Role - Data Engineer -- SHADC5784192
Location - Durham North Carolina
Hybrid - 2 weeks onsite a month
Must Have - Python, SQL, AWS, Data Warehouse, Snowflake and Oracle
What are the top three MUST have soft skills and technical skills (experience candidates must have coming in the door)?
1.) Python (need to know Python language to extract critical elements and re-build in new ecosystem)
2.) SQL
3.) AWS (brand new solutions built on AWS, breaking down on-prem databases/pipelines into the AWS ecosystem)
4.) Data Warehouse experience
Key Responsibilities:
-
Design, develop, and maintain data pipelines and ETL workflows
-
Perform data analysis, data modeling, and creation of data marts
-
Build and support enterprise Data Lake solutions using Snowflake on AWS
-
Work with relational databases such as Oracle and Snowflake
-
Develop data applications using Python
-
Collaborate with cross-functional teams in an Agile environment
-
Support cloud-based data solutions and migration from on-prem systems
-
Contribute to CI/CD and DevOps processes where applicable
Required Skills:
-
Strong proficiency in Python for data processing and transformation
-
Strong SQL skills for data querying and manipulation
-
Hands-on experience with AWS cloud platform
-
Experience in Data Warehousing and data modeling
-
Experience with relational databases such as Oracle or Snowflake
-
Experience with ETL tools like Informatica or SnapLogic
Preferred Skills:
-
Experience with Business Intelligence and dashboarding tools
-
Knowledge of DevOps tools such as Maven, Jenkins, Ansible, Docker
-
Familiarity with Agile methodologies (Scrum, Kanban)
-
Experience with other cloud platforms (Azure or Google Cloud)