The Data and Integration Engineer is responsible for designing, developing, and maintaining data and integration solutions that enable seamless data flow across enterprise systems. This role requires expertise in ETL/ELT processes, API integrations, data warehousing, and cloud-based data platforms. The ideal candidate will work closely with business stakeholders, developers, and data teams to ensure efficient data management, high-quality data pipelines, and scalable integration solutions.
ESSENTIAL DUTIES AND RESPONSIBILITIES
Responsibilities:
- Design, develop, and maintain ETL/ELT pipelines to extract, transform, and load data from multiple sources into data lakes and data warehouses.
- Implement data transformation logic to ensure data quality, consistency, and performance.
- Automate data ingestion and transformation processes using tools like Azure Data Factory (ADF), SSIS, or AWS Glue.
- Work with RESTful and SOAP APIs, JSON, XML, and other data formats.
- Develop and manage API-based integrations between enterprise applications, cloud platforms, and third-party services.
- Implement authentication and security protocols (OAuth, JWT, API keys).
- Design and optimize database structures for data warehousing and reporting solutions.
- Write complex SQL queries, stored procedures, and views to support data extraction and analytics.
- Work with relational databases (SQL Server, PostgreSQL, Snowflake) and NoSQL solutions.
- Develop and deploy data solutions in cloud environments like Azure, AWS, or Google Cloud.
- Utilize cloud storage, compute services, and serverless technologies for scalable data processing.
- Work with modern data lake and Lakehouse architectures.
- Implement data governance best practices, including metadata management, data lineage tracking, and access controls.
- Ensure compliance with data privacy regulations (GDPR, HIPAA, CCPA) and security best practices.
- Monitor data integration jobs and troubleshoot failures or performance issues.
- Create technical documentation, process workflows, and integration diagrams.
- Support cross-functional teams in identifying and resolving data-related issues.
Knowledge, Skills, and Abilities:
- Experience in ETL/ELT tools (Azure Data Factory, SSIS, Informatica, Talend, etc.).
- Experience with SQL and database technologies (SQL Server, Snowflake, PostgreSQL, MySQL).
- Hands-on experience with APIs, web services, and integration tools.
- Familiarity with cloud data platforms (Azure Synapse, AWS Redshift, Google BigQuery).
- Experience with scripting languages (Python, PowerShell, or Bash) for automation.
- Knowledge of DevOps practices, CI/CD pipelines, and Infrastructure as Code (Terraform, ARM templates).
- Strong problem-solving and analytical skills.
- Excellent communication and collaboration abilities.
- Ability to work in a fast-paced environment and manage multiple priorities.
- Strong attention to detail and data accuracy.
Requirements:
- Bachelor’s degree in Computer Science, Information Systems, or a related field
- 3+ years of experience in data engineering, data integration, or related roles
PHYSICAL REQUIREMENTS:
- Office environment with occasional requirements to work outside of normal business hours.
- On-call rotation for network emergencies.
- Ability to lift and carry up to 50 pounds.
- Ability to work in confined spaces and to climb ladders and stairs.