Position Overview:
We are looking for a Data Engineer to build and operate a scalable, secure, and reliable data platform that enables analytics, data science, and downstream applications. This role focuses on designing core data infrastructure, developer tooling, and self-service capabilities that empower teams to work with data efficiently and safely.
Key Responsibilities:
-
Design, build, and maintain the data platform and core data infrastructure.
-
Build reusable, self-service data pipelines and platform services for internal teams
-
Implement data orchestration, observability, and monitoring across the platform.
-
Ensure data quality, integrity, and reliability across systems
-
Define and enforce data platform standards, best practices, and architectural patterns.
-
Integrate data from multiple sources (APIs, databases, streaming systems, etc.).
-
Manage and optimize data storage solutions (data warehouses, data lakes).
-
Collaborate with cross-functional teams to understand data requirements.
-
Monitor, troubleshoot, and improve performance of data workflows.
-
Implement best practices for data security, governance, and compliance.
-
Document data architecture, pipelines, and processes
Requirements
Qualifications:
-
Bachelor's degree in Computer Science, Engineering, or a related field.
-
Strong experience building and operating data platforms or large-scale data systems.
-
Advanced SQL and strong programming skills (Python, Java, or Scala).
-
Strong experience with Informatica (PowerCenter, Informatica Intelligent Cloud Services, or similar).
-
Experience with data orchestration tools (Airflow, Dagster, Prefect, or similar).
-
Solid understanding of data lake, warehouse, and Lakehouse architectures.
-
Experience with infrastructure-as-code (Terraform or similar).