About Delta Tech Hub:
Delta Air Lines (NYSE: DAL) is the U.S. global airline leader in safety, innovation, reliability and customer experience. Powered by our employees around the world, Delta has for a decade led the airline industry in operational excellence while maintaining our reputation for award-winning customer service. With our mission of connecting the people and cultures of the globe, Delta strives to foster understanding across a diverse world and serve as a force for social good. Delta has fast emerged as a customer-oriented, innovation-led, technology-driven business. The Delta Technology Hub will contribute directly to these objectives. It will sustain our long-term aspirations of delivering niche, IP-intensive, high-value, and innovative solutions. It supports various teams and functions across Delta and is an integral part of our transformation agenda, working seamlessly with a global team to create memorable experiences for customers.
Responsibilities:
Data Pipeline Development and Maintenance:
-
Build and optimize scalable ETL/ELT pipelines to ingest data from diverse sources such as APIs, cloud platforms, and databases.
-
Ensure pipelines are robust, efficient, and capable of handling large volumes of data.
Data Integration and Harmonization:
-
Implement data transformation and enrichment processes to support analytics and reporting needs.
Data Quality and Monitoring:
-
Troubleshoot and resolve issues related to data quality, latency, or performance.
Collaboration with Stakeholders:
-
Provide technical support and guidance on data-related issues or projects.
Tooling and Automation:
-
Leverage cloud-based solutions and frameworks (e.g., AWS) to streamline processes and enhance automation.
-
Maintain and optimize existing workflows while continuously identifying opportunities for improvement.
Documentation and Best Practices:
-
Document pipeline architecture, data workflows, and processes for both technical and non-technical audiences.
-
Follow industry best practices for version control, security, and data governance.
Continuous Learning and Innovation:
-
Stay current with industry trends, tools, and technologies in data engineering and marketing analytics.
-
Recommend and implement innovative solutions to improve the scalability and efficiency of data systems.
What you need to succeed (minimum qualifications):
-
Bachelor of Science degree in Computer Science or equivalent
-
At least 2+ years of experience as a data engineer developing and maintaining data pipelines
-
Strong experience with databases and data platforms (AWS preferred)
-
Proficiency in Python, SQL, PySpark
-
Experience in Data Quality, Data Modeling, Data Analytics / BI, Data Enrichment.
-
Understanding of concepts such as normalization, SCD (Slowly changing dimensions) and CDC (Change data capture)
-
Experience in working on streaming event platforms such as Kafka / Kinesis
-
Knowledge of non-relational databases
-
Preferable experience in DBT for data transformation and modeling.
-
Good understanding of data warehouses, ETL / ELT, AWS architecture (using Glue, SQS, SNS, S3, step functions etc.,)
-
Understanding of orchestration tools such as Airflow
-
Ability to create clean, well-designed code and systems
-
Strong attention to detail and a commitment to data accuracy.
-
Proven ability to learn new data models quickly and apply them effectively in a fast-paced environment.
-
Ability to work collaboratively in a team environment.