You will design, develop, and optimize large-scale data pipelines. You will take ownership of critical components of the data architecture, ensuring performance, security, and compliance.
-
Design and implement scalable data pipelines for batch and real-time processing.
-
Optimize data storage and computing resources to improve cost and performance.
-
Ensure data security and compliance with industry regulations.
-
Collaborate with data scientists, analysts, and application teams to align data storage strategies.
-
Lead technical discussions with stakeholders to deliver the best possible solutions.
-
Automate data workflows and develop reusable frameworks.
-
Monitor and troubleshoot ETL pipelines, jobs, and cloud services.