Job Title: Senior ETL Developer – Complex Data Flows & High-Performance Systems
Location: Gurugram
Experience: 3+ years
Department: Data Engineering / Analytics
Job Summary:
We are looking for a highly skilled ETL Developer with extensive experience in managing complex data flows, designing scalable pipelines, and optimizing performance in large data environments. The ideal candidate should have hands-on experience with ETL tools, strong knowledge of PostgreSQL, and the capability to manage environments with thousands of processors (e.g., 5000+ in a single PG setup).
Key Responsibilities:
- Design, build, and maintain robust ETL pipelines for large-scale and complex datasets
- Develop, optimize, and troubleshoot SQL queries in PostgreSQL, including working with high-concurrency environments
- Work with 5000+ processor instances in PostgreSQL or similar scale setups
- Manage data ingestion from multiple sources, ensuring data integrity, consistency, and availability
- Monitor data workflows, identify bottlenecks, and apply performance tuning
- Collaborate with data architects, analysts, and stakeholders to define and fulfill data requirements
- Ensure data quality, validation, and reconciliation across systems
- Create and maintain documentation for data processes, models, and architecture
- Ensure ETL pipelines meet security, privacy, and compliance standards
Required Skills & Experience:
- 3+ years of experience in ETL development and complex data workflows
- Strong hands-on experience with PostgreSQL, including optimization at scale
- Proven ability to manage and process data across massively parallel systems (e.g., 5000 processor environments)
- Proficient in SQL, PL/pgSQL, and performance tuning
- Experience with ETL tools like Talend, Apache Nifi, Informatica, Airflow, etc.
- Familiarity with big data ecosystems (Hadoop, Spark, Kafka) is a plus
- Strong understanding of data modeling, warehousing, and data governance
- Excellent analytical, debugging, and problem-solving skills
Preferred Qualifications:
- Experience in cloud platforms (AWS, GCP, or Azure)
- Familiarity with DevOps and CI/CD practices for data pipelines
- Exposure to real-time streaming data processing
- Knowledge of scripting languages (Python, Bash, etc.)
Education:
- Bachelor’s or Master’s degree in Computer Science, Data Engineering, or related field
Job Types: Full-time, Permanent
Pay: ₹2,500,000.00 - ₹3,000,000.00 per year
Ability to commute/relocate:
- Gurugram, Haryana: Reliably commute or planning to relocate before starting work (Preferred)
Application Question(s):
- Are you serving your Notice Period ? If yes, what is your Last Working Day ?
- What is your ECTC? Please mention here.