Senior Data Engineer
Experience: 5 to 9 Years
Location: Remote (Work from Home) / Bangalore / India
Mode of Engagement: Full-time
No. of Positions: 2
Educational Qualification: B.E / B.Tech / M.Tech / MCA / Computer Science / IT
Industry: IT / Data / AI / LegalTech / Enterprise Solutions
Notice Period: Immediate
What We Are Looking For:
- 5–9 years of strong experience in Python-based data engineering and end-to-end data pipeline development.
- Hands-on expertise in data extraction, transformation, and automation using Python, SQL, and cloud-based tools.
- Proven ability in building scalable APIs, microservices, and ETL workflows with Docker/Kubernetes.
- Working knowledge of AWS, PostgreSQL/MySQL, and CI/CD pipelines for deployment and monitoring.
- Experience or familiarity with AI/LLM-based automation or data enrichment workflows is an advantage.
Responsibilities:
- Design, build, and maintain data ingestion and transformation pipelines for analytics and automation systems.
- Develop API-driven data delivery frameworks ensuring performance, scalability, and security.
- Work on data scraping and integration from multiple structured and unstructured sources.
- Collaborate with product, AI, and analytics teams to develop intelligent, data-driven workflows.
- Implement data quality, validation, and monitoring standards throughout the pipeline lifecycle.
- Deploy and optimize workloads using Docker, Kubernetes, AWS Lambda, and CI/CD automation.
- Troubleshoot and optimize system performance for reliability and scalability.
Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Engineering, or related field.
- Proficiency in Python, FastAPI/Django, SQL, and RESTful API design.
- Hands-on experience with scraping tools (BeautifulSoup, Scrapy) and ETL frameworks.
- Good understanding of AWS ecosystem, Docker, Kubernetes, and Git-based CI/CD workflows.
- Excellent problem-solving, analytical, and collaboration skills.
Job Types: Full-time, Permanent
Pay: ₹60,000.14 - ₹90,000.03 per month
Benefits:
- Health insurance
- Paid sick time
- Provident Fund
- Work from home
Application Question(s):
- How many end-to-end data pipeline or automation projects have you delivered successfully?
- What tools or frameworks have you used to build or orchestrate data pipelines (e.g., Airflow, Prefect, Luigi, or custom-built using Python)
- How many total years of experience do you have in Python-based data engineering (including data pipelines, data scraping, or API automation)?
- How many end-to-end data engineering or AI-integrated projects have you delivered or led successfully?
- How many years of experience do you have working with AWS or other cloud platforms (for deployment or data management)?
- How many large-scale data scraping projects have you handled (mention approximate count)?
- How many projects have involved AI or LLM integration (e.g., LangChain, LlamaIndex, or custom AI models)?
- Notice period (days)
- Current CTC (monthly)
Work Location: Remote