Job Description: Data Engineer
Position: Data Engineer
Location: Indore
Experience: 0-7
Employment Type: Full-Time
Job Summary:
We are looking for a skilled Data Engineer to design, develop, and maintain data pipelines, ensuring efficient data processing, storage, and retrieval for business analytics and decision-making. The ideal candidate will have strong experience in working with large-scale data systems, ETL processes, and cloud-based platforms.
Key Responsibilities:
- Data Pipeline Development:
- Design, build, and maintain scalable, reliable, and efficient data pipelines to process and integrate data from various sources.
- Ensure the timely and accurate flow of data across systems.
- Data Architecture:
- Develop and optimize data models and architectures for data warehouses, data lakes, and databases.
- Collaborate with data scientists, analysts, and other stakeholders to ensure data accessibility.
- ETL/ELT Processes:
- Design and implement ETL/ELT workflows to extract, transform, and load data from diverse sources into target systems.
- Monitor and troubleshoot ETL processes to maintain data integrity.
- Data Governance:
- Implement data quality checks, validation processes, and monitoring to ensure accuracy and consistency.
- Ensure compliance with data privacy regulations and organizational policies.
- Cloud & Big Data Platforms:
- Work with cloud-based tools (AWS, Azure, GCP) and big data technologies (Hadoop, Spark) for data storage and processing.
- Optimize performance and cost-effectiveness of cloud-based data solutions.
- Collaboration:
- Partner with software engineers, data analysts, and business teams to identify and fulfill data requirements.
- Provide support for ad-hoc data requests and troubleshoot data issues.
Key Skills and Qualifications:
- Technical Skills:
- Proficiency in programming languages such as Python, Java, or Scala.
- Strong knowledge of SQL and database systems (e.g., PostgreSQL, MySQL, MongoDB).
- Hands-on experience with ETL tools (e.g., Apache NiFi, Talend, Informatica).
- Familiarity with big data technologies like Hadoop, Spark, and Kafka.
- Experience with cloud platforms (AWS, GCP, Azure) and related services (e.g., S3, Redshift, BigQuery).
- Analytical Skills:
- Solid understanding of data modeling and database design principles.
- Ability to analyze and interpret large data sets to solve business problems.
- Other Skills:
- Strong problem-solving and debugging skills.
- Excellent communication and teamwork abilities.
- A keen eye for detail and commitment to delivering high-quality work.
Job Types: Full-time, Permanent, Fresher
Pay: ₹25,000.00 - ₹50,000.00 per month
Benefits:
- Cell phone reimbursement
- Paid sick time
- Provident Fund
Experience:
- total work: 1 year (Preferred)
Work Location: In person