About ScholarHat
ScholarHat trains learners in trending technologies to help them secure jobs in MNCs and startups. Our programs include Full Stack, DevOps, Cloud Engineering, Data Engineering, AI/ML, and more—supported with practical projects and placement assistance.
Key Responsibilities
- Train students on Data Engineering fundamentals and advanced concepts.
- Cover essential tools like SQL, Python, ETL processes, Data Pipelines, Big Data Frameworks, and Cloud-based data solutions.
- Teach technologies such as Apache Spark, Hadoop, Kafka, Snowflake, Databricks, Azure Data Factory, AWS Glue, etc.
- Explain data warehousing concepts (Star schema, OLAP/OLTP, Data Lakes).
- Provide hands-on guidance on real-time data pipelines and end-to-end project implementation.
- Conduct assessments, mentor learners, and ensure strong conceptual clarity.
- Help improve course content to align with industry standards.
RequirementsEducation
- B.Tech (CS/IT), BCA, MCA, M.Tech, or related technical degree (mandatory)
Experience
- 5-10 years of experience in Data Engineering + training/mentoring experience.
- Strong SQL & Python fundamentals
- Experience with Big Data tools (Spark, Hadoop, Kafka, Hive)
- Knowledge of cloud platforms (Azure, AWS, GCP)
- Understanding of ETL, data modeling, and pipeline orchestration
- Excellent communication & teaching skills
- Ability to simplify complex data concepts
What We Offer
- Flexible freelance sessions
- Pay per session / per module
- Opportunity to lead major Data Engineering batches
- Scope for full-time conversion based on performance
Job Types: Part-time, Freelance
Contract length: 12 months
Pay: ₹1,000.00 - ₹2,500.00 per hour
Expected hours: 4 – 8 per week
Benefits:
- Flexible schedule
- Work from home
Work Location: Remote