Qureos

FIND_THE_RIGHTJOB.

Intern Data Engineer

JOB_REQUIREMENTS

Hires in

Not specified

Employment Type

Not specified

Company Location

Not specified

Salary

Not specified

Work From Office - Gurgaon Location


Build with Python • Think in SQL • Ship Real-World Systems


Role Summary: Data Engineer (Fresher)

We’re looking for a high-potential fresher who doesn’t just “know” Python and SQL — but knows how to use them to solve real-world business problems.


About the Company

At CashKaro/EarnKaro/BankKaro, the Data Engineering team powers everything from campaign automation to user retention logic, influencer analytics to fraud detection. If you’ve been building tools, scraping data, optimizing queries, or wiring systems together — this is the role where you turn that passion into impact.

This is not a pure academic or theoretical data role. You'll be expected to ship code, move data, scrape sources, and automate pipelines that power business-critical operations across marketing, product, and growth.


Key Responsibilities

Data Engineering & Automation

  • Build and maintain Python-based scripts and ETL pipelines for business teams and reporting use cases.
  • Automate repetitive data operations using Python, APIs, and SQL.
  • Develop robust web scraping pipelines to extract data from affiliate portals, third-party sources, and internal tools.
  • Support creation of production-ready pipelines used in campaign workflows, user segmentation, fraud alerts, and more.

SQL & Data Validation

  • Write advanced SQL queries (joins, window functions, CTEs) to extract and validate business-critical data.
  • Perform data sanity checks to ensure accuracy in dashboards and downstream systems.

Collaboration & Execution

  • Work closely with Data Analysts, Marketing, and Product teams to implement logic as per business requirements.
  • Debug issues in automation and scraping pipelines and suggest optimizations.
  • Own tasks end-to-end — from logic to deployment.

Learning & Growth

  • Gain hands-on exposure to AWS, Jenkins, Redshift, Power BI, and more.
  • Participate in internal L&D sessions to level up on tools, business context, and best practices.


Who You Are

  • Builder mindset : You’ve built or automated something real using Python (scripts, scrapers, bots, pipelines, etc.).
  • SQL-native : You don’t just write SELECT * — you know how to slice and dice data like a pro.
  • System thinker : You think in workflows, not just code.
  • Business curious : You ask why something is being built, not just how.
  • Hustler attitude : You learn fast, adapt quickly, and love shipping real outcomes.


Must-Have Skills

  • Strong proficiency in Python and SQL
  • Experience with web scraping using libraries like requests, BeautifulSoup, or Selenium
  • Comfort with data structures like lists, dicts, JSON, and API usage in Python
  • Solid understanding of joins, aggregations, filtering, and window functions in SQL
  • Ability to independently learn and debug
  • Strong sense of ownership and urgency


Good-to-Have Skills

  • Exposure to Jenkins, AWS (S3, Redshift), or other cloud tools
  • Familiarity with Git & working on multiple branches
  • Experience with cron jobs, scheduled scripts, or automation triggers
  • Basic understanding of affiliate marketing or e-commerce funnels


Tools You’ll Use

  • Python
  • SQL (Redshift/PostgreSQL)
  • Jenkins
  • AWS S3, Redshift
  • Git, VS Code, Postman
  • Power BI, Excel (as needed)
  • Scraping libraries: requests, BeautifulSoup, Selenium, Scrapy

© 2025 Qureos. All rights reserved.