FIND_THE_RIGHTJOB.
JOB_REQUIREMENTS
Hires in
Not specified
Employment Type
Not specified
Company Location
Not specified
Salary
Not specified
Big Data Engineering
remote
Good Communication needed
Must have skills - SQL PySpark, Databricks, AWS, Python
5-10 yrs
Job Description:
"1) Experience in AWS Databricks Data pipelines Development - DLT , Streaming & Data ingestion patterns , Unity Catalog migration
2) Independently develop AWS data extraction and integration code modules for one-time historical, incremental data flows from Databricks delta lake using Pyspark, Python ,SQL, AWS Data bricks, S3.
3) Develop scalable data ingestion pipelines as needed
4) Implement advanced transformation logics, schedule, orchestrate, and validate pipelines
5) Perform data validation, data reconciliation
6) Deployment of Components
7) Perform unit testing and document test results."
Primary Skills: AWS Databricks,SQL, Python Secondary Skills: Pyspark, AWS Cloudwarch, AWS S3, Streaming
Job Types: Full-time, Permanent, Contractual / Temporary, Freelance
Contract length: 12 months
Pay: ₹4,000,000.00 - ₹6,000,000.00 per year
Work Location: Remote
© 2025 Qureos. All rights reserved.