Find The RightJob.
Job Description: Job Description :
Design, develop, and optimize data platforms that support enterprise-wide data management and analytics. Architect scalable solutions using modern cloud-native data warehouses such as AWS Redshift, Snowflake, and Azure Synapse Analytics. Leverage Microservices Architecture to support various data workloads, including data marts, data warehouses, and AI/ML applications. Implement and test Python applications with a strong understanding of object-oriented programming, multi-tier architecture, and parallel/multi-threaded programming. Leverage expertise in Hadoop and Cloud services such as Lambda, Step Functions, Event Bridge, DynamoDB, and CloudFormation to build efficient, secure, and scalable data solutions. Utilize predictive analytics to enhance data workflows, supporting real-time insights and data-driven decision-making. Play a critical role in improving operational efficiency, enabling business intelligence, and driving innovation in cloud-based data platforms. This position will require up to 10% travel on an as-needed basis for project meetings, trainings, and implementation support. Position may work at various and unanticipated worksites throughout the United States. Telecommuting permitted.
Responsibilities: .
Qualifications: Job Requirements :
Requires Bachelor’s degree in Computer science, Information Technology, or a directly related field plus Three (3) years of Experience in Data engineering, cloud computing, or software development. Must have at least three (3) years experience with the following: Cloud-based data platforms such as AWS, Azure, or Google Cloud. Expertise with Hadoop ecosystem tools (HDFS, Sqoop, Hive, MapReduce, HBase, Airflow). Data processing frameworks and tools, including Spark, Python, SQL and ETL pipelines. Database management and data modeling in relational and non-relational databases (e.g.,Redshift, Snowflake, DynamoDB). Understanding of microservices architecture and its application in data-driven environments. Familiarity with infrastructure-as-code tools like AWS CloudFormation or Terraform. Troubleshooting complex data integration issues. Experience with GitHub/Bitbucket Actions, CI/CD pipelines, and deployment workflow across dev-to-prod environment. Experience in data visualization using BI tools, including dashboards/reports creation. Collaborating with cross-functional teams, including data governance, DevOps, and business stakeholders. At least one relevant certification in AWS, Azure or Google Cloud is required.
40 hours/week, 9:00am-5:00pm, Salary range: $140,000 to $150,000 per year.
To apply: Send resume and cover letter to us.careers@exlservice.com. Must cite job title and code EXL58 in response. This notice is subject to ExlService.com, LLC's employee referral program. EEO/Minorities/Females/Vets/Disabilities.
Similar jobs
Amazon Web Services
Dallas, United States
4 days ago
Amazon Web Services
Mountain View, United States
4 days ago
AspenView Technology Partners
Dallas, United States
4 days ago
The Home Depot
Atlanta, United States
4 days ago
The Home Depot
Atlanta, United States
4 days ago
MERCOR
San Francisco, United States
11 days ago
Amazon Web Services
Minneapolis, United States
11 days ago
© 2026 Qureos. All rights reserved.