NETS International Group is seeking an experienced Data Modelling Engineer to design, develop, and optimize data models supporting advanced analytics, business intelligence, and cloud-based data solutions. The ideal candidate will build scalable data pipelines, define robust data architectures, and ensure efficient integration across platforms such as MySQL, PostgreSQL, MongoDB, Hadoop, and Azure Data Lake.
You will collaborate with data engineers, analysts, and solution architects to establish a unified data foundation aligned with business objectives and analytics needs.
Key Responsibilities:
- Design, develop, and maintain conceptual, logical, and physical data models to support enterprise data architecture and reporting needs.
- Build and manage data pipelines and integrations using ETL tools like Talend, Informatica, and custom scripts (Bash/Unix Shell, Python).
- Collaborate with cross-functional teams to capture business requirements and translate them into optimized data models and schemas.
- Design and optimize database structures in SQL-based platforms (MySQL, PostgreSQL, SQL Server, Oracle) and NoSQL systems (MongoDB).
- Implement and maintain data processing across big data ecosystems — Hadoop, Spark, Hive, and Azure Data Lake.
- Ensure data accuracy, consistency, and governance across distributed data environments.
- Develop data lineage and metadata documentation to support transparency and compliance.
- Optimize query performance and storage design for analytics workloads.
- Collaborate with analytics and BI teams (e.g., using Looker, Power BI) to deliver accurate and performant data models.
- Participate in Agile development cycles, ensuring timely delivery of data solutions.
- Maintain documentation and provide technical support for deployed models and data architectures.
Required Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Information Systems, or Data Engineering.
- 4–7 years of experience in data modelling, data architecture, or database design.
- Proven experience with SQL and NoSQL databases (MySQL, PostgreSQL, MongoDB).
- Strong understanding of data warehouse and data lake architectures.
- Hands-on experience with ETL tools (Talend, Informatica) and scripting (Bash, Python).
- Familiarity with big data platforms (Hadoop, Spark, Hive).
- Experience in data governance, metadata management, and lineage documentation.
- Knowledge of cloud platforms (AWS, Azure Data Lake, or GCP).
- Excellent understanding of database performance tuning and query optimization.
- Knowledge of dimensional modelling techniques (Star/Snowflake schemas).
- Familiarity with RESTful APIs and data integration for analytical applications.
- Experience with BI tools (Looker, Power BI, or Tableau).
- Exposure to containerized environments (Docker, Kubernetes).
- Understanding of Agile methodologies for collaborative delivery.
Job Type: Full-time
Ability to commute/relocate:
- Muscat: Reliably commute or planning to relocate before starting work (Required)
Application Question(s):
- what is your monthly current salary with currency?
- what is your monthly expected salary with currency?
- what is your notice period?
Education:
Experience:
- Data Modelling Engineering: 4 years (Required)