You will be participating in exciting projects covering the end-to-end data lifecycle – from raw data integrations with primary and third-party systems, through advanced data modeling, to state-of-the-art data visualization and development of innovative data products.
You will have the opportunity to learn how to build and work with both batch and real-time data processing pipelines. You will work in a modern cloud-based data warehousing environment alongside a team of diverse, intense, and interesting co-workers. You will liaise with other departments – such as product & tech, the core business verticals, trust & safety, finance, and others – to enable them to be successful.
Your Responsibilities
-
Design, implement and support data warehousing;
-
Raw data integrations with primary and third-party systems
-
Data warehouse modeling for operational & application data layers
-
Development in Amazon Redshift cluster
-
SQL development as part of agile team workflow
-
ETL design and implementation in Matillion ETL
-
Real-time data pipelines and applications using serverless and managed AWS services such as Lambda, Kinesis, API Gateway, etc.
-
Design and implementation of data products enabling data-driven features or business solutions
-
Building data dashboards and advanced visualizations in Sisense for data cloud teams (formerly Periscope Data) with a focus on UX, simplicity, and usability
-
Working with other departments on data products – i.e. product & technology, marketing & growth, finance, core business, advertising, and others
-
Being part and contributing towards a strong team culture and ambition to be on the cutting edge of big data
-
Evaluate and improve data quality by implementing test cases, alerts, and data quality safeguards
-
Living the team values: Simpler. Better. Faster.
-
Strong desire to learn
Required Minimum Experience (must)
-
1-3 years experience in data processing, analysis, and problem-solving with large amounts of data;
-
Good SQL skills across a variety of relational data warehousing technologies especially in cloud data warehousing (e.g. Amazon Redshift, Google BigQuery, Snowflake, Vertica, etc.)
-
1+ year of experience with one or more programming languages, especially Python
-
Ability to communicate insights and findings to a non-technical audience
-
Written and verbal proficiency in English
-
Entrepreneurial spirit and ability to think creatively; highly-driven and self-motivated; strong curiosity and strive for continuous learning
-
Top of class University technical degree such as computer science, engineering, math, physics.
Additional Experience (strong Plus)
-
Experience working with customer-centric data at big data-scale, preferably in an online / e-commerce context
-
Experience with modern big data ETL tools (e.g. Matillion)
-
Experience with AWS data ecosystem (or other cloud providers)
-
Track record in business intelligence solutions, building and scaling data warehouses, and data modeling
-
Tagging, Tracking, and reporting with Google Analytics 360
-
Knowledge of modern real-time data pipelines (e.g. serverless framework, lambda, kinesis, etc.)
-
Experience with modern data visualization platforms such as Periscope, Looker, Tableau, Google Data Studio, etc.
-
Linux, bash scripting, Javascript, HTML, XML
-
Docker Containers and Kubernete