This job might no longer be available.
Senior Data Engineer
2 years ago
Description
SciPlay is a leading developer and publisher of digital games on mobile and web platforms, providing highly-entertaining free-to-play games that millions of people play every day.
We're looking for a Senior Data Engineer to work on our massive data processing pipeline and lead our data lake and data warehouse building, to help us deliver more insights and scale our data infrastructure. You will have a chance to contribute to the company's evolving culture, bring innovative approaches and learn from your talented colleagues.
Key Responsibilities
- Designs and develops programs and tools to support data ingestion, curation and provisioning of complex enterprise data to achieve analytics & reporting on our current technology stack
- Designs and builds data extracts, integrations and transformations
- Provides successful deployment and provisioning of data solutions to required environments
- Designs and builds data architecture and applications that successfully enable speed, quality and efficient pipelines
- Interacts with cross-functional customers and development team to gather and define requirements.
- Develops understanding of the data and builds business acumen.
- Reviews discrepancies in requirements and resolves with stakeholders.
- Identifies and recommends appropriate continuous improvement opportunities and ensures integrations are automated and have proper exception handling.
- Key team member of project team designing and deploying a ground up cloud data pipeline
Requirements
- Bachelor or Master`s degree in technical discipline such as Computer Science, Information Systems or another technical field
- People person, team player with a strong can-do mentality
- 5 years as a Data Engineer on a data and analytics team
- Proficient in data modelling principles
- Proficiency in Snowflake and other data warehousing solutions (Redshift / BigQuery)
- Proficiency in Databricks
- Experience with building data stream pipelines and ETL using: Spark, Elastic
- Advanced experience with AWS cloud services
- Advanced knowledge in Python
- Advanced working SQL experience as well as performance tuning
- Experience with data pipeline workflow management tools such as: Airflow, Astronomer
- Knowledge and ability to write, test, and debug APIs
- Experience working with agile methodologies and working in cross-functional teams
- Must be proactive, demonstrate initiative and be a logical thinker
- Must be an inquisitive learner and have a thirst for improvement
- Ability to understand and apply customer requirements, including drawing out unforeseen implications and making design recommendations.
- Strong facilitation and consensus building skills. Strong oral and written communication skills; Ability to communicate by simplifying complexity
- Ability to provide work guidance to junior level developers
- Enjoys higher learning and keeps track of industry best practices and trends and through acquired knowledge, takes advantage of process and system improvement opportunities
- Be proactive requiring minimal supervision, be strong time management and work organization skills
- Have an ability prioritize workload and handle multiple tasks and at times meet tight deadlines.
- Strong problem-solving and analytical skills
- Has worked on data quality improvement projects such as Master Data Management
Highly Desired AWS certifications (any):
- AWS Certified Solutions Architect – Associate/Professional
- AWS Certified Developer – Associate/Professional
- AWS Certified DevOps Engineer
- AWS Certified Solutions Architect
- AWS Certified Data Analytics
- AWS Certified Security - Specialty
- AWS Certified Cloud Practitioner
Create Your Profile — Game companies can contact you with their relevant job openings.