Sr Technical Lead
3 months ago
The incumbent will be part of Data Engineering Team and responsible for developing and maintaining solutions on integration platforms like Talend, Stitch, Scribe or custom codes and working closely with team of developers, QAs, and Support Analysts for supporting building data lake in Google Cloud Big Query and Snowflake. This individual, functionally reporting to Manager, Enterprise Data & Insights, who shapes the technology demand among customer facing employee community across Aristocrat’s global businesses.
The role will need to have a broad range of technical and functional knowledge, with good blend of problem solving and communication skill’s; able to align regional requirements with global templates and deliver solution across multiple projects.What you'll do
- We are looking for a Data Engineer with SQL & Python skills to join our team on immediate basis.
- Build the data pipelines required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Python, SQL, Talend, Stitch into data lakes on Google Cloud BQ, Snowflake.
- Work with Architect, Data & Analytic teams to assist with data-related technical issues and support their data processing needs.
- Produce clean, standards based, modern code with an emphasis on advocacy toward end-users to produce high quality ETLs\Data Pipeline jobs.
- The engineer should have proven experience on production ETL system monitoring and troubleshooting and developing data models.
- Prior experience integrating with different types of data sources (API, batch, stream)
- Should have a good track on handling ambiguous requirement, on creating design specs.
- Prior experience with the GitHub
- Demonstrate an understanding of technology and digital frameworks in the context of data integration
- Ensure code and design quality through the execution of test plans and assist in development of standards, methodology and repeatable processes, working closely with internal and external design, business, and technical counterparts.
Mandatory:
- Experience in Snowflake Data Warehouse
- Experience in Talend Studio and TAC/TMC (designing, developing, validating, and deploying the Talend ETL Pipelines)
Desirable:
- Proficient in Python (min 4 years)
- Proficient in writing advanced SQL.
- Experience in consuming GCP Cloud Services specially GCS, Big Query, Cloud function and other offered services.
- Automation, orchestrations, and Performance Tuning of ETL processes along with implementing Best Practices
- Experience with development and production support
Non-Technical Requirements:
- Proven success in contributing to a team-oriented environment.
- Proven ability to work creatively and analytically in a problem-solving environment.
- Excellent communication (written and oral) and interpersonal skills
What we're looking for
Qualification a candidate might have that could enhance their ability to be successful.
- B.Tech in Computer Science or equivalent
- List of experiences such as Gaming Experience, embedded systems, work-ex with Data lake & ETL/MicroStrategy based projects.
Travel Expectations
None
Create Your Profile — Game companies can contact you with their relevant job openings.