This job might no longer be available.
Data Engineer
1 year ago
Job Title:
Data EngineerRequisition ID:
R017420Job Description:
Job Title: Data Engineer
Department: Information Technology
Location: Austin, TX (Open to Remote)
Your Mission Description
Activision is seeking a Data Engineer to the join the Corporate IT Data and Analytics team. The Data Engineer will be working closely with architects/product owners/business analysts and various functional teams to understand the data requirements and implement solutions in Google Cloud Platform. The data engineer will be responsible for the design and development of data pipelines to extract, transform, and load data from various internal/external data sources in varied data formats to the cloud data warehouse.
The position will have opportunities to push the boundaries of new business capabilities using new google technologies including AI & ML. If you want to create amazing user experiences using the latest technologies, then this is the right job for you!
Player Profile
- 3+ years’ experience developing solutions utilizing programming languages, modules, and techniques such as:
- Python
- Google Cloud API
- REST/SOAP API libraries
- Pandas/Koalas/Dask Dataframes
- Pyspark
- SQL
- Stored Procedures
- Slowly Changing Dimensions
- Data Modelling
- Clustering, Partitioning, and Sharding
- Dynamic SQL
- Other Skills
- Unix Shell Scripts
- Version Control (Git, TFS, SVN, etc.)
- 3+ years’ experience in designing, building and operationalizing enterprise data solutions and applications with experience in at least one of the following scenarios:
- Migrating data pipelines/applications from on-premises to the cloud.
- End to end data warehouse implementations.
- Data warehouse design and data modeling.
- Implementing data segregation and security policies.
- 2+ years’ experience working in SQL using relational/columnar databases such as:
- Google BigQuery (preferred)
- Snowflake
- Azure Synapse
- Amazon Redshift
- 2+ Experience with Google Cloud Platform, or similar cloud tools such as:
- Cloud Composer/Apache Airflow
- Cloud Storage
- Secret Manager
- Cloud Functions
- PubSub
- Dataflow/Apache Beam
- 2+ years’ experience administering, troubleshooting, and implementing solutions with Visualization tools such as:
- Tableau (Preferred)
- Looker
- Power BI
- 3+ years’ experience working with an ELT/ETL tool for data transformation such as:
- Data Build Tool (DBT)
- Azure Data Factory
- Matillion
- Fivetran
- Talend
- Informatica
- Highly organized team player with the ability to innovate, multi-task and set priorities effectively.
- Self-starter with strong verbal and written communication skills.
- Ability to interface effectively and decisively with all infrastructure teams, various levels of management & departments.
Priorities can often change in a fast-paced environment like ours, so this role includes, but is not limited to, the following responsibilities:
- Work with architects and business partners to build out technical solutions to import data from external cloud and internal application sources, utilizing industry standards to cleanse, integrate, transform, and load this data into the cloud data warehouse.
- Partner with cross-functional teams to design and implement data requirements to drive organizational strategies and objectives.
- Develop and support the entire data warehouse lifecycle, including requirements gathering, data profiling, design, development, testing, ongoing support and enhancements for analytics and management teams.
- Design, implement & maintain ETL/ELT procedures for intake of data from both internal and cloud sources.
- Ensure data flowing through the data pipelines is verified and quality is checked.
- Create complex SQL queries, data transformation & aggregations to support analytics.
- Consult with business partners, analytics teams, management, and other business analysts to clarify program objectives, determine scope, identify problems, and recommend solutions.
- Uses comprehensive knowledge and understanding of relational data base concepts, including data architecture, operational data stores, ETL/ELT processes, interface processes, multidimensional modeling, data warehouse concepts, master data management, and data manipulation.
- Work with REST APIs to integrate third party data sources to the data warehouse.
- Support implemented BI solutions by working with the infrastructure teams to carry out monitoring, tuning, and performance analysis, addressing user questions regarding data integrity, and communicating functional and technical issues.
Pluses
- Experience with ML & AI capabilities.
- Experience working with Kubernetes.
- Experience working in the entertainment or video game industries.
- Experience working in the Azure data stack.
- Experience with Unix/Windows Server Administration
Create Your Profile — Game companies can contact you with their relevant job openings.