4 months ago
SENIOR DATA ENGINEER
Who We Are:
2K publishes some of the most popular video game franchises on the planet including Mafia, Borderlands, BioShock, NBA 2K, WWE 2K, Evolve, XCOM, and Sid Meier’s Civilization. The analytics group is responsible for collecting, processing and utilizing the data in the right way to identify problems and opportunities across the company and to build solutions for them. As part of this, the analytics team works with the studios and other partners to build reliable, scalable, and high-performance data pipelines to help inform all aspects of our business. From data designed to improve our development processes to data designed to drive critical business decisions, our group is constantly facing fun and challenging problems in the big data space.
What We Need:
We are looking for a Data Engineer to be part of our growing data engineering team within our analytics group. Collaborate with cross-functional teams, studios, external data providers to architect, design and develop a metadata driven data pipeline framework focusing on reusability, scalability, and productivity. S/he will work with data analysts and data scientists to understand data requirements, design and develop data pipelines to ingest data from multiple disparate sources. It’s a once in a lifetime opportunity to be part of a great team chartered to define the future of data and analytics platform for 2k. This position will be reporting directly to the Director of Data Engineering, located at our Novato office.
What You Will Do:
- Actively contribute to the architecture and design of data platform and data engineering practice
- Design and develop a data pipeline framework for ingesting structured and unstructured data
- Work with the game, marketing, and analytics teams to gather requirements, build, test, and deploy new data pipelines based on business requirements
- Translate data and BI requirements into technical design documents and data mapping documents
- Mentor junior engineers and be part of their career growth
Who We Think Will Be a Great Fit:
If you have experience building large scale data solutions, are interested in taking up another challenge to shape the future of data and platform capabilities in a fast-paced environment, we think you will be a great fit and we’d love to hear from you!
- 5+ Years of experience in developing near real-time data pipelines
- Strong hands-on experience with object-oriented/object function scripting languages: Python[Preferred], Java, Scala, etc.
- Expert knowledge in Data warehouse concepts and implementation of Dimensional and star models
- Experience with data pipeline and workflow management tools: Luigi, Airflow, etc.
- Experience with AWS cloud services: EC2, EMR, Kinesis, RDS
- Prior Implementation Experience with stream-processing systems: Storm, Kafka, Spark-Streaming etc.
- Proficient using Source Control, build and deploy tools like perforce/Git and Jenkins
- Strong project management, organizational and interpersonal skills