This job might no longer be available.
Data Engineer II
Vancouver British Columbia Canada
7 months ago
Data Engineer II
An excellent opportunity has arisen to join our Global Analytics & Insights department as a Data Engineer, working with a team of highly skilled developers and delivering complex development projects for internal systems across a variety of products and technologies.
This role offers challenges across a wide variety of projects and responsibilities, including the opportunity to influence the future direction of the department and systems used across the business.
To apply for this role, you must have a proven technical capability and a track record in software development/ data engineering. You are a motivated individual who is experienced in taking responsibility and can independently deliver results on both departmental and business projects.
- Design, model, develop and maintain data sets to support reporting analytics and exploratory analysis
- Capable of understanding and contributing to the technical solution from design through to code level
- Contribute to technical design and ongoing development of our custom ETL solutions and analytics platforms, and help drive continuous improvement of design and delivery standards
- Work with big data developers to build scalable and supportable infrastructure
- Employ a variety of languages and tools (e.g. scripting languages) to marry systems together
- Assess, recommend and support the implementation available and emerging big data technologies
- Participate in reviews and meetings
- Recommend ways to improve data reliability, efficiency and quality
- Collaborate with data architects, modelers and IT team members on project goals
- Contribute to post implementation reviews helping to demonstrate success
- Work on a wide range of projects involving the implementation of new and existing systems, solutions and processes
- To have a Bachelor’s or Master’s degree in computer science or software engineering
- Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
- Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data’ data stores
- Strong analytic skills related to working with unstructured datasets
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
- Experience working with big data technologies including HDFS, Hive, Spark, Storm and Impala
- Experience in the development of software using Python, SQL, NoSql and SnowSQL
- Experience in the physical and logical design of database architecture for relational databases, data warehouses and data lakes.
- Experience with database optimization, data replication, database recovery, DR failover and H/A solutions and performance tuning.
- Experience with distributed computing using a cloud based tech stack on platforms such as AWS and Azure.
- Excellent personal organization and ability to prioritize and carry out multiple tasks
- Ability to adapt to new technology and research troubleshooting techniques and best practices
- Ability to efficiently debug problems and issues with little supervisors input
- Able to influence and drive projects to meet key milestones and overcome challenges
- Comfortable working without routine supervision
- A desire to remain technically capable and an expert in current technologies
- We are looking for a candidate with 5+ years of experience in a Data Engineer role, who has attained a Graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field. They should also have experience using the following software/tools:
- Experience with big data tools: Hadoop, Hive, Spark, etc.
- Experience with relational SQL and NoSQL databases, including Postgres and Cassandra.
- Experience with data pipeline and workflow management tools: Airflow , etc.
- Experience with AWS cloud services: ECS, EKS, EMR, RDS, ELB, Docker
- Experience with Snowflake
- Experience with object-oriented/object function scripting languages: Python, C#, etc.