This job might no longer be available.
Senior, Computer Vision Cloud Data Backend Engineer
3 months ago
The Computer Vision Cloud Tools team at Magic Leap, Inc. is currently in search of a Senior Data Backend Engineer to join the team.
The Sr. Data Backend Engineer will design and develop highly scalable data processing pipelines and systems that provides insights into understanding computer vision algorithms and deep learning models and techniques. You will work on cutting-edge computer vision problems that require unique software and data infrastructure solutions. The tools that you develop will drive key engineering decisions and help guide the work of our Computer Vision and Machine Learning teams.
Qualified candidates need to be self-starters and able to operate in a highly dynamic environment.
- Build large scale distributed systems that leverage distributed content and data processing.
- Design and implement complex big data systems with a focus on collecting, parsing, cleaning, managing, analyzing and visualizing large sets of data to turn information into insights.
- Maintain a high level of data integrity, quality and security checks.
- Develop data pipelines and RESTful services that are distributed, robust and highly performant.
- Work with other data teams to integrate data from different sources into deep learning pipelines.
- Act as a subject matter expert and mentor junior developers.
- 5+ years of proficient experience working on software products.
- Experience integrating with a variety of SQL and NoSQL databases such as MySQL, PostgreSQL, MongoDB, Cassandra and Redis.
- Extensive experience in maintaining high data integrity and quality with relational databases.
- Strong knowledge in REST API design and message queues.
- Strong programming skills in one or more of Python / Java / Scala / C++.
- A proven track record of successful design and implementation of APIs and high-performance service-oriented architectures.
- Solid OOP and software design skills to create software that’s extensible, reusable and meets desired architectural objectives.
- Experience with Docker and container management and deployment.
- Comfortable with Linux, shell-scripting, and Git.
- Expert in data warehousing solutions and proficient in designing efficient and robust ETL workflows.
- Experience building large-scale data processing systems using MapReduce or frameworks such as Spark and Hive.
- Familiarity with Message Broker architectures such as RabbitMQ, ZeroMQ and Kafka.
- Experience with processing terabytes or petabytes of data on a daily basis.
- Experience deploying and scaling high-traffic services in private and public clouds such as AWS and Google Cloud.
All your information will be kept confidential according to Equal Employment Opportunities guidelines