Deep Learning Engineer, Habana Labs

Austin, TX 78701
  • Job Code
Job Description

Habana Labs is an innovative company focused on developing purpose-built AI processors, disruptive solutions that will shape the future of AI and Deep Learning computing. Our vision to take AI processing from its current limits to the peak of its potential continues. We are looking for exceptionally smart and driven people who believe AI will change the world and would like to join us on our exciting journey.

Job Description

We are searching for Deep Learning Engineers to join our team. In this role you will plan, design, prototype, and develop tools, techniques, and models for distributed and federated learning and more.

You will leverage and develop deep learning frameworks, techniques for large scale-distributed learning, state of the art algorithms and models; exploit distributed accelerator architecture; utilize data loading, augmenting, and pipelining techniques. You will work cross-functionally with engineers and product leaders across teams to help define and prioritize problem requirements.

You will be a part of a small but growing team of highly motivated, self-starting, hardworking, and collaborative people that are the primary driver for raising the quality of our products and services. You must have excellent communication and organizational skills, and the ability to stay focused on completing tasks and meeting goals within a busy workspace. You should have a desire to learn and stay current with industry trends to  make process improvement recommendations to the team as needed.


Minimum Requirements:

BS in Computer Science or  Computer/Electrical Engineering or  Physics, or related fields with 4+ years of relevant experience.

2+ years of experience in the following:
Deep learning models such as  NLP, Vision, or Speech Recognition.
Deep learning frameworks such as PyTorch, TensorFlow, etc.
Programming Languages such as Python, Scala, Go, C++.
Virtualization/Systems such as Docker/ Linux.

Preferred Requirements:

2+ years' experience with deep learning frameworks such as PyTorch, TensorFlow, etc.
Experience with orchestrators such as Kubernetes, Slurm
Experience with distributed systems/HPC, Horovod, MPI, Distributed Deep Learning.
Understanding of CPU and GPU architectures, numeric libraries, modular software design.

Inside this Business Group

The Data Center Group (DCG) is at the heart of Intels transformation from a PC company to a company that runs the cloud and billions of smart, connected computing devices. The data center is the underpinning for every data-driven service, from artificial intelligence to 5G to high-performance computing, and DCG delivers the products and technologiesspanning software, processors, storage, I/O, and networking solutionsthat fuel cloud, communications, enterprise, and government data centers around the world.

Posting Statement

All qualified applicants will receive consideration for employment without regard to race, color, religion, religious creed, sex, national origin, ancestry, age, physical or mental disability, medical condition, genetic information, military and veteran status, marital status, pregnancy, gender, gender expression, gender identity, sexual orientation, or any other characteristic protected by local law, regulation, or ordinance.

Before you go...

Our free job seeker tools include alerts for new jobs, saving your favorites, optimized job matching, and more! Just enter your email below.

Share this job:

Deep Learning Engineer, Habana Labs

Austin, TX 78701

Join us to start saving your Favorite Jobs!

Sign In Create Account