Date: December 9, 2016
- In recent years, advanced machine learning techniques have enabled computers to recognize objects in images, understand commands from spoken sentences, and translate written language.
- That will mean a need for small, energy-efficient computers optimized for machine learning, especially for portable devices.
- As a result, chipmakers like Intel, graphics powerhouse Nvidia, mobile computing kingpin Qualcomm, and a number of startups are racing to develop specialized hardware to make modern deep learning significantly cheaper and faster.
- Electric carmaker Tesla Motors announced in October that all of its vehicles will be equipped with computing systems for autonomous driving, using Nvidia hardware to support neural networks to process camera and radar input.
- Graphcore is developing chips it calls “intelligent processing units” that Toon says are designed from the ground up with deep learning in mind.
Related NLP Articles
Build smarter apps with our natural language processing API.