According to WIRED, the potential market for artificial intelligence (AI) chips is enormous. That has companies like Google and Facebook pioneering the AI technology for their own neural networks. Google has developed its own chips designed specifically for neural networks, known as TPUs. Companies such as Intel, nVidia, and Qualcomm are also working on chips for neural netowrks. The chips are essential for efficiency and cost. WIRED’s Cade Metz writes:

As Google’s TPUs have shown, dedicated AI chips can bring a whole new level of efficiency in data centers, especially as demand for image recognition services increases. In executing neural networks, they can burn less electrical power and generate less heat. “If you don’t want to boil a small lake, you might need specialized hardware,” LeCun quips.

Meanwhile, as virtual and augmented reality become more pervasive, phones and headsets will need similar chips. As Facebook explained last week in unveiling its new augmented reality tools, this kind of technology requires neural networks that can recognize the world around you. But augmented reality systems can’t afford to run this AI back in the data center. Sending all that imagery over the internet takes too long, ruining the effect of the simulated reality. As Facebook chief technology officer Mike Schroepfer explains, Facebook is already starting to lean on GPUs and other chips, called digital signal processors, for some tasks. But in the long run, devices will surely include an entirely new breed of chip. The need is there. And chipmakers are racing to fill it.

Read more here.

TensorFlow: Open source machine learning