Google: You bring the artificial fish, we’ve got the intelligent chips

Google claims it has made a significant step forward in the delivering the reality of artificial intelligence after releasing details of its Tensor Processing Unit, a machine learning chip.

Jamie Davies

April 6, 2017

3 Min Read
Google: You bring the artificial fish, we’ve got the intelligent chips

Google claims it has made a significant step forward in the delivering the reality of artificial intelligence after releasing details of its Tensor Processing Unit, a machine learning chip.

The search giant teased the industry with the initial announcement last year, however details were relatively thin. The latest shout-out is accompanied with a white paper, which will be presented at a National Academy of Engineering meeting, claiming the TPU is 15x to 30x faster than contemporary GPUs and CPUs, as well as 30x to 80x more energy efficient in certain circumstances.

“The need for TPUs really emerged about six years ago, when we started using computationally expensive deep learning models in more and more places throughout our products,” said Norm Jouppi, Distinguished Hardware Engineer at Google. “The computational expense of using these models had us worried.

“If we considered a scenario where people use Google voice search for just three minutes a day and we ran deep neural nets for our speech recognition system on the processing units we were using, we would have had to double the number of Google data centres.”

Artificial intelligence, and in particular machine learning technologies, are now playing a crucial role in the effectiveness of the Google business. More than 100 teams are using various machine learning components to power the Google engine, from Street View, to Inbox Smart Reply, to voice search. Just to put the challenge into perspective, Jouppi was worried about having to double data centre capabilities; Google already operates 15 data centres around the world. Doubling this footprint would be an astronomical headache both practically and financially.

The Google business is reliant on the successful delivery of adverts, which increasingly have to be relevant to user preferences, mood, activities and general state of play; artificial intelligence is key to comprehending all of this data to create a more personalized experience. The status quo had to change to ensure Google was meeting the demands of advertisers and users, while also maintaining a profitability which would be seen as attractive to investors.

The TPU operates in the space which Google has described as neural network inference, essentially the data crunching and training phase, where predictions are made to reinforce and improve performance. Its data heavy and very demanding from a performance perspective.

Last year, Jouppi wrote: “TPU is tailored to machine learning applications, allowing the chip to be more tolerant of reduced computational precision, which means it requires fewer transistors per operation. Because of this, we can squeeze more operations per second into the silicon, use more sophisticated and powerful machine learning models and apply these models more quickly, so users get more intelligent results more rapidly. A board with a TPU fits into a hard disk drive slot in our data centre racks.”

With such bold claims, many businesses might be tempted into moving into the licensing and hardware game, but we don’t think this is going to be the case; it moves too far away from Google’s core business. Yes, the company is trying to diversify the technology in which it invests, but all of these examples can be tied back to data collection, ultimately feeding the artificial intelligence machine which is ever improving Google’s ability to tell us where to eat lunch.

Perhaps the most significant takeaway from this move, is the position which this ultimately leaves Google in. Should the claims prove to be correct, the chip moves the search-giant years ahead of the competition, in a field which could potentially decide the technology leaders of tomorrow.

If you fancy a bit of easy reading, you can find the complete study here.

Subscribe and receive the latest news from the industry.
Join 56,000+ members. Yes it's completely free.

You May Also Like