HOME
NEWS
2017.05.05
Artificial intelligence chips are the cornerstone for the next wave of technology
Views:9464




In 1992, in the famous Bell Labs in the suburbs of New York, LeCun, a scientist at Bell Labs, and several of his fellow researchers designed a chip that could run neural networks in depth.



This chip based on the development of mathematical operations can self-learn by analyzing a large amount of data. At that time, LeCun named the chip "Anna (ANNA)", but "Anna" has never been used on a large scale.



The neural network can well recognize the letters and numbers that people write in graffiti, but it is not particularly ideal when dealing with other tasks.



However, today, neural networks are rapidly changing the direction of the development of Internet giants such as Google, Facebook and Microsoft.



LeCun is now the director of the Facebook Artificial Intelligence Lab. Now the neural network has been able to recognize and recognize objects in faces and photos, translate a language into another language, and do a lot of other work. .



Today, 25 years later, LeCun said that the market is in great need of chips such as "Anna", and these chips will soon be available to users.



Google has also recently established its own artificial intelligence chip, named TPU, which is widely placed in a large-scale data center that supports Google's online search for Empire business.



In these data centers, the TPU is integrated with other devices and plays a positive role in the selection of voice commands from Android phones to Google search engines.



However, this is just the beginning of a larger wave of technology.



The National Broadcasting Corporation's Financial Channel (CNBC) reported last week that several of the veteran development engineers at Google's TPU are working on a similar chip with a very mysterious startup called Groq.



At the same time, other chip makers in the market, Intel, IBM and Qualcomm are all moving in the same direction.



Google, Facebook and Microsoft can still rely on traditional standard computer chips to run their neural networks, which are well known CPUs.



But the CPU is designed as a full-service processor, and it is currently running out of efficiency. Neural networks can be faster and consume less energy, especially the large amount of mathematical operations required by artificial intelligence systems.



Google has reported that after the adoption of the TPU chip, the technology update saved Google the cost of building 15 additional data centers.



The current industry trend is that both Google and Facebook are sparing no effort to promote the use of neural networks in mobile phones and VR headsets, thus eliminating the delays in the transmission of images from long distances to user terminals.



LeCun said that both Google and Facebook require artificial intelligence chips for personal terminal devices, and in this direction, it is necessary to develop different smart chips that are more distinctive for different products in order to increase efficiency.



In other words, the potential artificial intelligence chip market is huge. This is why so many companies are moving in this direction.



Leading company of artificial intelligence chips:



Intel Corporation is developing an artificial intelligence chip dedicated to machine learning after acquiring a startup called Nervana.



IBM is not far behind, but also built a hardware architecture to reflect the design of neural networks.



Recently, Qualcomm is also developing chips specifically designed to implement neural networks.



LeCun is very familiar with Qualcomm's plans because Facebook is helping Qualcomm develop technology related to machine learning.



At the same time, NVIDIA is also close to this field.



Just last month, the Silicon Valley chip manufacturing company hired Clément Farabet, a well-known research expert in neural networks and artificial intelligence chips, and LeCun at New York University. After graduating from New York University, the student founded the famous machine deep learning startup Mdabits, which was acquired by Twitter in 2014.



NVIDIA is already the flagship company in the field of artificial intelligence.



Large companies like Google and Facebook have previously wanted to use a neural network for language translation, and they must send employees to NVIDIA for training to complete this particular task.



Facebook last week released a new augmented reality tool that requires a neural network to recognize the world around people.



But augmented reality systems are not yet connected to big data centers. It takes too long to send all the images to the data center.



But Facebook's chief technology officer, Mike Schroepfer, said that Facebook has turned to GPUs and other chips. He calls this other chip a digital signal processor. A lot of image transmissions generated in virtual reality can be performed faster and faster.



From a long-term perspective, all kinds of artificial intelligence, virtual reality development and improvement are inseparable from artificial intelligence chips, the market demand is there, chip developers are moving forward in this direction.
RELATED NEWS
ALL NEWS