Monday, May 23, 2016

Google to do the chip, Intel how to do?

Google want AI to a new height, and therefore a need for a lower power consumption, more work done in less time chips. But the influence of this chip far beyond the Google empire itself - its commercial chipmakers Intel and NVIDIA such a threat when you see Google's vision of the future, the feeling will be more intense. Google global data center network director Urs Huole Ze (Urs Hölzle) said that in the future they will develop more such chips. Google will not sell the chips to other companies, does not directly compete with Intel or NVIDIA. But Google has a huge data center, the two companies are by far the largest potential customers. At the same time, as more and more businesses use Google to provide cloud computing services, the number of servers they buy their own (and chip) will be less and less, will bring a further blow to the chip market. TPU chip: Designed for Custom Machine Learning In fact, Google put this new chip as a cloud service to promote their AI selling point, businesses and programmers can use a cloud service Google's AI engine, to integrate into their application software. Google vigorously sell to other companies own AI ability, claiming that their AI has the best hardware support, and other hardware companies do not. Google's new chip, called Tensor Processing Unit, called TPU. This name is because it facilitates TensorFlow operation. Google deep neural network software engine is driven by the TensorFlow (deep neural network is a network of hardware and software, you can analyze large amounts of data to learn specific tasks network software). Other technology giants run with the graphics processing unit (GPU) deep neural network is designed to help --GPU games and other graphics-intensive applications to render images, driving deep neural networks that use GPU computing class is more suitable. But Google said it designed the new chip is more efficient in that class computing. According to Google, he said, TPU is tailored specifically for machine learning, so it performs fewer number of transistors required for each operation. This means an increase in the number of operations executed per second chip. The GPU out of office? For the time being, TPU and GPU Google deep neural networks will also be used. Huo Leze reluctant to explain in detail the use of Google TPU way, only that with TPU to deal with "counted as part of" Android phone voice recognition needs. But he said Google will release a paper explained the benefits of TPU, and Google will continue to devise other ways to promote learning machine chips. It looks, GPU seems finally to be squeezed out. Huole Ze said, "GPU has been a little bit out .GPU too common, and for machine learning pertinence. Machine learning was never GPU is designed." NVIDIA did not want to hear the case. As the world's largest manufacturers of GPU, NVIDIA is driving the expansion of its business to the field of AI. As Huole Ze pointed out, NVIDIA's latest GPU has a specific machine learning models. But clearly, Google hopes progress can be even greater. Programmable chip FPGA Meanwhile, other companies (most notably Microsoft) is exploring another chip: field-programmable gate array, referred to as the FPGA, you can reprogram the chip, in order to perform specific tasks. Microsoft has been in machine learning was tested on FPGA, Intel also recently acquired a FPGA vendors. Some analysts believe that this is a more intelligent approach. Closely chip business Moor Insights and Strategy, president and chief analyst Patrick Morehead (Patrick Moorhead) said, FPGA provides more flexibility. He felt Google new TPU seems a bit "too far", because such a chip would need at least six months to develop successfully, and this is a very competitive market, the largest Internet company in which you scramble for 6 months for them it is quite a long period of time. However, Google does not need that kind of flexibility, it is most important is speed. When asked why Google developed the chip to start from scratch, instead of using an FPGA, Huo Leze replied: "because it will be much faster." Frozen Games - Juegos de Frozen - Jogos de Frozen - Permainan Frozen - Игры Холодное сердце
Giochi di Frozen - Frozen Spelletjes - Gry Kraina lodu - Elsa Spiele - Jeux de Frozen - Elsa Oyunlari Core Business Huo Leze also pointed out that Google does not replace the chip CPU (central processing unit, a computer server in the heart). Google data centers still need to run on the CPU tens of thousands of machines. CPU is Intel's main business. But since Google is willing just to AI research and development of an area on their own chips, people will think it will inevitably develop its own CPU future. Huo Leze played down that possibility. "You want to solve those outstanding issues," he said. Means CPU is already a very mature technology, there is no problem needs to be improved. But he also said, Google hopes the presence of healthy competition in the chip market. In other words, it wants to buy products from a large number of sellers, not only a seller is optional. After all, the more competition for Google, it means lower prices. Huole Ze explains that Google is working with the OpenPower Foundation, is to expand the range of options. OpenPower Foundation chip design anyone can use and modify. This approach to the world's largest chip maker poses a potential threat. Research firm IDC analyst Shane Rau (Shane Rau) estimates that global sales of all server CPU, about 5% are bought by Google. In the most recent year's time, Google bought about 1.2 million chips, most of which may come from Intel. No matter what Google plans to CPU side, the company will continue to study particularly suitable for machine learning chips. To really find out what worked well and what is not feasible, it may take years of time, after all, the neural network itself is constantly evolving. "We have been studying," he said. "I do not know what the final answer." Of course, the global chip maker will certainly pay close attention to their learning situation.

No comments:

Post a Comment