News classification
Contact us
- Add: No. 9, North Fourth Ring Road, Haidian District, Beijing. It mainly includes face recognition, living detection, ID card recognition, bank card recognition, business card recognition, license plate recognition, OCR recognition, and intelligent recognition technology.
- Tel: 13146317170 廖经理
- Fax:
- Email: 398017534@qq.com
The future of artificial intelligence also depends on hardware breakthroughs
The future of artificial intelligence also depends on hardware breakthroughs
The hype about artificial intelligence is all about the algorithm. In the field of machine learning, Google Corporation Deep Mind recently published an article, which depicts how AlphaGo Zero has become a go go master from scratch, and its application advanced technology of reinforcement learning algorithm has defeated all previous versions. However, although enterprises and organizations compete with each other for the top talents in the field of algorithm design and data science, the real news is not from the bit world, but from the wire, silicon and electronics industry: the hardware is coming back.
In addition, machine learning applications, especially in the form of recognition (such as understanding the voice, image and so on) demand a large number of parallel disposal. When Google announced that its algorithm could identify the picture of the cat, what they did not mention was that its software needs 16000 processors to work. If you can run your algorithm on cloud computing server, this is not a big problem, but what if you have to run these algorithms on a mobile device? This is increasingly becoming an important industry demand. The advanced machine learning algorithm on the terminal brings a great advantage to the user and also deals with a lot of data privacy issues. Imagine that if Siri does not need to do cloud computing, it can dispose of all the data and algorithms on the hardware of the smartphone. But if you find a smartphone getting too hot after a few minutes or Minecraft, you're waiting to use a cell phone to make Siri really personalized.
Bottleneck problem processing
The reason of equipment become hot, and the main problems of our current computer hardware design, is the so-called "Von Neumann bottleneck": the classic computer architecture, data disposal and data storage were open, which means that the data needs in the process of calculation from a central transfer to another. Parallelism deals with local problems through synthetic computation and distributed processing, but you still need to move data at last and transform all data into desired output. So, if there is a way to completely eliminate hardware bottlenecks? What if the disposal and data are in the same center, without the need to move, and do not produce heat or so much energy? After all, our brain works like this: we don't have independent areas like computers that process data and store data. Everything happens on our neurons.
Intel neural morphologic chip Loihi
Our brain function is not new in the artificial intelligence study, and we have been learning the depth of the neural network. We mimic the function of neurons through machine learning algorithms and parallel handling. But what if our computers do not work like our brains? Since 1970s, people have imagined such a way: mapping brain function to hardware, in other words, directly drawing the brain structure with hardware. This approach, known as "neural form calculation," has finally begun to commercialize. Intel and Qualcomm have recently announced that it will launch neuromorphic chips for commercial use.
Neuromorphic chips can be used for terminals in AI applications, and this is really a very exciting message. However, it is possible that they can lift machine intelligence to a new degree. By using electronic hardware instead of software to carry out machine cognition, we may be able to accomplish the illusion of general AI and invent the real intelligent system.
Quantum: Calculation of the big bang
But calculating the real big bang is not from the neural chip. Although it has great potential, it may only have small applications, but it comes from the application of quantum physics. With the increase in demand for fast computing, our ambition to deal with real difficult problems is also increasing. If we can calculate the best way of arranging a series of molecules to develop a cure for cancer? This problem is in practice to reduce all cancers, and it is now stopped by the trial and error method. Classic calculations can't deal with the problem: after several iterations, the combination of parameters will explode. It is possible for quantum computing to calculate all possible combinations at the same time and to get the right answer in a few seconds. There are many similar optimization problems that can be processed by quantum computing. Optimize the allocation of resources in complex businesses, or make predictions that can support the best strategy in the economy, or synthesize the numbers in cryptography.
IBM's quantum computer
Quantum computer is developing rapidly: we now in the 50 qubit degree. Let's write this number into the scope of the thought. A 32 bit quantum computer can handle 4 billion coefficients and 265 GB information. You might say, this is not impressive, because you can run similar programs on a laptop computer in a few seconds. But once
The flattening of Moore's law
In addition, machine learning applications, especially in the form of recognition (such as understanding the voice, image and so on) demand a large number of parallel disposal. When Google announced that its algorithm could identify the picture of the cat, what they did not mention was that its software needs 16000 processors to work. If you can run your algorithm on cloud computing server, this is not a big problem, but what if you have to run these algorithms on a mobile device? This is increasingly becoming an important industry demand. The advanced machine learning algorithm on the terminal brings a great advantage to the user and also deals with a lot of data privacy issues. Imagine that if Siri does not need to do cloud computing, it can dispose of all the data and algorithms on the hardware of the smartphone. But if you find a smartphone getting too hot after a few minutes or Minecraft, you're waiting to use a cell phone to make Siri really personalized.
Bottleneck problem processing
The reason of equipment become hot, and the main problems of our current computer hardware design, is the so-called "Von Neumann bottleneck": the classic computer architecture, data disposal and data storage were open, which means that the data needs in the process of calculation from a central transfer to another. Parallelism deals with local problems through synthetic computation and distributed processing, but you still need to move data at last and transform all data into desired output. So, if there is a way to completely eliminate hardware bottlenecks? What if the disposal and data are in the same center, without the need to move, and do not produce heat or so much energy? After all, our brain works like this: we don't have independent areas like computers that process data and store data. Everything happens on our neurons.
Intel neural morphologic chip Loihi
Our brain function is not new in the artificial intelligence study, and we have been learning the depth of the neural network. We mimic the function of neurons through machine learning algorithms and parallel handling. But what if our computers do not work like our brains? Since 1970s, people have imagined such a way: mapping brain function to hardware, in other words, directly drawing the brain structure with hardware. This approach, known as "neural form calculation," has finally begun to commercialize. Intel and Qualcomm have recently announced that it will launch neuromorphic chips for commercial use.
Neuromorphic chips can be used for terminals in AI applications, and this is really a very exciting message. However, it is possible that they can lift machine intelligence to a new degree. By using electronic hardware instead of software to carry out machine cognition, we may be able to accomplish the illusion of general AI and invent the real intelligent system.
Quantum: Calculation of the big bang
But calculating the real big bang is not from the neural chip. Although it has great potential, it may only have small applications, but it comes from the application of quantum physics. With the increase in demand for fast computing, our ambition to deal with real difficult problems is also increasing. If we can calculate the best way of arranging a series of molecules to develop a cure for cancer? This problem is in practice to reduce all cancers, and it is now stopped by the trial and error method. Classic calculations can't deal with the problem: after several iterations, the combination of parameters will explode. It is possible for quantum computing to calculate all possible combinations at the same time and to get the right answer in a few seconds. There are many similar optimization problems that can be processed by quantum computing. Optimize the allocation of resources in complex businesses, or make predictions that can support the best strategy in the economy, or synthesize the numbers in cryptography.
IBM's quantum computer
Quantum computer is developing rapidly: we now in the 50 qubit degree. Let's write this number into the scope of the thought. A 32 bit quantum computer can handle 4 billion coefficients and 265 GB information. You might say, this is not impressive, because you can run similar programs on a laptop computer in a few seconds. But once