News classification
Contact us
- Add: No. 9, North Fourth Ring Road, Haidian District, Beijing. It mainly includes face recognition, living detection, ID card recognition, bank card recognition, business card recognition, license plate recognition, OCR recognition, and intelligent recognition technology.
- Tel: 13146317170 廖经理
- Fax:
- Email: 398017534@qq.com
Migrate learning? Is deep learning the past?
Migrate learning? Is deep learning the past?
A AI discussion is like making a spaceship, and a strong engine is essential in addition to sufficient fuel. If the fuel is lacking, the spacecraft will not be able to enter the scheduled orbit. And the engine was not strong enough that the spacecraft could not be lifted off. Analogous to AI, the deep learning model seems to be the engine, and the mass of exercise data seems to be fuel, both of which are equally indispensable for the AI. In the precise mapping of the neural network and the input to the output, we have been getting better and better in recent years. It is no longer a problem, whether it is for images, statements, or label prediction, with a large number of samples that have been marked.
But today, the deep learning algorithm is still lacking. It is difficult to get the generalization data in the new situation (different from the exercise set) and the massive data necessary for the training model.
Limitations of deep learning
In the depth of learning in full swing, the depth of the national depth of the deep study of the bad words, and about those who did not finish the title of the party, will not hit me. But I still have to pluck up the courage to say the difficulties that are now facing deep study.
1. express the limit of talent. Since a model is an ideal reflection after all, it is equal to the ideal image. The more powerful it can depict the ideal talent, the more accurate it is to machine the ideal ability, and machine learning is to depict the world with variables, its variable number is limited, and the depth of deep learning is also limited. In addition, the demand for data increases with the increase of the model, but there is not a lot of high quality data in the ideal. So on the one hand, the amount of data, on the one hand, is the complexity of the data in the data, depth learning to describe the complexity of the data is not complex enough.
2. lack of repercussion mechanism. At present, deep learning is best for image recognition, speech recognition and other issues, but not on other issues the best, especially the delay feedback problems, such as robot action, AlphaGo chess is not deep learning pack everything, it is part of a strong learning, until the last response is one step before you know the outcome. There are many other learning tasks that are not necessarily accomplished by deep learning.
3. the complexity of the model is high. The following are some popular machine learning models and the amount of data they need. We can see that with the progress of the complexity of the model, the number of parameters and the amount of data required is amazing.
640? Wx_fmt=jpeg&wxfrom=5&wx_lazy=1
OK, from the above description, we can draw on the three key problems to be dealt with in the traditional machine learning methods, including deep learning.
1. as the complexity of the model progresses, the number of parameters is astonishing.
2. in the new situation, the generalization of the model needs to be improved.
The mass mark of the 3. exercise model is time-consuming and expensive.
4. the ability to express is limited and lack of repercussion mechanism.
Migration learning helps you to do everything, make your model small and light, and you can make a three - one!
What is migration learning in the end?
"You can never understand a word unless you understand at least two words."
Anyone who has learned second words should be "feeling the same" for the words of the British writer Jeffrey Williams. But why? The reason is that the process of learning to use a foreign language inevitably deepens a person's understanding of his mother tongue. In fact, Gerd also found the powerful talent of this idea, which led to his involuntary assertion that it was similar but more extreme.
"A person who can't speak a foreign language knows nothing about his mother tongue."
This statement is very interesting, but surprisingly, I'm afraid it's more about its essence. Learning and improving a certain skill or mental function can exert a positive influence on other skills or mental functions, which is called transfer learning. It is not only in human intelligence, but also for machine intelligence. Often, migration learning has become one of the basic research fields of machine learning, and has a universal potential for theoretical application.
Some people may be surprised how a computer - based learning system can show the ability to migrate. Google has been thinking about this problem through an experiment that touches two sets of machine learning systems. For the sake of simplicity, we call them machine A and machine B. Machine A uses a new DNN, and machine B uses DNN that has been exercised and can understand English. As usual, we assume that a group of the same sound recording and Mandarin text corresponding to stop the exercise on the machine A and B, you think what happens? Surprisingly, machine B (received English exercise machine) show better than machine A Mandarin skills, because it before accepting English exercise related abilities transfer to the Mandarin comprehension task.
Not only that, this experiment there is another more amazing results: B machine not only mandarin ability is more high, it can also understand English! Looks like Williams and Gerd did say to a little learning second language really can deepen our understanding of the two types of speech understanding, even if the machine is no exception.
In fact, this is a computerised migration learning. But in the US, transfer learning examples too much, a person will know the guitar than those without basic music people can quickly learn the piano; one can play table tennis, people are more receptive than tennis without experience; cyclist can quickly learn to ride electric vehicles, etc. transfer learning is on your side.
640? Wx_fmt=jpeg
The difference between migratory learning and traditional machine learning
Classic monitoring learning scene in machine learning
But today, the deep learning algorithm is still lacking. It is difficult to get the generalization data in the new situation (different from the exercise set) and the massive data necessary for the training model.
Limitations of deep learning
In the depth of learning in full swing, the depth of the national depth of the deep study of the bad words, and about those who did not finish the title of the party, will not hit me. But I still have to pluck up the courage to say the difficulties that are now facing deep study.
1. express the limit of talent. Since a model is an ideal reflection after all, it is equal to the ideal image. The more powerful it can depict the ideal talent, the more accurate it is to machine the ideal ability, and machine learning is to depict the world with variables, its variable number is limited, and the depth of deep learning is also limited. In addition, the demand for data increases with the increase of the model, but there is not a lot of high quality data in the ideal. So on the one hand, the amount of data, on the one hand, is the complexity of the data in the data, depth learning to describe the complexity of the data is not complex enough.
2. lack of repercussion mechanism. At present, deep learning is best for image recognition, speech recognition and other issues, but not on other issues the best, especially the delay feedback problems, such as robot action, AlphaGo chess is not deep learning pack everything, it is part of a strong learning, until the last response is one step before you know the outcome. There are many other learning tasks that are not necessarily accomplished by deep learning.
3. the complexity of the model is high. The following are some popular machine learning models and the amount of data they need. We can see that with the progress of the complexity of the model, the number of parameters and the amount of data required is amazing.
640? Wx_fmt=jpeg&wxfrom=5&wx_lazy=1
OK, from the above description, we can draw on the three key problems to be dealt with in the traditional machine learning methods, including deep learning.
1. as the complexity of the model progresses, the number of parameters is astonishing.
2. in the new situation, the generalization of the model needs to be improved.
The mass mark of the 3. exercise model is time-consuming and expensive.
4. the ability to express is limited and lack of repercussion mechanism.
Migration learning helps you to do everything, make your model small and light, and you can make a three - one!
What is migration learning in the end?
"You can never understand a word unless you understand at least two words."
Anyone who has learned second words should be "feeling the same" for the words of the British writer Jeffrey Williams. But why? The reason is that the process of learning to use a foreign language inevitably deepens a person's understanding of his mother tongue. In fact, Gerd also found the powerful talent of this idea, which led to his involuntary assertion that it was similar but more extreme.
"A person who can't speak a foreign language knows nothing about his mother tongue."
This statement is very interesting, but surprisingly, I'm afraid it's more about its essence. Learning and improving a certain skill or mental function can exert a positive influence on other skills or mental functions, which is called transfer learning. It is not only in human intelligence, but also for machine intelligence. Often, migration learning has become one of the basic research fields of machine learning, and has a universal potential for theoretical application.
Some people may be surprised how a computer - based learning system can show the ability to migrate. Google has been thinking about this problem through an experiment that touches two sets of machine learning systems. For the sake of simplicity, we call them machine A and machine B. Machine A uses a new DNN, and machine B uses DNN that has been exercised and can understand English. As usual, we assume that a group of the same sound recording and Mandarin text corresponding to stop the exercise on the machine A and B, you think what happens? Surprisingly, machine B (received English exercise machine) show better than machine A Mandarin skills, because it before accepting English exercise related abilities transfer to the Mandarin comprehension task.
Not only that, this experiment there is another more amazing results: B machine not only mandarin ability is more high, it can also understand English! Looks like Williams and Gerd did say to a little learning second language really can deepen our understanding of the two types of speech understanding, even if the machine is no exception.
In fact, this is a computerised migration learning. But in the US, transfer learning examples too much, a person will know the guitar than those without basic music people can quickly learn the piano; one can play table tennis, people are more receptive than tennis without experience; cyclist can quickly learn to ride electric vehicles, etc. transfer learning is on your side.
640? Wx_fmt=jpeg
The difference between migratory learning and traditional machine learning
Classic monitoring learning scene in machine learning