22 lines
		
	
	
		
			1.2 KiB
		
	
	
	
		
			Markdown
		
	
	
	
	
	
			
		
		
	
	
			22 lines
		
	
	
		
			1.2 KiB
		
	
	
	
		
			Markdown
		
	
	
	
	
	
| ---
 | ||
| title: One-Shot Learning
 | ||
| ---
 | ||
| 
 | ||
| # One-Shot Learning
 | ||
| 
 | ||
| Humans learn new concepts with very little need for repetition – e.g. a child can generalize the concept
 | ||
| of a “monkey” from a single picture in a book, yet our best deep learning systems need hundreds or
 | ||
| thousands of examples to grasp any object even upto a point of decent accuracy. This motivates the setting we are interested in: “one-shot” learning, which
 | ||
| consists of learning a class from a single (or very few) labelled example.
 | ||
| 
 | ||
| There are various approaches to One-Shot learning such as [similarity functions](https://www.coursera.org/lecture/convolutional-neural-networks/one-shot-learning-gjckG), 
 | ||
| [Bayes' probability theorem](https://www.youtube.com/watch?v=FIjy3lV_KJU), DeepMind has come up with it's own version of Neural Networks using the One-Shot learning approach!
 | ||
| 
 | ||
| 
 | ||
| 
 | ||
| ### More information:
 | ||
| * [Siraj Raval on YouTube](https://www.youtube.com/watch?v=FIjy3lV_KJU&feature=youtu.be)
 | ||
| * [Andrew Ng (Deeplearning.ai)](https://www.coursera.org/lecture/convolutional-neural-networks/one-shot-learning-gjckG)
 | ||
| * [Scholarly article](http://web.mit.edu/cocosci/Papers/Science-2015-Lake-1332-8.pdf)
 | ||
| * [Wikipedia](https://en.wikipedia.org/wiki/One-shot_learning)
 |