<< Chapter < Page Chapter >> Page >

Generative learning algorithms

So far, we've mainly been talking about learning algorithms that model p ( y | x ; θ ) , the conditional distribution of y given x . For instance, logistic regression modeled p ( y | x ; θ ) as h θ ( x ) = g ( θ T x ) where g is the sigmoid function. In these notes, we'll talk about a different type of learning algorithm.

Consider a classification problem in which we want to learn to distinguish between elephants ( y = 1 ) and dogs ( y = 0 ), based on some features of an animal. Given a training set, an algorithm like logistic regression or the perceptron algorithm (basically) tries to find a straight line—that is, a decision boundary—that separates the elephants anddogs. Then, to classify a new animal as either an elephant or a dog, it checks on which side of the decision boundary it falls, and makes its prediction accordingly.

Here's a different approach. First, looking at elephants, we can build a model of what elephants look like. Then, looking at dogs, we can build a separate model of whatdogs look like. Finally, to classify a new animal, we can match the new animal against the elephant model, and match it against the dog model, to see whether the new animal looks morelike the elephants or more like the dogs we had seen in the training set.

Algorithms that try to learn p ( y | x ) directly (such as logistic regression), or algorithms that try to learn mappings directly from the space of inputs X to the labels { 0 , 1 } , (such as the perceptron algorithm) are called discriminative learning algorithms. Here, we'll talk about algorithms that instead try to model p ( x | y ) (and p ( y ) ). These algorithms are called generative learning algorithms. For instance, if y indicates whether an example is a dog (0) or an elephant (1), then p ( x | y = 0 ) models the distribution of dogs' features, and p ( x | y = 1 ) models the distribution of elephants' features.

After modeling p ( y ) (called the class priors ) and p ( x | y ) , our algorithm can then use Bayes rule to derive the posterior distribution on y given x :

p ( y | x ) = p ( x | y ) p ( y ) p ( x ) .

Here, the denominator is given by p ( x ) = p ( x | y = 1 ) p ( y = 1 ) + p ( x | y = 0 ) p ( y = 0 ) (you should be able to verify that this is true from the standard properties of probabilities), and thus canalso be expressed in terms of the quantities p ( x | y ) and p ( y ) that we've learned. Actually, if were calculating p ( y | x ) in order to make a prediction, then we don't actually need to calculate the denominator, since

arg max y p ( y | x ) = arg max y p ( x | y ) p ( y ) p ( x ) = arg max y p ( x | y ) p ( y ) .

Gaussian discriminant analysis

The first generative learning algorithm that we'll look at is Gaussian discriminant analysis (GDA). In this model, we'll assume that p ( x | y ) is distributed according to a multivariate normal distribution. Let's talk briefly about the properties ofmultivariate normal distributions before moving on to the GDA model itself.

The multivariate normal distribution

The multivariate normal distribution in n -dimensions, also called the multivariate Gaussian distribution, is parameterized by a mean vector μ R n and a covariance matrix Σ R n × n , where Σ 0 is symmetric and positive semi-definite. Also written “ N ( μ , Σ ) ”, its density is given by:

Questions & Answers

how to study physic and understand
Ewa Reply
what is conservative force with examples
Moses
what is work
Fredrick Reply
the transfer of energy by a force that causes an object to be displaced; the product of the component of the force in the direction of the displacement and the magnitude of the displacement
AI-Robot
why is it from light to gravity
Esther Reply
difference between model and theory
Esther
Is the ship moving at a constant velocity?
Kamogelo Reply
The full note of modern physics
aluet Reply
introduction to applications of nuclear physics
aluet Reply
the explanation is not in full details
Moses Reply
I need more explanation or all about kinematics
Moses
yes
zephaniah
I need more explanation or all about nuclear physics
aluet
Show that the equal masses particles emarge from collision at right angle by making explicit used of fact that momentum is a vector quantity
Muhammad Reply
yh
Isaac
A wave is described by the function D(x,t)=(1.6cm) sin[(1.2cm^-1(x+6.8cm/st] what are:a.Amplitude b. wavelength c. wave number d. frequency e. period f. velocity of speed.
Majok Reply
what is frontier of physics
Somto Reply
A body is projected upward at an angle 45° 18minutes with the horizontal with an initial speed of 40km per second. In hoe many seconds will the body reach the ground then how far from the point of projection will it strike. At what angle will the horizontal will strike
Gufraan Reply
Suppose hydrogen and oxygen are diffusing through air. A small amount of each is released simultaneously. How much time passes before the hydrogen is 1.00 s ahead of the oxygen? Such differences in arrival times are used as an analytical tool in gas chromatography.
Ezekiel Reply
please explain
Samuel
what's the definition of physics
Mobolaji Reply
what is physics
Nangun Reply
the science concerned with describing the interactions of energy, matter, space, and time; it is especially interested in what fundamental mechanisms underlie every phenomenon
AI-Robot
what is isotopes
Nangun Reply
nuclei having the same Z and different N s
AI-Robot
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now




Source:  OpenStax, Machine learning. OpenStax CNX. Oct 14, 2013 Download for free at http://cnx.org/content/col11500/1.4
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Machine learning' conversation and receive update notifications?

Ask