And basically the next thing was we generalized the useful independence.
To something which is much more prevalent in nature or the things we want to model, which is a weaker notion, which is conditional independence.
Okay, so here's the notion.
We have two sets of things, just think one random variable, a random variable Z1, another random variable Z2, and a third one.
A third one Z, then instead of having, then we just basically can do independence like reasoning only by adding this conditional dependence on given Z.
That doesn't destroy the independence property. So Z1 and Z2 are conditionally independent given Z, if I have this multiplication property both times given Z.
And if they're fully independent, namely independent, conditionally independent, given nothing, the empty set of variables, then this just collapses to our independence rule we had before.
And the idea here is that we often have conditional independence, but we don't very often have full independence.
And that's something we're going to build our world models on.
So we get all the rules for independence also generalized to conditional independence.
And when we go towards Bayesian networks, we will create world models, which we're going to write down graphically so that they're easier to manipulate by humans.
And where we write down conditional independence basically in this style, if we have toothache and catch being conditionally independent given cavity, I write these dependency arrows here.
And these are the things we're really after. So you can think of them as influences arrows here. I have an influence from cavity to catch.
I have lots of cavities. I get lots of catching. If I have lots of cavity, I have toothache.
But importantly is the arrows you don't draw. Namely, there's no direct arrow between catch and toothache.
And of course, if they were really independent, which they aren't, then we would have nothing in between.
They would just be disconnected subgraphs. And that, of course, is very nice.
We know that if we can disconnect subgraphs, then everything becomes much easier.
Because we have exponential on the subgraphs, but not exponential on the whole graph.
OK. And given this here, this conditional independence, we do what we always do.
We do normalization and marginalization.
And that gives me things I can deal with, I can compute with, things I typically have, namely sums over atomic events.
And we use the chain rule. And the nice thing about this graphical representation here is that these conditional independence results here allow me to drop dependencies here.
So those things get much, much, much, much shorter.
And also, the graphical model gives me a way of determining an order of variables so that much drops out.
OK. And then we do conditional independence.
And essentially, the kind of thing you want to remember is that if I have the conditional independence here, the conditional probability distribution here,
then of all the things it might depend on, I only need to look at the parents.
So if Xi were toothaches, then I would only have to look at cavity.
And even if I've ordered catch before toothache, then I don't have to take it into account.
So we went through an example.
So we're going to, if we want to compute cavity given toothache, we can catch.
I don't have any remaining variables, so I don't have to marginalize, but directly get something like this.
Then I can use the chain rule.
And now the magic happens. I can actually drop the toothache because the parent of catch is cavity.
OK. You remember our graph. Cavity at the top and both of the others connected with that.
And then it's easy to compute.
And in the end, whatever we wanted here is the thing the dentist is really interested in.
If I have a patient with a toothache and also my probe catches, where is my probe catches,
then the probability is almost 90% that there is a cavity.
And then they'll decide, well, I have to get out my other equipment.
Good. Any questions so far? Yes.
I have one very specific question to slide regarding the example way back at, I think,
the introduction of the traditional
Yeah, I'm sorry.
Yes.
OK.
Yes.
That really looks like a copy and paste error.
I'll have a look at that and correct it. Would you send me an email?
Presenters
Zugänglich über
Offener Zugang
Dauer
00:09:11 Min
Aufnahmedatum
2021-03-30
Hochgeladen am
2021-03-31 10:37:55
Sprache
en-US
Recap: Conditional Independence
Main video on the topic in chapter 3 clip 13.