8 - 20.2. Unconditional Probabilities (Part 2) [ID:29048]
50 von 227 angezeigt

What we can do once, we can do a couple of times. So given a set, a complete tabulation

of all probabilities of a given Boolean random variable, we can just basically, instead of

writing it below each other, writing it into a vector.

So weather is a variable that has one, two, three, four possible outcomes, so we can say,

and note that there is a boldface p here, which is the probability distribution, rather

than single probabilities, and we just tabulate the whole thing in a vector.

And very often, we can do operations on the whole vector in a kind of a linear algebra

way, and that makes writing down formulae very simple.

Much simpler than having to do, and in particular, we get around writing all kinds of sums and

so on.

Okay.

More notations for probability distribution.

If I have a set of random variables, we can assign to them all at once, given a vector,

each one point-wise.

We are going to call this the joint probability distribution of this subset of events, and

so if I have the joint probability distribution of headache and weather, right, boldface here

means vector, multiple arguments here means vector, together of course that means matrix.

So if I have, I need to have values for the weather being sunny and having a headache,

which is essentially the joint probability of the outcome being little sunny, which actually

means W equals true, weather equals true, and having a headache.

And then you have not headache, which actually means capital headache being false, and so

on.

And then of course there's a couple of more values here, and you can probably guess why

I've only given it two arguments.

Yes?

Exactly, writing down matrices is simple, writing down cubes or, you know, you can write

17 dimensional hypercubes is really a lot of work.

So that's why my examples, when I spell them out correctly, fully tend to be two-dimensional.

Of course if you want to program these things, a multi-dimensional arrow is not a problem

at all, unless you want to look at it.

You're already seeing we're in the vocab learning part of theory again, so I'm going to do a

couple of definitions here, which we're going to use productively in the future.

So if we give ourselves a set of random variables, I'm slowly drilling in to the language of

probability, which really has this language which talks about the probabilities and probability

distributions of events, and of course now I'm going to drill into the language in which

we actually write down events.

Remember we had events of the form random variable capital X has some outcome little

x, weather being sunny.

Now that is something that's an event we can write down.

If we haven't, we've already seen that.

We can also have composite events that say well and also the weather is...

We can string together things that go over various random variables.

The things we can write down in these joint probability distributions are actually what

our events are.

All of those, if you look at them, they really are things that can become true or false.

So the natural idea here is to say that we have atomic events, which are things like

value assignments to random variables.

Things we've essentially seen in constraint propagation already.

We have a couple of slots, they were called there, random variables, they're called now,

and the whole thing we think of an assignment.

Teil eines Kapitels:
Chapter 20. Quantifying Uncertainty

Zugänglich über

Offener Zugang

Dauer

00:23:25 Min

Aufnahmedatum

2021-01-28

Hochgeladen am

2021-02-11 16:57:20

Sprache

en-US

Different probability distributions, probabilites of propositional formulas and Kolmogorow's Theorem. 

Einbetten
Wordpress FAU Plugin
iFrame
Teilen