7 - Recap Clip 3.7: Unconditional Probabilities (Part 1) [ID:30409]
50 von 97 angezeigt

Good.

So into the math, we're basically

recapping lots of concepts that you probably already know.

There we go.

We have probability theories, which are essentially

the syntactic part, just like for propositional logic

or something like that.

We had a language and possibly inference procedures.

And on the other hand, we always have

models, which tell us what these languages actually mean.

And the setup here is completely analogous.

We have a language part, which is a probability theory, which

talks about random variables, events,

and all of those kind of things.

And we have probability models, which

talk about things like sample space and probability

functions and so on.

And just like for propositional logic or first order logic,

if we are lucky, those things actually do the same thing.

Inference on the language level, which

is something we can program in our computer,

does the same thing as the models would do.

That's one of the things that you are usually

not told in probability theories when you learn it

from a mathematician.

But for us, I think it's important.

We always think in terms of the models,

because those are very similar to the world

or to our belief states.

They talk about sample spaces or possible worlds

and the likelihood of certain facts in that world.

What we're really going to do is something different.

We're not going to calculate in the model,

because that's not going to be efficient.

What we're going to do is we're going

to do inference at that level, just like we did last semester.

Models are big and complex.

You never know in which you are.

But you can do inference, and you don't

need all that model stuff.

OK, we've talked about random variables

as representations of processes we don't really understand.

The only thing we need to understand about them

is what is the set of their outcomes,

and eventually, what are the probabilities

of the various outcomes, which is something you can actually

say at a summary level.

And so you're only interested in, in the long run,

what's the distribution of these outcomes.

And that's what this prior probability gives us.

Teil eines Kapitels:
Recaps

Zugänglich über

Offener Zugang

Dauer

00:06:26 Min

Aufnahmedatum

2021-03-30

Hochgeladen am

2021-03-31 10:26:37

Sprache

en-US

Recap: Unconditional Probabilities (Part 1)

Main video on the topic in chapter 3 clip 7.

Einbetten
Wordpress FAU Plugin
iFrame
Teilen