11 - Logic-Based Natural Languate Semantics (LBS WS2024/25) [ID:55900]
50 von 884 angezeigt

Okay. Zoomies, can you hear me and can you also see the blackboard?

Yes, we can.

Excellent. I'll do my best to write to the right. So I hope you all have a very relaxing time,

holidays. You took a break from studying and refreshed and eager to learn.

In case you have forgotten everything during this time, which I certainly have,

let me refresh. Okay, so what we did, we started with thinking about what modeling natural language

actually means. And we looked at kind of the philosophy of science as a background,

just to give ourselves a little bit of vocabulary. And we then went to have to look at one kind of

logical model because logical models have the advantage that they're very crisp, they're

relatively small. We can look into how they work. Right? This idea that we take language,

then do unspeakable things to them with the model. And then this model makes predictions.

Something from of the type listeners who hear this kind of a story, say that kind of a statement

must be true in the situation described by the story. Okay. It's all about making predictions.

Now the problem with this is that unlike math or so where we have objects that have very few

properties, natural language is something from out there. And it has lots of properties and it's very

irregular language. You could say looks like this amorphous blob of things with unclear borders.

So what scientific theories tend to do, and we also tend to do, is we look at certain phenomena

in isolation and that takes the form of a fragment of language.

This is supposed to be less amorphous and with very clear borders and so on. And we define that

by a grammar. And then we translate this into a formal language, a logic. And then we're on

safe grounds. Because with that's something we understand. It's just moving symbols across the

board and using inference procedures to make predictions. And the idea is that we kind of parcel out

the space of all natural language with these fragments and then hopefully be able to combine

them by logic engineering and so on. And we've done that for four fragments.

Kind of extending our regional understanding of natural language semantics as we go along.

And then we kind of tacked on something we didn't quite call fragment five. I don't even know why.

And there the idea was, and we had a couple of ideas in between. One is that if we want to deal

with these three problems we have, we have the ambiguity problems. We have an utterance, a piece

of text maybe or a piece of speech. And then if that has multiple readings or semantics,

then we think of ambiguity. We have the same thing with multiple utterances

that correspond to one semantics. And we kind of have the problem that if we have

kind of an utterance like this, we might just have what is called the composition problem.

Now the composition problem we took a very specific approach to. And we said composition is something

we use the lambda calculus for. Because the simple lambda calculus, which is a very

nice little logic, actually allows us to shuffle things around. You can think of it as an extremely

simple and limited programming language. Just kind of logic internal. We're making use of this.

That allows us to do things like

the linguistic quantifier for all, which is basically every student sleeps.

We can make this lambda p, lambda q, for all x, px plus qx. If we apply this guy here

to student and sleeps, that actually goes to all x, student and x.

It implies sleeps of x, which is exactly what we want in a cursor or what.

Okay, that's one of the things we did. And then there's all kinds of niggly bigly things like

where we in the grammar, we don't want to allow didn't didn't and we have to take care of

finitary verbs and all of those kinds of things. There's a lot of detail which really doesn't matter

that much. We want to somehow get from abstract syntax trees into some kind of logic and we are

willing to extend the logic as we go along. And in this case, we extended it by lambdas

to get the composition problem and the compositionality in this mapping under control.

And the last thing we looked at was things like Peter hit the cat with a room in the bathroom

and so on, which really had the problem that hitting, which we think about the transitive verb,

can be kind of decorated with additional information.

Zugänglich über

Offener Zugang

Dauer

01:28:05 Min

Aufnahmedatum

2025-01-08

Hochgeladen am

2025-01-08 13:16:04

Sprache

en-US

Tags

language computational logic
Einbetten
Wordpress FAU Plugin
iFrame
Teilen