12 - Logic-Based Natural Languate Semantics (LBS WS2024/25) [ID:55971]
50 von 832 angezeigt

Okay.

Any problems with the quiz?

Any questions about the exam?

We're nearing the end of the lectures.

So if you come up with problems, either use the Matrix channel or questions or ask them in class, just like any other question.

Righto.

So Zoomies, you can hear me and you can see the slides. Is that correct?

I think I might have to address that.

Yes.

Henry alone.

Okay, good. So we've been talking about discourse representation theory.

The main idea or the word discourse means more than one sentence.

You can kind of think of everything we did before as one sentence linguistics.

Now we turn to three sentence linguistics.

But we're looking at all kinds of data where we have multiple sentences here too, mostly.

The idea is that we have these dynamic effects that are mostly centered around an afro.

Accessibility of things introduced by indefinite nominals, a man, Peter, a book, all of those kind of things.

And whether they're available for anaphoric reference or not. Those are the predictions this makes.

And of course, you can imagine that if we extend this to event semantics, like we've done with the verb modifiers, then you can have an afro to events as well.

The witness said she saw a man being hit over the head with a crowbar or something like that.

And then you can have an anaphoric reference saying that's what's supposed to happen.

And if you think about it, the that here is really anaphoric to an event.

All of those things we can now do if we basically collect the things we've learned so far.

And so I'm not going to go into this in our example.

And the main thing we're doing stuff with these boxes, we're trying to understand this in terms of logic and what the predictions really are that this makes.

And we've looked at merging of the RSS as essentially sentence composition.

We've looked at accessibility, which has this weird and wonderful definition by a sub DRS.

And we've looked at translation into first of all logic, which is relatively straightforward, except for the case of dynamic implication.

And we've basically then looked at this picture, which I want to very briefly talk about again, namely, we have a good kind of compositional story about how sentences are composed.

And then there's another story about what any box means and kind of that gives us this spanning thing.

And then you realize that sentence composition at the first of our logic level is nonstandard.

We don't quite know how.

And all of these people, they're very, very proud of this.

We've invented a new representational level on which certain things work and the logic has to follow.

You had a question.

All the examples, it seems to me, are all set.

Like, you were sleeping, but it's not going to be a logical idea.

But if there comes a point where John was crying, Peter pushed him.

Yes. Well, we need 10s for that.

It's been better in brief on 10s.

But that is not a problem.

Okay.

See, it's simpler to kind of have linear cause and effect.

In any case, cause and effect is not something we're modeling terribly well in logic, generally.

So it's more correlations and cause.

There are abducted.

We've been mostly using deduction as a kind of a forward-looking thing, but you can also use abduction.

Everybody familiar with the notion of abduction?

This is the best way of creating a silent lecture room by asking, do you know this and that?

The ones who do don't want to say, yeah, I know this is hard.

Zugänglich über

Offener Zugang

Dauer

01:31:42 Min

Aufnahmedatum

2025-01-15

Hochgeladen am

2025-01-15 20:46:05

Sprache

en-US

Tags

language computational logic
Einbetten
Wordpress FAU Plugin
iFrame
Teilen