So, constraint propagation.
As I said, we have inference and we have decomposition, starting with inference.
The thing you can do here is, can we add additional constraints to our constraint satisfaction
problem such that we don't actually lose any of the valid solutions.
Why might that be a good idea?
I mean, do you think that if I just add additional constraints, I'm just making the problem harder
in some sense?
Sorry?
Yes, why might that be a good idea?
Yes?
Yes?
Yes and most importantly, they would contradict with those constraints early.
That's kind of the point.
You're not ruling out any of the full solutions, but by adding additional constraints, you're
already ruling out partial assignments early.
So you can basically eliminate whole branches of your tree without going down to the full
depth before you realize that this doesn't work.
So the whole idea here with inference is that we try to add additional constraints that
don't lose any solutions, which just means we get an equivalent constraint satisfaction
problem, but we have a tighter description.
As you mentioned, it's more constraining and hence we can rule out a number of partial
assignments, which speeds us up.
The second thing we can do is we can try to decompose our constraint network into several
smaller constraint networks, which can be solved efficiently.
And then from those solutions of the separate components, we can build up one solution.
Right.
So our agenda for this chapter is basically to look at how we can do this.
With respect to inference, there's two things we can use.
The most naive and simple way to add inference is to use forward checking.
I think you've already done that on Thursday.
So we'll briefly go over that again.
And in particular, forward checking is basically the thing you always want to do.
It's comparatively cheap and basically everything you can do regarding inference is at least
as smart as forward checking.
So it's a good starting point.
And the one thing that you actually want to do probably in practice is to abuse a concept
called R consistency.
I don't think you've talked about that yet.
So we're going to do this today, which is basically just an advanced version of forward
checking that not only uses variables whose values you've already fixed in your search.
So you can do basically inference on variables you haven't even assigned values to yet.
Regarding decomposition, there's one particular system which we will talk about called cut
set conditioning, which basically abuses the fact that constraint networks are easier to
solve if they don't have any cycles.
So the idea is to quickly get to a point by assigning values to those variables such that
the remaining variables form an acyclic graph.
And once you have that, you can rather efficiently solve those.
Presenters
Zugänglich über
Offener Zugang
Dauer
00:04:35 Min
Aufnahmedatum
2020-10-31
Hochgeladen am
2020-10-31 10:07:38
Sprache
en-US
Recap: Introduction
Main video on the topic in chapter 10 clip 1.