Okay, good morning everybody.
I hope everybody online can hear me.
I'm recording you again, just that you're aware.
And I want to congratulate you because basically you can program now.
Everything we've covered so far enables you to do some basic programming.
Not saying that you can do perfect programming, that you can write very good programs.
I'm just saying you know everything that's necessary to at least look up in the docs
how to do, how to write your programs.
And also very relevant to understand other people's code.
You know now how variables work, data types work, basically how a very basic computer
works, some very basic data structures, lists, errors and so on, classes and functions.
So we've covered all of that.
That is almost universal across all imperative programming languages.
Yeah, you might come across things you might not understand, but now with what you have,
what we have covered so far, you are in a very good position to write your own programs.
Still probably takes a while to write really good programs.
We're discussing a quite auxiliary topic today, which is related to writing good programs.
I will do exclusively program efficiency, complexity of programs today, and look a little
bit into how complexity is actually measured.
If you can call it like that, what you do is basically estimates and look into various
well types of algorithmic complexity, which are basic classes of algorithms, which you
will usually encounter all the time.
What this lecture will give you is intuition when you look into a new problem to be able
to derive how expensive will it be to compute it.
Can I compute this within a reasonable time?
Is this a very kind of quick thing or does this particular program probably require either
very big compute resources or a very long time on your laptop?
This is the kind of key takeaway from these lectures that you get a basic understanding
of what is fast and what is slow.
What we are going to discuss is mainly around measuring the orders of growth of an algorithm.
That means we are not doing most of the time exact metrics here.
We can, but the big topic here will be something we call the big O notation, which is just
a notation used to characterize certain types of asymptotic growth of algorithms.
Then we'll discuss the basic complexity classes, which give you the tools to categorize, say,
a new algorithm or to estimate your collection of algorithms when you run them one after
the other in a consolidated way to get kind of idea of what's the bottleneck here?
How long will the entire thing take to execute?
Now some of you might say what for?
I don't care.
Computers always get faster.
I have a program that's probably slow today, but let's just wait a couple of years and
following more slow of law, which is probably not true anymore.
But what we've experienced the last 30 years or so is when computers always go faster and
faster and we could just wait until this algorithm you wrote will maybe execute in half the time.
I can tell you now it's a bad idea because unfortunately, the amount of data we're looking
at is growing probably even faster than our compute power.
And so it probably doesn't matter waiting.
Your algorithm will maybe always be slow.
Even if you wait five years, data will have grown by whatever factor.
And so now you would be expected to process six billion more entries with your slow algorithm.
Presenters
Zugänglich über
Offener Zugang
Dauer
01:20:38 Min
Aufnahmedatum
2023-12-04
Hochgeladen am
2023-12-04 14:06:04
Sprache
en-US
Algorithm complexity and asymptotic runtime, big O notation