Hi, we are going to discuss one proposal for a solution for the test exam and we're starting
with question one which is about explicit reconstruction of a solution for an inverse
problem and the inverse problem that we're looking at is of the form, this is datum,
this is the matrix A and then we have an unknown parameter vector u1, u2, u3 and of course
the goal is to infer the parameter u from the data. First part of this question is to
compute the SVD of A and as usual we start by multiplying A transposed with A, so this
is the product 1, 1, 0, 1, 0, minus 1 times 1, 1, 0, 1, 0, minus 1 which if you do the
math will get you to 1, minus 1, 1, 1, 0, minus 1, 0, 1. Next step is to find the eigenvalues
and eigenvectors of this matrix A transposed A. So what we do is we compute the determinant
of A transposed A minus lambda identity matrix which is the determinant of the matrix 2 minus
lambda 1, minus 1, 1, 1, minus lambda 0, minus 1, 0, 1, minus lambda and well you can compute
this, so this is first diagonal is 2 minus lambda times 1 minus lambda squared then plus
0 plus 0 minus 1 minus lambda which is this diagonal here minus 0 minus another times
1 minus lambda so I'm going to put a 2 here. So now we can pull out this factor of 1 minus
lambda which leaves us with 2 minus lambda times 1 minus lambda minus 2 and this is 1
minus lambda so this is 2 minus 3 lambda plus lambda squared if you multiply this out you
subtract the 2 you can pull out a factor of minus lambda and what remains is 3 minus lambda.
So this means that the 3 eigenvalues of this matrix A transposed A are 3 lambda 2 is 1
and lambda 3 is 0 and sorry this means that the singular values of A are square root of
3 and 1 and we can immediately write down this diagonal rectangular matrix this has
same shape as A and has those singular values on the diagonal let's just look at the shape
of A this is 2 by 3 so this is the same shape that sigma has to have so this is sigma. Alright
next we have to compute the eigenvectors of A transposed A so what's the kernel of A
transposed A minus 3 times identity matrix this is the kernel of the matrix minus 1 1
minus 1 minus 2 0 minus 1 0 minus 2 let's call these rows 1 2 and 3 and if you look
closely you can see that the first row can be written as one half the third row minus
one half second row so we can just drop the first row and this has the same same kernel
so this is the kernel of this zero row 1 minus 2 0 and I'm going to for convenience swap
the sign on the last row and well short pondering of this this kernel shows you that well you
know x1 minus 2 x2 has to be 0 and x1 plus 2 x1 has to be 0 so if you combine this into
a linear system you can figure out really quickly that this has to be the span of the
following vector 2 1 minus 1. Okay and now you can pick your first right singular vector
v1 which should be a normalized representative of this span so you can take the plus this
one or minus this one scaled with the inverse norm and I'm going to take this representative
to 1 minus 1 divided by square root of 6 which is the particular norm of this vector. Okay
next A transpose A minus 1 identity matrix this is kernel of 1 1 minus 1 1 0 0 and minus
1 0 0 so you can see that this is the same as the kernel well you can just drop the third
line for example and subtract 1 times the second row from the first line so it will
look like that 1 0 0 and 0 0 0 and as you can see the first component has to be 0 and
both second and third components have to be equal to each other so this is the span of
the vector 0 1 1 so you choose any representative of the span which is normalized sorry v2 is
1 over square root of 2 of 0 1 1 fairly easy and the third right singular vector is the
following it's just a kernel A transpose A which is the kernel of the matrix 2 1 minus
1 1 1 0 and minus 1 0 1 again you can see that if you call this the first the second
and the third row you see that the first row is equal to the second row minus third row
so you can drop for example the first row the kernel doesn't change by removing the
first row 1 1 0 and minus 1 0 1 so you can see the first and the second component have
to be equal but with different signs and the first and the third entry have to be equal
so this gives you immediately all scalars of 1 minus 1 1 so you pick again a representative
to fix your third right singular vector which would be 1 over square root of 3 1 minus 1
Presenters
Zugänglich über
Offener Zugang
Dauer
00:25:06 Min
Aufnahmedatum
2022-01-27
Hochgeladen am
2022-01-27 10:36:03
Sprache
en-US