Welcome back to Deep Learning. So today we want to continue talking about the graph
convolutions and today we will look into the second part where we now see whether we can
now stay in this spectral domain or whether we can also go back to spatial domain. So
let's look what I have for you. So you remember we had this polynomial that we used in order
to define a convolution in spectral domain and we've seen that by computing the eigenvectors
of the Laplacian matrix we were able to find an appropriate Fourier transform that would
then give us a spectral representation of the graph configuration. Then we could do
our convolution in spectral domain and transform back. Now this was kind of very expensive
because we have to compute U and for U we have to do the eigenvalue decomposition for
this entire symmetric matrix and so on. And also we've seen that we can't use tricks
of the fast Fourier transform because this doesn't necessarily hold for our U-T. So
how can we choose now k and theta in order to get rid of U? Well, so if we choose k equals
to 1, theta 0 to 2 theta and theta 1 to minus theta we get the following polynomial. So
we still have the configuration that we have X transformed into Fourier space times our
polynomial expressed as matrix times the inverse Fourier transform here. Now let's look into
the configuration of g hat. g hat can actually be expressed as 2 times theta times lambda
to the power of 0. Remember lambda is a diagonal matrix so we take every element to the power
of 0. So this is actually a unity matrix and we subtract theta times lambda to the power
of 1. Well, this is actually just lambda and then we can express our complete matrix g
hat in this way. Of course we can then pull in our U from the left hand side and the right
hand side which is giving us the following expression. Now we use the property that theta
is actually a scalar so we can pull it to the front. The lambda to the power of 0 cancels
out because this is essentially just an identity matrix and the lambda on the right hand side
term still remains but we can also pull out the theta. Well, the U transpose just cancels
out so this is again the identity matrix and we can use our definition of the symmetric
version of our graph Laplacian and you can see that we just find it here in our equation
so we can also replace it with this one and you see now U is suddenly gone. So we can
pull out theta again and all that remains is that we have two times the identity matrix
minus the symmetric version of the graph Laplacian. If we now plug in the definition of the symmetric
version associated to the original adiacency matrix and the degree matrix we can see that
we still can plug this definition in then one of the identity matrices cancels out and
we finally get identity plus d to the power of minus 0.5 times A times d to the power
of minus 0.5. So remember d is a diagonal matrix so we can easily invert the elements
on the diagonal and we can also take element wise the square root so this is perfectly
fine. So this way we don't have U at all coming up here and we can express our entire
graph convolution in this very nice way using the graph Laplacian matrix. Now let's analyze
this term a little more so we can see this identity on the left hand side we see we can
convolve x in spectral domain and we can construct g hat as a polynomial of Laplacian filters
then we can see with a particular choice k equals one and theta zero equals to two theta
and theta one equals to minus theta then this term suddenly only depends on the scalar value
theta and with all these tricks we got rid of the Fourier transform U transpose so we
suddenly can express graph convolutions in this simplified way. Well this is the basic
graph convolution operation and you can find this actually shown in reference number one
you can essentially do this with a scalar value you use your degree matrix and plug
it in here you use your adhacency matrix and you plug it in here and then you can optimize
with respect to theta in order to find the weight for your convolutions. Well now the
question is is it really necessary to motivate the graph convolution from spectral domain
and the answer is no so we can also motivate it spatially as well. So let's look at the
following concept for a mathematician a graph is a manifold but a discrete one we can discretize
the manifold and do spectral convolution using the Laplacian matrix so this led us to spectral
Presenters
Zugänglich über
Offener Zugang
Dauer
00:10:45 Min
Aufnahmedatum
2020-10-12
Hochgeladen am
2020-10-12 23:36:30
Sprache
en-US
Deep Learning - Graph Deep Learning Part 2
In this video, we demonstrate how to go from spectral to spatial domain in graphs.
For reminders to watch the new video follow on Twitter or LinkedIn.
Further Reading:
A gentle Introduction to Deep Learning
References
[1]: Kipf, Thomas N., and Max Welling. "Semi-supervised classification with graph convolutional networks." arXiv preprint arXiv:1609.02907 (2016).
[2]: Hamilton, Will, Zhitao Ying, and Jure Leskovec. "Inductive representation learning on large graphs." Advances in neural information processing systems. 2017.
[3]: Wolterink, Jelmer M., Tim Leiner, and Ivana Išgum. "Graph convolutional networks for coronary artery segmentation in cardiac CT angiography." International Workshop on Graph Learning in Medical Imaging. Springer, Cham, 2019.
[4]: Wu, Zonghan, et al. "A comprehensive survey on graph neural networks." arXiv preprint arXiv:1901.00596 (2019).
[5]: Bronstein, Michael et al. Lecture “Geometric deep learning on graphs and manifolds” held at SIAM Tutorial Portlan (2018)
Image References
[a] https://de.serlo.org/mathe/funktionen/funktionsbegriff/funktionen-graphen/graph-funktion
[b] https://www.nwrfc.noaa.gov/snow/plot_SWE.php?id=AFSW1
[c] https://tennisbeiolympia.wordpress.com/meilensteine/steffi-graf/
[d] https://www.pinterest.de/pin/624381935818627852/
[e] https://www.uihere.com/free-cliparts/the-pentagon-pentagram-symbol-regular-polygon-golden-five-pointed-star-2282605
[f] http://geometricdeeplearning.com/ (Geometric Deep Learning on Graphs and Manifolds)
[g] https://i.stack.imgur.com/NU7y2.png
[h] https://de.wikipedia.org/wiki/Datei:Convolution_Animation_(Gaussian).gif
[i]https://www.researchgate.net/publication/306293638/figure/fig1/AS:396934507450372@1471647969381/Example-of-centerline- extracted-left-and-coronary-artery-tree-mesh-reconstruction.png
[j] https://www.eurorad.org/sites/default/files/styles/figure_image_teaser_large/public/figure_image/2018-08/0000015888/000006.jpg?itok=hwX1sbCO