Hello and welcome to this presentation covering our contribution to ICAST 2021.
My name is André Genski and I'm covering our paper, A Novel Viewport Adaptive Motion
Compensation Technique for Fisher Video, on behalf of my co-authors Christian Herglotz
and André Kaup.
We are with the Chair of Multimedia Communications and Signal Processing at Friedrich-Alexander
University in Germany and are happy to present the results of our research here.
Fish eye lenses are frequently used in applications where large fields of view should be captured
using a single camera such as in video surveillance, autonomous driving or consumer action cameras.
However, these large fields of view are only possible by heavily bending incoming light
rays towards the image sensor such that strong radial distortions occur.
These distortions lead to straight lines in the real world being bent on the image sensor
resulting in a considerably degraded performance of classical block matching motion estimation
and compensation that has been developed with perspective lenses in mind.
To combat this, a state-of-the-art projection based approach for fish eye video has been
proposed by Eichense et al. in 2019.
It performs the translatory motion in a perspective domain instead of directly performing the
motion on the fish eye domain pixels.
However, it has shown that this approach still faces problems with motion in peripheral areas.
Our contribution is a novel viewport adaptive approach that models differently oriented
motion planes in 3D space by performing the translational motion in different perspective
viewports as visualized on the right.
All in all, we achieve gains of approximately 2.4 dB compared to the state-of-the-art in
fish eye motion compensation.
Although we trust that you are familiar with the basics of block matching motion compensation,
we provide a short overview of this technique first.
We'll then provide some basic intuition for motion planes in 3D space by having a look
at perspective viewports before introducing our novel viewport adaptive motion compensation
technique for fish eye video.
In a detailed section on our experimental setup and results, we'll share the potential
of the proposed viewport adaptive motion compensation technique and provide an outlook for promising
future applications.
In case you want to watch specific parts of the presentation, the timestamps of the corresponding
sections in this video are provided here as well.
Classical block matching works with a translatory motion model to find the best motion vector
for each block in the current image.
Thereby, the motion vector describes the horizontal and vertical shift of the regarded block in
a reference image with similar but moved image content compared to the current image.
Motion vectors are selected independently for each block based on a suitable error matrix
such as the sum of squared differences.
To find the best motion vector, different search strategies are employed.
The extensive full search is an optimal but highly complex method as it requires to test
every possible motion vector.
For this reason, suboptimal smart search strategies such as the diamond search have been developed.
Here significantly less motion vector candidates need to be tested, greatly speeding up the
motion estimation procedure.
As the best motion vector has been found for every block in the current image, a complete
motion vector field can be visualized that shows the estimated motion for each block
in the image.
It is now possible to generate a motion compensated image based on the reference image and the
available motion information.
Presenters
Andy Regensky
Zugänglich über
Offener Zugang
Dauer
00:11:30 Min
Aufnahmedatum
2021-04-19
Hochgeladen am
2021-05-11 14:16:13
Sprache
en-US
Although fisheye cameras are in high demand in many application areas due to their large field of view, many image and video sig- nal processing tasks such as motion compensation suffer from the introduced strong radial distortions. A recently proposed projection- based approach takes the fisheye projection into account to improve fisheye motion compensation. However, the approach does not con- sider the large field of view of fisheye lenses that requires the consid- eration of different motion planes in 3D space. We propose a novel viewport-adaptive motion compensation technique that applies the motion vectors in different perspective viewports in order to realize these motion planes. Thereby, some pixels are mapped to so-called virtual image planes and require special treatment to obtain reliable mappings between the perspective viewports and the original fisheye image. While the state-of-the-art ultra wide-angle compensation is sufficiently accurate, we propose a virtual image plane compensa- tion that leads to perfect mappings. All in all, we achieve average gains of +2.40 dB in terms of PSNR compared to the state of the art in fisheye motion compensation.