Thuan Bui Bach’s paper has been published in ISPRS Journal of Photogrammetry and Remote Sensing, a very influential top journal with Impact Factor 8.979 (as of 2022). The paper is about 3D self-position estimation for indoor use using 2D camera data, and the proposed method is proven to have higher accuracy and performance than existing methods through comparative experiments. The source code is also available to the public, so those interested should see the information below.
Translated with www.DeepL.com/Translator (free version)
FeatLoc: Absolute pose regressor for indoor 2D sparse features with simplistic view synthesizing
Thuan Bui Bach, Tuan Tran Dinh, Joo-Ho Lee
ISPRS Journal of Photogrammetry and Remote Sensing, Volume 189, 2022, Pages 50-62, ISSN 0924-2716,
Abstract: Precise localization using visual sensors is a fundamental requirement in many applications, including robotics, augmented reality, and autonomous systems. Traditionally, the localization problem has been tackled by leveraging 3D-geometry registering approaches. Recently, end-to-end regressor strategies using deep convolutional neural networks have achieved impressive performance, but they do not achieve the same performance as 3D structure-based methods. To some extent, this problem has been tackled by leveraging the beneficial properties of sequential images or geometric constraints. However, these approaches can only achieve a slight improvement. In this work, we address this problem for indoor scenarios, and we argue that regressing the camera pose using sparse feature descriptors could significantly improve the pose regressor performance compared with deep single-feature-vector representation. We propose a novel approach that can directly consume sparse feature descriptors to regress the camera pose effectively. More importantly, we propose a simplistic data augmentation procedure to exploit the sparse descriptors of unseen poses, leading to a remarkable enhancement in the generalization performance. Lastly, we present an extensive evaluation of our method on publicly available indoor datasets. Our FeatLoc achieves 22% and 40% improvements in translation errors on 7-Scenes and 12-Scenes relatively, compared with recent state-of-the-art absolute pose regression-based approaches. Our codes are released at https://github.com/ais-lab/FeatLoc.
Keywords: Visual localization; Sparse features; Absolute pose regression
Miran Lee from AIS Lab. received the Japan Society of Mechanical Engineers Women of the Future Award. Dr. Lee received her PhD in September 2021, and this award is a result of her research activities at AIS Lab.
For more information, please visit the following link
Yume Matsushita’s survey paper has been published in Oxford Academy’s Journal of Computational Design and Engineering, a 34-page monograph that covers all recent gait research for medical purposes using machine learning. This is a paper that allows medical professionals and machine learning researchers to come together under the keyword of gait. It is Open Access, so anyone can read it.