Titre : | Handling Occlusions Based on 3D Reconstruction in Augmented Reality |
Auteurs : | DJALAL MESSERHI, Auteur ; Mohamed Chaouki Babahenini, Directeur de thèse |
Type de document : | Monographie imprimée |
Editeur : | Biskra [Algérie] : Faculté des Sciences Exactes et des Sciences de la Nature et de la Vie, Université Mohamed Khider, 2021 |
Format : | 1 vol. (89 p.) / ill. / 29 cm |
Langues: | Anglais |
Mots-clés: | Augmented Reality, handling occlusions, 3D reconstruction, Photogrammetry, Real-time 3D reconstruction. |
Résumé : | The correct relationships between real and virtual objects are of utmost importance to realistic augmented reality system, in which the occlusion handling method should be able to estimate the spatial relationships between real and virtual objects, as well as handle the mutual occlusion automatically in real-time.In this work, we propose two occlusion handling methods based on 3D reconstruction: Photogrammetry for 3D reconstruction method and Real-time 3D reconstruction method.More specifically, we can judge and handle the mutual occlusion without human interactivity in real-time and experimental results prove its effectiveness. |
Sommaire : |
1. Augmented reality ........................................................................................................... 14
1.1. Introduction ........................................................................................................................... 14 1.2. Augmented reality technologies ........................................................................................... 14 1.2.1. Computer vision methods in AR .................................................................................... 14 1.2.2. AR devices ...................................................................................................................... 17 1.2.2.1. Displays ....................................................................................................... 17 1.2.2.2. Input devices ............................................................................................... 19 1.2.2.3. Tracking ...................................................................................................... 19 1.2.2.4. Computers ................................................................................................... 21 1.2.3. AR interfaces.................................................................................................................. 21 1.2.3.1. Tangible AR interfaces ............................................................................... 21 1.2.3.2. Collaborative AR interfaces ........................................................................ 21 1.3. Applications ........................................................................................................................... 22 1.3.1. Advertising and commercial .......................................................................................... 22 1.3.2. Entertainment and education ....................................................................................... 24 1.3.3. Medical applications ...................................................................................................... 26 1.3.4. Mobile applications ....................................................................................................... 26 1.4. Conclusion ............................................................................................................................. 27 2. handling occlusions the theoretical part ....................................................................... 28 2.1. Introduction ........................................................................................................................... 28 2.2. Occlusions in augmented reality ........................................................................................... 28 2.3. Photogrammetry for 3D reconstruction method .................................................................. 29 2.3.1. Definition ....................................................................................................................... 29 2.3.2. System overview............................................................................................................ 29 2.3.2.1. Classification of photogrammetry ............................................................... 30 2.3.2.2. Characteristics of Close-range Photogrammetry ........................................ 31 2.3.3. Photogrammetry used for Augmented Reality ............................................................. 31 2.3.4. Photogrammetry Test Setup ......................................................................................... 32 2.3.4.1. Acquisition of Images ................................................................................. 32 2.3.4.2. Setup Instructions ........................................................................................ 33 2.3.4.3. Factors Affecting Close-range Photogrammetry ........................................ 33 2.3.4.4. Aspects for Successful Photogrammetry .................................................... 34 2.3.5. Photogrammetry methodology and tools evaluation ................................................... 35 6 2.3.5.1. Software Evaluation .................................................................................... 35 2.3.5.2. Selection of Tool for Integration ................................................................. 36 2.3.6. Photogrammetry algorithm of Meshroom .................................................................... 38 2.3.6.1. CameraInIt .................................................................................................. 38 2.3.6.2. Feature Extraction ....................................................................................... 39 2.3.6.3. Image Matching .......................................................................................... 39 2.3.6.4. Feature Matching ........................................................................................ 40 2.3.6.5. Structure from Motion ................................................................................ 40 2.3.6.6. Prepare Dense Scene ................................................................................... 41 2.3.6.7. Depth Map Estimation ................................................................................ 42 2.3.6.8. Depth Map Filter ......................................................................................... 43 2.3.6.9. Meshing ....................................................................................................... 43 2.3.6.10. Mesh filtering .............................................................................................. 43 2.3.7. Occlusion handling ........................................................................................................ 44 2.4. Real-time 3D reconstruction method .................................................................................... 44 2.4.1. Definition ....................................................................................................................... 44 2.4.2. System overview............................................................................................................ 44 2.4.3. 3D scene reconstruction ................................................................................................ 45 2.4.3.1. Converting depth image into a 3D point cloud ........................................... 45 2.4.3.2. 3D point clouds alignment based on GPU .................................................. 45 2.4.4. Occlusion handling ........................................................................................................ 47 2.5. Conclusion ............................................................................................................................. 48 3. Conception the practical part ........................................................................................ 49 3.1. Introduction ........................................................................................................................... 49 3.2. Photogrammetry for 3D reconstruction method .................................................................. 49 3.2.1. Methodology ................................................................................................................. 49 3.2.2. Global system design ..................................................................................................... 49 3.2.3. Detailed system design .................................................................................................. 50 3.2.3.1. Photogrammetry .......................................................................................... 50 3.2.3.2. Back-end ..................................................................................................... 51 3.2.3.3. Front-end ..................................................................................................... 52 3.3. Real-time 3D reconstruction method .................................................................................... 52 3.3.1. Methodology ................................................................................................................. 52 3.3.2. Global system design ..................................................................................................... 52 3.3.3. Detailed system design .................................................................................................. 53 7 3.3.3.1. Back-end ..................................................................................................... 53 3.3.3.2. Front-end ..................................................................................................... 55 3.4. Conclusion ............................................................................................................................. 56 4. Implementation and results ........................................................................................... 57 4.1. Introduction ........................................................................................................................... 57 4.2. Implementation ..................................................................................................................... 57 4.2.1. Photogrammetry for 3D reconstruction method .......................................................... 57 4.2.1.1. Environments and developing tools ............................................................ 57 4.2.1.2. Developing the Photogrammetry ................................................................ 59 4.2.1.2.1. Scan of the real scene ........................................................................................ 59 4.2.1.2.2. 3D Model of the real scene ................................................................................ 59 4.2.1.3. Developing the Back-end ............................................................................ 59 4.2.1.3.1. Tracking Model ................................................................................................... 60 4.2.1.3.2. adding virtual object........................................................................................... 62 4.2.1.3.3. Virtual Environment (VE) .................................................................................... 62 4.2.1.3.4. Real Environment (RE) ........................................................................................ 63 4.2.1.3.5. Camera alignment for AR ................................................................................... 63 4.2.1.4. Developing the Front-end ........................................................................... 64 4.2.2. Real-time 3D reconstruction method ............................................................................ 65 4.2.2.1. Environments and developing tools ............................................................ 65 4.2.2.2. Developing the Back-end ............................................................................ 67 4.2.2.2.1. Receive image RGB-D ......................................................................................... 67 4.2.2.2.2. Processing the image RGB-D .............................................................................. 67 4.2.2.2.3. Create 3D point clouds ....................................................................................... 68 4.2.2.2.4. Alignment of two consecutive frames ............................................................... 69 4.2.2.2.5. Save datasets of all frames ................................................................................. 71 4.2.2.2.6. adding virtual object........................................................................................... 73 4.2.2.2.7. Occlusion handling ............................................................................................. 73 4.2.2.3. Developing the Front-end ........................................................................... 73 4.3 . Results ................................................................................................................................... 77 4.3.1. Photogrammetry for 3D reconstruction method .......................................................... 77 4.3.1.1. Photogrammetry .......................................................................................... 77 4.3.1.2. Virtual Environment ................................................................................... 78 4.3.1.3. Real Environment ....................................................................................... 78 4.3.1.4. Camera alignment for AR ........................................................................... 79 8 4.3.1.5. Occlusion handling ..................................................................................... 79 4.3.2. Real-time 3D reconstruction method ............................................................................ 80 4.3.2.1. Result Processing dataset for an RGB-D camera Kinect ............................ 80 4.3.2.2. Draw 3D Point Clouds ................................................................................ 80 4.3.2.3. Alignment of two consecutive frames (ICP) Results .................................. 80 4.3.2.4. Save datasets of all frames .......................................................................... 80 4.3.2.5. Real scene drawing in Visual Studio .......................................................... 80 4.3.2.6. Occlusion handling ..................................................................................... 81 4.4. Discussion of results and comparison ................................................................................... 83 4.5. Conclusion ............................................................................................................................. 83 |
Type de document : | Mémoire master |
Disponibilité (1)
Cote | Support | Localisation | Statut |
---|---|---|---|
MINF/619 | Mémoire master | bibliothèque sciences exactes | Consultable |