Full metadata record
DC FieldValueLanguage
dc.contributor.authorWegmann, Marcel-
dc.date.accessioned2022-06-09T12:58:33Z-
dc.date.available2022-06-09T12:58:33Z-
dc.date.issued2022-05-31-
dc.identifier.urihttps://digitalcollection.zhaw.ch/handle/11475/25108-
dc.descriptionVortrag in der Themengruppe Hardware und DSPde_CH
dc.description.abstractThis presentation describes a new concept for motion tracking in augmented reality systems using new approaches from an actual research project at ZHAW-InES. Motion tracking and the resulting recalculation of the representation of virtual objects is the key function in augmented reality systems. One of the innovations described in this paper is the sensor fusion of a Time of Flight (TOF) camera to spatially capture the environment in conjunction with a standard color camera to compute the 3D representation. The final 3D output is then generated using the open standard Vulkan low-overhead, cross-platform rendering API. Motion processing is based on the angular velocity and translation extraction using the Nvidia CUDA SIFT (Scale invariant feature transform) library. The extracted angular velocity is then fused with IMU (Inertial Motion Unit) data using a Kalman filter for the sensor fusion. The presentation also briefly discusses the hardware architecture, which is based on the NVIDIA Jetson-AGX. As a conclusion, motion tracking using a TOF camera and the Vulkan rendering library speeds up the tracking process, resulting in a real-time and more realistic user experience.de_CH
dc.language.isoende_CH
dc.rightsNot specifiedde_CH
dc.subjectAugmented realityde_CH
dc.subjectTime of Flight Camerade_CH
dc.subjectTOFde_CH
dc.subjectARde_CH
dc.subjectRANSACde_CH
dc.subjectSingular Value Decomposition (SVD)de_CH
dc.subjectKalman Filterde_CH
dc.subjectSensor Fusionde_CH
dc.subjectVulkan APIde_CH
dc.subject.ddc006: Spezielle Computerverfahrende_CH
dc.titleReal time motion tracking for augmented reality with TOF camera and vulkan renderingde_CH
dc.typeKonferenz: Sonstigesde_CH
dcterms.typeTextde_CH
zhaw.departementSchool of Engineeringde_CH
zhaw.organisationalunitInstitute of Embedded Systems (InES)de_CH
zhaw.conference.detailsEmbedded Computing Conference (ECC), Winterthur, 31. Mai 2022de_CH
zhaw.funding.euNode_CH
zhaw.originated.zhawYesde_CH
zhaw.publication.statuspublishedVersionde_CH
zhaw.publication.reviewKeine Begutachtungde_CH
zhaw.author.additionalNode_CH
zhaw.display.portraitYesde_CH
Appears in collections:Publikationen School of Engineering

Files in This Item:
There are no files associated with this item.
Show simple item record
Wegmann, M. (2022, May 31). Real time motion tracking for augmented reality with TOF camera and vulkan rendering. Embedded Computing Conference (ECC), Winterthur, 31. Mai 2022.
Wegmann, M. (2022) ‘Real time motion tracking for augmented reality with TOF camera and vulkan rendering’, in Embedded Computing Conference (ECC), Winterthur, 31. Mai 2022.
M. Wegmann, “Real time motion tracking for augmented reality with TOF camera and vulkan rendering,” in Embedded Computing Conference (ECC), Winterthur, 31. Mai 2022, May 2022.
WEGMANN, Marcel, 2022. Real time motion tracking for augmented reality with TOF camera and vulkan rendering. In: Embedded Computing Conference (ECC), Winterthur, 31. Mai 2022. Conference presentation. 31 Mai 2022
Wegmann, Marcel. 2022. “Real Time Motion Tracking for Augmented Reality with TOF Camera and Vulkan Rendering.” Conference presentation. In Embedded Computing Conference (ECC), Winterthur, 31. Mai 2022.
Wegmann, Marcel. “Real Time Motion Tracking for Augmented Reality with TOF Camera and Vulkan Rendering.” Embedded Computing Conference (ECC), Winterthur, 31. Mai 2022, 2022.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.