<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
<channel>
<title>SAM</title>
<link>https://sam.ensam.eu:443</link>
<description>The DSpace digital repository system captures, stores, indexes, preserves, and distributes digital research material.</description>
<pubDate xmlns="http://apache.org/cocoon/i18n/2.1">Mon, 13 Apr 2026 02:05:12 GMT</pubDate>
<dc:date>2026-04-13T02:05:12Z</dc:date>
<item>
<title>Towards improving the future of manufacturing through digital twin and augmented reality technologies</title>
<link>http://hdl.handle.net/10985/14459</link>
<description>Towards improving the future of manufacturing through digital twin and augmented reality technologies
RABAH, Souad; ASSILA, Ahlem; KHOURI, Elio; MAIER, Florian; ABABSA, Fakhreddine; BOURNY, Valéry; MAIER, Paul; MERIENNE, Frédéric
We are on the cusp of a technological revolution that will fundamentally change our relationships to others and the way we live and work. These changes, in their importance, scope, and complexity, is different than what humanity has known until now. We do not yet know what will happen, but one thing is certain: our response must be comprehensive and it must involve all stakeholders at the global level: the public sector, the private sector, the academic world and civil society. Applications for the industrial sector are already numerous: predictive maintenance, improved decision-making in real time, anticipation of stocks according to the progress of production, etc. So many improvements that optimize the production tools every day a little more, and point to possibilities for the future of Industry 4.0, the crossroads of an interconnected global world. This work comes to contribute as a part of this industrial evolution(Usine 4.0). In this paper we introduce a part of a collaboration between industry and research area in order to develop a DT and AR industrial solution as a part of a predictive maintenance framework. In this context, we elaborate a proof-of-concept that was developed in special industrial application.
</description>
<pubDate>Mon, 01 Jan 2018 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/14459</guid>
<dc:date>2018-01-01T00:00:00Z</dc:date>
<dc:creator>RABAH, Souad</dc:creator>
<dc:creator>ASSILA, Ahlem</dc:creator>
<dc:creator>KHOURI, Elio</dc:creator>
<dc:creator>MAIER, Florian</dc:creator>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>BOURNY, Valéry</dc:creator>
<dc:creator>MAIER, Paul</dc:creator>
<dc:creator>MERIENNE, Frédéric</dc:creator>
<dc:description>We are on the cusp of a technological revolution that will fundamentally change our relationships to others and the way we live and work. These changes, in their importance, scope, and complexity, is different than what humanity has known until now. We do not yet know what will happen, but one thing is certain: our response must be comprehensive and it must involve all stakeholders at the global level: the public sector, the private sector, the academic world and civil society. Applications for the industrial sector are already numerous: predictive maintenance, improved decision-making in real time, anticipation of stocks according to the progress of production, etc. So many improvements that optimize the production tools every day a little more, and point to possibilities for the future of Industry 4.0, the crossroads of an interconnected global world. This work comes to contribute as a part of this industrial evolution(Usine 4.0). In this paper we introduce a part of a collaboration between industry and research area in order to develop a DT and AR industrial solution as a part of a predictive maintenance framework. In this context, we elaborate a proof-of-concept that was developed in special industrial application.</dc:description>
</item>
<item>
<title>Usability of Augmented Reality in Aeronautic Maintenance, Repair and Overhaul</title>
<link>http://hdl.handle.net/10985/14360</link>
<description>Usability of Augmented Reality in Aeronautic Maintenance, Repair and Overhaul
FISCHINI, Antoine; ABABSA, Fakhreddine; GRASSER, Mickaël
Augmented Reality (AR) is a strong growing research topic in several areas including industry, training, art and entertainment. AR can help users to achieve very complex tasks by enhancing their vision with useful and well-adapted information. This paper deals with evaluating the usability of AR in aeronautic maintenance training tasks. A case study in the on-site maintenance department was conducted using an augmented reality application, involving operators at several levels of expertise. Obtained results highlighted the full efficacy of AR in the field of aeronautic maintenance.
</description>
<pubDate>Mon, 01 Jan 2018 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/14360</guid>
<dc:date>2018-01-01T00:00:00Z</dc:date>
<dc:creator>FISCHINI, Antoine</dc:creator>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>GRASSER, Mickaël</dc:creator>
<dc:description>Augmented Reality (AR) is a strong growing research topic in several areas including industry, training, art and entertainment. AR can help users to achieve very complex tasks by enhancing their vision with useful and well-adapted information. This paper deals with evaluating the usability of AR in aeronautic maintenance training tasks. A case study in the on-site maintenance department was conducted using an augmented reality application, involving operators at several levels of expertise. Obtained results highlighted the full efficacy of AR in the field of aeronautic maintenance.</dc:description>
</item>
<item>
<title>Augmented Reality assistance for R&amp;D assembly in Aeronautics</title>
<link>http://hdl.handle.net/10985/14160</link>
<description>Augmented Reality assistance for R&amp;D assembly in Aeronautics
PRUVOST, Martin; MIALOCQ, Pierre; ABABSA, Fakhreddine
This paper presents an AR system architecture for assisting complex assembly work by adding visual information superimposed on the physical assembly parts.
</description>
<pubDate>Mon, 01 Jan 2018 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/14160</guid>
<dc:date>2018-01-01T00:00:00Z</dc:date>
<dc:creator>PRUVOST, Martin</dc:creator>
<dc:creator>MIALOCQ, Pierre</dc:creator>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:description>This paper presents an AR system architecture for assisting complex assembly work by adding visual information superimposed on the physical assembly parts.</dc:description>
</item>
<item>
<title>Free Hand-Based 3D Interaction in Optical See-Through Augmented Reality Using Leap Motion</title>
<link>http://hdl.handle.net/10985/14475</link>
<description>Free Hand-Based 3D Interaction in Optical See-Through Augmented Reality Using Leap Motion
ABABSA, Fakhreddine; HE, Junhui; CHARDONNET, Jean-Rémy
In augmented reality environments, the natural hand interaction between a virtual object and the user is a major issue to manipulate a rendered object in a convenient way. Microsoft’s HoloLens (Microsoft 2018) is an innovative augmented reality (AR) device that has provided an impressive experience for the user. However, the gesture interactions offered to the user are very limited. HoloLens currently recognizes two core component gestures: Air tap and Bloom. To solve this issue, we propose to integrate a Leap Motion Controller (LMC) within the HoloLens device (Figure 1). We thus used 3D hand and finger tracking provided by the LMC to propose new free hand-based interaction more natural and intuitive. We implemented three fully 3D techniques for selection, translation and rotation manipulation. In this work, we first investigated how to combine the two devices to get them working together in real time, and then we evaluated the proposed 3D hand interactions.
</description>
<pubDate>Mon, 01 Jan 2018 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/14475</guid>
<dc:date>2018-01-01T00:00:00Z</dc:date>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>HE, Junhui</dc:creator>
<dc:creator>CHARDONNET, Jean-Rémy</dc:creator>
<dc:description>In augmented reality environments, the natural hand interaction between a virtual object and the user is a major issue to manipulate a rendered object in a convenient way. Microsoft’s HoloLens (Microsoft 2018) is an innovative augmented reality (AR) device that has provided an impressive experience for the user. However, the gesture interactions offered to the user are very limited. HoloLens currently recognizes two core component gestures: Air tap and Bloom. To solve this issue, we propose to integrate a Leap Motion Controller (LMC) within the HoloLens device (Figure 1). We thus used 3D hand and finger tracking provided by the LMC to propose new free hand-based interaction more natural and intuitive. We implemented three fully 3D techniques for selection, translation and rotation manipulation. In this work, we first investigated how to combine the two devices to get them working together in real time, and then we evaluated the proposed 3D hand interactions.</dc:description>
</item>
<item>
<title>Evaluating Added Value of Augmented Reality to Assist Aeronautical Maintenance Workers - Experimentation on On-Field Use Case</title>
<link>http://hdl.handle.net/10985/16816</link>
<description>Evaluating Added Value of Augmented Reality to Assist Aeronautical Maintenance Workers - Experimentation on On-Field Use Case
LOIZEAU, Quentin; ABABSA, Fakhreddine; MERIENNE, Frédéric; DANGLADE, Florence
Augmented Reality (AR) technology facilitates interactions with information and understanding of complex situations. Aeronautical Maintenance combines complexity induced by the variety of products and constraints associated to aeronautic sector and the environment of maintenance. AR tools seem well indicated to solve constraints of productivity and quality on the aeronautical maintenance activities by simplifying data interactions for the workers. However, few evaluations of AR have been done in real processes due to the difficulty of integrating the technology without proper tools for deployment and assessing the results. This paper proposes a method to select suitable criteria for AR evaluation in industrial environment and to deploy AR solutions suited to assist maintenance workers. These are used to set up on-field experiments that demonstrate benefits of AR on process and user point of view for different profiles of workers. Further work will consist on using these elements to extend results to AR evaluation on the whole aeronautical maintenance process. A classification of maintenance activities linked to workers specific needs will lead to prediction of the value that augmented reality would bring to each activity.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/16816</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>LOIZEAU, Quentin</dc:creator>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>MERIENNE, Frédéric</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:description>Augmented Reality (AR) technology facilitates interactions with information and understanding of complex situations. Aeronautical Maintenance combines complexity induced by the variety of products and constraints associated to aeronautic sector and the environment of maintenance. AR tools seem well indicated to solve constraints of productivity and quality on the aeronautical maintenance activities by simplifying data interactions for the workers. However, few evaluations of AR have been done in real processes due to the difficulty of integrating the technology without proper tools for deployment and assessing the results. This paper proposes a method to select suitable criteria for AR evaluation in industrial environment and to deploy AR solutions suited to assist maintenance workers. These are used to set up on-field experiments that demonstrate benefits of AR on process and user point of view for different profiles of workers. Further work will consist on using these elements to extend results to AR evaluation on the whole aeronautical maintenance process. A classification of maintenance activities linked to workers specific needs will lead to prediction of the value that augmented reality would bring to each activity.</dc:description>
</item>
<item>
<title>Combining HoloLens and Leap-Motion for Free Hand-Based 3D Interaction in MR Environments</title>
<link>http://hdl.handle.net/10985/19485</link>
<description>Combining HoloLens and Leap-Motion for Free Hand-Based 3D Interaction in MR Environments
ABABSA, Fakhreddine; HE, Junhui; CHARDONNET, Jean-Rémy
The ability to interact with virtual objects using gestures would allow users to improve their experience in Mixed Reality (MR) environments, especially when they use AR headsets. Today, MR head-mounted displays like the HoloLens integrate hand gesture based interaction allowing users to take actions in MR environments. However, the proposed interactions remain limited. In this paper, we propose to combine a Leap Motion Controller (LMC) with a HoloLens in order to improve gesture interaction with virtual objects. Two main issues are presented: an interactive calibration procedure for the coupled HoloLens-LMC device and an intuitive hand-based interaction approach using LMC data in the HoloLens environment. A set of first experiments was carried out to evaluate the accuracy and the usability of the proposed approach.
</description>
<pubDate>Wed, 01 Jan 2020 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/19485</guid>
<dc:date>2020-01-01T00:00:00Z</dc:date>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>HE, Junhui</dc:creator>
<dc:creator>CHARDONNET, Jean-Rémy</dc:creator>
<dc:description>The ability to interact with virtual objects using gestures would allow users to improve their experience in Mixed Reality (MR) environments, especially when they use AR headsets. Today, MR head-mounted displays like the HoloLens integrate hand gesture based interaction allowing users to take actions in MR environments. However, the proposed interactions remain limited. In this paper, we propose to combine a Leap Motion Controller (LMC) with a HoloLens in order to improve gesture interaction with virtual objects. Two main issues are presented: an interactive calibration procedure for the coupled HoloLens-LMC device and an intuitive hand-based interaction approach using LMC data in the HoloLens environment. A set of first experiments was carried out to evaluate the accuracy and the usability of the proposed approach.</dc:description>
</item>
<item>
<title>3D Human Tracking with Catadioptric Omnidirectional Camera</title>
<link>http://hdl.handle.net/10985/16195</link>
<description>3D Human Tracking with Catadioptric Omnidirectional Camera
ABABSA, Fakhreddine; HADJ-ABDELKADER, Hicham; BOUI, Marouane
This paper deals with the problem of 3D human tracking in catadioptric images using particle-filtering framework. While traditional perspective images are well exploited, only a few methods have been developed for catadioptric vision, for the human detection or tracking problems. We propose to extend the 3D pose estimation in the case of perspective cameras to catadioptric sensors. In this paper, we develop an original likelihood functions based, on the one hand, on the geodetic distance in the spherical space SO3 and, on the other hand, on the mapping between the human silhouette in the images and the projected 3D model. These likelihood functions combined with a particle filter, whose propagation model is adapted to the spherical space, allow accurate 3D human tracking in omnidirectional images. Both visual and quantitative analysis of the experimental results demonstrate the effectiveness of our approach.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/16195</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>HADJ-ABDELKADER, Hicham</dc:creator>
<dc:creator>BOUI, Marouane</dc:creator>
<dc:description>This paper deals with the problem of 3D human tracking in catadioptric images using particle-filtering framework. While traditional perspective images are well exploited, only a few methods have been developed for catadioptric vision, for the human detection or tracking problems. We propose to extend the 3D pose estimation in the case of perspective cameras to catadioptric sensors. In this paper, we develop an original likelihood functions based, on the one hand, on the geodetic distance in the spherical space SO3 and, on the other hand, on the mapping between the human silhouette in the images and the projected 3D model. These likelihood functions combined with a particle filter, whose propagation model is adapted to the spherical space, allow accurate 3D human tracking in omnidirectional images. Both visual and quantitative analysis of the experimental results demonstrate the effectiveness of our approach.</dc:description>
</item>
<item>
<title>Methodology for the Field Evaluation of the Impact of Augmented Reality Tools for Maintenance Workers in the Aeronautic Industry</title>
<link>http://hdl.handle.net/10985/20022</link>
<description>Methodology for the Field Evaluation of the Impact of Augmented Reality Tools for Maintenance Workers in the Aeronautic Industry
LOIZEAU, Quentin; ABABSA, Fakhreddine; MERIENNE, Frédéric; DANGLADE, Florence
Augmented Reality (AR) enhances the comprehension of complex situations by making the handling of contextual information easier. Maintenance activities in aeronautics consist of complex tasks carried out on various high-technology products under severe constraints from the sector and work environment. AR tools appear to be a potential solution to improve interactions between workers and technical data to increase the productivity and the quality of aeronautical maintenance activities. However, assessments of the actual impact of AR on industrial processes are limited due to a lack of methods and tools to assist in the integration and evaluation of AR tools in the field. This paper presents a method for deploying AR tools adapted to maintenance workers and for selecting relevant evaluation criteria of the impact in an industrial context. This method is applied to design an AR tool for the maintenance workshop, to experiment on real use cases, and to observe the impact of AR on productivity and user satisfaction for all worker profiles. Further work aims to generalize the results to the whole maintenance process in the aeronautical industry. The use of the collected data should enable the prediction of the impact of AR for related maintenance activities.
</description>
<pubDate>Fri, 01 Jan 2021 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/20022</guid>
<dc:date>2021-01-01T00:00:00Z</dc:date>
<dc:creator>LOIZEAU, Quentin</dc:creator>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>MERIENNE, Frédéric</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:description>Augmented Reality (AR) enhances the comprehension of complex situations by making the handling of contextual information easier. Maintenance activities in aeronautics consist of complex tasks carried out on various high-technology products under severe constraints from the sector and work environment. AR tools appear to be a potential solution to improve interactions between workers and technical data to increase the productivity and the quality of aeronautical maintenance activities. However, assessments of the actual impact of AR on industrial processes are limited due to a lack of methods and tools to assist in the integration and evaluation of AR tools in the field. This paper presents a method for deploying AR tools adapted to maintenance workers and for selecting relevant evaluation criteria of the impact in an industrial context. This method is applied to design an AR tool for the maintenance workshop, to experiment on real use cases, and to observe the impact of AR on productivity and user satisfaction for all worker profiles. Further work aims to generalize the results to the whole maintenance process in the aeronautical industry. The use of the collected data should enable the prediction of the impact of AR for related maintenance activities.</dc:description>
</item>
<item>
<title>An Efficient Human Activity Recognition Technique Based on Deep Learning</title>
<link>http://hdl.handle.net/10985/18281</link>
<description>An Efficient Human Activity Recognition Technique Based on Deep Learning
KHELALEF, Aziz; ABABSA, Fakhreddine; BENOUDJIT, Nabil
In this paper, we present a new deep learning-based human activity recognition technique. First, we track and extract human body from each frame of the video stream. Next, we abstract human silhouettes and use them to create binary space-time maps (BSTMs) which summarize human activity within a defined time interval. Finally, we use convolutional neural network (CNN) to extract features from BSTMs and classify the activities. To evaluate our approach, we carried out several tests using three public datasets: Weizmann, Keck Gesture and KTH Database. Experimental results show that our technique outperforms conventional state-of-the-art methods in term of recognition accuracy and provides comparable performance against recent deep learning techniques. It’s simple to implement, requires less computing power, and can be used for multi-subject activity recognition.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/18281</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>KHELALEF, Aziz</dc:creator>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>BENOUDJIT, Nabil</dc:creator>
<dc:description>In this paper, we present a new deep learning-based human activity recognition technique. First, we track and extract human body from each frame of the video stream. Next, we abstract human silhouettes and use them to create binary space-time maps (BSTMs) which summarize human activity within a defined time interval. Finally, we use convolutional neural network (CNN) to extract features from BSTMs and classify the activities. To evaluate our approach, we carried out several tests using three public datasets: Weizmann, Keck Gesture and KTH Database. Experimental results show that our technique outperforms conventional state-of-the-art methods in term of recognition accuracy and provides comparable performance against recent deep learning techniques. It’s simple to implement, requires less computing power, and can be used for multi-subject activity recognition.</dc:description>
</item>
<item>
<title>Augmented Reality Application in Manufacturing Industry: Maintenance and Non-destructive Testing (NDT) Use Cases</title>
<link>http://hdl.handle.net/10985/19418</link>
<description>Augmented Reality Application in Manufacturing Industry: Maintenance and Non-destructive Testing (NDT) Use Cases
ABABSA, Fakhreddine
In recent years, a structural transformation of the manufacturing industry has been occurring as a result of the digital revolution. Thus, digital tools are now systematically used throughout the entire value chain, from design to production to marketing, especially virtual and augmented reality. Therefore, the purpose of this paper is to review, through concrete use cases, the progress of these novel technologies and their use in the manufacturing industry.
</description>
<pubDate>Wed, 01 Jan 2020 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/19418</guid>
<dc:date>2020-01-01T00:00:00Z</dc:date>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:description>In recent years, a structural transformation of the manufacturing industry has been occurring as a result of the digital revolution. Thus, digital tools are now systematically used throughout the entire value chain, from design to production to marketing, especially virtual and augmented reality. Therefore, the purpose of this paper is to review, through concrete use cases, the progress of these novel technologies and their use in the manufacturing industry.</dc:description>
</item>
</channel>
</rss>
