<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" version="2.0">
<channel>
<title>SAM</title>
<link>https://sam.ensam.eu:443</link>
<description>The DSpace digital repository system captures, stores, indexes, preserves, and distributes digital research material.</description>
<pubDate xmlns="http://apache.org/cocoon/i18n/2.1">Tue, 10 Mar 2026 17:06:14 GMT</pubDate>
<dc:date>2026-03-10T17:06:14Z</dc:date>
<item>
<title>Prediction of CAD model defeaturing impact on heat transfer FEA results using machine learning techniques</title>
<link>http://hdl.handle.net/10985/11381</link>
<description>Prediction of CAD model defeaturing impact on heat transfer FEA results using machine learning techniques
FINE, Lionel; PERNOT, Jean-Philippe; DANGLADE, Florence; VERON, Philippe
Essential when adapting CAD model for finite element analysis, the defeaturing ensures the feasibility of the simulation and reduces the computation time. Processes for CAD model preparation and defeaturing tools exist but are not always clearly formalized. In this paper, we propose an approach that uses machine learning techniques to design an indicator that predicts the defeaturing impact on the quality of analysis results for heat transfer simulation. The expertise knowledge is embedded in examples of defeaturing process and analysis, which will be used to find an algorithm able to predict a performance indicator. This indicator provides help in decision making to identify features candidates to defeaturing.
</description>
<pubDate>Wed, 01 Jan 2014 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/11381</guid>
<dc:date>2014-01-01T00:00:00Z</dc:date>
<dc:creator>FINE, Lionel</dc:creator>
<dc:creator>PERNOT, Jean-Philippe</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:creator>VERON, Philippe</dc:creator>
<dc:description>Essential when adapting CAD model for finite element analysis, the defeaturing ensures the feasibility of the simulation and reduces the computation time. Processes for CAD model preparation and defeaturing tools exist but are not always clearly formalized. In this paper, we propose an approach that uses machine learning techniques to design an indicator that predicts the defeaturing impact on the quality of analysis results for heat transfer simulation. The expertise knowledge is embedded in examples of defeaturing process and analysis, which will be used to find an algorithm able to predict a performance indicator. This indicator provides help in decision making to identify features candidates to defeaturing.</dc:description>
</item>
<item>
<title>Evaluating Added Value of Augmented Reality to Assist Aeronautical Maintenance Workers - Experimentation on On-Field Use Case</title>
<link>http://hdl.handle.net/10985/16816</link>
<description>Evaluating Added Value of Augmented Reality to Assist Aeronautical Maintenance Workers - Experimentation on On-Field Use Case
LOIZEAU, Quentin; ABABSA, Fakhreddine; MERIENNE, Frédéric; DANGLADE, Florence
Augmented Reality (AR) technology facilitates interactions with information and understanding of complex situations. Aeronautical Maintenance combines complexity induced by the variety of products and constraints associated to aeronautic sector and the environment of maintenance. AR tools seem well indicated to solve constraints of productivity and quality on the aeronautical maintenance activities by simplifying data interactions for the workers. However, few evaluations of AR have been done in real processes due to the difficulty of integrating the technology without proper tools for deployment and assessing the results. This paper proposes a method to select suitable criteria for AR evaluation in industrial environment and to deploy AR solutions suited to assist maintenance workers. These are used to set up on-field experiments that demonstrate benefits of AR on process and user point of view for different profiles of workers. Further work will consist on using these elements to extend results to AR evaluation on the whole aeronautical maintenance process. A classification of maintenance activities linked to workers specific needs will lead to prediction of the value that augmented reality would bring to each activity.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/16816</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>LOIZEAU, Quentin</dc:creator>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>MERIENNE, Frédéric</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:description>Augmented Reality (AR) technology facilitates interactions with information and understanding of complex situations. Aeronautical Maintenance combines complexity induced by the variety of products and constraints associated to aeronautic sector and the environment of maintenance. AR tools seem well indicated to solve constraints of productivity and quality on the aeronautical maintenance activities by simplifying data interactions for the workers. However, few evaluations of AR have been done in real processes due to the difficulty of integrating the technology without proper tools for deployment and assessing the results. This paper proposes a method to select suitable criteria for AR evaluation in industrial environment and to deploy AR solutions suited to assist maintenance workers. These are used to set up on-field experiments that demonstrate benefits of AR on process and user point of view for different profiles of workers. Further work will consist on using these elements to extend results to AR evaluation on the whole aeronautical maintenance process. A classification of maintenance activities linked to workers specific needs will lead to prediction of the value that augmented reality would bring to each activity.</dc:description>
</item>
<item>
<title>A Virtual Reality and BIM Approach for Clash Resolution</title>
<link>http://hdl.handle.net/10985/17739</link>
<description>A Virtual Reality and BIM Approach for Clash Resolution
RAIMBAUD, Pierre; BONILLA PALACIOS, Mateo; ROMERO CORTES, Juan Pablo; FIGUEROA, Pablo; HERNANDEZ, José Tiberio; MERIENNE, Frédéric; DANGLADE, Florence; LOU, Ruding
In the Architecture, Construction and Engineering (AEC) industry, a crucial task is Building Information Modelling (BIM) models coordination. Clashes can be detected automatically by current BIM tools. Clash origins (parn et al., 2018), or avoidance (Singh et al., 2015) have been studied. But, clash resolution still needs the civil engineers’ expertise. Currently, in a computer with a 3D BIM tool, they use annotations. As previous research showed that Virtual Reality (VR) can help to perform better AEC tasks, in terms of time and accuracy (Chalhoup and Ayer, 2018), we propose an immersive VR tool to solve clashes.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/17739</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>RAIMBAUD, Pierre</dc:creator>
<dc:creator>BONILLA PALACIOS, Mateo</dc:creator>
<dc:creator>ROMERO CORTES, Juan Pablo</dc:creator>
<dc:creator>FIGUEROA, Pablo</dc:creator>
<dc:creator>HERNANDEZ, José Tiberio</dc:creator>
<dc:creator>MERIENNE, Frédéric</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:creator>LOU, Ruding</dc:creator>
<dc:description>In the Architecture, Construction and Engineering (AEC) industry, a crucial task is Building Information Modelling (BIM) models coordination. Clashes can be detected automatically by current BIM tools. Clash origins (parn et al., 2018), or avoidance (Singh et al., 2015) have been studied. But, clash resolution still needs the civil engineers’ expertise. Currently, in a computer with a 3D BIM tool, they use annotations. As previous research showed that Virtual Reality (VR) can help to perform better AEC tasks, in terms of time and accuracy (Chalhoup and Ayer, 2018), we propose an immersive VR tool to solve clashes.</dc:description>
</item>
<item>
<title>A priori evaluation of simulation models preparation processes using artificial intelligence techniques</title>
<link>http://hdl.handle.net/10985/14315</link>
<description>A priori evaluation of simulation models preparation processes using artificial intelligence techniques
FINE, Lionel; PERNOT, Jean-Philippe; DANGLADE, Florence; VERON, Philippe
Controlling the well-known triptych costs, quality and time during the different phases of the Product Development Process (PDP) is an everlasting challenge for the industry. Among the numerous issues that are to be addressed, the development of new methods and tools to adapt to the various needs the models used all along the PDP is certainly one of the most challenging and promising improvement area. This is particularly true for the adaptation of Computer-Aided Design (CAD) models to Computer-Aided Engineering (CAE) applications, and notably during the CAD models simplification steps. Today, even if methods and tools exist, such a preparation phase still requires a deep knowledge and a huge amount of time when considering Digital Mock-Up (DMU) composed of several hundreds of thousands of parts. Thus, being able to estimate a priori the impact of DMU adaptation scenarios on the simulation results would help identifying the best scenario right from the beginning. This paper addresses such a difficult problem and uses artificial intelligence (AI) techniques to learn and accurately predict behaviours from carefully selected examples. The main idea is to identify rules from these examples used as inputs of learning algorithms. Once those rules obtained, they can be used on a new case to a priori estimate the impact of a preparation process without having to perform it. To reach this objective, a method to build a representative database of examples has been developed, the right input (explanatory) and output (preparation process quality criteria) variables have been identified, then the learning model and its associated control parameters have been tuned. One challenge was to identify explanatory variables from geometrical key characteristics and data characterizing the preparation processes. A second challenge was to build a effective learning model despite a limited number of examples. The rules linking the output variables to the input ones are obtained using AI techniques such as well-known neural networks and decision trees. The proposed approach is illustrated and validated on industrial examples in the context of computational fluid dynamics simulations.
</description>
<pubDate>Sun, 01 Jan 2017 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/14315</guid>
<dc:date>2017-01-01T00:00:00Z</dc:date>
<dc:creator>FINE, Lionel</dc:creator>
<dc:creator>PERNOT, Jean-Philippe</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:creator>VERON, Philippe</dc:creator>
<dc:description>Controlling the well-known triptych costs, quality and time during the different phases of the Product Development Process (PDP) is an everlasting challenge for the industry. Among the numerous issues that are to be addressed, the development of new methods and tools to adapt to the various needs the models used all along the PDP is certainly one of the most challenging and promising improvement area. This is particularly true for the adaptation of Computer-Aided Design (CAD) models to Computer-Aided Engineering (CAE) applications, and notably during the CAD models simplification steps. Today, even if methods and tools exist, such a preparation phase still requires a deep knowledge and a huge amount of time when considering Digital Mock-Up (DMU) composed of several hundreds of thousands of parts. Thus, being able to estimate a priori the impact of DMU adaptation scenarios on the simulation results would help identifying the best scenario right from the beginning. This paper addresses such a difficult problem and uses artificial intelligence (AI) techniques to learn and accurately predict behaviours from carefully selected examples. The main idea is to identify rules from these examples used as inputs of learning algorithms. Once those rules obtained, they can be used on a new case to a priori estimate the impact of a preparation process without having to perform it. To reach this objective, a method to build a representative database of examples has been developed, the right input (explanatory) and output (preparation process quality criteria) variables have been identified, then the learning model and its associated control parameters have been tuned. One challenge was to identify explanatory variables from geometrical key characteristics and data characterizing the preparation processes. A second challenge was to build a effective learning model despite a limited number of examples. The rules linking the output variables to the input ones are obtained using AI techniques such as well-known neural networks and decision trees. The proposed approach is illustrated and validated on industrial examples in the context of computational fluid dynamics simulations.</dc:description>
</item>
<item>
<title>BIM-based mixed reality environments to improve AEC task performance</title>
<link>http://hdl.handle.net/10985/17738</link>
<description>BIM-based mixed reality environments to improve AEC task performance
RAIMBAUD, Pierre; FIGUEROA, Pablo; HERNANDEZ, José Tiberio; MERIENNE, Frédéric; DANGLADE, Florence; LOU, Ruding
The Building Information Modelling (BIM) currently contributes to deeply modify the Architecture, Construction and Engineering (AEC) industry by improving data management, task planning, and architecture design etc. Nevertheless, other technologies have also joined this revolution, with the aim of allowing experts to perform better their tasks with them than with only the BIM, particularly mixed reality (MR). However, MR applications can take very diverse forms, because of the multiple design choice possibilities: multiple data sources from the BIM (3D model, worksite monitoring, simulations...), multiple possibilities of visualisations in MR (visual effects, 4D...) and multiple MR interactions (move, write, say, grasp...). Behind MR application design choices, there is a task for which the application has been created. Yet, having BIM-based MR environments that really allow to respond to the original need and that improve task performance is a current difficulty. In this paper, we present our proposal of a methodology for going from BIM to BIM-based mixed reality environments. Our inputs are the AEC tasks which are likely to benefit from being performed in a mixed reality environment, their performance measures (efficiency and effectiveness), and BIM data. Our target is to provide BIM-based mixed reality environments that support specific AEC tasks, and to prove thanks to appropriate indicators that the task performance has improved in MR compared to traditional methods. Thus, we present here the results from our first case studies and their impact on the methodology evolution. Finally, our ongoing and future works are discussed in the last sections.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/17738</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>RAIMBAUD, Pierre</dc:creator>
<dc:creator>FIGUEROA, Pablo</dc:creator>
<dc:creator>HERNANDEZ, José Tiberio</dc:creator>
<dc:creator>MERIENNE, Frédéric</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:creator>LOU, Ruding</dc:creator>
<dc:description>The Building Information Modelling (BIM) currently contributes to deeply modify the Architecture, Construction and Engineering (AEC) industry by improving data management, task planning, and architecture design etc. Nevertheless, other technologies have also joined this revolution, with the aim of allowing experts to perform better their tasks with them than with only the BIM, particularly mixed reality (MR). However, MR applications can take very diverse forms, because of the multiple design choice possibilities: multiple data sources from the BIM (3D model, worksite monitoring, simulations...), multiple possibilities of visualisations in MR (visual effects, 4D...) and multiple MR interactions (move, write, say, grasp...). Behind MR application design choices, there is a task for which the application has been created. Yet, having BIM-based MR environments that really allow to respond to the original need and that improve task performance is a current difficulty. In this paper, we present our proposal of a methodology for going from BIM to BIM-based mixed reality environments. Our inputs are the AEC tasks which are likely to benefit from being performed in a mixed reality environment, their performance measures (efficiency and effectiveness), and BIM data. Our target is to provide BIM-based mixed reality environments that support specific AEC tasks, and to prove thanks to appropriate indicators that the task performance has improved in MR compared to traditional methods. Thus, we present here the results from our first case studies and their impact on the methodology evolution. Finally, our ongoing and future works are discussed in the last sections.</dc:description>
</item>
<item>
<title>Identification of explanatory variables for DMU preparation process evaluation by using machine learning techniques</title>
<link>http://hdl.handle.net/10985/17018</link>
<description>Identification of explanatory variables for DMU preparation process evaluation by using machine learning techniques
FINE, Lionel; PERNOT, Jean-Philippe; DANGLADE, Florence; VERON, Philippe
Being able to estimate a priori the impact of DMU preparation scenarios for a dedicated activity would help identifying the best scenario from the beginning. Machine learning techniques are a means to a priori evaluate a DMU preparation process without to perform it by predicting its criteria of evaluation. For that, a representative database of examples must be developed that contains the right explanative and output variables. However, the key explanative variables are not clearly identified. This paper proposes a method for the selection of the most significant explanatory variables among all the database variables. In addition to using these variables for learning, this will allow to formalize the knowledge.
</description>
<pubDate>Fri, 01 Jan 2016 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/17018</guid>
<dc:date>2016-01-01T00:00:00Z</dc:date>
<dc:creator>FINE, Lionel</dc:creator>
<dc:creator>PERNOT, Jean-Philippe</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:creator>VERON, Philippe</dc:creator>
<dc:description>Being able to estimate a priori the impact of DMU preparation scenarios for a dedicated activity would help identifying the best scenario from the beginning. Machine learning techniques are a means to a priori evaluate a DMU preparation process without to perform it by predicting its criteria of evaluation. For that, a representative database of examples must be developed that contains the right explanative and output variables. However, the key explanative variables are not clearly identified. This paper proposes a method for the selection of the most significant explanatory variables among all the database variables. In addition to using these variables for learning, this will allow to formalize the knowledge.</dc:description>
</item>
<item>
<title>BIM-based Mixed Reality Application for Supervision of Construction</title>
<link>http://hdl.handle.net/10985/17238</link>
<description>BIM-based Mixed Reality Application for Supervision of Construction
RAIMBAUD, Pierre; FIGUEROA, Pablo; HERNANDEZ, Jose Tiberio; MERIENNE, Frédéric; DANGLADE, Florence; LOU, Ruding
Building Information Modelling (BIM) is an up-and-coming methodology and technology used in the Architecture, Engineering and Construction (AEC) industry, that allows data centralization and stakeholders' collaboration. But to check the accuracy of the work done on the worksite, it is necessary first to go on site and then to modify the BIM model. This paper presents a mixed reality (MR) application based on BIM data and drone videos, allowing off-site construction supervision. It permits to make annotations about differences between what has been planned in BIM and what has been built, using superimposition of the two sources. Then these ones can be transferred to the BIM model for corrections. Finally, we evaluate our work with building construction experts, providing to them a questionnaire to grade the application and to get feedback. Our major result is that as for them the application does really help to do construction supervisions; however, they suggest that the application should provide more interactions with the 3D model and with the videos.
</description>
<pubDate>Tue, 01 Jan 2019 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/17238</guid>
<dc:date>2019-01-01T00:00:00Z</dc:date>
<dc:creator>RAIMBAUD, Pierre</dc:creator>
<dc:creator>FIGUEROA, Pablo</dc:creator>
<dc:creator>HERNANDEZ, Jose Tiberio</dc:creator>
<dc:creator>MERIENNE, Frédéric</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:creator>LOU, Ruding</dc:creator>
<dc:description>Building Information Modelling (BIM) is an up-and-coming methodology and technology used in the Architecture, Engineering and Construction (AEC) industry, that allows data centralization and stakeholders' collaboration. But to check the accuracy of the work done on the worksite, it is necessary first to go on site and then to modify the BIM model. This paper presents a mixed reality (MR) application based on BIM data and drone videos, allowing off-site construction supervision. It permits to make annotations about differences between what has been planned in BIM and what has been built, using superimposition of the two sources. Then these ones can be transferred to the BIM model for corrections. Finally, we evaluate our work with building construction experts, providing to them a questionnaire to grade the application and to get feedback. Our major result is that as for them the application does really help to do construction supervisions; however, they suggest that the application should provide more interactions with the 3D model and with the videos.</dc:description>
</item>
<item>
<title>Environment Spatial Restitution for Remote Physical AR Collaboration</title>
<link>http://hdl.handle.net/10985/26441</link>
<description>Environment Spatial Restitution for Remote Physical AR Collaboration
CABY, Bruno; BATAILLE, Guillaume; DANGLADE, Florence; CHARDONNET, Jean-Rémy
The emergence of spatial immersive technologies allows new ways to collaborate remotely. However, they still need to be studied and enhanced in order to improve their effectiveness and usability for collaborators. Remote Physical Collaborative Extended Reality (RPC-XR) consists in solving augmented physical tasks with the help of remote collaborators. This paper presents our RPC-AR system and a user study evaluating this system during a network hardware assembly task. Our system offers verbal and non-verbal interpersonal communication functionalities. Users embody avatars and interact with their remote collaborators thanks to hand, head and eye tracking, and voice. Our system also captures an environment spatially, in real-time and renders it in a shared virtual space. We designed it to be lightweight and to avoid instrumenting collaborative environments and preliminary steps. It performs capture, transmission and remote rendering of real environments in less than 250ms. We ran a cascading user study to compare our system with a commercial 2D video collaborative application. We measured mutual awareness, task load, usability and task performance. We present an adapted Uncanny Valley questionnaire to compare the perception of remote environments between systems. We found that our application resulted in better empathy between collaborators, a higher cognitive load and a lower level of usability, remaining acceptable, to the remote user. We did not observe any significant difference in performance. These results are encouraging, as participants' observations provide insights to further improve the performance and usability of RPC-AR.
</description>
<pubDate>Thu, 01 May 2025 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/26441</guid>
<dc:date>2025-05-01T00:00:00Z</dc:date>
<dc:creator>CABY, Bruno</dc:creator>
<dc:creator>BATAILLE, Guillaume</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:creator>CHARDONNET, Jean-Rémy</dc:creator>
<dc:description>The emergence of spatial immersive technologies allows new ways to collaborate remotely. However, they still need to be studied and enhanced in order to improve their effectiveness and usability for collaborators. Remote Physical Collaborative Extended Reality (RPC-XR) consists in solving augmented physical tasks with the help of remote collaborators. This paper presents our RPC-AR system and a user study evaluating this system during a network hardware assembly task. Our system offers verbal and non-verbal interpersonal communication functionalities. Users embody avatars and interact with their remote collaborators thanks to hand, head and eye tracking, and voice. Our system also captures an environment spatially, in real-time and renders it in a shared virtual space. We designed it to be lightweight and to avoid instrumenting collaborative environments and preliminary steps. It performs capture, transmission and remote rendering of real environments in less than 250ms. We ran a cascading user study to compare our system with a commercial 2D video collaborative application. We measured mutual awareness, task load, usability and task performance. We present an adapted Uncanny Valley questionnaire to compare the perception of remote environments between systems. We found that our application resulted in better empathy between collaborators, a higher cognitive load and a lower level of usability, remaining acceptable, to the remote user. We did not observe any significant difference in performance. These results are encouraging, as participants' observations provide insights to further improve the performance and usability of RPC-AR.</dc:description>
</item>
<item>
<title>Identification of key evaluation criteria for co-development of XR applications in industrial contexts</title>
<link>http://hdl.handle.net/10985/26763</link>
<description>Identification of key evaluation criteria for co-development of XR applications in industrial contexts
MAAROUFI, Fahd; DANGLADE, Florence; CHARDONNET, Jean-Rémy
Extended Reality (XR) technologies—including Virtual, Augmented, and Mixed Reality—offer useful possibilities for improving industrial processes such as training, design validation, maintenance, and quality control. However, their adoption in industry remains limited, often because existing solutions do not fully meet practical needs. This study focuses on the co-development of XR applications better adapted to industrial requirements. In a first phase, a literature review helped identify 37 evaluation criteria, grouped into eight categories. Definitions were refined based on expert input. In a second phase, over 20 professionals from different industrial sectors assessed the relevance of each criterion using a 5-point Likert scale and a ranking method. The results showed differences between academic and industrial perspectives. While academic work often highlights technical or sensory aspects, indus-trial stakeholders emphasized usability, relevance of content, and potential for innovation. These find-ings provide a clearer view of industry expectations and will inform future development of XR tools.
</description>
<pubDate>Thu, 03 Jul 2025 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/26763</guid>
<dc:date>2025-07-03T00:00:00Z</dc:date>
<dc:creator>MAAROUFI, Fahd</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:creator>CHARDONNET, Jean-Rémy</dc:creator>
<dc:description>Extended Reality (XR) technologies—including Virtual, Augmented, and Mixed Reality—offer useful possibilities for improving industrial processes such as training, design validation, maintenance, and quality control. However, their adoption in industry remains limited, often because existing solutions do not fully meet practical needs. This study focuses on the co-development of XR applications better adapted to industrial requirements. In a first phase, a literature review helped identify 37 evaluation criteria, grouped into eight categories. Definitions were refined based on expert input. In a second phase, over 20 professionals from different industrial sectors assessed the relevance of each criterion using a 5-point Likert scale and a ranking method. The results showed differences between academic and industrial perspectives. While academic work often highlights technical or sensory aspects, indus-trial stakeholders emphasized usability, relevance of content, and potential for innovation. These find-ings provide a clearer view of industry expectations and will inform future development of XR tools.</dc:description>
</item>
<item>
<title>Methodology for the Field Evaluation of the Impact of Augmented Reality Tools for Maintenance Workers in the Aeronautic Industry</title>
<link>http://hdl.handle.net/10985/20022</link>
<description>Methodology for the Field Evaluation of the Impact of Augmented Reality Tools for Maintenance Workers in the Aeronautic Industry
LOIZEAU, Quentin; ABABSA, Fakhreddine; MERIENNE, Frédéric; DANGLADE, Florence
Augmented Reality (AR) enhances the comprehension of complex situations by making the handling of contextual information easier. Maintenance activities in aeronautics consist of complex tasks carried out on various high-technology products under severe constraints from the sector and work environment. AR tools appear to be a potential solution to improve interactions between workers and technical data to increase the productivity and the quality of aeronautical maintenance activities. However, assessments of the actual impact of AR on industrial processes are limited due to a lack of methods and tools to assist in the integration and evaluation of AR tools in the field. This paper presents a method for deploying AR tools adapted to maintenance workers and for selecting relevant evaluation criteria of the impact in an industrial context. This method is applied to design an AR tool for the maintenance workshop, to experiment on real use cases, and to observe the impact of AR on productivity and user satisfaction for all worker profiles. Further work aims to generalize the results to the whole maintenance process in the aeronautical industry. The use of the collected data should enable the prediction of the impact of AR for related maintenance activities.
</description>
<pubDate>Fri, 01 Jan 2021 00:00:00 GMT</pubDate>
<guid isPermaLink="false">http://hdl.handle.net/10985/20022</guid>
<dc:date>2021-01-01T00:00:00Z</dc:date>
<dc:creator>LOIZEAU, Quentin</dc:creator>
<dc:creator>ABABSA, Fakhreddine</dc:creator>
<dc:creator>MERIENNE, Frédéric</dc:creator>
<dc:creator>DANGLADE, Florence</dc:creator>
<dc:description>Augmented Reality (AR) enhances the comprehension of complex situations by making the handling of contextual information easier. Maintenance activities in aeronautics consist of complex tasks carried out on various high-technology products under severe constraints from the sector and work environment. AR tools appear to be a potential solution to improve interactions between workers and technical data to increase the productivity and the quality of aeronautical maintenance activities. However, assessments of the actual impact of AR on industrial processes are limited due to a lack of methods and tools to assist in the integration and evaluation of AR tools in the field. This paper presents a method for deploying AR tools adapted to maintenance workers and for selecting relevant evaluation criteria of the impact in an industrial context. This method is applied to design an AR tool for the maintenance workshop, to experiment on real use cases, and to observe the impact of AR on productivity and user satisfaction for all worker profiles. Further work aims to generalize the results to the whole maintenance process in the aeronautical industry. The use of the collected data should enable the prediction of the impact of AR for related maintenance activities.</dc:description>
</item>
</channel>
</rss>
