Virtual and Augmented Reality Applications in Manufacturing

Virtual and Augmented Reality Applications in Manufacturing

7th IFAC Conference on Manufacturing Modelling, Management, and Control International Federation of Automatic Control June 19-21, 2013. Saint Petersbu...

9MB Sizes 136 Downloads 347 Views

7th IFAC Conference on Manufacturing Modelling, Management, and Control International Federation of Automatic Control June 19-21, 2013. Saint Petersburg, Russia

Virtual and Augmented Reality Applications in Manufacturing A.Y.C. Nee1, S.K. Ong2 Mechanical Engineering Department, National University of Singapore, Singapore (Tel: 65-65162892; e-mail: {1mpeneeyc/[email protected]}) Abstract: Augmented Reality (AR) is a fast rising technology and it has been applied in many fields such as gaming, learning, entertainment, medical, military, sports, etc. This paper reviews some of the academic studies of AR applications in manufacturing operations. Comparatively, it is lesser addressed due to stringent requirements of high accuracy, fast response and the desirable alignment with industrial standards and practices such that the users will not find drastic transition when adopting this new technology. This paper looks into common manufacturing activities such as product design, robotics, facilities layout planning, maintenance, CNC machining simulation and assembly planning. Some of the issues and future trends of AR technology are also addressed. Keywords: Virtual reality; Augmented Reality; Manufacturing tary, and manufacturing (Ong and Nee, 2004, Yuan et al., 2008a).

1. INTRODUCTION Barely a century ago, manufacturing was known as the black art where most of the tools and technologies were primarily mechanical in nature. Mechanical moving elements were initially powered by steam and later by electric power. Elaborate overhead belt systems were used to provide power supply to each machine as it was more economical than having machines driven by individual power sources.

AR research in manufacturing applications is a strong and growing area. However, it faces a higher order of accuracy, response, and interface design. The challenge is to implement integrated AR-assisted simulation tools that could improve manufacturing operations, as well as product and process development, leading to faster learning and training, hence shorter lead-time, and consequently, reduced cost and improved quality.

In the 1950s, numerical controlled machine tools made a huge leap and since then, manufacturing had entered a new era. In the last several decades, due to the advancement of information technology, digital manufacturing has become a common platform worldwide. Computer-integrated manufacturing systems have eliminated data handling errors. Computer simulation using CAD modelling tools and finite element analysis has assisted manufacturing engineers to reach decisions faster and free from errors.

2. HARDWARE DEVICES AND SOFTWARE SYSTEMS 2.1 Hardware devices Head-mounted display (HMD) devices have been a popular choice when AR applications were first developed, as the eye-level display facilitates direct perception of the combined AR scene. HMD devices, however, are uncomfortable and may cause headache and dizziness, especially after prolonged usage.

Virtual reality as a simulation tool was first reported in the 1960s. Since then, many different forms had appeared, from 2D monitor-based to 3D immersive and sophisticated set up such as the CAVE. In just over two decades, augmented reality (AR) technology has matured and proven to be an innovative and effective tool to address some of the critical problems to simulate, guide and improve manufacturing processes before they are launched. Activities such as design, planning, machining, etc., can now be done right-the-firsttime without the need for subsequent re-work and modifications. Much like VR which is a great simulation tool, AR is a novel human-computer interaction tool that overlays computer-generated information on the real scene. The information display and image overlay are context-sensitive depending on the observed objects (Azuma et al., 2001). AR can be combined with human abilities to provide efficient and complementary tools to assist manufacturing tasks. Several successful demonstrations have been made in various domains such as: gaming, advertising, entertainment, medical, mili978-3-902823-35-9/2013 © IFAC

Current research in AR applications is towards mobility using handheld devices (HHD) either commercially available or specially designed (Hakkarainen et al/, 2008, Stutzman et al., 2009, Xin et al., 2008). The advantages of using HHD are quite obvious as high resolution camera, touch screen, gyroscope, etc., have already been embedded in these mobile devices. The use of off-the-shelf mobile phones in AR applications is on the rise. However, due to the limited processing and storage capabilities of the mobile phones, some researchers use a client-server architecture to improve real-time performance. Hakkarainen et al reported a study on the use of mobile phones for an AR-assisted assembly guidance system.

15

10.3182/20130619-3-RU-3018.00637

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

virtual prototyping, web-based virtual machining, assembly, fault diagnosis and learning, and various types of manufacturing operations. Advancement in computer and manufacturing technologies has provided suitable interfaces to allow users to interact directly with the manufacturing information associated with the manufacturing processes. AR can provide users with an intuitive way to interact directly with information in manufacturing processes. It also allows the operators to use their natural spatial processing abilities to obtain a sense of presence in the real world with virtual information.

A client-server architecture was designed using a mobile phone and a PC was used for computing CAD models, and the static rendered images are sent to the mobile phone for fast rendering. A similar set up was reported by Ha et al. using an Android mobile phone. Several AR applications use force feedback to enhance the immersive sensation of the user. Wearable datagloves such as in assembly, design, etc., have been studied (Valentini, 2009). Haptic devices for path planning of a virtual robot have been reported in (Chen et al., 2010).

Research on the manufacturing applications of VR and AR is definitely a strong and growing area, it has progressed much in the last ten years due to advances in both hardware and software. Hardware has become considerably smaller and more powerful while many efficient and robust algorithms have been developed to allow faster response, improved accuracy in tracking and registration.

2.2 Software systems The most important elements in any AR applications are tracking and registration. Precise tracking and registration allows the virtual and real objects to be aligned accurately. Current tracking and registration algorithms can be classified into marker-based, natural feature-based and model-based.

3.1 VR and AR research in design

Fiducial markers are popular as they have unique geometric patterns to allow easy detection and identification in a video stream. Marker-based tracking is suitable for a prepared environment where markers are placed a priori. ARToolKit (ARToolKit 2.11, 2011, Kato and Billinghurst, 1999) has the most well-known tracking library.

VR has been used in creating product design as it provides very intuitive interaction with the designers in terms of visualization and the interfacing with downstream processes. A hybrid immersive modelling environment merging desktopCAD was created by Stark et al (2010), Wiese et al (2009) and Israel et al (2009). They noted that the current modelling media using paper and CAD system is complementary but it lacks interaction. Digital media offers the great freedom of exploring different dimensions and features, using stored forms and shapes from the library, and the advantage of integrating a product model with associated physical properties. In addition, some downstream processes, such as process planning, machining, and inspection can be fully integrated. Fig. 1 shows the transformation of a sketch of a ceiling lamp to a final product using rapid prototyping techniques.

For an unprepared environment, natural feature tracking will be necessary. Current natural feature tracking is based on the robust point matching approach. Various feature descriptors, such as Binary Robust Independent Elementary Features (BRIEF), Speeded Up Robust Features (SURF), ferns features, Scale Invariant Feature Transform (SIFT) features, etc., have been explored in natural feature detection. For estimating camera pose in an unknown scene, Parallel Tracking and Mapping (PTAM) (Klein and Murray, 2007) is often used. By processing tracking and mapping in parallel threads and keyframe-based mapping, mapping of an unknown environment can be constructed with prominent features, allowing virtual objects to be registered onto the real world. Model-based tracking can be used to match detected features from a list of pre-created models. A model-based tracking library, Open Tracking Library (OpenTL) (Panin et al., 2008), provides good APIs and can handle multiple objects tracking. OpenTL uses multi-threading and GPU-based computing to achieve real-time performance. However, it is not developed specifically for AR applications, but for generalpurpose model-based object tracking.

Fig. 1. Example of a ceiling lamp from sketch (left) to the refined model (middle) and the final prototype (right) (Stark  et al., 2010). AR is becoming a major part of the prototyping process in product design in many industries, e.g., in the automotive industry, AR has been used for assessing interior design by overlaying different car interior mock-ups, which are usually only available as 3D-models in the initial phases of development, on real car bodies (Fründ et al., 2005). However, few systems can support product creation and modification in AR using 2D or 3D interaction tools.

3. MAJOR VR AND AR RESEARCH IN MANUFACTURING Many researchers in the manufacturing industries, academic institutes and universities have started exploring the use of AR technology in addressing some complex problems in manufacturing. Effective simulation before an actual operation will ensure that it can be carried out right-the-first-time, eliminating many trials and re-works, saving materials, energy and labour. VR applications have been well reported in

An AR-based mock-up system for design evaluation was presented by Park (2008). In this system, interactive modification of shapes as well as colors, textures, and user interfac16

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

es can be carried out. Physical mockups of handheld Media Players were used and with AR, a user is able to experiment with different features of the product such as color and interfaces (e.g., size of touch screens). A user could also decide on a design model that they want to evaluate and construct a mock-up by assembling specific components. On the other hand, everyday objects were used by Ng et al. (2010) to facilitate interactive design. An AR computer-aided design environment (ARCADE) has been developed to facilitate interactive design by a layman (Ng et al., 2010). 3.2 VR and AR in robotics Fig. 2. RPAR system architecture (Ong et al., 2010).

VR has been proven to be useful in medical robots for surgeries (Burdea, 1996), tele-robotics (Freund and Rossmann, 2005), welding (Liu et al., 2010), modeling of a six-DOF virtual robot arm (Chen et al., 2010), etc. In (Chen et al., 2010), the authors proposed a new Human Computer Interaction (HCI) method for VR-based robot path planning and virtual assembly systems. However, the main constraint in VR-based robot programming is the need to construct the entire Virtual Environment (VE), and this requires full a priori knowledge of the workpieces, working area and thus more computational resources. AR-based robotic systems offer the users with graphics, text and animation through augmenting illustrative and informative elements over the real scene via a video stream. An AR cueing method was reported by Nawab et al. (2007) and Chintamani et al. (2010) to assist the users in navigating the end-effector (EE) of a real robot using two joysticks. The use of AR to address human-robot interaction and robot programming issues have been reported in several studies. Operators can program and guide the virtual model without having to interact physically with the real robot. Zaeh and Vogl (2006) introduced a laser-projection-based approach where the operators can manually edit and modify the planned paths projected over the real workpiece through an interactive stylus. Reinhart et al. (2008) adopted a similar human-robot interface in remote robot laser welding applications. Chong et al. (2009) and Ong et al. (2010) presented a methodology to plan a collision-free path through guiding a virtual robot using a probe attached with a planar marker and developed the RPAR (Robot Programming using Augmented Reality) system. The methodology is interactive as the human is involved in obtaining the 3D data points of the desired curve to be followed through performing a number of demonstrations, defining the free space relevant to the task, and planning the orientations of the end-effector along the curve (Fig. 2).

Fig. 3. Physical layout of the experimental set-up (Fang et al., 2012). In RPAR-II (Fig. 4), a collision-free path can be generated through human-virtual robot interaction in a real working environment, as illustrated in Fig. 5. Fig. 5(a) is the setup for a robotic task, which is to transfer an object from a start point to a goal point. With the start point and goal point known a priori, after generating a collision-free volume (CFV) in the workspace (Fig. 5(b)), the user proceeds to create a series of control points within the collision-free volume using the interaction device (Fig. 5(c)). Using these points as inputs, a cubic-spline interpolation is applied to generate a smooth path automatically (Fig. 5(d)). 3.3 Factory layout planning (FLP) systems FLP refers to the design of the layout plans of the machines/equipment in a manufacturing shopfloor. A welldesigned manufacturing layout plan can reduce up to 50% of the operating cost (Xie and Sahinidis, 2008). Traditionally, FLP solutions are achieved by building scaled models as it is easy to visualize the individual components and many planners can study the model at the same time.

RPAR was further developed and enhanced to the RPAR-II system (Fang et al., 2009, Fang et al., 2012) which is shown in Fig. 3. It includes a SCORBOT-ER VII manipulator, gripper, robot controller, desktop-PC, desktop-based display, stereo camera, and an interaction device attached with a marker-cube. The augmented environment consists of the physical entities that exist in the robot operation space and a virtual robot model which includes the virtual end effector to replicate the real robot. 17

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

ings include time-consuming modelling of the entire factory floor space and all the facilities. Several AR-based FLP systems have been reported. These systems allow the users to place virtual objects in the real environment, allowing immediate visualisation of the layout. It is an attempt to integrate human intuitiveness with the layout design process. However, as AR technology was not mature then, many reported systems were at the conceptual design stage. Recently, AR tools and technologies have advanced considerably (Nee et al., 2012, Reinhart and Patron, 2003, Ong et al., 2007). Jiang and Nee (2013) presented an on-site planning and optimization method based on AR technology. Information of the existing facilities is obtained in real-time to formulate the layout criteria and physical constraints. Figure 6 shows the enhanced sense of the existing facilities obtained in real-time to formulate the layout criteria and constraints. The enhanced sense of reality can facilitate the full utilization of a user’s experience, knowledge and intuition for identifying specific issues to be addressed and examined on-site. A system named AFLP (AR-based FLP) has been developed to implement the proposed methodology (Fig. 6 and Fig. 7).

Fig. 4. Architecture of the RPAR-II system (Fang et al., 2012).

Fig. 6. Using AR to facilitate FLP for existing shopfloors (Jiang and Nee, 2013). Fig. 5. Geometric path planning in the RPAR-II system (Fang et al., 2012). Algorithmic tools and mathematical formulation of FLP are often used, examples are, the Quadratic Assignment Problem model, the Mixed Integer Problem model, and the development of efficient algorithms to solve these models, e.g., GA (genetic algorithm), SA (simulated annealing), etc. However, due to the combinatorial complexity of the FLP problem, it is almost impossible to find the best solution. More recently, VR simulation tools have been applied for generating solutions for FLP tasks. These systems are known as plant simulation or manufacturing software. These VR systems (Calderon and Cavazza, 2003, Iqbal and Hashmi 2001, Zetu et al., 1998) have quite similar features as they all provide visual on-line layout planning platforms to the users. The tedious design processes, however, are difficult to improve with these systems. Moreover, as the entire plant environment is simulated virtually, any deviation from reality could reduce the usefulness of the solutions. Other shortcom-

Fig. 7. User interface of the AFLP system (Jiang and Nee, 2013). An on-site modeling method has been developed to obtain the geometric data of existing facilities. These data, together with the data that represent the facilities to be laid out, are utilized to define the layout criteria and constraints evaluated in real-time for comparison purposes. In the augmented 18

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

scene is provided in OFA (Fig. 10(a)), and a mobile user interface, consisting of a physical marker and a virtual panel is provided in OSA. The physical marker is tracked in 3D space and acts as a 2D cursor and/or 3D placement tool. The virtual panel is a virtual display of computer augmented information, e.g., virtual buttons. The user can place the 2D cursor on a virtual button for a predefined time range to activate it (Fig. 10(b)). The user can use the 3D placement tool to arrange the virtual objects spatially (Fig. 10(c)).

shopfloor, users can manipulate the layout of new facilities intuitively until a good solution has been achieved. In addition, an optimization scheme is adopted to provide alternative layout plans. 3.4 Maintenance Maintenance plays an important role in ensuring equipment performance and reduction of downtime and disruption to production schedules. However, increasing equipment complexity has posed great challenges to the maintenance personnel. Several aspects of maintenance can be supported with advanced information technologies (van Houten et al., 1998, Setchi and White, 2003, van Houten and Kimura, 2000). Augmented reality (AR) can be used to enhance maintenance activities (Feiner et al., 1993). AR provides a better approach for providing maintenance information compared with paper and computer-based manuals and can improve the workflow of maintenance operations. AR applications in maintenance activities started in the early 1990s. The current research focus has been shifted from demonstrating the benefits of applying AR to improving the usefulness in routine and ad hoc maintenance activities. However, no single AR system has yet been proven to be well accepted by the industry (Ong et al., 2008). The usefulness of AR systems in maintenance depends on several factors. Firstly, the maintenance information should provide context-awareness. A system is context-aware if it could collate, reason, and use context information (Dey, 2001), and adapt its functionality to varying contexts (Byun and Cheverst, 2004). For example, details of maintenance instructions can vary according to the individual expertise of the technicians. Secondly, the maintenance information provided should be editable and hence can be updated easily. This is useful as it allows technicians to document and update any incorrect maintenance information in the database. Thirdly, suitable collaboration tools should be provided to allow remote experts to create AR-based instructions to assist on-site technicians who may need assistance. Lastly, a bi-directional content creation tool that allows dynamic AR maintenance contents creation offline and on-site.

Fig. 8. ARAMS system architecture (Ong and Zhu, 2013).

Ong and Zhu (2013) developed ARAMS (Fig. 8) which consists of (1) On-site authoring for maintenance technicians to create, edit and update AR contents; (2) Offline authoring for maintenance experts to develop context-aware AR maintenance contents, to form the bi-directional tool; (3) Online authoring for experts to create AR-based instructions during remote maintenance activities; (4) Database stores virtual and AR maintenance contents; (5) Context management collects and reasons maintenance contexts; (6) Tracking and registration; and (7) AR-based visualization for rendering the AR contents in the maintenance environments.

Fig. 9. Bi-directional authoring (Ong and Zhu, 2013). 3.5 CNC simulation Several commercial 3D graphics-based CNC machining simulation systems, such as DELMIA Virtual NC (Delmia, 2009), EASY-ROB NC Simulation (Easy-Rob, 2009), hyperMILL by OPEN MIND (hyperMill, 2009), etc., are available. The ability for operators to analyse machining information can be complemented using 3D graphics and instant

A bi-directional maintenance content creation tool has been developed to build context-aware AR contents. The bidirectional process (Fig. 9) consists of two main steps, context modeling and information instance modeling. Intuitive user interfaces are provided, where a desktop user interface consisting of an authoring panel and the augmented virtual 19

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

access to databases. AR technology enables this through rendering virtual information onto a real machining environment, thus providing a real world supplemented with rich information to the users.

human-machine interfaces, which include a Firewire CCD camera, and a high-end PC. Either a head-mounted display or a monitor can be used in the in situ system as the display device. To achieve in situ CNC simulation, the position of the cutter is registered in the machining coordinate system using a hybrid tracking method and constraints extracted from given NC codes. The system is designed to be used by novice machinists, who can use the system to alter NC codes and observe responses from the CNC machine without possible tool breakage and machine breakdowns. According to the simulation results, alarms can be rendered in the augmented display to notify the users of dangers and errors (Fig. 12).

Fig. 10. User interfaces (Ong and Zhu, 2013). In an AR-based machining simulation environment, the user retains the awareness of the real CNC machine, while the augmented 2D or 3D information, such as cutting parameters, CNC programs, etc. can enhance the user’s visual, aural and proprioceptive senses.

Fig. 11. Experimental setup of ARCNC (Zhang et al., 2008).

Many studies in applying AR technology to the informationintensive and time-consuming tasks in manufacturing have been conducted. Comparatively, fewer AR applications can be found in CNC machining. This is probably due to the fact that the processing procedures are machine-centric and fewer human factors are involved during this stage. The ASTOR system (Olwal et al., 2008, Olwal et al., 2005) applies a projection-based AR display mechanism to allow users to visualize machining data that is projected onto a real machining scene. The system first obtains the machining process data and the resultant cutting forces, and displays the information onto a holographic optical element window, which is installed on the sliding door of the lathe. Weinert et al. (2008] developed a CNC machining simulation system for 5-axis CNC machines. In the system, ARToolKit-based tracking was applied to track the movement of the cutter with respect to the machine table, and dexel boards were applied to model the workpiece. The simulation module in the system can estimate the cutting forces and predict collision between the head stock and the workpiece.

Fig. 12. System architecture of AR-assisted CNC simulation system (Zhang et al., 2008). In the ARCNC system, machining simulation is performed between a real cutter and a virtual workpiece and displayed to the users using a video see-through scene rendering mechanism (Fig. 13). Simulations of material removal processes are displayed to the user to assist in the inspection and evaluation of the machining processes before performing real machining, thus reducing material wastage and power consumption. In addition, the user can inspect the physical aspects of the machining processes based on the estimated machining conditions, which are augmented onto the scene, e.g., machining forces, etc. The application of the video see-

An AR-assisted in situ CNC machining simulation system, namely, the ARCNC system, has been developed (Zhang et al., 2008, Zhang et al., 2010a, Zhang et al., 2010b) for machining operations on a 3-axis CNC machine. The system setup and architecture are shown respectively in Fig. 11 and Fig. 12. This AR-assisted in situ CNC simulation system consists of three main units, viz., a 3-axis vertical CNC machine in this research, a display device, and the AR-assisted 20

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

Virtual Reality (VR) technology plays a vital role in simulating advanced 3D human-computer interactions (HCI), especially for mechanical assemblies, by allowing users to be completely immersed in a synthetic environment. Many VR systems have been proposed successfully to assist assembly activities, e.g., CAVE (Cruz-Neira et al., 1992, 1993); IVY (Inventor Virtual Assembly) (Kuehne and Oliver, 1995); Vshop (Pere et al.,, 1996); VADE (Virtual Assembly Design Environment) (Jayaram et al.,, 1997, 1999, 2000a, 2000b, Taylor et al., 2000), HIDRA (Haptic Integrated Dis/Reassembly Analysis) (Coutee et al., 2001, Coutee and Bras, 2002), SHARP (Seth et al.,, 2005, 2006). However, there are limitations in VR assembly systems. A major limitation is the need to create a fully immersed VR experience, which may not be highly convincing as it is not easy to fully and accurately model the actual working environments which are critical to the manufacturing process. Although there are advanced approaches to accelerate the computation process (e.g., GPU acceleration: http://www.nvidia.com), the realtime capability is still a challenge as a result of the extensive computation in VR.

through technology in the proposed system allows different users to focus on different information and tasks, which can be useful during training when several trainees are involved.

Fig. 13. An in situ CNC machining simulation system (Zhang et al., 2010b). During an in situ simulation, a virtual cutter is registered with the real cutter in near real time. A virtual workpiece is either rendered onto a worktable or aligned with a fixture on the worktable. Simulation of the machining process can be achieved according to the movements of the virtual cutter and the workpiece (which moves together with the worktable). Both geometric and physical simulations can be performed and displayed. To the operator, it would look like a real cutter machining a virtual workpiece. The operator can interactively observe the simulation as it proceeds, with NC codes, cutter coordinates, and estimated physical cutting conditions provided on a virtual interaction panel (Yuan et al., 2004). Feedback from the operator to the machine tool can be included in the architecture. When certain values in the physical simulation, e.g., cutting forces exceeding certain limits, an alarm can be displayed to the operator, who can respond accordingly, such as pressing an emergency button on the virtual panel to stop the machine tool.

AR technology can overcome some of the limitations described as it does not need the entire real world to be modelled (Ong et al., 2008), thus reducing the high cost of full immersive VR environments, in terms of both preparation and computation time. More importantly, AR enhances the interaction between the systems and the users by allowing them to manipulate the objects naturally. Therefore, AR technology has emerged as one of the most promising approaches to facilitate mechanical assembly processes, which complexity can be enormous. In the past two decades, AR has proven its ability by integrating various modalities in real time into the real assembly environment. With an AR-based environment, an intuitive way to interact directly with product design and manufacturing information is provided to the users, allowing them to use natural spatial processing abilities to obtain a sense of presence in the real assembly workspace with both real and virtual objects. Researchers in the manufacturing industries, academic institutes and universities all around the world have been exploring the use of AR technology in addressing some complex problems in mechanical assembly.

The AR-assisted in situ simulation system may perform better than 3D graphics-based simulation systems in several aspects. First of all, the cutting simulation is presented to the operator with a heightened sense of reality, and he can operate the CNC machine and observe the simulation simultaneously. The system can be used with any CNC machine that the operator is familiar with or is trained for. The selection of a machine tool will not affect the simulation procedures as long as initialization is conducted accordingly. For example, the parameters applied in a physical simulation may be different from machine to machine. Thus, calibrations should be performed first and stored in the physical model module. Scene rendering time and effort in this AR-assisted system is reduced as compared to the graphic-based simulation systems, since only a few virtual objects are modelled and updated geometrically and spatially. Furthermore, the movements of the cutter and the worktable are obtained from vision-based tracking and registration. Hence, the simulation can reflect the real dynamic tool movements, rather than an ideal model of the machine in 3D graphic-based simulation systems.

Ong and Wang (2011) presented an AA system which can interpret a user’s manual assembly intent, support on-line constraint recognition, and provide a robust 3D bare-hand interaction interface to allow visual feedback during assembly operations (Fig. 14). 3D natural bare-hand interaction (3DNBHI) method has been developed to implement a dualhand AA interface for users to manipulate and orientate components, tools and sub-assemblies simultaneously. This allows close replication of real world interactions in an AE, making the user feel that he is assembling the real product, and hence the AA process becomes more realistic and almost comparable to the real process. A tri-layer assembly data structure (TADS) is used for assembly data management in this system. An interactive constraint-based AA using barehand has been developed to increase the interaction between the user and the virtual components to realize the active participation of the users in the AA process. Figure 15 shows

3.6 Assembly operations

21

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

motion of the right bracket is constrained in the planar surface of the base.

the architecture of a bare-hand interaction augmented assembly (BHAA) system.

This case study, although quite simple, has demonstrated the use of bare-hands for assembly simulation. It has good potential for training a novice worker in assembling parts together. After several training sessions, the worker can operate without further AR guidance, while the initial learning curve can be shortened considerably. Work is underway to implement tactile feedback for detecting interference fit and mismatching of parts. The assembly simulation can also provide feedback to the design-for-assembly approach, and incorporate ergonomic principles for reducing worker fatigue.

Fig. 14. BHAA system setup (Ong and Wang, 2011).

Fig. 16. The AA processes of a pulley bracket (Ong and Wang, 2011).

Fig. 15. Architecture of BHAA system (Ong and Wang, 2011).

TECHNICAL ISSUES AND CHALLENGES IN AR

The exploded view of a pulley bracket shown in Fig. 16 is used as a case study. In Fig. 16(a), the user first grasps the pulley with his right hand and the left bush with his left hand and assembles the two parts together. If the parts collide, the system will analyze the surface information in the contact list and detects any possible constraints. In this case, a cylindrical fit constraint has been recognised. Next, the position and orientation of the pulley in the user’s right hand is adjusted automatically to ensure that the cylindrical fit constraint has been met precisely and the assembly operation is then completed. In Fig. 16(b), the user next assembles the right bracket with the base. After the two co-planar fit constraints have been recognized, it is realised that the right bracket can only be translated on the base. In Fig. 16(c), the user then proceeds to fasten the two bolts using a standard spanner for fixing the right bracket onto the base. In the final step shown in Fig. 16(d), the user assembles the left bracket on to the base. After a coplanar-fit constraint has been recognized, the

4.1 Tracking accuracy AR applications in manufacturing require a high level of tracking accuracy such as in robot path planning and CNC machining simulation. A combination of computer-vision, inertial and hybrid tracking techniques will be required. CVbased tracking will not be able to handle high frequency motion as well as rapid camera movements. Hybrid systems using laser, RFID and other types of sensing devices will need to be considered. 4.2 Registration of virtual information One of the basic issues in AR is the placing of virtual objects with the correct pose in an augmented space. This is also 22

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

three features are further used to integrate AR technology and develop custom-built 3D simulations.

referred to as registration, which is a difficult and much researched topic. As different tracking methodologies possess their own inherent deficiencies and error sources, it is necessary to study the best tracking method for a particular application which could be subject to poor lighting condition, moving objects, etc.

3D interface and wearable computing devices are popular areas of AR research on interfacing technologies. Poupyrev et al (2002) divided the AR interface design space along two orthogonal approaches, viz., 3D AR interfaces and tangible interfaces. In the 3D AR interface, users interact with virtual contents via HMDs and monitor-based displays and these are not the tools that they would interact with the real world. In tangible interfaces, users would use traditional tools in the same way as they manipulate the physical objects.

The first type of errors is referred to as static error which arises from the inaccuracy present in the sensory devices, misalignments between sensors, and/or incorrect registration algorithms (Dong and Kamat, 2010). These types of errors can be eliminated quite easily as higher accuracy sensors are available and other sensor alignments can be set up accurately.

5. CONCLUSIONS

The second type of errors is the dynamic errors that are less predictable, which can be due to latency problems between data streams due to off-host delay, synchronization and computational delays (Dong and Kamat, 2010). Researchers have been working on methods to resolve the latency issues and some of the solutions are to adopt multi-threading programming or scheduling system latency (Jacobs et al., 1997), and predicting the camera motion using Kalman filter (Lieberknecht et al., 2009).

Augmented reality is finding new application almost daily. Its ability to provide high user intuition and the relative ease of implementation has outperformed VR, which was one of the most notable impacts of the late 1990s. A proliferation of AR applications can be found on handheld devices and smart phones. Moving from marker-based to markerless registration and tracking, mobile and outdoor AR is rapidly gaining popularity. AR application in manufacturing operations is relatively new compared to social and entertainment applications. This is due largely to the more stringent requirements in tracking and registration accuracy, and a good alignment with traditional practices. Unlike playing an AR game where one can quit any time and restart at will, the engineering users are likely to spend a considerable amount of time using the system in their jobs, and this is where ergonomics, human factors and cognitive strain on the users must be well recognised and taken care of.

4.3 Latency issues AR displays require an extremely low latency to maintain the virtual objects in a stable position (Pasman et al., 1999). An important source of alignment errors come from the difference in time between the moment an observer moves and the time when the image which corresponds to the new position of the observer is displayed. This time difference is called the end-to-end latency, which is important as head rotations can be very fast and this would cause significant changes to the scene being observed. It is suggested (Padmos and Milders, 1992) that the displacement of objects between two frames should not exceed 0.25 of a degree. In terms of latency, this would translate to 5ms when an observer rotates his head at a speed of 50 degrees per second. Pasman et al (1999) described a method to meet this requirement. Their method used a combination of several levels of position and orientation tracking using varied relative and absolute accuracies, as well as different levels of rendering to reduce the 3D data to relatively simple scenes such that the 3D data can be rendered in a shorter period of time.

This paper presents some of the applications of AR which are relevant to the manufacturing community, although most of them are still in the experimental stage. The paper emphasizes the importance of designing and providing intuitive and effective human interfaces, as well as suitable content development in order to make AR a powerful tool in the manufacturing engineering field. REFERENCES ARTool 2.11 (Last accessed on June 29 2011). Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S., and B. MacIntyre (2001). “Recent advances in augmented reality”, IEEE Computer Graphics and Applications, vol. 21, no. 6, pp. 34-47. Burdea, G.C. (1996). “Virtual reality and robotics in medicine”, Proceedings of the IEEE International Workshop on Robot and Human Communication, Tsukuba, Japan: 11-14 November 1996, pp. 16-25. Byun, H.E. and K. Cheverst (2004). “Utilizing context history to provide dynamic adaptations”, Applied Artificial Intelligence, vol. 18, no. 6, pp. 533-548. Calderon, C., and M. Cavazza (2003). “A New Approach to Virtual Design for Spatial Configuration Problems”, Proceedings of the Seventh International Conference on Information Visualization (IV ‘03), pp. 518-523.

4.4 AR interfacing technology Four essential elements are needed to set up an AR environment (Kim & Dey 2010), namely, target places, AR contents, tracking module and the display system. Kim and Dey (2010) reported a comprehensive review of AR prototyping trends and methods. They addressed three features for creating an AR environment that are essential for end-user interaction, viz., intuitive observation, informative visualization and immersive interaction, and in the development of Interactive Augmented Prototyping (IAP). These

23

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

Freund, E., and J. Rossmann (2005). “Projective virtual reality as a basis for on-line control of complex systems-not only-over the Internet”, Journal of Robotic Systems, vol. 22, no. 3, pp. 147-155. Fründ, J., Gausemeier, J., Matysczok, C., and R. Radkowski (2005). “Using Augmented Reality Technology to Support Automobile Development”, Lecture Notes in Computer Science, vol. 3168, pp. 289-298. Ha, J., Cho, K., Rojas, F.A., and H.S. Yang, (2011). “RealTime Scalable Recognition and Tracking based on the Server-Client Model for Mobile Augmented Reality”, Proceedings IEEE 1st International Symposium on Virtual Reality Innovation 2011 (ISVRI 2011), March 1920, 2011, Singapore, pp. 267-272. Hakkarainen, M., Woodward, C., and M. Billinghurst (2008). “Augmented Assembly using a Mobile Phone”, Proceedings 7th IEEE International Symposium on Mixed and Augmented Reality (ISMAR 2008), 2008 September 16-18, Cambridge, pp. 167-168. HyperMILL by OPEN MIND (2009). http://www.openmindtech.com/zv. (Last accessed 3 Jun 2009). Iqbal, M., and M.S.J. Hashmi (2001). “Design and Analysis of a Virtual Factory Layout”, Journal of Material Processing Technology, vol. 118, pp. 403-410. Israel, J.H., Wiese, E., Mateescu, M., Zollner, C., and R. Stark (2009). “Investigating three-dimensional sketching for early conceptual design – Results from expert discussion and user studies”, Computers and Graphics, vol. 33, no. 462-473. Jacobs, M.C., Livingston, M.A., and A. State (1997). “Managing Latency in Complex Augmented Reality Systems”, Proceedings of the 1997 Symposium on Interactive 3D Graphics, pp. 49-54. Jayaram, S., Connacher, H.I., and K.W. Lyons (1997). “Virtual assembly using virtual reality techniques”, Computer Aided Design, vol. 29, no. 8, pp. 575-584. Jayaram, S., Jayaram, U., Wang, Y., Tirumali, H., Lyons, K., and P. Hart (1999). “VADE:a virtual assembly design environment”, Comput Graph Appl, vol. 19, no. 6, pp. 44-50. Jayaram, S., Jayaram, U., Wang, Y., and K. Lyons (2000a). “CORBA-based Collaboration in a Virtual Assembly Design Environment”, In: Proceedings of ASME design engineering technical conferences and computers and information in engineering conference, Baltimore, MD, USA, 2000. Jayaram, U., Tirumali, H., and S. Jayaram (2000b). “A tool/part/human interaction model for assembly in virtual environments”, In: Proceedings of ASME design engineering technical conferences, Baltimore, MD, USA, 2000. Jiang, S. and A.Y.C. Nee (2013). “A novel facility layout planning and optimization methodology”, CIRP Annals – Manufacturing Technology, vol. 62, no. 1 (to appear). Kato, H. and M. Billinghurst (1999). “Marker tracking and HMD calibration for a video-based augmented reality conferencing system”, Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality, CA, USA, pp. 85-94.

Chen, C.J., Ong, S.K., Nee, A.Y.C., and Y.Q. Zhou (2010). “Haptic-based Interactive Path Planning for a Virtual Robot Arm”, International Journal of Interactive Design and Manufacturing, vol. 4, no. 2, pp. 113-123. Chintamani, K., Cao, A., Ellis, R.D., and A.K. Pandya (2010). “Improved tele-manipulator navigation during display-control misalignments using Augmented Reality cues”, IEEE Transactions on Systems, Man, and Cybernetics – Part A: Systems and Humans, vol. 40, no. 1, pp. 29-39. Chong, J.W.S., Nee, A.Y.C., Ong, S.K., and K. YoucefToumi (2009). “Robot Programming using Augmented Reality: an Interactive Method for Planning CollisionFree Paths”, International Journal of Robotics and Computer-Integrated Manufacturing, vol. 25, no. 3, pp. 689-701. Coutee, A.S. and B. Bras (2002). “Collision detection for virtual objects in a haptic assembly and disassembly simulation environment”, In: ASME design engineering technical conferences and computers and information in engineering conference, Montreal, Quebec, Canada, 2002, pp. 11-20. Coutee, A.S., McDermott, S.D., and B. Bras (2001). “A haptic assembly and disassembly simulation environment and associated computational load optimization techniques”, J. Comput. Inf. Sci. Eng., vol. 1, no. 2, pp. 113122. Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Kenyon, R.V., and J.C. Hart (1992). “The CAVE: audio visual experience automatic virtual environment”, Communications of the ACM, vol. 35, no. 6, pp. 64-72. Cruz-Neira, C., Sandin, D.J., and T.A. DeFanti, (1993). “Surround-screen projection-based virtual reality: the design and implementation of the CAVE”, In: Proceedings of the 20th annual conference on Computer graphics and interactive techniques, Anaheim, CA, USA, 1993, pp. 135-142. Delmia (2009). DELMIA Virtual NC, http://www.delmia.com. (Last accessed 3 June 2009). Dey., A.K. (2001). “Understanding and using context”, Personal and Ubiquitous Computing, vol. 5, no. 1, pp. 4-7. Dong, S., and V.R. Kamat (2010). “Robust mobile computing framework for visualization of simulated processes in augmented reality”, Proceedings of the 2010 Winter Simulation Conference (WSC ‘10), pp. 3111-3122. Easy-rob (2009). EASY-ROB NC Simulation, http://www.easy-rob.com/en/product/apis-additionaloptions/nc-simulation.html. (Last accessed 3 June 2009). Fang, H.C., Ong, S.K., Nee, A.Y.C, 2009, Robot Programming using Augmented Reality, International Conference on Cyberworlds, 7-11 September 2009 University of Bradford, UK, 13-20. Fang, H.C., Ong, S.K., and A.Y.C. Nee (2012). “Interactive Robot Trajectory Planning and Simulation using Augmented Reality”, Robotics and Computer-Integrated Manufacturing, vol. 28, no. 2, pp. 227-237. Feiner, S., Macintyre, B., and D. Seligmann (1993). “Knowledge-based Augmented Reality”, Communications of the ACM, vol. 36, pp. 53-62.

24

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

tional Journal of Production Research, vol. 46, no. 10, pp. 2707-2742. Ong, S.K. and J. Zhu (2013). “A novel maintenance system for equipment serviceability improvement”, CIRP Annals – Manufacturing Technology, vol. 62, no. 1 (to appear). Padmos, P., and M.V. Milders (1992). “Quality criteria for simulator images: A literature review”, Human Factors, vol. 34, no. 6, pp. 727-748. Panin, G., Lenz, C., Nair, S., Roth, E., Wojtczyk, M., Friedlhuber, T., and A. Knol (2008). “A Unifying Software Architecture for Model-based Visual Tracking”, In: Proceedings of the 20th Annual Symposium of Electronic Imaging, San Jose, CA, 6813:03-17. Park, J. (2008). “Augmented Reality Based Re-formable Mock-Up for Design Evaluation”, Proceedings of the 2008 International Symposium on Ubiquitous Virtual Reality, IEEE Computer Society Washington, DC, USA, 2008, pp. 17-20. Pasman, W., van der Schaaf, A., Lagendijk, R.L., and F.W. Jansen (1999). “Accurate overlaying for mobile augmented reality”, Computer & Graphics, vol. 23, no. 6, pp. 875-881. Pere, E., Langrana, N., Gomez, D., and G. Burdea (1996). “Virtual mechanical assembly on a PC-based system”, In: Proceedings of ASME design engineering technical conferences and computers and information in engineering conference (DETC1996/DFM-1306), Irvine, CA, USA, 1996. Poupyrev, I., Tan, D.S., Billinghurst, M., Kato, H., Regenbrecht, H., and N. Tetsutani (2002). “Developing a generic augmented-reality interface”, IEEE Journal of Computers, vol. 35, no. 3, pp. 44-50. Reinhart, G., Munzert, U., and W. Vogl (2008). “A programming system for robot-based remote-laser-welding with conventional optics”, CIRP Annals – Manufacturing Technology, vol. 57, no. 1, pp. 37-40. Reinhart, G. and C. Patron (2003). “Integrating Augmented Reality in the Assembly Domain-Fundamentals, Benefits and Applications”, CIRP Annals – Manufacturing Technology, vol. 52, no. 1, pp. 5-8. Ong, S.K., Pang Y., and A.Y.C. Nee (2007). “Augmented Reality Aided Assembly Design and Planning”. CIRP Annals – Manufacturing Technology, vol. 56, no. 1, pp. 49-52. Setchi, R. and D. White (2003). “The development of a hypermedia maintenance manual for an advanced manufacturing company”, The International Journal of Advanced Manufacturing Technology, vol. 22, no. 5-6, pp. 456-464. Seth, A., Su, H.J., and J.M. Vance (2005), “A desktop networked haptic VR interface for mechanical assembly”, In: ASME 2005 International Mechanical Engineering Congress & Exposition, Orlando, Florida, USA, 2005, pp. 173-180. Seth, A., Su, H.J., and J.M. Vance (2006). “SHARP: A System for Haptic Assembly & Realistic Prototyping”, In: ASME 2006 International Design Engineering Technical Conferences and Computers and Information in Engi-

Kim, S. and A.K. Dey (2010) “AR interfacing with prototype 3D applications based on user-centered interactivity”, Computer Aided Design, vol. 42, no. 5, pp. 373-386. Klein, G. and D. Murray (2007). “Parallel Tracking and Mapping for Small AR Workspaces”, Proceedings of the 6th International Symposium on Mixed and Augmented Reality (ISMAR ‘07, Nara), pp. 250-259. Kuehne, R. and J. Oliver (1995). “A virtual environment for interactive assembly planning and evaluation”, In: Proceedings of ASME design automation conference, Boston, MA, USA, 1995. Liang, J., Shaw, C., and M. Green (1991). “On temporalspatial realism in the virtual reality environment”, Proceeding of 1991 Symposium on User Interface Software and Technology, Hilton Head, South Carolina,, ACM New York, pp. 19-25. Liu, Z., Bu, W., and J. Tan (2010). “Motion navigation for arc welding robots based on feature mapping in a simulation environment”, Robotics and Computer-Integrated Manufacturing, 26/2:137–144. Nee, A.Y.C., Ong, S.K., Chryssolouris G., and D. Mourtzis (2012). “Augmented Reality Applications in Design and Manufacturing”, CIRP Annals – Manufacturing Technology, vol. 61, no. 2, pp. 657-679. Ng, L.X. Ong, S.K., and A.Y.C. Nee (2010). “ARCADE: A Simple and Fast Aug-mented Reality Computer-Aided Design Environment Using Everyday Objects”, Proceedings of IADIS Interfaces and Human Computer Interaction 2010 Conference (IHCI 2010), pp. 227-234. Olwal, A., Gustafsson, J., and C. Lindfors (2008). “Spatial augmented reality on industrial CNC machines”, Proceedings of the International Conference on The Engineering Reality of Virtual Reality 2008, vol. 6804; 2008 Jan 27-31; San José, California; 2008, 680409:1-9. Olwal, A., Lindfors, C., Gustafsson, J., Kjellberg, T., and L. Mattson (2005). “ASTOR: An autostereoscopic optical see-through augmented reality system”, Proceedings of IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 24-27. Ong, S.K., Chong, J.W.S., and A.Y.C. Nee (2010). “A Novel AR-Based Robot Programming And Path Planning Methodology”, Robotics and Computer-Integrated Manufacturing Journal, vol. 26, no. 3, pp. 240-249. Ong, S.K. and A.Y.C. Nee (2004). Virtual and Augmented Reality Applications in Manufacturing, London, Springer, ISBN 1-85233-796-6. Ong, S.K., Pang, Y., and A.Y.C. Nee (2007). “Augmented reality aided assembly design and planning”, CIRP Annals – Manufacturing Technology, vol. 56, no. 1, pp. 4952. Ong, S.K. and Y. Shen (2009). “A Mixed Reality Environment for Collaborative Product Design and Development”, CIRP Annals – Manufacturing Technology, vol. 58, no. 1, pp. 139-142. Ong, S.K. and Z.B. Wang (2011). “Augmented assembly technologies based on 3D bare-hand interaction”, CIRP Annals – Manufacturing Technology, vol. 60, no. 1, pp. 1-4. Ong, S.K., Yuan M.L., and A.Y.C. Nee (2008). “Augmented reality applications in manufacturing: a survey”, Interna-

25

2013 IFAC MIM June 19-21, 2013. Saint Petersburg, Russia

International Symposium on Mixed and Augmented Reality. Santa Barbara, CA: 22-25 October 2006, pp. 125128. Zetu, D., Schneider, P., and P. Banerjee (1998). “Data Input Model for Virtual Reality-aided Factory Layout”, IIE Transactions, vol. 30, no. 7, pp. 597-620. Zhang, J., Ong, S.K., and A.Y.C. Nee (2008). “AR-Assisted in situ Machining Simulation: Architecture and Implementation”, Proceedings of ACM SIGGRAPH 7th International Conference on Virtual-Reality Continuum & its Applications in Industry, 2008 Dec 8-9; Singapore. Zhang, J., Ong, S.K., and A.Y.C. Nee (2010a). “Development of an AR system Achieving in situ machining simulation on a 3-axis CNC machine”, Computer Animation and Virtual Worlds, vol. 21, no. 2, pp. 103-115. Zhang, J., Ong, S.K., and A.Y.C. Nee (2010b). “A MultiRegional Computation Scheme in an AR-Assisted in situ CNC Simulation Environment”, Computer-Aided Design, vol. 42, no. 12, pp. 1167-1177.

neering Conference, Philadelphia, Pennsylvania, USA, 2006, pp. 905-912. Stark, R., Israel, J.H., and T. Wöhler (2010). “Towards hybrid modeling environments – Merging desktop-CAD and virtual reality-technologies”, CIRP Annals – Manufacturing Technology, vol. 59, pp. 179-182. Stutzman, B., Nilsen, D., Broderick, T., and J. Neubert (2009). “MARTI: Mobile Augmented Reality Tool for Industry”, Proceedings of 2009 World Congress on Computer Science and Information Engineering (CSIE2009), 2009, March 31 - April 2, Los Angeles, USA, pp. 425-429. Taylor, F., Jayaram, S., and U. Jayaram (2000). “Functionality to facilitate assembly of heavy machines in a virtual environment”, In Proceedings of ASME design engineering technical conferences, Baltimore, MD, USA, 2000. Valentini, P.P. (2009). “Interactive virtual assembling in augmented reality”, International Journal of Interactive Design and Manufacturing, vol. 3, no. 2, pp. 109-119. van Houten, F.J.A.M., Tomiyama T., and O.W. Salomons (1998). “Product modeling for model-based maintenance”, CIRP Annals – Manufacturing Technology, vol. 47, no. 1, pp. 123-128. van Houten, F.J.A.M. and F. Kimura (2000). “The virtual maintenance system: A computer-based support tool for robust design, product monitoring, fault diagnosis and maintenance planning”. CIRP Annals – Manufacturing Technology, vol. 49, no. 1, pp. 91-94. Weinert, K., Zabel, A., Ungemach, E., and S. Odendahl (2008). “Improved NC Path Validation and Manipulation with Augmented Reality Methods”, Production Engineering, vol. 2, no. 4, pp. 371-376. Wiese, E., Israel, J.H., Zöllner, C., Pohlmeyer, A.E., and R. Stark (2009). “The potential of immersive 3D-sketching environments for design problem-solving”, Proceedings of 13th International Conference on Human-Computer Interaction, HCI 2009, pp. 485-489. Xie, W. and N.V. Sahinidis (2008). “A Branch-and-bound Algorithm for the Continuous Facility Layout Problem”, Computers and Chemical Engineering, vol. 32, no. 4, pp. 1016-1028. Xin, M., Sharlin, E., and M.C. Sousa (2008). “Napkin Sketch – Handheld Mixed Reality 3D Sketching”, Proceedings of 15th ACM Symposium on Virtual Reality Software and Technology (VRST 2008), 2008, October 27-29, Bordeaux, France, 2008, pp. 223-226. Yuan, M.L, Ong, S.K., and A.Y.C. Nee (2004). “The virtual interaction panel: an easy control tool in augmented reality system”, Computer Animation and Virtual Worlds Journal, vol. 15, no. 3-4, pp. 425-432. Yuan, M.L., Ong, S.K., and A.Y.C. Nee (2008a). “Augmented Reality Applications in Manufacturing: A Survey”, International Journal of Production Research, vol. 46, no. 10, pp. 2702- 2742. Yuan, M.L., Ong, S.K., and A.Y.C. Nee (2008b). “Augmented reality for assembly guidance using a virtual interactive tool”, International Journal of Production Research, vol. 46, no. 7, pp. 1745-1767. Zaeh, M.F. and W. Vogl (2006). “Interactive laser-projection for programming industrial robots”, Proceedings of the

26