Nuclear Engineering and Design 241 (2011) 2614–2623
Contents lists available at ScienceDirect
Nuclear Engineering and Design journal homepage: www.elsevier.com/locate/nucengdes
In-cell maintenance by manipulator arm with 3D workspace information recreated by laser rangefinder Akihiro Kitamura ∗ , Koji Nakai, Takashi Namekawa, Masatoshi Watahiki Nuclear Fuel Cycle Engineering Laboratories, Japan Atomic Energy Agency, 4-33 Muramatsu, Tokai-mura, Naka-gun, Ibaraki 319-1194, Japan
a r t i c l e
i n f o
Article history: Received 6 January 2011 Received in revised form 27 April 2011 Accepted 27 April 2011
a b s t r a c t We developed a remote control system for maintenance of in-cell type fuel fabrication equipment. The system display recreated three-dimensional information of the workspace from data obtained by laser rangefinder and conventional cameras. It has allowed us to operate a manipulator arm remotely with several control modes. In order to evaluate the effectiveness and usefulness of developed system, we implemented remote handling experiments using mock up equipment. Performance was compared for remote operation conducted using several different display and operation modes. We confirmed that the system is able to maintain in-cell fuel fabrication equipment in each display and operation mode. Times required to complete the remote operations were collected and compared in each mode. It was observed that integration of 3D information from the laser rangefinder reduced operation time and reinforced visual information during remote operation. © 2011 Elsevier B.V. All rights reserved.
1. Introduction The Japan Atomic Energy Agency (JAEA) is developing plutonium and uranium mixed oxide (MOX) fuels containing minor actinides as a fuel for advanced fast breeder reactors (Funasaka and Itoh, 2007). In order to fabricate low decontaminated transuranium (TRU) fuel at production scale, enhancements for remote operability and maintenance performance is required relative to conventional glovebox type processes to control high radiation exposure. Furthermore, high precision is required in this equipment. Advanced Nuclear System Research and Development Directorate of JAEA proposed an in-cell module fuel fabrication equipment system to meet this criteria. The proposed repair and maintenance scheme of the system consists of three stages: (a) replace the out ordered module in the main process cell, (b) decontaminate and disassemble the module in the maintenance cell and (c) repair the module using the glove box in the maintenance room (Funasaka et al., 2008). To confirm feasibility of this concept, components of in-cell equipment, namely, pressing machine, pellet inspection equipment and powder analyzer are introduced. Currently, a mock up experiment is implemented to confirm operations and efficiency during stage (a) of the repair and maintenance scheme (Namekawa et al., 2009). There has been increased research and operations interest in the use of robotics for maintenance or decommissioning of structures
∗ Corresponding author. Tel.: +81 29 282 1111; fax: +81 29 282 9484. E-mail address:
[email protected] (A. Kitamura). 0029-5493/$ – see front matter © 2011 Elsevier B.V. All rights reserved. doi:10.1016/j.nucengdes.2011.04.044
in toxic, hazardous, or radioactive environments. Several prototype systems have been developed and tested. For example, Tharr (2000) developed the Auto Blaster system for bridge maintenance, Lorenc et al. (2000) made HydrCat for removing coatings from structures with applications in marine and non-marine industries, Ross et al. (2003) crafted crawling robots for tall buildings or tanks, and Schmitz (2003) introduced a large robotic paint stripping system for aircraft maintenance. Remote handling for maintenance of in-cell type equipment or decommissioning of old nuclear facilities is another challenging and difficult set of applications that involves direct contact tasks in hazardous and radioactive environments. Examples of existing robotic activities in such circumstances include, development of tele-operated manipulators conducted by Oak Ridge National Laboratory (Kress et al., 1997) and University of Tennessee (Hamel, 2006), aspects of the vitrification facility operation and dismantlement project conducted in Wast Valley, N Y (Cain et al., 2005), experiments in remote glovebox dismantlement in JAEA (Kitamura et al., 2011), and evolution of robotics in CEA (Detrich, 1998). In real remote in-cell operation, a collision control strategy is indispensible due to the lack of depth information or blind spots with a limited volume cell. In addition, robot malfunction or imprecision, and any potential for sudden changes of the environment, may be handled or mitigated through collision control strategies. Therefore, the acquisition of 3D information of the workspace is another important aspect to be resolved in the robotics application. Ordinarily, we use cameras as the primary visual information capture tool in remote operation, but recognition problems and full spatial characterization in real-time
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623
2615
Fig. 1. Laser rangefinder (LRF), servo motor and their ranges and resolutions.
can be difficult or imprecise. Simultaneously, remotely operated systems trend to be large and complicated, requiring simultaneous participation of multiple engineers and operators to set up and operate. In addition, revision and maintenance of the operational system are anticipated to be conducted with minimum impacts, leading to requirements for plug-and-play type systems. Several schemes aimed at these addressing issues have been devised. For example, Dixon et al. (2001, 2002, 2003) developed a camera based visual servo control system that compensates for unknown depth information (three-dimensional information from the two-dimensional image), adapts for the unknown camera calibration parameters, and enables multiple un-calibrated cameras to be used to provide a larger field of view. Nonlinear Lyapunov-based techniques, in conjunction with results from projective geometry, were used to overcome the complex control issues and alleviate many of the restrictive assumptions that impact typical visual servo controlled robotics systems. Developments for reconstructing 3D data from range measurement have been intensively explored as well. Curless and Levoy (1996) used volumetric range image processing (VRIP) while Davis et al. (2002) used volumetric diffusion method to generate surface information of the environment. Recently, Liu et al. (2008) and To et al. (2009) developed a robotic system for steel bridge maintenance. Their system consists of a 6-degrees of freedom (DOF) industrial robot, a moving platform, a sensor package equipped with a laser rangefinder, cameras and a capacitive sensor network, and a high performance computer. They introduced an efficient sphere representation method for environment description, collision detection and distance queries. Our development of a remote control system for maintenance of in-cell type fuel fabrication equipment integrates many of these strategies and techniques. The system display recreated threedimensional information for the workspace from measured data obtained by laser rangefinder and usual camera information to allow operation of a manipulator arm remotely with several control modes. The system also provides movable CAD drawings that simultaneously track the real time movement of the manipulator arm. In order to evaluate the effectiveness and usefulness of developed system, we implemented remote handling experiments using mock up equipment and compared the performance of remote operation conducted using different display mode and different operation modes. In this paper, we describe the features of the system and report experimental results for the remote handling operation experiments. The intended and potential applications
of the developed system include remote in-cell manipulation, decommissioning of radioactive equipment and hazardous material handling activities. 2. System configuration Initial CAD drawings of equipment and/or the workspace environment may not accurately express the real circumstances for robotic operations, or provide flexibility to respond to changing job and equipment requirements. This includes changes in equipment shape with time (which can range from small to large), material and equipment movement within the environment, and other aspects of a real system. Therefore, generation of precise location and distance data is necessary to adjust CAD drawings with real workspace information so that remote operation can be based on the CAD drawings. To capture such necessary real-time workspace information, we introduced a laser rangefinder for the data acquisition. The workspace 3D data obtained can then be used to correct and update the position and the posture of equipment and/or workspace within the CAD system drawings. We implemented a HOKUYO Rapid-URG scanning laser rangefinder (LRF) for the 3D scans of workspace. The Hokuyo URG-04LX scanning LRF is a compact, affordable and accurate laser scanner designed for robotic applications (Kawata et al., 2005). It is able to report ranges from 20 mm to 4000 mm (1 mm resolution) with 0.36◦ angular resolution (Hokuyo, 2009). In order to obtain 3D workspace information, a servo motor (ROBOTIS) is used to rotate the LRF in the pitch direction as shown in Fig. 1 (0.3◦ in pitch). This LRF with servo motor is connected to a computer to control LRF and the computer displays the 3D scanned data from LRF. This process demands accurate measurement of the sensor. Currently about 1 cm error in a 5 m distance measurement is the best precision obtainable. We are continuing efforts to decrease this measurement error by mechanical modification of the stand of the LRF. Rangefinder data acquired using a fixed position is transformed to a single global coordinate frame for the manipulator arm PA-10. Then the CAD drawings of three-dimensional positional and postural relationship between the PA-10 and other equipment are adjusted by superimposing the components of CAD drawings to the rangefinder point data. The system, then displays an adjusted CAD present view of the workspace for remote operation. Once CAD drawings are matched with the real measured 3D data, then operator is able to remotely operate manipulator arm even in a blind workspace. The flow of the process is shown in Fig. 2. We also provide camera pictures as supplemental information using a visual capture tool.
2616
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623
Fig. 2. LRF measurement and CAD drawing adjustment by the developed system.
In the system design phase, we also evaluated the ease of removal and installation of terminals. This was evaluated with respect to requirements of cost performance and ease of set up the system, which resulted in selection of commercially available terminals (allowing simplification of exchange the terminals and modification/enhancement of the system displays). For the robot
arm selection, we choose the Mitsubishi PA-10 robot arm as a slave manipulator. The Mitsubishi PA-10 robot arm is a 7 degreeof-freedom robot arm with an open control architecture and is manufactured by Mitsubishi Heavy Industries (Fig. 3). The four layer control architecture is made up of the robot arm, servo controller, motion control card, and the upper control computer. It delivers
Fig. 3. The Mitsubishi PA-10 robot arm and its range of motion.
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623
2617
3. Experimental set up and the flow of remote operation
Fig. 4. Block diagram schematic of the developed system.
high performance to price ratio and has been used in many settings including academic and industrial fields. Components of the robot arm are required to have adequate radiation resistance to allow its operation for an extended time in a high radiation environment (Shibanuma and Honda, 2001). The development of the AC servo motor, cables, electric device and position sensor are crucial in these aspects and intensive research has been conducted (Itoh et al., 1997; Obara et al., 1996; Oka et al., 1997; Takeda et al., 2008). Modifications identified that must be performed for real use in an intense radiation environment have not been implemented for the work reported here, but are not expected to change our experimental results. To be able to interface with the robot arm controller, a software package was developed to link to the master controller. As master terminals, we selected a 3D mouse controller and joystick controller. The main reasons these two devices were chosen were cost effectiveness and ease of replacement, especially to handle possible controller malfunctions. However, we have simultaneously begun evaluation of a suit type master terminal as an alternative effective candidate, since such terminal is expected to have more powerful human interface potential. The block diagram of the total system is shown in Fig. 4. For actual applications, in-cell environment replacement of remote devices and ancillary equipment must be carefully designed into the system because remote devices and equipment degrade quickly in such environment and it is beneficial to utilize enhanced capabilities as they become available. It is also an important feature of this system that terminal units and external components of system are replaceable and easy to modify for future demands on operational capabilities.
As described in Section 1, representative in-cell equipment consists of a pressing machine, pellet inspection equipment and powder analyzer. Of these, the pressing machine is the most complex equipment. It has various modules and maintenance mechanisms. Therefore, a mocked up pressing machine was developed for experimental tests, comprised of apellet pressing module and pellet transferring module. The basic concept of the experimental design is to use similar remote handling units as much as possible. We did not restrict the number of remote handling units in this experimental demonstration system, but the type of remote handling units utilized are designed to be the minimum possible. These units are basically restricted to three types: (a) bolts, (b) pins, and (c) connectors. We also designed the access points for these remote handling units so that they are concentrated within the range of PA-10 reach as much as possible, so that we could reduce the total time of all remote operations by eliminating the number of unnecessary set points of the PA-10. As a result, however, the probability of interferences for access to remote handling units is increased, and intensive simulation of remote handling access were conducted in the design phase to make sure that the handling activities could be performed adequately. Fig. 5 shows such a simulation involving remote handling of a pin. While we previously designed the structure to set pins to the outer side of the equipment frame, the simulation revealed the existence of an un-natural posture of PA-10 which occurs due to avoidance of collisions with the equipment surface. Therefore, the pin locations needed to be re-designed to be set into the inner side of the equipment frame. In addition to operations with the PA-10 robot arm, heavy tasks such as removal and reinstallation of large modules are carried out using an in-cell crane. Fig. 6 shows the schematic flow of the module removal operation using the crane. Fig. 7 shows the experimental mock up equipment for the pressing machine. Basic remote handling units included are pins to lock and unlock the hanger for equipment removal and re-installation, connectors to connect lines for supply gas, liquid (which is similar to the gas supply connector and therefore omitted from this test bed), and electricity, and bolts to fasten and unfasten modules and submodules in various places, as described earlier The actions involved in unlocking pin and sub-module removal are shown in Fig. 8. A few exceptional remote handling units exist. The pellet pressing module includes slides to unfix the machine to move into the replacement position, handled by PA-10 (although this is not difficult task). The pellet transfer module involves a small sub-module, which allows removal of some of the pellets for inspection (line exchange equipment). This sub-module is placed in the path of the pellet (bottom right Fig. 7) and is handled by PA-10. In the following presentation we describe the results of pellet transfer module remote handling. Flow of the remote operational tasks required in the pellet transfer module is as follows: Module removal
Fig. 5. Simulation of PA-10 robot arm accessing a lock pin for module removal.
2618
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623
Fig. 6. Schematic flow of the module removal operation using the crane.
• • • •
Start operation of manipulator arm; Take down the right hand side lock pins (A); Take down the left hand side lock pins (A); Disconnect compressed gas supply connector (B);
• • • •
Disconnect electricity supply connector (B); Attach impact driver to manipulator arm (C); Attach cusps of impact driver to unfasten bolt (C); Detach impact driver from manipulator arm (C);
Fig. 7. Mock up equipment with main manipulation points. Bottom left: lock pins for module removal (red circle). Bottom middle: gas and electrical supply connectors (green square). Bottom right: sub-module fix bolt (yellow triangle).
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623
2619
Fig. 8. Examples of remote operation involved in module handling.
• • • •
Remove sub-module (line exchange equipment) (D); Hold sub-module (line exchange equipment) (D); Separate sub-module (line exchange equipment) (D); Locate sub-module (line exchange equipment) to repairing position (D); • Hold module equipment by crane*; • Separate module equipment by crane*; • Position module equipment into repair station position by crane*. Module re-installation
• Locate repaired module equipment to set position by crane*; • Re-install repaired module equipment by crane*; • Position repaired sub-module (line exchange equipment) to set position (D); • Re-install repaired sub-module (line exchange equipment) (D); • Attach jig to hand for connecting connectors (B); • Connect compressed gas supply connector (B); • Connect electricity supply connector (B); • Terminate operation of the manipulator arm; Note that the actions involving the crane (labeled with asterisks) were not performed in this experiment. Although there seem to be many operations necessary for the process of removal and re-installation of module equipment, basically only four (A–D) types of mechanical operation are needed (take down of pins, fasten and unfasten bolts for module connection, and connect and disconnect the connectors of gas and electricity, and handle the sub-module).
In these tasks, detailed actions involve (1) movement of the PA10 hand to a position near the access point, avoiding collision(s) with the equipment, (2) locate the PA-10 hand for the appropriate operation, (3) conduct removal and re-installation of bolt and connectors. These are of primary concern in this experiment, because these actions were found to be the most difficult in previous based remote operation using just the camera. Several tasks in the action set are shown in Fig. 9. Of these tasks, disconnecting and connecting the electric connector are especially difficult, because the system must be designed for nuclear use with a complex set of holes and corresponding pins, with tight tolerances. As is shown in the top right picture in Fig. 9, this task requires both delicate control and precise, sharp movements.
4. Results and discussion To evaluate the efficiency and usability of the developed LRF system, we performed remote maintenance experiments using the mock up equipment. Work performance and time needed to finish data was collected for each task. The experiments were conducted using two different display modes (camera vision only and camera with CG visions) and two different control modes (joystick and 3D mouse) of the system. For the 3D information acquisition, around 1 min is needed to acquire data and calculate 3D coordinates of workspace. 3D coordinate data is then converted into visual 3D dotted representation, which can be view from any direction as defined by the operator. 3D data is used to adjust relative coordinate relationships between the CAD drawings of equipment and the status of the manipula-
2620
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623
Fig. 9. Examples of the tasks conducted during remote operation.
tor arm. Once this set of information is settled, the manipulator arm can be operated using adjusted CAD vision to remotely handle the modular equipment (this display mode is labeled as LRF based display mode), as shown in Fig. 10. In the experiment, a defined set of tasks was repeated three times in each examination. We first confirmed that the tasks described in the previous section could be conducted in any display and operational mode in the system. In Fig. 11, remote operation times consumed to complete the tasks on handling mock up equipment are shown. As can be seen from the figure, the average time to complete the total remote operation task by 3D mouse and joystick were both smaller in the LRF based display mode. In the 3D
mouse operation, the time to complete the total task was halved in this mode. The average operation time in the joystick manipulation was, however, not reduced markedly. The main aspect that likely influences this result is that joystick manipulation mode has several selection buttons for operation and involves large movements of the controller (joystick). In comparison, 3D mouse manipulation
Fig. 10. Adjusted CAD drawings of module and PA-10 robot arm corrected by LRF data.
Fig. 11. Total remote operation time (average) and standard deviation for each experiment set involving three repetitions.
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623
2621
Fig. 12. Individual remote operation times for sub-tasks in each experiment.
mode has no selection button and requires only small movement of the controller (3D mouse) which can be more precise. In most cases, the 3D workspace information provides significantly more detail in the LRF based display mode, so that trial movement of master terminals is reduced, resulting less operation time. The variations of the times to complete the total task were also smaller in the LRF based display mode. The standard deviation of the total task operation time by 3D mouse and joystick operation with camera based display were found to be 93 and 179, respectively, while corresponding values with the LRF based display were 53 and 70, respectively. By individual tasks, we observe the results in Fig. 12. The variations in time for individual tasks were large for camera based operations, especially for the difficult task of disconnection and connection of the electricity supply connector. In fact, the variances in time to complete disconnecting the electrical supply connecter by 3D mouse and joystick with camera based display mode were found to be 5241 and 1133, corresponding variances with the LRF based display mode were found to be 98 and 61, respectively. Also, we observe very small variations in some manipulations with the LRF based display mode, such as disconnection of gas supply connector and removal of sub-module. These results suggest that LRF based display mode improved man-machine interface is superior for especially in difficult remote operations. The tasks that (1) move PA-10 near to the access point without collision
with the equipment, (2) locate the hand of PA-10 to the destination, and (3) conduct removal and re-installation of bolt and connectors, were confirmed to be adequately performed in the LRF mode. As designed in our LRF mode, the overall view is available and therefore it was easy to recognize the work space available for operation of the robot arm. In this way, the hand of PA-10 could be placed exactly where it was best operated. Also, many blind spots are removed and depth information is added to the camera vision. Therefore, position and depth information for both the equipment and manipulator arm are easily recognized. Furthermore, since the system could remove nonessential information from view by withdrawing some of the acquired data and could zoom in and out without blur (usually unavoidable deficiency for the camera system), the operator could concentrate on operating master unit to approach the hand of PA-10 to the target points (accessing points) without irritation. The LRF based display mode could also be configuration to show the target component even if it is covered by other components, by expressing parts of the equipment CAD drawings using a frame model (see Fig. 13). In this way, the system would allow the operator to visualize a reach the underneath equipment components to the target, a maneuver that is usually blind in camera vision. In addition, we made several other observations during the experiments. Tasks involving take down pins could be done by
2622
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623
Fig. 13. Access to a covered object by implementing transparency (frame) display mode for some equipment components.
using only adjusted CAD drawing vision (without camera vision). Positioning the robot hand to the target point is best done by LRF based display mode, but the direct contact operation is best accomplished by combined use of the LRF based display mode and watching the camera vision for additional information. The testing resulted in identification of superiority of LRF display mode when compared to only the usual camera based mode, which can be summarized as follows: • The times to complete the total remote operation task were smaller in LRF based display mode (combination of 3D mouse master terminal and LRF based display mode was the fastest operationally under these experimental conditions). • The variations of the times to complete the total task were also generally smaller in the LRF based display mode. In particular, the variation in times required to accomplish the tight tasks, such as connecting and disconnecting connectors, was smaller. • Removal of blind spots and addition of depth information in the LRF based display mode relative to simple camera vision allows position and depth information of equipment and manipulator arm to be easily recognized and integrated into operations. • The overall view available makes it easy to recognize work space. • The risk of collision with the equipment is reduced during PA-10 robotic arm operation. • Light contact tasks, such as operation of take down pins, could be done by using adjusted CAD drawings vision only (without camera vision). • Man-machine interface is enhanced, since the system could remove nonessential information from view by withdrawing some of the acquired data, could zoom in and out without blur (usually unavoidable deficiency for the camera system), and could see the target figure even when covered by other objects by expressing components of equipment using the CAD frame model.
Acknowledgements The present study is a part of the result of “Development of the in-cell remote equipment” project entrusted to Japan Atomic Energy Agency (JAEA) by the Ministry of Education, Culture, Sports, Science and Technology of Japan (MEXT). The authors give their gratitude to Dr. D. R. Janecky of Los Alamos National Laboratory for reviewing the manuscript.
References Cain, M.J., Dayton, C., Al-Daouk, A.M., 2005. Dismantling the vitrification facility at the west valley demonstration project. In: Radwaste Solutions , March/April, pp. 36–42. Curless, B., Levoy, M., 1996. A volumetric method for building complex models from range images. In: Proceedings of SIGGRAPH ‘96 , New Orleans, Louisiana, USA, August 4–9. Davis, J., Marschner, S.R., Garr, M., Levoy, M., 2002. Filling holes in complex surfaces using volumetric diffusion. In: Proceedings of First International Symposium on 3D Data Processing Visualization and Transmission , Padova, Italy, June 19–21. Detrich, J.M., 1998. Innovations in dismantling robotics. Nuclear Engineering and Design 182, 267–276. Dixon, W.E., Dawson, D.M., Zergeroglu, E., Behal, A., 2001. Adaptive tracking control of a wheeled mobile robot via an uncalibrated camera system. IEEE Transactions on Systems, Man, and Cybernetics: Part B Cybernetics 31, 341–352. Dixon, W.E., Fang, Y., Dawson, D.M., Flynn, T.J., 2003. Range identification for perspective vision systems. IEEE Transactions on Automatic Control 48, 2232–2238. Dixon, W.E., Zergeroglu, E., Dawson, D.M., Costic, B.T., 2002. Repetitive learning control: a Lyapunov-based approach. IEEE Transactions on Systems, Man, and Cybernetics: Part B Cybernetics 32, 538–545. Funasaka, H., Itoh, M., 2007. Perspective and current status on fuel cycle system of fast reactor cycle technology development (FaCT) project in Japan. In: Proceedings of Global , Boise, Idaho, USA, September 9–13. Funasaka, H., Nakamura, H., Namekawa, T., 2008. Current status on fuel cycle system of fast reactor cycle technology development (FaCT) project in Japan. In: 16th Pacific Basin Nuclear Conference , Aomori, Japan, October 13–18. Hamel, R., 2006. Human interaction with robots working in complex and hazardous environments. In: Colloquium on Robotics and Automation, Università degli Studi di Napoli Federico II , Napoli, Italy, December 18. Hokuyo, 2009. Vender web site, http://www.hokuyo-aut.jp/products/. Itoh, A., Obara, K., Tada, E., Morita, Y., Yagi, T., Iida, K., Sato, M., 1997. Development of radiation-hard electric connector with ball bearing for in-vessel remote maintenance equipment of ITER. In: JAERI-Tech-97-065. Kawata, H., Ohya, A., Yuta, S., Santosh, W., Mori, T., 2005. Development of ultrasmall lightweight optical range sensor system. In: International Conference on Intelligent Robotics and Systems , 1078–1083, Edmonton, Canada, August 2–6, p. 2005. Kitamura, A., Watahiki, M., Kashiro, K., 2011. Remote glovebox size reduction in glovebox dismantling facility. Nuclear Engineering and Design 241, 999–1005. Kress, R.L., Jansen, J.F., Noakes, M.W., Herndon, J.N., 1997. The evolution of teleoperated manipulators at ORNL. In: ANS Topical Meeting on Robotics and Remote System , Augusta, Georgia, USA, April 27–May 1. Liu, D.K., Dissanayake, G., Manamperi, P.B., Brooks, P.A., Fang, G., Paul, G., Webb, S., Kirchner, N., Chotiprayanakul, P., Kwok, N.M., Ren, T.R., 2008. A robotic system for steel bridge maintenance: research challenges and system design. In: Proceedings of Australasian Conference on Robotics and Automation (ACRA 08) , Canberra, Australia, December 3–5, p. 2008. Lorenc, S.J., Handlon, B.E., Bernold, L.E., 2000. Development of a robotic bridge maintenance system. Automation in Construction 9, 251–258. Namekawa, T., Yamada, Y., Kitamura, A., Hosogane, T., Kawaguchi, K., 2009. Handling technology of low decontaminated TRU fuel for the simplified pelletizing method fuel fabrication system. In: International Conference on Fast Reactors and Related Fuel Cycles: Challenges and Opportunities (FR09) , Kyoto, Japan, December 7–11. Obara, K., Kakudate, S., Oka, K., Furuya, K., Taguchi, H., Tada, E., Shibanuma, K., Koizumi, K., Ohkawa, Y., Morita, Y., Yagi, T., Yokoo, N., Kanazawa, T., Haneda, N.,
A. Kitamura et al. / Nuclear Engineering and Design 241 (2011) 2614–2623 Kaneko, H., 1996. Irradiation tests of critical components for remote handling system n gamma radiation environment. In: JAERI-Tech-96-011. Oka, K., Obara, K., Kakudate, S., Morita, Y., 1997. Development of radiation hard components for remote maintenance. Journal of Plasma and Fusion Research (in Japanese) 73, 69–82. Ross, B., Bares, J., Fromme, C., 2003. A semi-autonomous robot for stripping paint from large vessels. International Journal of Robotics Research 22, 617–626. Schmitz, W., 2003. Robotic paint stripping of large aircraft—a reality with the FLASHJET® Coatings Removal Process. In: Proceedings on Aerospace Coating Removal and Coatings Conference , Colorado Springs, Colorado, USA, May 22–24.
2623
Shibanuma, K., Honda, T., 2001. ITER R&D: remote handling systems: blanket remote handling systems. Fusion Engineering and Design 55, 249–257. Takeda, N., Kakudate, S., Nakahira, M., Shibanuma, K., 2008. Current status of research and development on remote maintenance for fusion components. Journal of Plasma and Fusion Research (in Japanese) 84, 100–107. Tharr, D., 2000. Automated abrasive blasting equipment for use on steel structures. Applied Occupational and Environmental Hygiene 15, 713–720. To, W.K., Paul, G., Kwok, N.M., Liu, D.K., 2009. An efficient trajectory planning approach for autonomous robots in complex bridge environments. International Journal of Computer Aided Engineering and Technology 1, 185–208.