Available online at www.sciencedirect.com Available online at www.sciencedirect.com
ScienceDirect ScienceDirect
Procedia online Computer 00 (2016) 000–000 Available at Science www.sciencedirect.com Procedia Computer Science 00 (2016) 000–000
ScienceDirect
www.elsevier.com/locate/procedia www.elsevier.com/locate/procedia
Procedia Computer Science 109C (2017) 59–66
The 8th International Conference on Ambient Systems, Networks and Technologies The 8th International Conference (ANT on Ambient 2017) Systems, Networks and Technologies (ANT 2017)
Curved - free-form interaction using capacitive proximity sensors Curved - free-form interaction using capacitive proximity sensors Andreas Brauna,b *, Sebastian Zander-Walzaa, Martin Majewskiaa, Arjan Kuijpera,b a,b Andreas Braun *, Sebastian Zander-Walz , Martin Majewski , Arjan Kuijpera,b a a
Fraunhofer Institute for Computer Graphics Research IGD, Fraunhoferstr. 5, 64283 Darmstadt, Germany b Technische Darmstadt, Karolinenplatz 5, 64283 Darmstadt, Fraunhofer Institute for Universität Computer Graphics Research IGD, Fraunhoferstr. 5, 64283 Germany Darmstadt, Germany b Technische Universität Darmstadt, Karolinenplatz 5, 64283 Darmstadt, Germany
Abstract Abstract Large interactive surfaces have found increased popularity in recent years. However, with increased surface size ergonomics become important, as interacting extended periods may cause fatigue. Curvedwith is aincreased large-surface interaction device, Largemore interactive surfaces have foundfor increased popularity in recent years. However, surface size ergonomics designed to follow the natural movement a stretched arm when performing one or twointeraction hands above the become more important, as interacting forofextended periods may cause fatigue.gestures. Curved Itis tracks a large-surface device, surface, using an array capacitive proximity supports both touch and mid-air gestures. specific object designed to follow the of natural movement of a sensors stretchedand arm when performing gestures. It tracks oneItorrequires two hands above the tracking using methods measurement from sensors.both We have an example application forspecific users wearing surface, an and arraythe ofsynchronized capacitive proximity sensors and32supports touchcreated and mid-air gestures. It requires object a virtual methods reality headset seated thatmeasurement may benefit from from 32 haptic feedback and ergonomically shaped surfaces. for A prototype with tracking and thewhile synchronized sensors. We have created an example application users wearing curvature has been created us to evaluate gesture recognition performance and different surface inclinations. aadaptive virtual reality headset while seatedthat thatallows may benefit from haptic feedback and ergonomically shaped surfaces. A prototype with adaptive curvature has been created that allows us to evaluate gesture recognition performance and different surface inclinations. © 2016 The Authors. Published by Elsevier B.V. 1877-0509 © 2017 The Authors. Published by Elsevier B.V. © 2016 The Authors. Published by B.V. Program Chairs. Peer-review under responsibility of Elsevier the Conference Peer-review under responsibility of the Conference Program Chairs. Peer-review under responsibility of the Conference Program Chairs. Keywords: Curved surfaces; capacitive sensing; gestural interaction; virtual reality Keywords: Curved surfaces; capacitive sensing; gestural interaction; virtual reality
1. Main text 1. Main text Large interactive surfaces have been a research interest for several decades, with various applications in research, Large interactive surfaces have been a research interest using for several decades, various applications research, commerce, and industry. Typically, they are controlled various touch with gestures. However, therein are some 1,2 commerce, industry. they the are surface controlled using touch gestures. there are some . The lastvarious few years have seen a However, revived interest in virtual varieties thatand enable hand Typically, tracking above 1,2 . Thethelast few years have seen a revived interest insystems virtual varieties that systems. enable hand tracking aboveheadsets the surface reality (VR) Sensor-augmented improve user experience and numerous commercial realityhave (VR)arrived systems. Sensor-augmented headsets the user experience and numerous commercial systems either on the market, or are about to doimprove so. either have arrived on the market, or are about to do so.
* Corresponding author. Tel.: +49 6151 155 208; fax: +49 6151 155 480. E-mail address:author.
[email protected] * Corresponding Tel.: +49 6151 155 208; fax: +49 6151 155 480. E-mail address:
[email protected] 1877-0509 © 2016 The Authors. Published by Elsevier B.V. Peer-review©under the Conference Program 1877-0509 2016responsibility The Authors. of Published by Elsevier B.V. Chairs. Peer-review under responsibility of the Conference Program Chairs.
1877-0509 © 2017 The Authors. Published by Elsevier B.V. Peer-review under responsibility of the Conference Program Chairs. 10.1016/j.procs.2017.05.295
60 2
Andreas Braun et al. / Procedia Computer Science 109C (2017) 59–66 Andreas Braun et al. / Procedia Computer Science 00 (2015) 000–000
Fig. 1. Person interacting with Curved in a VR application
Visual cues are an important aspect in human-computer interaction (HCI). Knowing the position of interaction elements is achieved, either by visual identification or by learning their positions. There is no straightforward way to provide these for VR systems. Most commonly this limitation is overcome by relying on haptic feedback devices, or providing visual feedback in the virtual world3. Combining interactive surfaces and VR has a number of potential applications, ranging from 3D manipulation to configurable haptic user interfaces. However, so far these surfaces rarely consider the body kinematics and remain flat or shaped non-ergonomically4. In applications, where prolonged use is necessary, a device that allows for haptic feedback can help reducing fatigue, by letting the user rest his hands. This is not directly supported by common hand tracking devices, such as the Leap Motion or the Microsoft Kinect5,6. In this work we present Curved - an interaction device with a curved surface, based on capacitive proximity sensors (Fig. 1). It is able to detect presence and position of one or two hands at a distance of approximately 20 centimeters from the surface and distinguishes mid-air and touch gestures. Curved is comprised of eight modules with four capacitive sensors each. The inclination of the modules can be adapted, so it is suited for persons sitting or standing in front of it. We have created a demonstration application in VR that visualizes the effects of different lighting systems and their impact on the energy consumption in a home environment. In addition, the gesture classification module was tested with ten users. In short, we propose the following scientific contributions: • A novel free-form interaction system, whose shape follows the kinematic range of a person’s hands, • whose shape is adaptive for sitting/standing persons and supports touch/mid-air gestures, and a • VR demonstration application for visualizing energy consumption and user evaluation 2. Related Works Curved display systems have been of commercial and research interest in the past few years. Improved display technologies, such as OLED or eInk can be applied more flexibly, which has resulted in various curved displays on the market, from small smartphone screens to home theater systems7,8. The University of Groningen created the Reality Touch Theater - a large area system with a curved projection area that supports touch interaction from multiple users that move around the area9. The system is designed for collaborative multi-user interaction with the vertical surface, not taking into account individual kinematic constraints. Researchers have been active in creating large-area interaction systems for standing or sitting persons. Rekimoto used an array of capacitive sensors to build SmartSkin, an early example of a flat interactive surface10. More
Andreas Braun et al. / Procedia Computer Science 109C (2017) 59–66 Author name / Procedia Computer Science 00 (2015) 000–000
61 3
recently Annett et al. created Medusa, a combination of a regular touch surface and an array of IR sensors for proximity-aware interaction4. The example applications for Medusa included user differentiation and combined touch & proximity gestures. However, this system is flat, so users have to stretch to reach different parts. Modern sensor systems enable the creation of irregularly shaped or even deformable interaction systems. Wimmer and Baudisch use time-domain reflectometry to detect the touch at a certain position of a wire11. By applying this wire on a soft silicone surface, they are able to create stretchable interactive surfaces. Roudaut et al. evaluated user preference when interacting with curved interactive surfaces12. They created some guidelines for device and interface designers, on how to adapt the systems based on the chosen curvature. The created shapes are small and primarily aimed towards interaction with fingers or a single hand. Capacitive proximity sensors detect the presence and activities of a human body at a distance. A classic example is the Gesture Wall - an interactive art installation controlled by hand movements in front of spherical antennas13. Recent examples explored using transparent electrodes and new processing methods for tracking fingers and hands14,15. Bachynsky et al. have recently started investigating the ergonomics of various user interfaces based on biomechanical simulation16,17. They propose that a lower amount of kinetic energy used when interacting, can increase efficiency and acceptance of HCI systems. This can be simulated or measured, e.g. by using motion capture systems, leading to increased research into devices that are more efficient to use. Even though many systems are aimed for gaming, VR has a high potential in work environments18. Systems, such as Curved can be another step towards the acceptance of VR systems in the work environment, by reducing necessary upper body movement, avoiding fatigue and increasing intuitiveness of use. 3. Curved surface interaction
Fig. 2. Left - Kinematic range of hands with static shoulders. Right - rotational 3D model of arm range
The first step in the design of Curved is finding the proper layout of the interaction system. We consider a person wearing VR goggles and is sitting or standing in front of the interactive surface. In order to minimize the required movement of the body, we consider that the shoulders should stay static. The arms and hands can move freely. A simple two-dimensional kinematic model of this is shown in Fig. 2 on the left. The red line is the convex hull created when the hands are stretched out. There is an overlapping area in the middle as indicated by the dashed line. This area right in front of the user can be reached by both hands equally. This model can be transferred into three dimensions. While the shoulders remain static, the arms are rotated around all joints. As a simplification the overlapping area in front of the user is modeled as a flat surface. The resulting body is the convex hull of the area that can be reached by outstretched arms and hands. It is shown in Fig. 2 on the right. The resulting rotational body is the template for Curved. Some limitations apply for practical purposes. The lower part of this rotational body need to provide space for the person sitting or standing in front of it. Therefore, an area should be left open that leaves sufficient space for the user. The second limitation is based on the requirement for sitting and standing interaction. The position of the shoulder relative to Curved is considerably different for each
Andreas Braun et al. / Procedia Computer Science 109C (2017) 59–66 Andreas Braun et al. / Procedia Computer Science 00 (2015) 000–000
62 4
case. Either the system has to be moved up and down, or the surface has to adapt towards the current posture. Finally, the whole rotational body is very large when installed. Therefore, the interaction area can be reduced by not providing the full 180° coverage. The final geometric design is shown in Fig. 3. Curved is comprised of eight modules, that are hinged at the side nearest to the user. The back side can be moved up and down and locked into several positions, thus fixing their inclination. A low inclination is suitable for a standing user that wants to interact with his arms down (e.g. in Fig. 1), whereas a higher inclination is more suitable for sitting users. There is a circular cutout in the center to accommodate a standing or sitting user. The geometry of the Curved system resembles a reduced polygon model of the rotational body. This reduces complexity, while keeping a mechanically feasible system. Each module is equipped with four loading mode capacitive proximity sensors that use copper foil electrodes. They can track the presence and position of hands on or above the surface.
Fig. 3. Concept sketch of the final Curved design.
4. Prototype
Fig. 4. Left: overall view of Curved prototype, center: copper electrodes and sensor of a single module, right: detail view of mechanical joints and connected sensors
The Curved prototype system is based on the OpenCapSense toolkit for capacitive proximity sensing19. This toolkit has been used for various interaction systems in the past20,21. Four boards are synchronized over CAN bus using a customized firmware. This synchronization is necessary, since adjacent boards may disturb one another. If an electrode is charged next to one that is discharged, the measurement may be affected. This synchronization is a feature for the eight sensors connected to a single board, but was unsupported over multiple boards. In our case we perform simultaneous measurements of modules 1, 2, 5, and 6 (as shown in Fig. 4 - left), as the two connecting boards are sufficiently far away from one another. Once the measurement is completely, modules 3, 4, 7, and 8 are used. This limits the effective sampling rate from 20Hz to 10Hz. Each board interfaces the sensors of two modules. For each module we use an acrylic plate as the base. Four copper foil electrodes are glued on and connected via a soldered wire to the small sensor boards (Fig. 4 - center). Each of those modules is then attached to the joint structure that allows modifying the inclination of the system between 10° and 25° (Fig. 4 - right). The sensors are finally connected to the main OpenCapSense board using USB cables. Finally, the eight modules are hoisted on a wooden frame (Fig. 4 - left).
Andreas Braun et al. / Procedia Computer Science 109C (2017) 59–66 Author name / Procedia Computer Science 00 (2015) 000–000
63 5
The Curved system is connected over USB-to-Serial interfaces to a PC that performs the previously mentioned calibration, baselining and drift compensation, as well as the generation of capacitive images and gesture classification. The software is implemented in Java, using OpenCV† for any image processing and the Weka framework‡ for gesture classification (using the libSVM module). The recognized gesture can be used by any application for interaction. We have created a demonstration application in VR that visualizes energy consumption choices in a Smart Home environment. 5. Data processing
Fig 5. Processing of integer capacitance values (a) to capacitive images (b) and recognized objects and shapes (c)
The loading mode sensors provide a capacitance measurement that is mapped into a bit value by an analog-digital converter. We apply basic methods of baseline creation, calibration and drift compensation for a stable signal22. For further analysis, including gesture recognition, we use the geometric position of the sensors and the capacitance value to generate a capacitive image. 5.1. Generating capacitive images The process of creating a capacitive image is shown in Fig. 5. The integer values from the sensor can be considered as a low-resolution grayscale image. We have the option to use this image directly for object recognition or first apply additional image processing methods. Fig. 5 (c) shows an image that is scaled up with bicubic interpolation and converted to a binary image using a threshold. The small red circle marks the center of the object, which is calculated using image moments23. A new image is acquired from sensor data every 50 ms. The algorithm clusters and identifies several objects in proximity of, or touching Curved. The analysis of subsequent centers is used as a basis for the gesture recognition.
Fig. 6. Gesture categories supported by Curved
† ‡
http://opencv.org/ http://www.cs.waikato.ac.nz/ml/weka/
Andreas Braun et al. / Procedia Computer Science 109C (2017) 59–66 Andreas Braun et al. / Procedia Computer Science 00 (2015) 000–000
64 6
5.2. Gesture recognition Curved supports three different types of gestures that are shown in Fig. 6. Each can be performed by either touching the surface or moving the hands through the interaction space of approximately 20 cm above the surface. The system can also be configured to only support touch or mid-air interaction. The first gesture category is pointing gestures (a) performed by holding the hand in the interaction area for a time of more than 300 ms. The second category is one-handed swipes (b) in eight directions. This again can be performed on or off the surface. The last category is two-handed pinch-and-zoom gestures (c) that are detected if two objects are present and move their locations towards one another or vice versa. Here we distinguish six varieties (horizontal and diagonal movements, centers moving closer/getting further away). The gestures are classified, if six consecutive images (duration 300 ms) have found objects. A support vector machine (SVM) classification is used, with the relative position of six object centers used as features (twelve centers in case of two-handed gestures). In case that two objects are present, only pinch-to-zoom gestures are enabled. SVMs are often used in discrete classification tasks and have a good performance24. Overall, there are 15 different gestures that can be performed. The system was trained by two users and 20 samples for each gesture. 6. Interaction in Virtual Reality environments A main application area that we see for Curved is the interaction in scenarios, where there is no visual reference for the position of the user’s hands. Most notably this is the case in VR, when a headset is worn. In the last few years, several of those systems have been released as prototypes, ranging from smartphone-based varieties, such as Google Cardboard, to PC-based systems, such as the Oculus Rift. Our application was implemented on the latter, using the available development kits.
Fig. 7. Left - Traffic light system for energy usage and current energy consumption. Right - visualizing effects of different lighting scenarios (top: lamp off, middle: fluorescent lighting, bottom: LED lighting)
6.1. Making Energy Visible application To explore energy usage and to enable the user to understand the consequences of his actions and choices as well as the impact on the environment, the demonstrator consists of a rich virtual reality simulation. This simulation goes beyond displaying lists of metrics and charts of cash movement. It provides a comprehensive and easy to understand visualization approach (shown in Fig. 7 - left). It connects the chosen setup (the ensemble) of energy consuming devices (e.g. the type of light sources from fluorescent bulbs to modern LEDs) - entertainment sets or household appliances) and brings it into correlation with the user’s environmental perception (shown in Fig. 7 - right). The environment is created using the Unity3D game engine. The input and output device management sub module is accompanying the Unity3D framework, by providing an abstraction layer for a variety of different input devices like gamepads, joysticks, mice and keyboards. Curved hooks into the C# scripting environment system of Unity3D and translates the input signals into Unity3D events.
Andreas Braun et al. / Procedia Computer Science 109C (2017) 59–66 Author name / Procedia Computer Science 00 (2015) 000–000
65 7
The user interacts with the environment using all supported gestures. The user selects different devices, by moving the head. The device can be selected using a pointing gesture, which opens the energy visualization menu. Here, we can move through the menu using left and right swipes. A pinch-gesture closes the menu again. 7. Evaluation A study was performed with 10 users (8m, 2f, average age 25.4). Most participants had previous experience with gesture interaction systems. We wanted to evaluate the gesture recognition performance and the preference for inclination when using the system. The users were shown each gesture once and had no specific training time. The participants performed gestures in two different inclinations (10° and 25°), both in touch and mid-air, and a subset of the supported gestures (one point, three swipes, two pinch-to-zoom). The inclinations were chosen, due to constraints of our structure, and the gestures were chosen based on similarity to each other, whereas adding more gestures would have led to similar results. Of these gestures, each gesture was performed five times. Finally, the participants were asked to fill a questionnaire. Overall 1200 gestures were performed during the evaluation. The system was pre-trained by two users, a day before the evaluation with a training set of 300 gestures. They were not participating in the actual study. The accuracy of the classification is shown in Fig. 8. With the exception of swipes to the left, we achieved a good accuracy. The overall average accuracy was 81% (89% if left swipes are excluded). The swipes are often misclassified as right swipes, since the hand tends to move inside the supported interaction space, thus sometimes doing a right swipe before doing a left swipe. This was strongly depending on the participant. In general, it is difficult to distinguish between touch and mid-air gesture accuracy. Both methods performed similar in our evaluation (touch=83%, mid-air=79%), but the difference is not statistically significant (p=.32). The same applies to the inclination angles. While the high inclination (83%) performed a little bit better than the low inclination (79%), the result is not statistically significant (p=.31). Finally, in the questionnaire we inquired about preference of inclination (multiple-choice), of touch vs. mid-air (10-scale Likert, 1=touch, 10=mid-air) and how tiring the interaction was (10-scale Likert, 1=not tiring, 10=very tiring). All ten participants preferred the high-inclination setting, as the evaluation was performed standing. They slightly preferred mid-air interaction (µ=6.6, σ=3.2) and did not consider the system very tiring (µ=4.5, σ=2.2).
Fig. 8. Classification accuracy of touch and mid-air gestures at low and high inclination
8. Conclusion & Future Works In this work we have presented Curved - a free-form interaction device, based on capacitive proximity sensors that supports touch and mid-air gestural interaction. We consider it particularly suited for VR applications. We have presented the process of creating the ergonomic surface shape and how we created a prototype from it. A software application was programmed that illustrates the VR use case, by combining Curved with the Oculus Rift. This application demonstrates the effects of device usage on energy consumption and how different lighting technologies
Andreas Braun et al. / Procedia Computer Science 109C (2017) 59–66 Andreas Braun et al. / Procedia Computer Science 00 (2015) 000–000
66 8
influence the environment. The evaluation showed that the gesture recognition is performing sufficiently well and that users prefer a high inclination setting for interaction. In the future we would like to focus our research on two areas. Additive manufacturing, in particular functional 3D printing, offers the opportunity to integrate conductive material into printed objects. This would enable additional devices and use cases. The concept of ergonomic surface design can be extended to other constrained shapes. For example, if we consider a static elbow or static finger joints, we could create much smaller interaction systems that can be added to a desk. Acknowledgements We would like to thank Stefan Krepp, Steeven Zeiss, and Maxim Djakow for their input to the hardware and software development, as well as our study participants for their valuable comments. This work was partially supported by EC Grant Agreement No. 610840. References 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19. 20. 21. 22. 23. 24.
H. Benko, R. Jota and A. Wilson. 2012. Miragetable: freehand interaction on a projected augmented reality tabletop. Proceedings CHI 12, 199–208. R. Wimmer, M. Kranz, S. Boring and A. Schmidt. 2007. Captable and capshelf-unobtrusive activity recognition using networked capacitive sensors. Proceedings INSS 07, 85–88. R.A. Earnshaw. 2014. Virtual reality systems. Academic press. M. Annett, T. Grossman, D. Wigdor and G. Fitzmaurice. 2011. Medusa: a proximity-aware multi-touch tabletop. Proceedings UIST 11, 337–346. F. Weichert, D. Bachmann, B. Rudak, and D. Fisseler, “Analysis of the Accuracy and Robustness of the Leap Motion Controller,” Sensors, vol. 13, no. 5, pp. 6380–6393, 2013. Z. Ren, J. Meng, J. Yuan, and Z. Zhang, “Robust hand gesture recognition with kinect sensor,” in Multimedia, 2011, 759-760. Samsung Inc. Samsung Introduces the Latest in its Iconic Note Series - The Galaxy Note 4, and Showcases Next Generation Display with Galaxy Note Edge. 2014. Retrieved March 31st 2016 from http://www.samsungmobilepress.com/2014/09/03/Samsung-Introduces-theLatest-in-its-Iconic-Note-Series---The-Galaxy-Note-4,-and-Showcases-Next-Generation-Display-with-Galaxy-Note-Edge-1. Samsung Inc. CES 2014: Samsung Unveils First Curved Ultra High Definition (UHD) TVs. 2014. Retrieved March 31st 2016 from http://www.samsung.com/uk/news/local/samsung-unveils-first-curved-ultra-high-definition-uhd-tvs. University of Groningen. Reality Touch Theatre. 2011. Retrieved March 31st 2016 from http://www.rug.nl/news/2011/01/touchscreen1?lang=en. J. Rekimoto. 2002. SmartSkin: an infrastructure for freehand manipulation on interactive surfaces. Proceedings CHI 02, 113–120. R. Wimmer and P. Baudisch. 2011. Modular and deformable touch-sensitive surfaces based on time domain reflectometry. Proceedings UIST 11, 517–526. A. Roudaut, H. Pohl and P. Baudisch. 2011. Touch input on curved surfaces. Proceedings CHI 11, 1011–1020. J. Smith, T. White, C. Dodge, J. Paradiso, N. Gershenfeld and D. Allport. 1998. Electric field sensing for graphical interfaces. IEEE Computer Graphics and Applications 18, 3, 54–60. M. Le Goc, S. Taylor, S. Izadi, C. Keskin and others. 2014. A Low-cost Transparent Electric Field Sensor for 3D Interaction on Mobile Devices. Proceedings CHI 14, 3167–3170. T. Grosse-Puppendahl, A. Braun, F. Kamieth and A. Kuijper. 2013. Swiss-cheese extended: an object recognition method for ubiquitous interfaces based on capacitive proximity sensing. Proceedings CHI 13, 1401–1410. M. Bachynskyi, A. Oulasvirta, G. Palmas, G and T. Weinkauf. 2014. Is motion capture-based biomechanical simulation valid for HCI studies?: study and implications. Proceedings CHI 14, 3215–3224. M. Bachynskyi, G. Palmas, A. Oulasvirta, J. Steimle and T. Weinkauf. 2015. Performance and Ergonomics of Touch Surfaces: A Comparative Study Using Biomechanical Simulation. Proceedings CHI 15, 1817-1826. D. Ma, X. Fan, J. Gausemeier and M. Grafe, eds. 2011. Virtual Reality & Augmented Reality in Industry. Springer Berlin Heidelberg, Berlin, Heidelberg. T. Grosse-Puppendahl, Y. Berghoefer, A. Braun, R. Wimmer and A. Kuijper. 2013. OpenCapSense: A rapid prototyping toolkit for pervasive interaction using capacitive sensing. Proceedings PerCom 2013, 152–159. A. Braun, S. Zander-Walz, S. Krepp, S. Rus, R. Wichert, and A. Kuijper. 2016. CapTap - Combining Capacitive Gesture Recognition and Knock Detection. Proceedings iWOAR 2016, Article No. 6. A. Braun and P. Hamisu, “Designing a multi-purpose capacitive proximity sensing input device,” in Proceedings PETRA, 2011, Article No. 15. A. Braun, R. Wichert, A. Kuijper and D.W. Fellner. 2015. Capacitive proximity sensing in smart environments. Journal of Ambient Intelligence and Smart Environments 7, 4 (2015), 1–28. M.-K. Hu. 1962. Visual pattern recognition by moment invariants. IRE Transactions on Information Theory 8, 2, 179–187. A. Braun, S. Krepp and A. Kuijper. 2015. Acoustic Tracking of Hand Activities on Surfaces. Proceedings iWOAR 2015, Article No. 9.