Acta Astronautica 161 (2019) 66–74
Contents lists available at ScienceDirect
Acta Astronautica journal homepage: www.elsevier.com/locate/actaastro
Augmented Reality applications as digital experiments for education – An example in the Earth-Moon System
T
Claudia Lindner∗, Andreas Rienow, Carsten Jürgens Geomatics Research Group, Department of Geography, Ruhr-University Bochum, Universitätsstraße 150, 44801 Bochum, Germany
A R T I C LE I N FO
A B S T R A C T
Keywords: Education Augmented Reality Digital experiments Gravitation Moon
“You realise that the Earth is nothing but accumulated cosmic dust having formed a rock that is encompassed by a flimsy, fragile atmosphere. To grasp this, I needed the view out of the window.” German ESA Astronaut Alexander Gerst's perspective on Earth was changed sustainably by the view from the International Space Station (ISS) onto our home planet. It is possible to give pupils a very similar perspective within the means of public education due to the availability of Earth Observation data from the ISS. However, the data can be put to more educational use than providing a taste of the overview effect. Applying common remote sensing methods and modern teaching concepts to EO (Earth Observation) data from the National Aeronautics and Space Administration (NASA) High Definition Earth Viewing experiment, teaching modules for several STEM (Science, Technology, Engineering, and Mathematics) subjects could be implemented successfully. Building on this success, more ISS EO sensors are being implemented in teaching materials and new media techniques are explored. The more recent addition to the material pool are smartphone apps using Augmented Reality (AR) with which the pupils can experiment on their own. These apps are developed in a partial What You See Is What You Get (WYSIWYG) application development system called Unity with the Vuforia extension, the latter allowing the use of printed images as reference markers for AR. Complex theoretical topics can be visualised in 3-dimensional (3D) animations or turned into inexpensive, easy digital experiments. The app “The Earth-Moon System” applies this with experiments on the effects of changes in the distance between Earth and Moon and a 3D animation on the barycentre between two celestial bodies. Development of such apps is feasible for researchers to visualise their data even with no prior app development knowledge.
1. Introduction The STEM fall report of the German Economic Institute [1] found that in October 2018, 337,900 STEM jobs could not be filled with qualified personnel, about 1/3 of them academics, and the trend is going towards higher numbers of missing qualified personnel. Despite great conditions on the job market for STEM workers, an insufficient amount of pupils chose to pick any kind of STEM education when they leave school, neither practical nor academic. One of the main reasons identified in the report is a lack of compassion, sometimes even complete lack of interest, for the STEM subjects, shown by the STEM-focused PISA study 2015 [2]. This is inevitably intertwined with the (lack of up-to-date) IT infrastructure in schools and its limited usage. The report [1] thus advocates more qualified STEM teachers, more fun in STEM, more STEM connections for schools (e.g. competition and mentorship programmes), and the right use of computers for research and team work. Most important however is to reach the children as
∗
young as possible. Despite the deeply curious nature of young children [3,4], school materials rarely make the complex, abstract mathematical and physical topics attractive, resulting in the lack of attention identified in the report [1]. Scientific language also makes it hard for people outside a specific field, e.g. pupils and teachers, to understand the benefits of STEM [5]. However, manned spaceflight has a high potential to let the pupils engage with many STEM topics [4] and has been used extensively for this purpose [6]. Astronaut is still one of the most popular children's job dreams [7]. Their actual work would not be possible without highly complex skills in mathematics, physics, engineering, and programming, but also knowledge in subjects like biology and biotechnology, Earth and space sciences and many more [8] – in short, many STEM topics' basics taught in school can be covered by topics from manned spaceflight. One of these topics is the astronauts' view on Earth, i.e. remotely sensed images from the ISS. A total of 63 ISS experiments have dealt
Corresponding author. E-mail addresses:
[email protected] (C. Lindner),
[email protected] (A. Rienow),
[email protected] (C. Jürgens).
https://doi.org/10.1016/j.actaastro.2019.05.025 Received 28 February 2019; Received in revised form 2 May 2019; Accepted 14 May 2019 Available online 18 May 2019 0094-5765/ © 2019 IAA. Published by Elsevier Ltd. All rights reserved.
Acta Astronautica 161 (2019) 66–74
C. Lindner, et al.
Acronyms/Abbreviations
HICO HTML5 ISS IT KEPLER
Hyperspectral Imager for the Coastal Ocean HyperText Markup Language version 5 International Space Station Information Technology ISS Kompetenzbasiertes, Erfahrungsorientiertes, Praktisches Lernen mit ERdbeobachtung von der ISS, ger. “competence-oriented, experience-based, applicable learning with Earth Observation from the ISS” Lidar LIght Detection And Ranging M-learning Mobile Learning METEOR ISS meteor observation project NASA National Aeronautics and Space Administration ISS RapidScat Rapid Scatterometer on the ISS RGB Red, Green, Blue RS Remote Sensing SD card Secure Digital memory card STEM Science, Technology, Engineering, Mathematics UI User Interface WYSIWYG What You See Is What You Get
2D 3D AR ASTER
2-dimensional 3-dimensional Augmented Reality Advanced Spaceborne Thermal Emission and Reflection Radiometer CATS Cloud Aerosol Transport System DEM Digital Elevation Model DLR German Aerospace Center E-learning Electronic Learning EO Earth Observation ESA European Space Agency ESERO European Space Education Resource Office FIS Fernerkundung in Schulen, ger. “Remote Sensing in Schools” GDEM Global Digital Elevation Model GIS Geographic Information Systems HD High Definition HDEV High Definition Earth Viewing Experiment
have free data storage and performance in the amount required for actual EO data analysis. To convey how the Moon affects everyday life on Earth on a physical level, several objectives had to be implemented considering the above-mentioned constrictions. Simulating the change in the tides relative to the moon distance as well as the moon position and its distortion when coming too close to the Earth were the two main challenges, but during development, represent-ting the moon shadow accurately became a challenge of its own. Including a video from the ISS was simple. Finally, a 3D model of a barycentre was implemented.
with RS, at least 5 more with EO technologies [8]. The Universities of Bochum and Bonn have implemented the project Columbus Eye and its successor KEPLER ISS, which use a selection of these experiments to fascinate pupils for STEM, but most of all foster methodological competences [9]. In addition, the FIS project, performed by the same team, uses RS in general to achieve those goals. All of these projects provide teachers with free, easily accessible and useable software, teaching materials and learning environments based on imagery from the ISS and various satellite sensors [10]. Due to their high degree of visualisation, authenticity and topicality, satellite images also have a high potential as a learning material. This is why 13 of the 16 German Federal States require their use in school lessons [11]. Using them properly teaches the pupils spatial decisionmaking and responsibility, one of the main goals of geography lessons. Especially the use of digital remote sensing data is and will be a crucial skill in the future [12] in an increasingly digitalised, globalised and socially responsible society [13,14]. RS requires theories and methods from the school subjects of mathematics, physics, chemistry, and biology, which are used for applications in the wide field of geosciences and can thus be used to teach complex issues from all of these fields, all the while keeping the pupils' attention through fascination and application in the real world. One of those is the interconnection of human activity and different Earth ecosystems that is revealed by the view from above, helping the pupils understand the coupled human-environment system. For the most recent app and work sheet about gravitation in the Earth-Moon system, several physical principles had to be implemented in a simplified way. This simplification was necessary due to two constrictions: The topic has to be understandable for pupils with limited physical background knowledge while transporting scientific methods and principles with enough scientific accuracy that they get the correct idea. The other problem is to apply resource-intensive calculations to large volumes of EO image data on pupils' smartphones, which rarely
2. Implementing the astronaut's perspective into school lessons 2.1. Possibilities of Earth Observation from the ISS in school lessons Out of the 63 Earth Observation experiments aboard the ISS, five were especially suited to create learning materials from: HDEV [15], METEOR [16], HICO [17], CATS [18], and RapidScat [19] (see Table 1). All of these sensors have in common that their data is freely available, processing is possible in common GIS, and their topics are relevant for German school curricula. Several teaching modules used videos from the NASA HDEV experiment (launched in March 2014, operational) that has been providing a nearly constant video stream since 2014. Four commercial offthe-shelf cameras were mounted on the Columbus External Pay-load Adapter of the ESA Columbus Module to test how cameras developed for use on Earth would fare with the harsh environment in space [15,20]. The cameras face in different directions (1 fore, 1 nadir, 2 aft) and stream in an automatic camera cycle determined by the NASA HDEV team. The Universities of Bochum and Bonn have permission and ability to alter that cycle, which is done to track certain phenomena from all angles (hurricanes, forest fires, etc.) The ground resolution for the nadir camera is approximately 500 m per pixel, the ground swath
Table 1 EO Imaging Experiments aboard the ISS suitable for the creation of teaching materials in KEPLER ISS. Experi-ment
Launch
End
Instrument
Objective
Teaching materials
HDEV METEOR HICO CATS RapidScat
03/2014 03/2016 09/2009 01/2015 09/2014
operational operational 09/2014 10/2017 08/2016
4 RGB HD cameras for constant video stream RGB ultra-sensitive high-resolution camera Hyperspectral sensor with 128 channels LiDAR for range-resolved profile measurements Scatterometer
testing off-the-shelf cameras in space analysis of meteor streaks on dark side of earth analysis of coastal environments analysis of atmospheric aerosols and clouds measure near-surface wind speed and direction over the ocean
8 modules and apps 2 apps under development planned planned
67
Acta Astronautica 161 (2019) 66–74
C. Lindner, et al.
geography, 4 in biology, 3 in physics, 2 in mathematics, and 1 in informatics are offered on the FIS website [30], as well as a general introduction to remote sensing, research tools and analysis tools. About half of these materials are available in English, which includes several of the digital modules and the general introduction. On the Columbus Eye website [31], 5 geography, 3 physics, and 1 maths topics are offered as well as the observatory, where pupils can experiment with classification and mapping. Few of these materials are available in English as the translation of the learning materials is an ongoing project. As the university working group responsible for these projects has become the ESERO Germany in 2018, the modules and materials will be distributed and translated into more European languages in the coming years. All of the provided materials and apps are downloadable without any charges or in-app purchases and without additional registration. However, having an internet connection and a Google Play Store account for the app downloads is necessary. All materials are created to fit into the curricula of the German Federal States. It is important to introduce teachers to the topics and the materials provided for them to find out how to implement them into their own lessons with little additional time investment [4]. Teacher trainings by the FIS and Columbus Eye/KEPLER ISS team, which introduce into the basics of RS and digital image processing as well as present an overview of the digital modules, are conducted regularly all over Germany and are provided on request.
width 530 km [15]. All data gathered in the stream is exclusively stored on the Columbus Eye project server, currently about 42 terabyte of data in the MPEG-4 file format. The videos have a resolution of 1280 × 780 pixels. Out of the hourly segments sent to Earth, highlight videos are cut and enhanced to reduce atmospheric scattering, to improve contrast and intensity values. This is performed in MATLAB© based on individually tested images [21]. Since there are neither infrared nor ultraviolet bands in the off-the-shelf cameras, no real atmospheric correction can be performed [22]. 2.2. Educational principles and new media The learning modules are prepared with more goals in mind than just to fascinate pupils for STEM. They work interdisciplinary, connecting both the sciences for the theoretical basis as well as the practical application. While work sheets on paper still play a great role, the intermedia approach combines videos, animations, computer-based tools, and smartphone apps. This requires a high degree of interactivity, both with the different types of media and other pupils. These three approaches, combined in a science-oriented, moderate constructivist approach [23,24], give the pupils important hard and soft skills for their future STEM careers. Among these is the pupil's ability to transfer knowledge and prepare them to solve problems autonomously, but the material is also designed to improve competences in literacy, scientific workflows, spatial orientation, as well as decision-making and responsibility [22]. Besides reading, writing, and arithmetic, algorithms have become the fourth literacy skill in the 21st century [14]. Basic computational skills and media interaction are essential in STEM education. Pupils need to use tools to experiment and find solutions for themselves [25]. Since quantitative RS depends on computer programmes, E-learning has a pivotal role. However, few German schools have a modern IT equipment readily available. The ratio of computers to pupils is often too low to use them regularly. Internet speeds are inconsistent and, at best, low [26]. On teacher trainings all over Germany, our team found that some schools' computers consist of hardware that is not able to run a recent operating system, sometimes not even having seen a security update in years. Thus, the digital materials developed have to be sized small enough to be downloaded quickly and work on computers with very low specifications. Small Flash applications with only two to three typical GIS functions, working with small embedded EO imagery, were found to work reliably on such computers and are used for most of the projects' learning modules. On the other hand, where the IT infrastructure is up-to-date, security and other requirements have to be met. So far, all e-learning modules were implemented using Flash technology, which is outdated and will no longer be supported by the creator Adobe®. The modules will be transferred into HTML5/JavaScript in the near future. New learning modules however take advantage of computers that are available anywhere, anytime, usually up-to-date on hard- and software, and with predictable internet speeds: the pupils' smartphones and other mobile devices. Mobile, or M-learning puts a focus on interactions between the learner and a learning environment on a mobile device. AR plays a dominant role as it enriches the real world with virtual content and information, like videos, animations, maps, images, texts and other tools [27]. AR apps for research and education are only becoming widespread in recent years [28], along with the increased availability of devices to run them on [29]. STEM AR apps have gained popularity due to having many positive effects on pupils, increased attention, interest and motivation among them, but also cognitive and social skills [28]. Whether analogue, E-, or M-learning materials, all of them come with background information for the teachers, helping them to understand the sometimes complex topics and methods as well as to fit the learning materials into their curricula. A total of 12 topics in
3. Classroom experiments with AR Due to the constraints of school IT infrastructure, new learning modules are based on AR apps running on the pupils' smartphones. The learning apps are provided for free in the Google Play Store and require few permissions: Only access to the SD card/external storage, to install the app there if the user doesn't want to use the phone's storage, and to the camera, since the main principle of AR is to exchange something the camera detects with additional information, is needed. Google transmits information for their statistics and error messages, but no other data, including personal data, on the phone is read, written or transmitted to or by the app creators. The first AR app (“The Eye of the Cyclone”) focused on very basic functions, like enhancing static work sheet images to animations and annotated videos [24]. More interactive functions were added in later apps [31]. The latest AR app, “The Earth-Moon System”, belongs to a worksheet that teaches about gravitation in the Earth-Moon system and visualizes it with RS data [32]. As with the E-learning modules, the app has to work offline after download. The app is ideally downloaded while in a stable Wi-Fi, but the offline functionality makes the pupils independent from data volumes or school Wi-Fi. 3.1. Functions of the AR app “The Earth-Moon System” This app turns the 2D marker image of Earth into a 3D rotating Earth and puts the Moon right in front of the smartphone's camera (see Fig. 1). The distance between the Earth marker image and the smartphone Moon represents the distance between the real Earth and Moon to scale with the virtual ones. Pupils can display what the maximum and minimum tide levels in the German Bight would look like based on their phone's distance to the marker, either on a DEM or on a truecolour Sentinel-2 image. If they come too close to the Earth marker, reaching the Roche limit at which the gravitational force differential would rip the Moon apart, an animation shows an artistic impression of that effect. The time it takes the Moon to revolve around Earth once is displayed depending on the distance as well, helping the pupils understand Kepler's laws of planetary motion. As the smartphone-Moon's shadow on the virtual Earth is displayed as if the Sun was on the opposite side of the room, the pupils can also experiment with solar eclipses to 68
Acta Astronautica 161 (2019) 66–74
C. Lindner, et al.
3.2. Developing the AR app The app was developed in Unity Version 5.6 [33] using the Vuforia extension [34]. Reality is augmented through “Image Targets”, created when marker images are uploaded to a unity database in Vuforia. The software determines a set of edges to be detected by the app (see Fig. 2, Each yellow cross represents an edge that is detected.) Marker images that are used as Image Targets are designed partially to convey information, but most important is their capacity to be recognised by the app. Since image recognition works via edge detection, good contrast and a high number of clearly distinguishable features are crucial. Colours are irrelevant; only contrast matters [33]. Satellite mosaics of Earth have few hard edges but large transitional areas and thus the image chosen as Earth marker image had to be enhanced with cornerrich markings, in this case, a circular porthole with hexagon screws to fit the theme, as well as a scale with serif-rich text (see Fig. 2a). The 3D, rotating Earth superimposed on it was taken from a downloadable package on the Unity Asset Store [35]. If an image does not have enough edges to be recognised by the Vuforia software, the surrounding text can be added into the image. This stabilizes the image recognition partially, but cannot solve the problem entirely due to the repetitiveness of text edges (see Fig. 2b). The images take longer to be recognised and the superimposed animation is unstable. The main challenge is to add something rich in edges to the graphic that does not reduce the comprehensibility. In the app, all the materials' positions, sizes and rotations are stored in a 3D coordinate system. The smartphone camera moves around in this coordinate system, relative to the detected marker. Depending on the lengths and angles between the markers' known edges, the smartphone's exact position in this coordinate system is calculated. The more discernible edges the marker has, and the closer the camera is (up to a
Fig. 1. The AR app “ColEye - The Earth-Moon System” in action.
understand why they are so rare and spatially limited. Other features of the app include an HDEV video from a lunar eclipse as seen from the ISS. It is used to demonstrate the true sizes and distances in space. To help the pupils understand the centre of gravity in a star or planet system, a 3D animation is added to the respective image in the work sheet. The work sheet contains tasks to record findings, hypothesize, prove or disprove theories as well as calculations and a text about the formation of the Earth-Moon system. The teacher material contains the tasks' solutions and background information about the mathematics and physics involved in the processes, about the used data and about the app's programming, all of which can be taught to the pupils as well [27].
Fig. 2. Comparison between different Image Targets as seen by the user and the AR app image recognition. a) Earth in a porthole. b) 2D-visualisation of the barycentre with surrounding text. 69
Acta Astronautica 161 (2019) 66–74
C. Lindner, et al.
object is attached to the “EMS_Earth” object, which is the respective image target. Other image targets are represented by the objects “EMS_Moon”, which receives a video overlay, and “EMS_Bary”, which activates the animation of the barycentre (see ch. 3.2.4). The final element, the “EventSystem”, is responsible for managing inputs and a necessary component provided by Unity [33].
distance at which the marker is taking up the entire screen), the better these calculations are. With the “Extended Tracking” function, the positional device tracker's information is added to the marker's tracking information to stabilise the image and enable tracking beyond the marker's extents [33]. The calculated smartphone's position relative to the marker is used to calculate the distance relative to the hard-coded size of the marker. This distance is to scale of the marker automatically. Functions that go beyond the standard capabilities of Unity or Vuforia can be implemented using either C# or JavaScript. It is not necessary to be proficient in either of these to write scripts, but some programming background is useful. Scripting is done in Microsoft Visual Studio® that includes tools for Unity, autocomplete for functions among them. The structure of the project is depicted in Fig. 3. In the upper left part, the Scene View is visible. The Moon sphere with a texture from an equirectangular Moon map hangs in the app's 3D coordinate system. The lower left of the image shows a preview of the user's screen view. The Earth's size is defined relative to the image target (underneath) so that 1 cm on the target are 1000 km in the real universe. The light coming from a source orthogonal to the image target's plane so that the visible shadow of the Moon cannot be created by the Moon sphere. Instead, an invisible sphere that only creates shadows is put in the right position by a script to give the impression the Moon sphere was casting that shadow (see ch. 3.2.3). The strongly pixelated edges of the moon shadow are due to a performance issue with soft shadows. The object structure is displayed to the right of the image. The Moon is attached (or a child object) to the camera while the “MoonShadow”
3.2.1. Tide simulation in relation to distance World tide simulators exist for a uniform, shallow-ocean Earth, e.g. by Brubaker [36], tide charts are widely available, and even a video simulation of the tidal flow through the British Isles and the North Sea exists [37] based on [38]. However, the kind of simulation required to convey influence of the Moon's distance on tides on a local level with smartphone-applicable resource consumption were not found and thus a way to implement them had to be developed. The maximum tidal amplitude is estimated via a simplification of the tidal force
ag ≈ ±
2RGM r3
(1)
With R being the Earth's radius, G the gravitational constant, M the Moon's mass and r the distance between the Earth and the Moon [39]. This simplified formula was specifically chosen because it is easy enough for high school pupils to understand and calculate with [39]. Using known tidal amplitudes for multiple gauges around the German bight, the virtual maximum amplitudes are calculated proportionally. There is no complex modelling involved, no time offsets or local
Fig. 3. View of the “Earth-Moon System” app's structure in Unity 5.6. 70
Acta Astronautica 161 (2019) 66–74
C. Lindner, et al.
degrees longitude and latitude are directly relative to the amount of pixels. This projection is inverse to the distortion applied to the sphere texture and thus the best choice for any planetary object in a virtual 3D environment. Since there may be tracking inconsistencies, the Moon sphere's position is calculated relative to the camera of the smartphone. In the game object structure, this means the Moon sphere is “attached” to the camera object instead of the Earth image target object. Without a script, only the Moon's far side would be shown on the screen, no matter where the Moon was positioned relative to the Earth. However, a script was attached that calculates the position of the camera-centred Moon relative to the Earth image target's centre and rotates the Moon sphere accordingly. Theoretically, the user would be able to see the Moon from all sides, practically, the Extended Tracking has its limits and the rotation does not work anymore soon after the Earth image target tracking is lost, which turns off the Moon in the script as it would malfunction otherwise. When the smartphone comes too close to the Earth marker image and the virtual Moon thus comes close to the Roche limit, it is distorted in relation to the distance. After a threshold is passed, the Moon is distorted by adjusting the scale in all three dimensions. The smartphone's vibration is activated. An array of textures is stored in the application that contains the same equirectangular map that is displayed outside the Roche limit, but with gradually stronger lava flows and red glowing canyons painted on them to represent the frictional heat generated by the tidal distortion. Along with the distortion, the texture is renewed depending on distance, allowing the pupils to put the Moon back together. At the point the Moon would rip apart, the screen turns orange to indicate the catastrophe.
variances as this would require more computing power than the average smartphone has or can reasonably allot to one app. As is, the calculation and rendering of the tidal amplitude requires so many resources it is only performed in every 10th frame, or 3 times per second, as to reduce visual stuttering and energy consumption. Part of the energy-intensive method is the image reading and writing. A DEM based on ASTER GDEM and bathymetry data of both the North Sea and the Baltic Sea [32] is stored in byte-format: 1 m in height difference is 1 (grayscale) value in difference in the image. When the tide amplitude is calculated, a transparent overlay image is coloured in semi-transparent blue in all pixels that have corresponding values in the DEM below the amplitude thresholds, each low and high tide for the respective display (see Fig. 3). The rendered images are overlaid on either an image of the DEM in classic green-to-red shades, or on a Sentinel-2 image. The latter was manipulated in Photoshop® to “empty out” the ocean by colour replacement for the low tide side. The UI (see Fig. 4) was designed to give as much space as possible to the simulation while still allowing to see the Earth in order to keep the function running. The transparent part is filled with text information. The maximum tide is displayed on the left side, the minimum tide on the right side. In the middle column, there is (top-to-bottom) a button to turn the UI off, the distance between Earth and Moon, the tidal amplitude, the months length (the name was chosen as a month in its original sense depended on the moon cycle), two buttons to switch between a DEM background and the Sentinel-2 scene shown, and finally a button to turn on a white plane to find the moon shadow outside the Earth. 3.2.2. Moon position and distortion Depending on the distance between the two bodies is also the time it takes the Moon to turn around the Earth, or the month's length when going by the original definition of month. Kepler's third law of planetary motion states that the square of the orbital period of a planet is directly proportional to the cube of the semi-major axis of its orbit, or: 2
(T T ) = (a a ) 1
1
2
2
3.2.3. The moon shadow The Moon's shadow on the virtual Earth cannot be displayed as the soft, conic shadow it is in the real world due to restrictions on resource usage of the smartphone (see Fig. 3). Displaying shadows at all uses a lot of computer performance and energy and is not intended to be used for greater distances than ∼0.5 m. In the handy scale of 1:100, 000, 000 that the app uses to let the Earth-Moon system stay in the size of a classroom, the maximum shadow distance has to be about 3.8 m. Instead of displaying the actual shadow of the virtual Moon, an invisible sphere is kept in close distance to the virtual Earth on the same vector as the virtual Moon and only casts a shadow based on the point light source in the distance, orthogonal to the wall the Earth marker is stuck
3
(2)
With the orbital periods T1 and T2 , and the semi-major-axes a1 and a2 of two orbits around the same centre of mass [40]. The Moon itself is a simple sphere with a texture (see Fig. 3). The texture is based on an equirectangular projection of the Moon in which
Fig. 4. UI of the tide simulation (in German). 71
Acta Astronautica 161 (2019) 66–74
C. Lindner, et al.
imagined it to be relative to the size of the Earth (often less than 1 m) resulted in tides between 50 and 150 m height. The Roche limit functions were received very positively, especially the vibration of the smartphones. The pupils had fun experimenting with the Moon distance and despite their earlier statement about their lack of interest in physics, had a long discussion about gravity and orbital mechanics. The other two functions did not work well due to their repetitive markers and an Android camera autofocus malfunction. Up until Android Oreo, this required a manual fix that was implemented in the app. However, the fixing script made the problem worse up from that version and was removed after the test. Since these two animations are important to the overall understanding, no conclusions can be drawn about the pupils' reception and understanding from this test. The app is accompanied by a work sheet that contains arithmetic problems, information texts and has the pupils deduce hypotheses or apply principles of gravity and orbital mechanics all over the solar system. Despite not having been interested in physics in the beginning of the class, most pupils actively engaged in the use of the app, the discussion and the maths problems and two asked for more information on the topics at the end of class. The “Earth-Moon System” app is currently being transferred into one comprehensive app for the KEPLER ISS project. All improvements the pupils recommended have been implemented in that version of the app that is to be released in fall 2019, together with an iOS version. English versions will be included. A full quantitative evaluation regarding its effects on comprehension and motivation of the pupils will be performed after its release.
to (see Fig. 4). Among the Unity light assets, there is nothing to accurately represent the Sun's light distribution, meaning the double-conic shadow of the Moon as created by the much larger Sun is not re-creatable easily. The invisible sphere in constant distance has to change in size depending on the distance between Earth and Moon. For resource consumption and processing speed reasons, it also has to be a “hard shadow”, meaning the edges are pixelated, i.e. an inhabitant of the virtual Earth would not see partial shadows. The issue is discussed in the accompanying teacher material [32]. 3.2.4. ISS video and barycentre model There are two more markers implemented in the app, one of which is overlaid by a video taken from the NASA HDEV experiment. Shown is a lunar eclipse setting and rising above the Earth. The Moon only makes up a few pixels, compared to the Earth's surface being very close to the cameras and being only barely discernible as round due to the close terminator. This is supposed to give the pupils a better estimation on the dimensions in space and incite discussion about the magnification effect of the atmosphere and the lack of intermediate objects to estimate the Moon's size by comparison. Another marker serves as an anchor for a 3D model of a barycentre, the centre of mass of two or more bodies that they orbit. In implementation, an invisible orb is created and attached with a spinning script. Inside the invisible sphere are two more small, coloured spheres to represent the bodies spinning around the barycentre. The orbits are created from Line Renderers. This component takes an array of 3D points in the virtual coordinate system that the marker is the centre of and connects them with a vector line (see Fig. 5). A script fills this array with points forming a circle, starting and ending inside the body spheres. The two bodies and their orbits are arranged inside the invisible sphere to follow the same directions and be opposite each other as seen from the barycentre. As the invisible sphere spins on the spot, its contents give the impression of two planets orbiting their common centre of mass. Due to the 3D implementation and extended tracking, the smartphone can be turned around the animation and view the effect from the top or peripheral, practically explaining how exoplanets can be found through wobbling stars [41]. 4. Testing the learning materials in school The Columbus Eye materials have been used and evaluated extensively in a new elective subject called Geography-Physics that uses the materials for interdisciplinary lessons. The subject was implemented at the projects' partner school Gymnasium Siegburg Alleestraße (Gymnasium being an academically oriented secondary school in German). At the end of the two-year course, most of the 24 pupils agreed that the live images from the ISS were not only interesting, but helped to comprehend the topics (see Fig. 6) [42]. Many of these pupils were in the alpha test group for the “EarthMoon System” app, which was performed in two advanced geography courses of grade 10 and one Geography-Physics course in grade 9. Most of the pupils declared to have little to no interest in physics and accordingly, low knowledge levels: Of the 35 pupils participating in grade 10, only 2 had elected physics classes for the year and none of them elected advanced physics classes. The 9th grade only performed an alpha test of the app's functions and did not do the app's tasks. The test classes were done primarily to test the functionalities. The Earth marker image is recognised well by the app even on distance and most of the functions work. The Moon shadow was misplaced in all tests by the app when the smartphone is further away from the marker image than approx. 2 m, which is not enough to simulate the actual distance of Earth and Moon at 356–407 Mm or 3.56–4.07 m in the app. A solution is still being worked on. The tide simulation surprised many of the pupils who imagined the Moon to be much closer to Earth. Putting it at the distance they
Fig. 5. The barycentre animation. Top: In the Scene View. Bottom: The view of the same objects in the app. 72
Acta Astronautica 161 (2019) 66–74
C. Lindner, et al.
[4] A. Kojima, To ignite the passion in children's hearts – role and effect of space education, issues and consideration, Acta Astronaut. 127 (2016) 614–618 https:// doi.org/10.1016/j.actaastro.2016.06.040. [5] B.S. Caldwell, Spaceflight-relevant stem education and outreach: social goals and priorities, Acta Astronaut. 112 (2015) 174–181, https://doi.org/10.1016/j. actaastro.2015.03.017. [6] V.I. Mayorova, S.N. Samburov, O.V. Zhdanovich, V.A. Strashinsky, Utilization of the International Space Station for education and popularization of space research, Acta Astronaut. 98 (2014) 147–154 https://doi.org/10.1016/j.actaastro.2014.01. 031. [7] Appinio Research, Studie zu Traumberufen, Das sind die Berufswünsche der Kinder von heute, www.appinio.com/de/blog/studie-zu-traumberufen-berufsw%C3% BCnsche-der-kinder-von-heute, (2017) , Accessed date: 20 August 2018. [8] NASA, Space station research explorer www.nasa.gov/mission_pages/station/ research/experiments/explorer/ , Accessed date: 11 April 2019. [9] A. Rienow, H. Hodam, F. Selg, G. Menz, Columbus Eye: interactive Earth observation from the ISS in class rooms, in: A. Car, T. Jekel, J. Strobl, G. Griesebner (Eds.), GI-forum, J. Geogr. Inf. Sci., Wichmann, Berlin, 2015, pp. 349–353 https:// doi.org/10.1553/giscience2015s349. [10] A. Rienow, V. Graw, S. Heinemann, J. Schultz, F. Selg, G. Menz, Earth Observation from the ISS Columbus Laboratory – an Open Education Approach to Foster Geographical Competences of Pupils in Secondary Schools, Proc. ‘Living Planet Symposium 2016’, Prague, Czech Republic, 2016 9 – 13 May. [11] A. Siegmund, Satellitenbilder im Unterricht – eine Ländervergleichsstudie zur Ableitung fernerkundungs-didaktischer Grundsätze, (2011) (diss. at PH Heidelberg). [12] K. Voß, R. Goetzke, F. Thierfeldt, Integration von angewandten Fernerkundungsmethoden im Schulunterricht der Sekundarstufen I und II, in: T. Jekel, A. Koller, J. Strobl (Eds.), Lernen mit Geoinformation II, Wichmann, Berlin, 2007, pp. 183–191. [13] B. Trilling, C. Fadel, 21st Century Skills: Learning for Life in Our Times, Jossey-Bass, San Francisco, 2009. [14] C.N. Davidson, Now You See it: How the Brain Science of Attention Will Transform the Way We Live, Work, and Learn, Viking Penguin Books, New York, 2011. [15] P. Muri, S. Runco, C. Fontanot, C. Getteau, The High Definition Earth Viewing (HDEV) Payload, in; IEEE Aerosp. Conf., Big Sky, USA, 2017 4 – 11 March https:// doi.org/10.1109/AERO.2017.7943749. [16] Planetary exploration research center, ISS meteor observation project “METEOR” www.perc.it-chiba.ac.jp/project/meteor/ , Accessed date: 20 August 2018. [17] M.R. Corson, D.R. Korwan, R.L. Lucke, W.A. Snyder, The Hyperspectral Imager for the Coastal Ocean (HICO) on the International Space Station. IEEE IGARSS proc., Boston, USA, 2008 6 – 11 July https://doi.org/10.1029/2011EO190001. [18] M.J. MicGill, J.E. Yorks, V.S. Scott, A.W. Kupchock, P.A. Selmer, The cloud-aerosol transport system (CATS): A technology demonstration on the international space station, Proc. SPIE. 9612, Lidar Remote Sensing for Environmental Monitoring XV, San Diego, USA, 2015 11 – 13 August https://doi.org/10.1117/12.2190841. [19] N. Ebuchi, Evaluation of marine vector winds observed by rapidscat on the international space station using statistical distribution, IEEE IGARSS proc., Milan, Italy, 2015 26 – 31 July https://doi.org/10.1109/IGARSS.2015.7326930. [20] S. Runco, International space station – high definition Earth viewing (HDEV), www. nasa.gov/mission_pages/station/research/experiments/917.html, (2015) , Accessed date: 20 August 2018. [21] J. Schultz, A. Ortwein, A. Rienow, Eur. J. Remote Sens. Technical Note: Using ISS Videos in Earth Observation – Implementations for Science and Education, vols. 51–1, 2018, pp. 28–32, , https://doi.org/10.1080/22797254.2017.1396880. [22] J. Schultz, C. Lindner, H. Hodam, A. Ortwein, F. Selg, J. Weppler, A. Rienow, Augmenting pupil's reality from space – digital learning media Based on earth observation data from the ISS, Proc. 68th Int. Astronaut. Congr., Adelaide, Australia, 2017 25 – 29 September. [23] K. Voß, R. Goetzke, H. Hodam, A. Rienow, Remote sensing, new media and scientific literacy - a new integrated learning portal for schools using satellite images, in: T. Jekel, A. Koller, K. Donert, R. Vogler (Eds.), Learning with GI 2011 Implementing Digital Earth in Education. Wichmann, Berlin, 2011, pp. 172–180. [24] A. Ortwein, V. Graw, S. Heinemann, G. Menz, J. Schultz, F. Selg, A. Rienow, Pushed beyond the pixel – interdisciplinary earth observation education from the ISS in schools, Proc. 67th Int. Astronaut. Congr., Guadalajara, Mexico, 2016 26 – 30 September. [25] S. Nag, J.G. Katz, A. Saenz-Otero, Collaborative gaming and competition of CSSTEM education using SPHERES Zero Robotics, Acta Astronaut. 83 (2013) 145–174 https://doi.org/10.1016/j.actaastro.2012.09.006. [26] Initiative D21, Sonderstudie “Schule Digital” – Lehrwelt, Lernwelt, Lebenswelt. Digitale Bildung im Dreieck SchülerInnen-Eltern-Lehrkräfte, initiatived21.de/app/ uploads/2017/01/d21_schule_digital2016.pdf, (2016) , Accessed date: 24 August 2018. [27] M. Dunleavy, C. Dede, R. Mitchell, Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning, J. Sci. Educ. Technol. 18 (1) (2009) 7–22 https://doi.org/10.1007/s10956-008-9119-1. [28] J. Li, E.D. van der Spek, L. Feijs, F. Wang, J. Hu, Augmented reality games for learning: a literature review, Int. Conf. On Distributed, Ambient, and Perva-Sive Interactions, Vancouver, Cananda, 2017 9 – 14 July. [29] R. Wojciechowski, W. Cellary, Evaluation of learners' attitude toward learning in ARIES augmented reality environments, Comput. Educ. 68 (2013) 570–585. [30] A. Rienow, Fernerkundung in Schulen, fis.rub.de/(accessed 11.04.2019). [31] A. Rienow, Columbus Eye columbuseye.rub.de/ , Accessed date: 11 April 2019. [32] C. Lindner, H. Hodam, A. Ortwein, J. Schultz, F. Selg, C. Jürgens, A. Rienow, Towards a new horizon for planetary observation in education, Proc. 2nd Symp. On
Fig. 6. Evaluation by the test group at GSA.
5. Conclusions There is a large deficit in STEM workers that will increase in the future. Methods need to be developed to motivate pupils enough for STEM for them to consider it a career. Remote Sensing covers many STEM topics, starting with mathematical, physical and programmatic background knowledge and ending at applications in various physical and anthropogeographic topics. Thus, it is especially suited to show pupils possibilities of STEM applications and careers. Smartphones work well as a replacement for lacking school IT infrastructure. AR smartphone apps are used for the demonstration of concepts as well as for digital experiments. The 3D visualisation and animation help the pupils understand complex topics and motivate them to learn more about them. Even with the constrictions of low data volumes and processing power on smartphones, basic physical principles can be visualised and turned into digital experiments for educational purposes. Modelling tides doesn't need to be completely accurate for educational purposes and could thus be simplified to a minimum vs. maximum visualisation while still applying gravitational physics. Due to lightning constrictions, the moon and its shadow had to be implemented separately. A simple 3D animation can conveys how a system's barycentre works. Developing AR apps based on real data and scientific principles has become less complicated recently due to the availability of partial WYSIWYG application development systems. Basic programming skills allow researchers, not just specific programmers, to visualise their data and principles. This is useable in secondary education, where the technology helps the pupils to understand complex topics with digital experiments and visualisation. Acknowledgements FIS, Columbus Eye and KEPLER ISS are supported by the DLR with funds of the Federal Ministry for Economic Affairs and Energy [grant numbers 50EE1703, 50JR1307 and 50JR1701] based on a resolution of the German Bundestag. Appendix A. Supplementary data Supplementary data to this article can be found online at https:// doi.org/10.1016/j.actaastro.2019.05.025. References [1] C. Anger, O. Koppel, A. Plünnecke, E. Röben, R.M. Schüler, MINT-herbstreport 2018, Institut der Deutschen Wirtschaft, Köln, 2018https://www.arbeitgeber.de/ www/arbeitgeber.nsf/res/Mint-Herbstreport%202018.pdf/$file/Mint-Herbstreport %202018.pdf. [2] OECD, PISA 2015 Results in Focus, OECD, 2018, www.oecd.org/pisa/pisa-2015results-in-focus.pdf. [3] J. Jirout, D. Klahr, Children's scientific curiosity: in search of an operational definition of an elusive concept, Dev. Rev. 32 (2) (2012) 125–160 https://doi.org/10. 1016/j.dr.2012.04.002.
73
Acta Astronautica 161 (2019) 66–74
C. Lindner, et al.
[33] [34] [35]
[36] [37] [38]
Space Educ. Act., Budapest, Hungary, 2018 11 – 13 April http://www.hit.bme.hu/ ∼bacsardi/SSEA/SSEA2018_proceedings.pdf. Unity technologies, unity, https://unity.com/ , Accessed date: 11 April 2019. PTC inc., Vuforia, developer's guide, https://library.vuforia.com/ , Accessed date: 11 April 2019. Headwards, planet Earth free. unity asset Store, (2016) https://assetstore.unity. com/packages/3d/environments/sci-fi/planet-earth-free-23399 , Accessed date: 12 February 2019. C. Brubaker, Earth's tides simulation, J. Comput. Sci. Coll. 19 (2004) 351–352. openDelft3D, The tide in the North Sea simulated with the ZUNO model, https:// www.youtube.com/watch?v=JNOpJxfI_Mg, (2011). C. Coughlan, A. Stips, JRC Technical Report - Modelling the Tides on the North West European Shelf, Publications Office of the European Union, Luxembourg,
2015https://doi.org/10.2788/17992. [39] T. Franc, Tides in the earth-moon system, Proc. 21st Ann. Conf. Doct. Stud. WDS 2012. Charles University, Prague, 2012 2012, 29 May – 1 June https://www.mff. cuni.cz/veda/konference/wds/proc/pdf12/WDS12_318_f12_Franc.pdf , Accessed date: 10 April 2019. [40] J. Kepler, Harmonices Mundi, Libri V., Tampachius, Linz, 1619. https://doi.org/10. 3931/e-rara-8723. [41] R. Launhardt, Exoplanet search with astrometry, N. Astron. Rev. 53 (11–12) (2009) 294–300 https://doi.org/10.1016/j.newar.2010.07.006. [42] C. Lindner, C. Müller, H. Hodam, C. Jürgens, A. Ortwein, J. Schultz, F. Selg, J. Weppler, A. Rienow, From earth to moon and beyond – immersive STEM education based on remote sensing data, Proc. 69th Int. Astronaut. Congr., Bremen, Germany, 2018 1-5 October.
74