The “Smart Garden” System using Augmented Reality Tsuyoshi Okayama. Kazuya Miyawaki
*Ibaraki University, Inashiki-gun, Ibaraki, 300-0393 Japan (Tel: +81-29-888-8592; e-mail:
[email protected]). Abstract: The goal of this research is to build an advanced gardening system called “Smart Garden”. The goal of this research is to build an advanced gardening system called “Smart Garden”. The Smart Garden support beginners in farming operations, such as home gardeners. The objectives of this paper are to develop a support system of the Smart Garden which can 1) visualize guidance of farming operations using CGs and overlay them on a field where the operator is working, and 2) record the operator’s positions and viewpoints during their operations. Keywords: Augmented reality, gardening, precision agriculture, sensing
1. INTRODUCTION The goal of this research is to build an advanced gardening system called “Smart-Garden”. The Smart-Garden support beginners in farming operations, such as home gardeners and fresh farmers. Home vegetable gardening is becoming popular in Japan. Because Japan is entering into an unprecedented aging society, many retired person have abundant leisure times. Home gardening is one of options for consuming their times. Younger generations also have interest on vegetable gardening for hobby, healthy food, and children’s education. However, it is difficult for beginners to manage their own home garden, because of enormous factors which you have to or should consider, such as soil, weather, climate condition, etc. The Smart-Garden supports growers’, especially beginners’ decision with much information based on collected data by sensors and database. The concept of the Smart-Garden is based on precision agriculture (PA). The report of National Research Council in the United States of America (1997) defines PA as a management strategy that uses information technologies to bring data from multiple sources to bear on decisions associated with crop production. A key difference between conventional management and PA is the application of modern information technologies to provide, process, and analyse multisource data of high spatial and temporal resolution for decision making and operations in the management of crop production. There is a decision support system to help the decision about the selection of cultivate crops or farm operation or so on. In PA, it is needed for farmers to make careful planning and act in order to pursue multi goals such environmental load reduction, maintenance of soil fertility, good agricultural production, or increase in yieldability. PA needs to propose the story that considers the balance of these goals, and give the system that support best judgment to farmers. For decision support system, optimization algorithm and information management have very important role. In the information management, record data of field map and cultivation history, and accumulate the experience or acknowledgement of farmers, and make these data share among the concerned personnel.
Especially accumulation of failure case is important to avoid the irrevocable situations. The concept of decision support system of the smart garden is based on the PA. One of the important problems of a decision support system is how to provide the information for users. Therefore, we employ an augmented reality (AR) system for visualization of information. AR is a variation of Virtual Environments (VE), or Virtual Reality (VR). VE technologies completely immerse a user inside a synthetic. While immersed, the user cannot see the real world around him/her. In contrast, AR allows the user to see the real world, with virtual objects superimposed upon or composited with the real world. Therefore, AR supplements reality, rather than completely replace it (Azuma, 1997). Azuma also defined AR as systems that have the following three characteristics: 1) combines real and virtual. 2) Interactive in real time. 3) Registered in 3-D. In other words, an AR system is a technology that strengthens information on the real world by overlapping information produced by computers. AR is a specific example of what Brooks (1996) called Intelligent Amplification using the computer as a tool to make a task easier for a human to perform. There are many researches on application of AR systems. For example, Rekimoto and Nagao (2009) proposed a device, called NaviCam, has the ability to recognize the user’s situation by detecting colorcode IDs in real world environments. In medical domain, Lievin and Keeve (2000) proposed a surgical simulation system in which a stereoscopic overlay is visually superimposed on the patient. By overlaying the threedimensional image of internal organs obtained from MRI (Magnetic Resonance Imaging) or CT (Computer Tomography) on patient’s video image under the operation with a head mounted display (HMD), it is possible to operate carefully checking the patient’s three-dimensional position. Additionally it can be used as training of operation. In assembly domain, AR systems can be used in assembly planning e.g. by superimposing virtual planning results onto the real manufacturing (Reinhart and Patron, 2003).
Most of farm operations require both hands to operate. It is very troublesome to search required information during farming operations. Therefore, if HMDs are available during using AR systems, the farmers can obtain information overlaid on objects what they are watching plants, soils, etc., and use their hands during operations. AR applications can be displayed useful information depending on the context, i.e. in reference to particular chemical fertilizers and their usage of how much, when, etc. It is also important to monitor and record farming operations in order to increase the agricultural productivity. Professional farmers usually record their operations manually, or, if they have enough IT literacy, by using assistant software. These kinds of tasks are usually very troublesome for farmers, especially the elder persons. Therefore, the farmers’ operations should be recorded automatically. Many researches have conducted on automatic recording system for farming operations using GPS (e.g. Matsuo, et al., 2011) and RFID (Nanseki et al., 2007; Fukatsu and Nanseki, 2009). GPS covers very wide area, but errors of hand-held GPS are usually more than a few meters, and accessible distance of RFID is shorter than 200 mm (Fukatsu and Nanseki, 2009). An AR system can detect a camera’s position and orientation using fiducial markers. If the camera is attached on the head of a farmer, similar to a HMD, the AR system can detect positions of the farmer’s head and the view point. It means that the information obtained automatically when operators reach the objects (Nanseki et al., 2007). In this paper, we proposed the support system for farming operations which can 1) visualize guidance of farming operations and virtual plants using CGs on a field where an operator is working, 2) record the operator’s positions and viewpoints during their operations. 2. MATERIALS AND METHODS 2.1 ARToolKit for Augmented Reality ARToolkit ARToolKit is a software library for building AR applications (Kato and Billinghurst, 1999; Kato et al., ARToolKit homepage). The ARToolKit uses computer vision algorithms and calculate the camera position and orientation relative to fiducial markers in real time. In this study, we used “NyARToolkit for Processing” library, which is based on ARToolkit, to use AR technology under the Processing environment. Processing is a programming language for the computer art made by Ben Fly and Casey Reas (Processing.org). The processing language has many functions to specialize in the visual expression and the interaction, and the image can be controlled readily and intuitively. A fiducial marker (Fig. 1) is a square black frame with a unique symbol inside, which makes it possible to differentiate the marker from others. In order to track the user’s viewpoint (the camera’s position and orientation), the fiducial marker plays essential role in following procedures. The captured image by the camera is converted to binary image and black marker frame is identified. Then, positions and orientations of the marker relatively to the camera are calculated. The symbol inside of the marker is matched with
templates in the computer. And virtual objects are rendered in video frame according the information.
Fig. 1. The relationship among marker coordinates, the camera coordinates, and the screen coordinates is estimated by image analysis. 2.2 Experimental Field: Smart Garden The Smart Garden was created at Ami campus of Ibaraki University in Japan (latitude: 36.035993N and altitude: 140.21488E) and its shape is a keyhole type. The smart garden has 11 compartments and plants are grown and managed in each compartment. Various sensors, such as temperature, humidity, light intensity, soil moisture, EC, pH will be installed to obtain environmental information. Conceptually, the smart garden consists of two “environments”. One is real (physical) environment and the other is virtual environment (Fig. 2). In the real environment, environmental parameters are measured with the sensors temporally and spatially. Then in the virtual environment, the data is organized and models are built based on the data, and the models are used for planning a schedule of fertilization, harvest timing, forecasting a crop yield.
Fig. 2. Concept of the “Smart Garden” The 24 fiducial markers were placed on the Smart-Garden (Fig. 3). Each marker has different pattern to distinguish and the patterns are associated with the location based on the “Smart-Garden” coordinates.
Fig. 3. Smart garden with markers. 3. RESULTS AND DISCUSSION
Fig. 6. Virtual CG plants shown on a real-time captured image.
3.1 Guidance using Augmented Reality Figure 4 shows an example of guidance for planting seedlings using CGs. The upper opaque rectangular parallelepiped indicate a ridge which should be prepared before planting, and circles on the virtual ridge indicate locations where seedlings should be planted. and circles on the virtual ridge indicate locations where seedlings should be planted, because it is difficult for beginners to imagine grown plants which occupy large space. CGs indicate appropriate locations and distance ( numbers in CGs in Fig. 4) for planting. Therefore, growers can know a proper arrangement for planting. The lower transparent rectangular parallelepiped show the ground, and a rectangular of bottom of the virtual ground indicates a position (30 cm in depth in this example) of a basal fertilizer which should be applied.
3.3 Identification of camera’s positions and viewpoints using Augmented Reality An operator who held a tablet PC with a camera on the back (Acer ICONIA TAB W500) was observing ridges in each compartment using the camera and walked around the Smart Garden slowly. The experiment was conducted 2:00 P.M. on January 30th, and it was sunny day. Figure 6 shows the result of the experiment. Red spheres and blue spheres represent the camera positions and the viewpoints, respectively. During the experiment, the sunlight was coming from lower right (the yellow allow in Fig. 6). When the sunlight was coming from the back side of the operator, the system could detect fidual markers well (right side area in Fig. 6). However, when the sunlight was coming from the front side of the operator, the system could detect only a few markers (the area around the yellow circle).
Fig. 4. CGs of a ridge and the ground for instruction. 3.2. Virtual CG Plants overlaid on a field. In fig. 4, a virtual tomato plant created CGs was overlaid on a captured real image. An farmer can compare a state of a real plant with comparing a virtual plant intutively.
Fig. 6 Captures red and blue spheres repesent camera 6. CONCLUSIONS We developed the basic support system for farming operations which can 1) visualize guidance of farming
operations and virtual plants using CGs and overlay them on a field where the operator is working, and 2) record the operator’s positions and viewpoints during their operations. In future, we want to try diagnosis of plants using image processing; i.e. we can know when the harvest will be possible or about leaf disease diagnosis. For that we wish to conduct research about solar correction. REFERENCES Azuma, R.T. (1997). A survey of augmented reality. Teleoperators and Virtual Environments. 6(4): 355-385. Azuma R.T., Baillot Y, Behringer R, Feiner S, Julier S, MacInyre and B (2001). Recent Advances in Augmented Reality. IEEE Computer Graphics and Applications. 21(6): 34-47. Brooks, F. P. (1996). The computer scientist as toolsmith II. CACM 39(3): 61-68. Fukatsu, T. and Nanseki, T. (2009). Monitoring system for farming operations with wearable devices utilized sensor networks. Sensors. 9: 6171-6184. Kato, H. and Billinghurst, M. (1999). Marker tracking and HMD calibration for a video-based augmented reality conferencing system. In proceedings of the 2nd International Workshop on Augmented Reality (IWAR). October, San Francisco, USA. Kato, H. Billinghurst, M., Poupyrev, I., and Imamoto, K. Tachibana, Artoolkit, http://www.hitl.washington.edu /research/shared_space/download/ (Accessed January 31, 2013). Lievin, M. and Keeve, E. (2001). Stereoscopic augmented reality system for computer-assisted surgery. International Congress Series 1230: 107-111. Matsuo, K. Hamaguchi, Watanabe, K. and Watanabe, Y. (2011). Use of hand-held GPS for recording agricultural machinery operation and its utilization in the management of paddy farming-evaluation of no-tillage seeding in staple cropping area of soybean (cv. Nattosyoryu) considering the date and location of seedling.. Jpn. J. Crop Sci. 80(4): 448-456. Nanseki, T., Sugahara, K., and Fukatsu T. (2007). Farming operation automatic recognition system with RFID (Japanese with English Abstract). Agricultural Information Research, 16(3): 132-140. National Research Council. (1997). Precision agriculture in the 21st century: Geospatial and information technologies in crop management. 199. 7: 1. NyARToolkit project: http://nyatla.jp/nyartoolkit/wp/ (Accessed August 30, 2012). Processing.org http://processing.org/ (Accessed August 30, 2012). Reinhart, G., and Patron, C. (2003). Integrating augmented reality in the assembly domain - fundamentals, benefits and applications. CIRP Annals - Manufacturing Technology, 2003, 52(1): 5-8. Rekimoto, J. and Nagano, K. (1995). The world through the computer: computer augmented interaction with real world environments. Proceedings of the 8th annual ACM symposium on User interface software and technology,
November 15-17 Pittsburgh, Pennsylvania, United States, 29-36.