Understanding a public environment via continuous robot observations

Understanding a public environment via continuous robot observations

Journal Pre-proof Understanding a public environment via continuous robot observations Deneth Karunarathne, Yoichi Morales, Takayuki Kanda, Hiroshi Is...

5MB Sizes 0 Downloads 9 Views

Journal Pre-proof Understanding a public environment via continuous robot observations Deneth Karunarathne, Yoichi Morales, Takayuki Kanda, Hiroshi Ishiguro

PII: DOI: Reference:

S0921-8890(19)30093-4 https://doi.org/10.1016/j.robot.2020.103443 ROBOT 103443

To appear in:

Robotics and Autonomous Systems

Received date : 21 March 2019 Revised date : 16 December 2019 Accepted date : 20 January 2020 Please cite this article as: D. Karunarathne, Y. Morales, T. Kanda et al., Understanding a public environment via continuous robot observations, Robotics and Autonomous Systems (2020), doi: https://doi.org/10.1016/j.robot.2020.103443. This is a PDF file of an article that has undergone enhancements after acceptance, such as the addition of a cover page and metadata, and formatting for readability, but it is not yet the definitive version of record. This version will undergo additional copyediting, typesetting and review before it is published in its final form, but we are providing this version to give early visibility of the article. Please note that, during the production process, errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

© 2020 Published by Elsevier B.V.

Journal Pre-proof

1

Understanding a Public environment via Continuous Robot Observations Deneth Karunarathne,*1,3 (mailto: [email protected]), Yoichi Morales,*1,2 Takayuki

pro of

Kanda,*1 and Hiroshi Ishiguro*1, 3 *1Advanced Telecommunications Research Institute International, Intelligent Robotics and Communication Laboratories, Kyoto, Japan *2 Nagoya University, Nagoya, Japan *3Osaka University, Osaka, Japan

Abstract— This paper presents a study on a point cloud analysis captured by a robot navigating in a shopping mall environment. It investigates the type and how much

re-

information the robot could extract from the environment. For this purpose, information regarding environmental changes and the number of people in shops was extracted and analyzed. First, the robot was manually controlled to collect data in a typical shopping mall

lP

having different types of shops and a food court. As the robot navigated thoroughly around the environment, seven data recordings of data obtained from various onboard sensors were recorded during afternoon hours over three consecutive days. We built a composite map by overlaying 3D point clouds for each recording sharing the same coordinate frame, which

urn a

reveals the changes in the environment’s static objects. The number of humans at each shop in each recording was computed using a human tracker. Then, we computed a fourteendimensional vector for each shop: seven dimensions for environmental changes and seven for human density. Experimental results show that the environmental changes and the human density at each shop are consistent with the visual changes that occurred in the shops and the number of people who visited the shops. Correlation analysis was done

Jo

among shop changes, shop open space, and human density where results suggest that change in shop configurations are often done in smaller shops and shops with larger open space tend to attract larger number of customers. Finally, information extracted from shops was used to categorize the shops according to similarity. Keywords—Human tracking, Point cloud data, Data analysis

Journal Pre-proof

2

I. Introduction Stakeholders of shopping malls often desire to know how people interact with their ventures. Shop managers make numerous efforts to increase their shop’s popularity and make it stand out. These actions include adding display banners announcing sales, changing the merchandise displayed in front of the shops, and even remodeling the

pro of

entrance. In a shopping mall, the relationship between customer actions and environmental changes could provide important information on social interactions and, moreover, determine how shops and other ventures deal with its visitors. This information could also be used to improve the effectiveness of business establishments in the mall. For example, it might show which shops are the most attractive and at which times of the day; furthermore, it could indicate which corridors and entrances are most frequented. Thus, information on the various relationships between environmental changes and the number

re-

of visitors could be used for better pricing, better store arrangement, and more effective targeting of customers.

Traditional ways of physically observing human interactions and behavioral dynamics are

lP

tiresome and highly cost ineffective. Thus, it would be useful to have a robot to perform such tasks. In the coming years, service robots will be gradually introduced into our daily public environments like shopping malls as sales assistants and information providers [1, 2, 3]. Accordingly, robots could serve a dual purpose: recording useful data while

urn a

providing their primary designated service. Shop owners could use the results of data analysis for the design and improvement of their businesses. There is great promise for such a dual-service framework, as it is a common and successful approach in other domains [4]. Present online businesses conduct their activities while logging a large amount of data and are often used for dual purposes. For example, companies such as Yahoo [5] and Amazon [6] obtain hundreds of thousands of users’

Jo

transactions each day from their primary functionality. Apart from serving the primary purpose, they use the collected data as main input for big data analytics systems [4, 7], which the organizations leverage to uncover hidden relationships and patterns that cannot be inferred by traditional data analysis. The patterns uncovered are generally used for discount ratio calculation, intelligent customer preference prediction, and personalized marketing [7, 8]. With our data recordings, we try to understand what types of useful information could be extracted from the recorded data. Subjecting the observed data to big

Journal Pre-proof

3

data analytics could potentially uncover hidden patterns and unknown correlations within the environment. Surveys conducted on visitors to shops have found that consumers are often influenced by the retailer’s physical appearance [18]. For example, people preferred to visit cleaner shops where there is less waiting time [19]. Nevertheless, these studies have not been extended to quantitatively measure the physical changes in shops. Factors like shop size, changes

pro of

over time, and location may influence how often visitors frequent the shops. Managers could use shop knowledge to set merchandise pricing and the shop’s physical rearrangement (showcases, racks, etc.). Useful knowledge could also be obtained with data observed in other places and applied in a target environment. In this work, we study whether useful knowledge on the relationships between shops and visitors could be identified and extracted from robot observations.

re-

The objective of this study is to identify what type of useful information could be extracted from continuously observed data from a passing by a mobile robot. It is centered around 3D point cloud data observed by the robot with a sufficiently accurate representation of the environment to observe detailed shop characteristics. For a service robot to operate in

lP

crowded environments, accurate localization and human tracking are required. Based on recorded 3D point clouds, we aim to find patterns between environmental changes in a shop and human density in the shop that otherwise could not be found. Furthermore, the observed environmental data demonstrates which shops changed the most. The human

urn a

density represents how often people visited the shops which could be used as an index of shop popularity. We believe such knowledge between relationships of environmental and human data would be useful in business decision making. It could also be useful in both designing new shops and changing unpopular shops. The paper is organized as follows. Section II gives an overview of related works. Section III presents details of the dataset. Section IV describes the point cloud processing, and

Jo

Section V presents the analysis results. Finally, discussion and conclusions are provided in sections VI and VII, respectively.

II. Related works This section presents works on human trajectory analysis, environmental modeling, and service robots in shopping malls.

Journal Pre-proof

4

A. Action recognition using human trajectory analysis Studies in action recognition using trajectories are usually based on the location of human centroids, since they are stable and robust features. Human trajectories have been observed from both cameras and laser range finders in indoor and outdoor environments [45, 20]. Studies on human activity recognition have used approaches such as clustering,

pro of

classification, and probabilistic generative models. For example, a human trajectory analysis method based on a hidden Markov model was proposed for the creation of an intelligent home space [9]. Another approach by J. C. Nascimento et. al., in [10] used a method based on parametric motion vector fields that allow the representation of trajectories and space-dependent dynamics not often captured by clustering or classification techniques. One study [16] focused on learning semantic scene models via trajectory analysis using unsupervised learning to segment a scene into semantic regions.

re-

The trajectories were clustered into vehicles and pedestrians, and from these learned models both similar activities and abnormal activities could be detected. In another work, a visual system statistically learned common patterns of activities from observations and identified unusual activities [11]. A similar study but on pair-activity classification of bi-

lP

trajectory analysis was presented [12], and later the authors further extended the causality features into three types, including individuals, pairs and groups [13]. A subsequent work [14] used the group activity pattern to represent and differentiate group activities where

urn a

Gaussian parameters from trajectories were calculated from multiple people. A visionbased system [15] was presented for group recognition with a varying number of group members, and it used hidden Markov models to model the relationships among people. Another system [17] used laser range finders in an outdoor environment to detect anomalous human interactions. Like our study, humans were tracked using laser rangefinders. They defined interaction as two people being situated close to each other

Jo

through the duration of an activity.

The works mentioned above studied human interaction identification and action detection using trajectories. This was achieved using trajectory segmentation and model matching with a known interaction type. Our work involves understanding how human trajectories can be used to derive the relationships between shops in the environment and their visitors. In addition to the knowledge we can gain from the trajectories of visitors, we also observe the environmental changes that have occurred in shops that could reflect the relationships

Journal Pre-proof

5

between visitors and shops. One comprehensive study on human trajectories in a single large public space [20] focused on the variation in human trajectories over a long period of time. This study did not discuss environmental changes and required a static sensor network. Another work analyzed the paths taken by customers in a supermarket [21] to identify possible routes. In addition, a recent study used the human trajectories of people in a shopping mall to determine the

pro of

suitable locations for a service robot to wait [22]. Even though it is useful to know the best locations to wait, this is not directly related to gaining understanding of the shops. These studies conducted on trajectory analysis focused on human action recognition, interaction identification, and determination of various patterns of human trajectories in public spaces. Previous works did not study the interactions of visitors with shops. In our study, we observe how often humans visit each shop and how often the shop is seen from the robot. The acquired knowledge is used to identify the relationships that visitors might hold with

re-

the shops in a shopping mall.

B. Studies on environmental dynamics using mobile robot platforms.

lP

Many studies on environmental dynamics have identified various features, while other studies focused on the representation of environmental information. These studies used the obtained knowledge to build a map and localize positions within the environment. One work [23] proposed a mobile robot sensing solution for environmental applications to

urn a

detect complex spatial and temporal structures. Other works have taken up the environmental changes observed in the representation of the environment. Some works removed moving objects from the environment’s representation [24, 25]. Other approaches tracked these objects and classified them as moving landmarks [26, 27, 28]. In another work [29], a new map type represented local maps at different time scales, where the best map for localization was chosen by its consistency with current readings. Adaptive

Jo

approaches never assumed the map to be complete and performed continuous mapping, adding new features to the map every time the robot observed its environment [30, 31, 32]. In one method [33], the robot adapted its environment model every time it visited a place by finding the features that are most stable and neglecting those that are less useful. One study [34] proposed a method in which a dynamic occupancy grid was used to distinguish between highly dynamical objects, movable objects and static objects. A couple of studies [35, 36] used extended occupancy grid maps, while another represented environmental

Journal Pre-proof

6

changes at different rates and time scales [37]. One study [38] maintained separated active and dynamic maps to represent a changing environment. The use of environmental dynamics to localize a robot [39] was reported as a way to improve robot localization in long-term autonomy scenarios by modeling the spatial-temporal dynamics in the environment. Some of the above systems highlighted the importance of understanding environmental

pro of

information for better representation and modeling. In our study we represent environmental change as an occupancy value for each voxel. For each shop, the number of voxels with the same occupancy value within the boundary determines the amount of change that may or may not have occurred. Thus, with this information we intend to identify the relationships that environmental change might have with the shops in the experimental environment.

re-

C. Robot operation in shopping malls

In recent years, several studies have been conducted with communication robots in shopping malls. One study [40] recorded interactions between visitors and a

lP

communication robot to determine the robot’s perceived acceptability and its function to encourage shopping activities. Similarly, a robot [41] was deployed as a guide, where its stability was reported. Another study analyzed how shop owners and visitors identify the concept of shop territory in a shopping mall and developed a model in which a robot could

urn a

identify the shop territory accurately [47]. The results reported that a robot with shop territory model behaved better than a robot that did not use such model. The results from these studies suggest a deployment in which users perceive the robot as acceptable and feel encouraged by it to engage in shopping activities. The communication robots used in these studies could be extended to record data while operating. Therefore, when introduced into public spaces, such robots could be used to monitor both human interactions and

Jo

environment changes within shops. In this study we investigate the relationships that human trajectories and environment changes could have with the shops in a shopping mall from the data recorded by a passing robot. The originality of our study is that with the data recorded by a passing robot, we integrated the analysis of human density and environmental change and studied how these aspects relate to the entities (shops) of the shopping mall.

Journal Pre-proof

7

III. Data collection We present a study in a shopping mall with data recorded by a mobile robot. In this section we discuss the robot system, the experimental environment, and the data collection procedure. The mobile robot was manually driven in a shopping mall environment (Fig.

lP

re-

pro of

1), and the data were recorded from the sensors mounted on the robot.

urn a

Figure 1: Robot controlled with a joystick in the shopping mall environment

A. System Overview

We used a Pioneer P3-DX differential drive robot platform originally developed by ActivMedia Robotics. The robot was equipped with a 3D laser radar sensor (Velodyne HDL-32E) capable of scanning 360 degrees up to a maximum distance of 100 m, an inertial

Jo

measurement unit (IMU) (Memsic VG440), and a monocular camera to capture frontal footage. The 3D laser range finder was placed 1.30 m above ground level. Table 1 denotes the sensors, their frequency, and the obtained data.

Journal Pre-proof

8

Table 1: Sensors mounted on the robot and the recorded data Sensors

Frequency (Hz)

Velodyne HDL-32E laser radar 10

Recorded data • UDP Packets Time of flight distance Calibrated reflectivity Rotation Angles

Memsic VG440 measurement unit

pro of

Synchronized Time Stamps

Inertial 40

X, Y, Z axes’ velocity

Yaw, pitch, roll angular velocity

Pioneer P3-DX robot platform

100

Monocular camera

13

Right and left wheels’ velocities Video frame data

re-

In this work we present a self-contained mobile robot capable to continuously collect data while navigating the environment. Available environmental survey camera systems would require considerable modifications to offer full environmental coverage and they would rely on complex calibration; additionally, processing of large amounts of data would result

lP

in computationally expensive systems.

urn a

B. Experimental Environment

We conducted data recording at a shopping mall (named Temposan Market Place1 located in Osaka Prefecture, Japan. The shopping mall has an area of 100 x 80 m and two floors. The walking passages have widths ranging from 2.5 m to 6.3 m. In the wider passages, there are temporary shopping kiosks in the center. Based on our observations, the shopping mall is less crowded during the weekdays compared to weekends. Fig. 2 shows a map of the mall with the trajectory followed by the robot along with some typical scenes in the

Jo

path. The shopping mall consists of 22 shops and 10 food stalls. The shop location is shown in Figs. 7 and 9 and shop numbering is shown in Fig. 18.

9

Journal Pre-proof

pro of

Robot Path

re-

80 m

Start / End

100 m

lP

Figure 2. Experimental environment with shops around the route.

C. Data Collection Procedure

urn a

We collected seven data sets in the first floor of the shopping mall over three consecutive days during afternoon hours between 4 pm to 6 pm. We used a joystick and manually drove the robot through the corridors of the mall (Fig. 1) and followed the route set (Fig. 2). For each recording, we logged the 3D point cloud, odometry, IMU data and the images from the frontal monocular web camera. Time-stamped data were recorded and stored in rosbag format. The dataset consists of seven recordings1,2, one for each drive through. Table 2 robot.

Jo

depicts the consecutive data recordings obtained, along with the distance traveled by the The

dataset

1

can

be

accessed

at

the

following

URL:

http://www.irc.atr.jp/sets/TEMPOSAN_dataset/

1 The dataset was divided into seven categories THU01 THU02 FRI01 FRI02 FRI03 SAT01 SAT02 each holding a different recording. 2 Data from the monocular camera cannot be shared due to privacy issues.

10

Journal Pre-proof Table 2: Obtained data recordings Obtained day

Recording Time

Distance traveled Number of (m) Velodyne scans

THU01

Thursday

22.24 min

810

14862

THU02

Thursday

22.20 min

785

13026

FRI01

Friday

27.14 min

830

16992

FRI02

Friday

25.40 min

FRI03

Friday

26.42 min

SAT01

Saturday

22.22 min

SAT02

Saturday

21.10 min

pro of

Recording Name

812

15679

820

16627

805

14853

785

13257

The data observed from each sensor were recorded with its own standard Unix timestamp,

re-

since different sensors operate at different frequencies. The point cloud data from the Velodyne LIDAR sensor is stored as time of flight distance. While the number of points per scan is not constant, a frame can have up to 69,504 valid distance measurements. Additionally, the 6-DoF data from the Velodyne sensor were also recorded. Similarly, for

lP

all sensors discussed in Table 1 , the data were recorded when the robot was driven in the shopping mall. Fig. 2 depicts a typical path traversed by the robot during data recording.

urn a

IV. Point cloud data processing This section presents the point cloud data processing pipeline. The process of building multiple 3D point cloud maps in the same coordinate frame is explained, and the computation of a shop environmental change index is described, followed by the approach of detecting and tracking humans to calculate a shop’s human density index. When

Jo

calculating the environment change index and human density index we considered how often the shops were visible to the robot. This is a human density index based on the number of times a grid was occupied by people divided by the number of times it was visible to the robot. Data was normalized based on how often the robot visited an area. The robot could only observe objects within its line of sight and could cluster and track people in a moderated crowded environment. During the study, the shopping mall was not crowed to a point where the robot failed to see some meters ahead.

11

Journal Pre-proof A. Environmental Change Index Computation

We define the shop environmental change index as a number that represents the physical change of each shop as detected by a laser sensor. First, a data log where the robot traversed through the entire environment was used to create a 2D environmental map (m2d ) utilizing SLAM [46], which is used as the global reference for all maps. Then, the robot was

pro of

localized toward the global map while the 3D points were transformed to global coordinates. Additionally, ICP scan matching [44] was applied to align the point clouds that were down-sampled to voxels of 0.05 m in resolution. The resulting maps shared the same coordinate frame and were combined to obtain a composite 3D map composed of n voxel maps (where n=7, since we had seven runs). Fig. 3 shows a block diagram of the process pipeline.

Reference 2D Map

Localization

3D Laser Sensor

3D Mapping

lP

Odometry

IMU

Shop Map

re-

SLAM

urn a

Map Multi-log Voxel 3DMap Map Map Composite Map

Environmental Change Index

Human Tracker Human Density Index

Figure 3. System overview of building a multi-log voxel composite 3D map to compute an environmental change index. Also, a human tracker was used to compute a human density index.

Jo

In the composite map, we have a counter i (0 to 7) for each voxel, which represents the number of times it was occupied. Voxels with value i = 7 represent the most stable voxels, mostly representing static objects (left side of Fig. 4). Voxels with index i = 1 provide the most dynamic voxels (right side of Fig. 4).

12

pro of

Journal Pre-proof

(a) Voxels with index i = 7, seen every time (b) Voxels with index i = 1, seen only one during data recording. time. Figure 4. Composite voxel map with different index values.

To calculate the voxels which fall inside each shop s, we delimited the shop borders. The borders were marked based on the floor map of the shopping mall, observations from the

re-

environment, and the 2D grid map. The borders shown in red in Fig. 5 mark each shop’s area (denoted by a). Additionally, we define an open space of a shop (denoted by b) as its observable area or entrance area for regular shops and the area of interaction in front of the

lP

food court shops, which do not have entrances. Both the entrances and the areas of interaction are visible to the robot passing by. These areas are marked in blue in Fig. 5. The area covered by the borders of a given shop is denoted by 𝑎𝑎 𝑠𝑠 , while the observable area is denoted by 𝑏𝑏 𝑠𝑠 . We defined the open space index 𝑐𝑐 𝑠𝑠 using the following equation: 𝑏𝑏 𝑠𝑠 𝑎𝑎 𝑠𝑠

urn a 𝑐𝑐 𝑠𝑠 =

(1)

For each shop, we calculated the number of voxels 𝑣𝑣𝑖𝑖𝑠𝑠 (i =1...7) by counting the number of voxels with the same value. For example, 𝑣𝑣1𝑠𝑠 was calculated by counting the number of

Jo

voxels with value 1 within the boundary of shop s, representing the voxels that were only

occupied once among all seven observations. We defined the total voxel state change (𝑉𝑉𝑇𝑇 )

that had object movement for a given shop as 𝑣𝑣𝑇𝑇𝑆𝑆 = ∑7𝑖𝑖=1 𝑣𝑣𝑖𝑖𝑆𝑆 .. The environmental change

index (𝑜𝑜𝑖𝑖 ) for each shop was computed by the following equation: 𝑜𝑜𝑖𝑖𝑠𝑠

𝑣𝑣𝑖𝑖𝑠𝑠 = 𝑆𝑆 𝑣𝑣𝑇𝑇

(2)

From the logged data, we built a seven-dimensional vector 𝒗𝒗𝒆𝒆 = (𝑜𝑜1 , 𝑜𝑜2 , . . . 𝑜𝑜7 ).

13

Journal Pre-proof 70 m

Open Space

Map

0

40

Shops

60 50 40

20 10 0 -10 -20

20

pro of

30

60

80 m

Figure 5: Marked area of shops and open spaces within shops

re-

B. Human Density Index Computation

In order to compute the human density index for each shop, we first localized the robot towards the global map. We used a 3D localization module based on a particle filter [43]

lP

utilizing an end-point likelihood field model. Each particle corresponds to a possible robot pose x with x, y, z position and yaw, pitch, roll orientation. After the localization we use point cloud in the global coordinates and use an inflated map (Octomap with 0.1m resolution with an inflation of 0.1 m) to perform the background

urn a

subtraction. Points falling on occupied voxels of the map would be removed as those resemble the static elements corresponding the roof, staircases and walls. We use the inflated map in order to tackle false positives that might arise from localization error. The difference between point cloud and the 3D map produces a reduced point cloud mostly belonging to the dynamic entities in the environment. Next, we conducted clustering on the reduced point cloud which was sorted by height.

Jo

Starting from the non-clustered highest point, distance (d)towards existing cluster centroids was computed. If the distance is smaller than a distance threshold (𝒅𝒅𝒕𝒕𝒕𝒕 ) (𝒅𝒅 < 𝒅𝒅𝒕𝒕𝒕𝒕 ) the

point is merged to the cluster. Given the nature of our laser sensor, the threshold value is a function of distance to robot, 𝒅𝒅𝒕𝒕𝒕𝒕 = 𝒇𝒇(𝒅𝒅𝒓𝒓𝒓𝒓𝒓𝒓𝒓𝒓𝒓𝒓 ) .

In the opposite case, a new cluster is created. The resulting clusters contain possible humans in the environment (Fig. 6). We removed clusters with large and short dimensions

Journal Pre-proof

14

that could not be a human. The clusters that were close to previously tracked humans within

(a) Detected clusters nearby the robot Figure 6. Result of clustering

pro of

the last few time steps were not removed.

. (b) Detected clusters far from the robot,

During clustering we clustered all dynamic entities in the environment including shopping

re-

carts which were sometimes carried around by people. But based on the dimensions we removed those clusters from the potential human clusters list. Additionally, we defined a minimum number of points to remove potential clusters based on the distance to the robot,

lP

since the LIDAR curvatures expanded with distance to the robot. With the expanded map, the dimension-based cluster reduction, the number of point-based cluster reduction and the height we eliminated objects in the environment that could have been misclassified as humans.

urn a

The potential humans detected at the end of clustering were then tracked using a cascade of particle filters. Every track was assigned a single filter composed of 50 particles per human to estimate 2D positions 𝐱𝐱 human = [𝑥𝑥, 𝑦𝑦] in global coordinates.

The system performed robot localization, background subtraction, clustering, and stable human tracking up to 25 m from the robot on the shopping mall environment. During data collection, the robot was driven by hand through the environment. At some point in certain

Jo

locations, the robot may have stopped, slowed down or speeded up. The amount of time each shop was seen by the robot was different. In some cases, even for a single shop, only certain areas were visible at certain times. Therefore, when calculating human density, we need to determine the visibility of each shop as visible to the robot. In order to determine how often a shop was visible to the robot we used a 1-m resolution grid map of the environment and performed ray tracing to calculate all visible grids during

15

Journal Pre-proof

each time step. Thus, we calculated the human density for each grid as the number of times a grid was visited divided by the number of times it was visible to the robot indicated in Eq. 3. In Eq 3 counter j in each grid g denotes the number of times the grid was visible to the robot, while counter k denotes the number of times humans visited the same grid. For data recording z (1 to 7), the human density index value h for each shop s is defined as ℎ𝑧𝑧𝑠𝑠 , which is given by the following equation: 𝑛𝑛 𝑔𝑔 1 𝑘𝑘 � 𝑛𝑛 1 𝑔𝑔𝑗𝑗

pro of

ℎ𝑧𝑧𝑠𝑠 =

(3)

Here, n is the number of grids inside the shop’s border with 𝑔𝑔𝑗𝑗 > 0 (visible to the robot at

least once). The human density is normalized for the area that was visible to the robot in the shop.

re-

For each recording (z), we compute the human density index ℎ𝑧𝑧 value for each shop. When repeated for all seven recordings, seven values could be obtained for each shop. These

values are then represented as a seven-dimensional vector 𝒗𝒗𝒉𝒉 = (ℎ1 , ℎ2 , . . . ℎ7 ). Finally, for

each shop, we have a 14-dimensional vector (𝑟𝑟 ∗ 2), which includes 7 values for shop

V. Analysis Results

lP

environmental change 𝒗𝒗𝒆𝒆 and 7 values for human density 𝒗𝒗𝒉𝒉 .

urn a

We aim to identify to what extent we can retrieve useful information from the sensory data repeatedly gathered by a mobile robot. We used the recorded data reported in section III, to which we applied the techniques reported in section IV Based on the computed indexes, we first analyzed whether observed indexes (human density and environmental changes) provide meaningful information on the shops. Then we tried to identify any correlational relationships shops might have among observed

Jo

indexes. Finally, we analyzed whether similar shops could be grouped together based on the observed indexes. The common features observed in groups of shops that performed well could then be applied to improve the existing shops and in establishing new shops. This obtained information was then verified through the recorded video data.

16

Journal Pre-proof A. Observed Indexes and Shop Characteristics

For each shop, we computed an environmental motion index using equation 2 for the seven data logs (section III). Fig. 7 depicts the environmental change index (𝑜𝑜1 ) , which

represents voxels that were only seen once. Thus, it represents the most unstable sections in the shops. By observing the environmental change index 𝑜𝑜1𝑠𝑠 , we obtained information

pro of

on how each shop changes over the duration of the data recording. The top-left shop no. 1 (stationery shop) in Fig. 7 had one of the largest environmental change indexes (𝑜𝑜11 =

0.112), Fig. 6 shows typical scenes of shop no. 1, where visible changes were observed during different recordings. The stationery goods in this shop are not fixed, and their appearance tends to vary from day to day. In contrast, shop no. 13 (jewelry shop) has a low environmental change index (𝑜𝑜113 = 0.034). It also has few visible environmental changes

(Fig. 10(a)) because the showcases of jewelry items do not change. Likewise, we observed

re-

that the environmental change index certainly provides information about the changes

Env. Motion Index 𝒓𝒓𝒔𝟏 1

80

Map

urn a

23

lP

made in shops and could be used to recognize shops with significant changes.

2

3

m 4

7

5

8

6

9

28

10

19 20 21

Jo

30

26

27

.12 .10

29 15

24

25

.14

Shops

32

31

14

22 100 m

Figure 7. Environmental change index of shops

11

12 13

16 17 18

.08 .06 .04 .02

17

pro of

Journal Pre-proof

re-

(a) Items arranged close to wall (b) Items rearranged Figure 8. Observed environmental change in front display of open space in shop no. 1. Arrangement of displayed items changed significantly.

lP

We used equation 3 to compute the human density index for each logged data item. An example of the human density index for one data recording is shown in Fig. 9. By observing the human density index ℎ𝑧𝑧𝑠𝑠 , we obtained information on how humans engage with each

shop during the changes over the duration of the data recording. For example, an accessory

urn a

shop (shop no. 13 in Fig. 10(a)) has a large human density index (0.008); in fact, some people entering the shop can be seen. On the contrary, a souvenir shop (shop no. 3 in Fig. 10(b)) had a very low human density index (0.001), where we saw few people interacting with the shop. Thus, the human density index ℎ𝑧𝑧𝑠𝑠 seems to provide information on how

Jo

often people visit the shops, which would indicate the popularity of the shops.

18

Journal Pre-proof

Human density index 𝒕𝒕𝒔𝒛

Map

Shops

1

7

6

8 9

25

0.005

29 15

3 24

30

26

27 28

31

32

10

14

22

19 20 21

11 12

0.004

13

17 18

0.003 0.002 0.001

re-

100 m

16

pro of

80 4 5 m

0.007 0.006

23

2

0.008

urn a

lP

Figure 9. Human density index for a single data recording

Jo

(a) People entering Shop no. 13 through an open space.

(b) Observed open space of shop no. 3 with no people interacting

Figure 10. Typical views of open space in different shops

19

Journal Pre-proof B. Correlations observed among shops

From observed human density, environmental change, and open space information on shops, we would like to analyze the relationships between the obtained information and the shop. We calculated the Pearson correlation coefficients between the various indexes h, o,

Table 3: Summary of correlations observed Correlation Environmental Change Index o1 / Open Space Index Environmental Change Index o1 / o7

pro of

b and c of shops and denoted these in Table 3.

Value

-0.42*

-0.86**

Human Density Index / Open Space

0.52** 0.77**

re-

Human Density Index: Weekday / Weekend

*: p<.05, **: p<.01

In the graph of Fig. 11 each shop is plotted against environmental change index (o) and

lP

open space index (c). Based on Fig. 11, the shops with a smaller open space index tend to have a larger environmental change index. The correlation between environmental change index and shop open space index is -0.42, which is statistically significant. Table 4 summarizes the observations on some of the shops. For example, shop no. 1 does not have

urn a

a large open space, and it has visible changes in the left-front area of the shop (Fig. 8). Similarly, in shop no. 16, the yellow-colored casing in front of the shop (Fig. 17(a)) was visible during one recording but no longer visible during another recording (Fig. 17(b)). On the contrary, shop no. 13, as seen in Fig. 12, has a larger open space but no visible environmental change. Thus, the observations provide evidence that shops with a smaller open space have considerably larger environmental change, while shops with a larger open

Jo

space have considerably smaller environmental change. This could be attributed to shops with smaller open spaces tending to change their layouts as a means of attracting more people.

20

Journal Pre-proof Envi r onme nt a l c h a n g e v s Op e n s p a c e i n d e x

Environmental change index o1

0.14

Shop 16

0.12

pro of

Shop 1

0.1 0.08 0.06 0.04

0

0

0.05

0.1

re-

Shop 13

0.02

0.15

0.2

0.25

0.3

0.35

Open space index (c)

urn a

lP

Figure 11. Environmental change index (o1) vs open space index (c) graph

Table 4: Summary of environmental change observations on shops Open

space

Env.

Env.

change

change

(b) m2

index o1 index o7

Shop no. 1

4.79

0.119

0.459

Shop 13

no.

14.1

0.033

0.837

Shop 16

no.

2.86

0.124

0.506

Jo

Shop

Observation

Visible change in the front accessories display of the shop No visible change Visible change in umbrellas hanging in front of the shop

21

pro of

Journal Pre-proof

(a) Typical view of shop no. 13 during one (b) Two persons observing showcases during recording another recording Figure 12. Shop no. 13 open space during two recordings

re-

The way in which each shop modifies its appearance can also be attributed to the static environmental area in each shop. The correlation between environmental change indexes o7 and o1 is statistically significant at -0.862. For instance, there was a relatively large environmental change in shop 10 (𝑜𝑜110 ) 0.108 and a relatively small static object

lP

representation (𝑜𝑜710 ) 0.498. Fig. 13 shows that the displayed items (umbrellas and clothes)

of shop no. 10 have changed over the recordings, which might be due to people interacting with them. On the other hand, shop no. 6 has a small environmental change value of

(𝑜𝑜16 ) 0.013 and large static-object representation of (𝑜𝑜76 ) 0.919. Based on Fig. 14, shop no.

urn a

6 does not show any visible environmental change. We can speculate that in no. shop 6,

the items sold from the displays were quickly replaced by the staff, leading to no visible difference. Based on visual evidence, we can say that shops with a large static area.

Jo

environmental area did not change as much as shops with a smaller static environmental

pro of

Journal Pre-proof

22

urn a

lP

re-

(a) Umbrellas displayed (b) Umbrellas’ locations changed Figure 13. Japanese umbrellas shown in front of shop no. 10. Significant changes in open space can be observed during different data recordings.

Jo

(a)Observation of shop no. 6 during weekday (b) Observation of shop no. 6 during data recording weekend data recording Figure 14. No visible change observed in shop no. 6 during data recordings

23

Journal Pre-proof

Human density index vs open space 0.009 Shop 13

0.007 Shop 22

0.006 0.005 0.004 0.003 0.002

Shop 3

0

5 Open space (b)

( m2 )

10

15

lP

0

re-

0.001

pro of

Human density index (h)

0.008

Figure 15. Human density index vs open space index for each shop

urn a

The graph in Fig. 15 shows the human density index vs open space graph for each shop. We noticed that shops with larger human density were more frequented and had a larger open space. As indicated in Fig. 17, 78% of the shops (25 of 32) have human density below 0.004 and an open space below 8 m2. Only 4 shops (12.5%) had a human density above 0.004 and open space above 8 m2 (primarily lady’s merchandise). We analyzed whether human density index has any relation to the shop’s environment. The

Jo

correlation between average human density index (h) and open space (b) is statistically significant at 0.52. For example, Shop no. 13 (Figs. 12(a), (b)) has the largest open space of 14.1 m2 and an average human density index of 0.008. In shop no. 13, we observed that there were many people interacting with the shop. In contrast, shop no. 1 (Figs. 8(a) and (b)) has an open space of 4.79 m2 and an average human density index of 0.001. In fact, we saw few people around shop no. 1. The shops with the highest human density are shops selling ladies merchandise (shop no. 13 sells jewelry, shop no. 22 is the largest clothing

Journal Pre-proof

24

shop) and have larger selections of merchandise. These shops place their goods in the large open space that attracts customers. Shop no. 3 (Fig. 10(b)) (b = 3.3m2, h = 0.001) and shop no. 2 (b = 3.9 m2, h = 0.001) are souvenir shops with relatively smaller open space and human density index. Note that, although open space is rather small, these shops have good enough visibility from outside of the store, so the robot could observe people inside. Both shops have merchandise next to their open space. The shops are mostly open (no doors)

pro of

and could be observed by the robot passing by. Nevertheless, these shops still have a relatively smaller human density index. Based on this, we can say that the larger the open space the more people visit the shop.

To identify how people visited the shops during weekends and weekdays, we calculated the person correlation coefficient between weekday and weekend human density indexes, which is statistically significant at 0.769. This means that shops frequented mostly on weekdays were similarly frequented during weekends. For example, food shop no. 8

re-

(selling ice-cream in Figs. 16(a) and (b) had numerous people during weekdays (h = 0.006) and weekends (h = 0.008). Shops with few customers, such as shop no. 16 selling South Asia clothing in Figs. 17(a) and (b), had low human density on both weekdays (h = 0.001)

lP

and weekends (h = 0.002). This result indicates that the shops that were popular on

Jo

urn a

weekdays were similarly popular during weekends.

(a) Three people in front of ice cream shop (b) Three different people buying ice cream on on a weekday a weekend Figure 16. People visible in food shop no. 8, which sells ice-cream

25

pro of

Journal Pre-proof

(a) View of shop no. 16 on a weekday

(b) View on a weekend with visible changes (yellow baskets gone) Figure 17. Shop no. 16 view on weekday and weekend.

re-

To summarize the correlation analysis, experimental results indicate that open space has a negative correlation with the environmental change index, and thus shops with less open space change their open space often. Similarly, open space has positive correlation with human density index, meaning that the larger the open space, the more human density a

lP

shop will have.

C. Clustering based results (k-means clustering and PCA)

urn a

Finally, we analyzed whether human density and environmental change could be used to identify shops with similar traits. We used the 14-dimensional vector (Section IV.B and shown in Figure 18(a)) and applied K-means clustering to group similar shops. Utilizing a value of k=5, we built five clusters as shown in Figure 18(b). Principal component analysis (PCA) was used to visualize the result. Figure 19 shows the results of k-means clustering marked on the results of applying PCA on the dataset. The first principal component (PC1)

Jo

corresponds to the human visits to the shops while the second principal component (PC2) corresponds to the environment change.

26

Journal Pre-proof

Shop 1 Stationery Shop 2 Souvenirs Shop 3 Souvenirs Shop 4 Souvenirs Shop 5 Souvenirs Shop 6 Souvenirs Shop 7 Hobby Shop 8 Hobby Shop 9 Accessories Shop 10 Clothing Shop 11 Clothing Shop 12 Clothing Shop 13 Jewelry Shop 14 Hobby Shop 15 Accessories Shop 16 Accessories Shop 17 Clothing Shop 18 Clothing Shop 19 Clothing Shop 20 Clothing Shop 21 Clothing Shop 22 Clothing Shop 23 Food (1) Shop 24 Food (2) Shop 25 Food (3) Shop 26 Food (4) Shop 27 Food (5) Shop 28 Food (6) Shop 29 Food (7) Shop 30 Food (8) Shop 31 Food (9) Shop 32 Food (10)

12345671234567

Clothing 10 Clothing 22

C1 C2

Souvenirs 2 Souvenirs 3 Souvenirs 6 Hobby 7 Accessories 9 Clothing 11 Hobby 14 Clothing 19 Food (2) Food (4) Food (5) Food (6) Food (7) Food (9) Food (10)

C3

Hobby 8 Clothing Clothing Clothing Clothing Clothing Food (1)

lP

re-

1 2 3 4 5 6 71 2 3 4 5 6 7

urn a

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32

1.0 0.0 ℎ𝑧𝑧 𝑧: 1 … 7 𝑂𝑖𝑖 𝑖: 1 … 7

pro of

1.0 0.0 ℎ𝑧𝑧 𝑧: 1 … 7 𝑂𝑖𝑖 𝑖: 1 … 7

C4

C5

12 17 18 20 21

Jewelry 13 Accessories 15 Food (8) Stationery 1 Souvenirs 4 Souvenirs 5 Accessories 16 Food (3)

(a) Heat map of human density index and (b) Five resulting clusters environmental change index from unsupervised learning Figure 18. Heat maps of the dataset and clustering results

Jo

Each cluster contains similar shops based on observed human density index (h) and environmental change index (o) of the shops. In the case of cluster 1 (C1), it contains two elements, which were the most visited clothing shops. Cluster 2 (C2) contains most of the food stalls. These shops are not often visited and did not change much over time. Cluster 3 (C3) contains shops with average visits and average environmental change. Most of the clothing shops that sell ladies garments and hobby shops fit into this cluster. The shops that attract more customers but do not have much environmental change can be

27

Journal Pre-proof

seen in cluster 4 (C4). The jewelry shop (shop no. 13), the shoe and accessories shop (shop no. 15), and the ice cream shop (food stall no. 8) fall into this category. As discussed in the previous subsection, these shops have higher human density index than other shops. Cluster 5 (C5) contains shops with high environmental change index and low human density index. Shops no. 1 and 16 (discussed earlier) had visible environmental change and very small souvenir shops.

pro of

human motion index. Shops in this cluster were mostly accessories shops with a couple of

The clustering results indicate that shops specialized in similar merchandise and focusing on similar clientele have mostly clustered together.

Shop 4

Shop 5

Shop 16

C5

Shop 18 C3 Shop 17

PC2 (Change)

0.6 0.4

Food 3

Shop 1

re-

1.2 1.0 0.8

-1.0 - 0.5 -0.5 - 0.0 0.0 - 0.5 0.5 - 1.0 1.0 - 1.5

Shop 22

C2 Shop 3 Shop 14 Shop 20 Shop 12 Shop 10 Shop 2 Food 1 Food 5 Shop 21 0 Shop 7 C1 Shop 8 Food 10 -0.2 Food 9 Food 6 Food 2 C4 -0.4 Food 7 Shop 19 Shop 15 Food 4 Food 8 -0.6 Shop 6 -0.8 Shop 11 Shop 9 Shop 13 -1.0 -1 -0.8 -0.6 -0.4 -0.2 0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 PC1 (Visits)

urn a

lP

0.2

Jo

Figure 19. Displaying shops based on their two principal components resulting from PCA analysis

Journal Pre-proof

28

VI. Discussion In this study, we analyzed range data that could be observed by an operational robot in a real shopping mall environment. We kept the path of the robot invariant, observed data, considered all the shops in the environment without emphasizing on specific shops and conducted an analysis of all 32 shops in the mall, which we consider to be a fair number

pro of

for a single study. We paid no special attention to any shop and considered all the shops available in the environment as they are presented. Thus, we are interested in inferring knowledge that is general to the environment.

Open space is an important area of a shop, since it is where people observe, engage and decide whether to go into the shop. We observed more people engaging with shops with large open spaces. In shops with smaller open space, significant environmental change was items as a means of attracting customers.

re-

visible, particularly in the displayed items. Data suggest that shopkeepers change displayed

Based on the observed point cloud, we proposed and calculated a human density index and an environmental change index where robot video was used to visually confirm

lP

experimental results. Human density is directly proportional to the number of people visiting a shop. Shops with the highest human density seem to be the most popular. However, without shop sales information it is difficult to determine which shop is the most profitable.

urn a

Considering the observed correlations, we believe that they could be useful in business decision making. If we consider the relationship between human density index and open space, there is a moderately positive correlation of 0.52. Since there is a higher chance of making more sales with more people visiting a shop, shops could be advised to have wide open space as it would allow more visitors to interact with the merchandise. Similarly, we can consider the strong correlation between human density in weekday to weekend where

Jo

shops with higher human density in weekdays had more visits during weekends. Even though this is the general notion, the correlation confirms it. The shopping mall environment in this study had several people during data recording. However, while the robot was driven manually during the data recording, no person approached it too closely. But for an actual service robot, people might approach it and interact with it (like a robot distributing flyers), severely limiting its visibility of shops and

Journal Pre-proof

29

confining it to a limited area of operations. In high density environments, tracking could be occasionally lost when there were groups of people close together or when long occlusions occurred. During the study, we recorded the data that could resemble the operation of a future service robot. During robot operation, all shops in the mall were open. In this study we were not much interested in observing the environmental change and human density in a particular

pro of

shop, rather we were interested in what could be observed in the shops as the robot passed. During our study, we did not find that shops made drastic changes to their structure or changed their entrances. Therefore, the overall structure remained similar. Since we did not find a direct correlation between human density and environmental change, we believe human density is more important in this study and is mostly dependent on the open spaces of the shops.

re-

Human density could also be influenced by factors such as sales or sale signs, however, this information requires subjective information for each shop. This information might not be easily available for the robot. Rather than using data such as whether there is a sale or not, we observed data which could be inferred by a typical passing by robot. Additionally,

lP

clustering results show that we could group similar shops together reasonably well, confirming that our data processing was adequate for the task. For this study we did not have ground truth as it is very difficult to obtain it in real shopping mall environments. Similarly, the ground truth about how various visitors interact with the

urn a

environment it is integrally difficult to collect because most business entities would not want to publicly share their strategies in interacting with customers, focus on shop environment design, their sales and profit margins. It would require an elaborated study to analyze shopping mall visitor preferences and factors drawing them to interact with the environment.

However, if additional data on shops such as sales figures, profit margins and rent are

Jo

available that information could be used to enhance the results. In future work we could consider the integration of sales or promotional campaign information to study their effect on people behavior and human density. We believe that the methodology proposed in this paper could be used in similar environments because the analysis used real data recorded in a real shopping mall environment. Also, we assume that the robot recorded typical observations of the

Journal Pre-proof

30

environment and that the results are illustrative of the basic trends observed in similar environments. We are interested in improving the system to be able to cope reliably with crowded environments.

pro of

VII. Conclusions In this work we presented a study on point cloud analysis in a shopping mall environment. We recorded seven different logs in three consecutive days taken with a mobile robot. The analysis was performed for each shop, where the shop area was enhanced by the open space in the area outside of shops, which mostly contained apparel displays. We proposed and computed an environmental change index that represents the amount of change in the shops. Additionally, we detected and tracked humans and computed a human density index

re-

that represents the customer visits per shop. We performed a correlation analysis and found that there is a positive correlation between average human density and the size of open space. Furthermore, a positive correlation was found between human densities on weekdays and weekends. A negative correlation was found between environmental change

lP

index and the size of open space. Additionally, a negative correlation was found between environment change indexes in shops representing more changeable environment segments and in those with more static environment segments. Finally, we built a 14-dimension vector utilizing the two indices and the 7 data logs and then performed k-means clustering

urn a

to obtain 5 distinctive shop clusters. Analysis confirmed that the shops were grouped together according to similar features identified by the type of products they offered.

Compliance with Ethical Standards

Jo

This study was supported by JST CREST Grant Number JPMJCR17A2, Japan. The authors declare that they have no conflict of interest.

Journal Pre-proof

31

References [1] National Public Radio (2015). Retrieved January 22, 2018, from https://goo.gl/rUSNEp [2] National Post (2015). Retrieved January 22, 2018, https://goo.gl/G3Hhzj [3] Nikki Asian Review (2017). Retrieved January 22, 2018, https://goo.gl/HE5e53

pro of

[4] R. B. Wynn, V. A. I. Huvenne, T. P. Le Bas, B. J. Murton, D. P. Connelly, B. J. Bett, H. A. Ruhl, K. J. Morris, J. Peakall, D. R. Parsons, E. J. Sumner, S. E. Darby, R.M. Dorrell and J. E. Hunt, “Autonomous Underwater Vehicles (AUVs): Their past, present and future contributions to the advancement of marine geoscience”, Marine Geology, pp 451-468, 2014.

[5] Tech Crunch (2016). Retrieved January 22, 2018, https://goo.gl/2KrSov

re-

[6] AMAZON.COM INC (2017). Retrieved January 22,2018, https://goo.gl/fHzCuu [7] Forbes On Marketing (2014). Retrieved January 22, 2018, https://goo.gl/6eEKsW [8] Mashable Asia Business (2014). Retrieved January 22, 2018, https://goo.gl/WvgkAv

lP

[9] Y. Zhao, G Tian, J.Q. Yin and J.X. Fan, “Human trajectory analysis method based on hidden Markov model in home intelligent space”. Journal of Pattern recognition & artificial intelligence, pp 542-549, 2015.

[10]J. C. Nascimento., J. S. Marques, J. M. Lemos , “Modeling and Classifying Human

urn a

Activities from Trajectories Using a Class of Space-Varying Parametric Motion Fields”, IEEE Transactions on Image Processing, 22(5) pp. 2066-2080, 2013. [11]C. Stauffer, W. E. L. Grimson . “Learning Patterns of Activity Using Real-Time Tracking”. in IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 8, pp. 747-757, 2000.

[12]Y. Zhou, S. Yan and T. Huang, “Pair-activity classification by bi-trajectory analysis,”

Jo

IEEE Conf. Computer Vision Pattern Recognition (CVPR), pp. 1-8, 2008. [13]B. Ni, S. Yan and A. Kassim, “Recognizing human group activities with localized causalities,” IEEE Conf. Computer Vision and Pattern Recognition (CVPR), pp. 14701477, 2009

Journal Pre-proof

32

[14]Z. Cheng, L. Qin, Q. Huang, S. Jiang, and Q. Tian, “Group activity recognition by Gaussian process estimation,” Int’l Conf. Pattern Recognition (ICPR), pp. 3228-3231, 2010. [15]W. Lin, M.-T. Sun, R. Poovendran and Z. Zhang, “Group event detection with a varying number of group members for video surveillance,” IEEE Trans. Circuits and

pro of

Systems for Video Technology, pp. 1057-1067, 2010 [16]X. Wang, K.Tieu, E. Grimson, “Learning Semantic Scene Models by Trajectory Analysis”. In: Leonardis A., Bischof H., Pinz A. (eds) Computer Vision – ECCV, Lecture Notes in Computer Science, vol 3953, pp. 110–123. Springer, Berlin, Heidelberg, 2006.

[17]A. Panangadan, M. Matarić, and G. Sukhatme, “Detecting anomalous human interactions using laser range-finders,” in Proc. IEEE/RSJ Int. Conf. Intelligent. Robots

re-

Systems. , vol. 3, pp. 2136–2141, 2004.

[18]Report: Consumers influenced by a retailer's physical appearance (2011), Retrieved January 22, 2018, https://goo.gl/8jLYVy

lP

[19]M. P. Mobach, " The impact of physical changes on customer behavior", Management Research Review ,Vol. 36 Issue: 3, pp.278-295, 2013. [20]D Brscic, T Kanda, “Changes in Usage of an Indoor Public Space: Analysis of One 2014.

urn a

Year of Person Tracking”, in IEEE Transactions on Human-Machine Systems 45(2):

[21]J. S. Larson, E. T. Bradlow, and P. S. Fader, “An exploratory look at supermarket shopping paths,” International Journal of Research in Marketing, vol. 22, no. 4, pp. 395–414, 2005.

[22]T. Kitade, S. Satake, T. Kanda and M. Imai,“ Understanding suitable locations for

Jo

waiting”, Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction, 2013.

[23]A. Singh, “Mobile Robot Sensing for Environmental Applications” In: C. Laugier, R. Siegwart (eds) Field and Service Robotics. Springer Tracts in Advanced Robotics, vol 42. Springer, Berlin, 2008. [24]D. Hähnel, D. Schulz, and W. Burgard, “Mobile robot mapping in populated environments,” Advanced Robotics, 2003.

Journal Pre-proof

33

[25]D. Wolf and G. Sukhatme, “Mobile robot simultaneous localization and mapping in dynamic environments,” Autonomous Robots, 2005. [26]L. Montesano, J. Minguez, and L. Montano, “Modeling dynamic scenarios for local sensor-based motion planning,” Aut. Robots, 2008. [27]C. C. Wang, C. Thorpe, S. Thrun, M. Hebert, and H. Durrant-Whyte, “Simultaneous Robotics Research, 2007.

pro of

localization, mapping and moving object tracking,” The International Journal of

[28]D. Migliore et al., “Use a single camera for simultaneous localization and mapping with mobile object tracking in dynamic environments,” in ICRA Workshop on S.N.O.D.E.: App. auton. veh., Japan, 2009.

[29]P. Biber and T. Duckett, “Experimental analysis of sample-based maps for long-term SLAM,” The Int. Journal of Robotics Research, 2009.

inspired SLAM system,” IJRR, 2010.

re-

[30]M. Milford and G. Wyeth, “Persistent navigation and mapping using a biologically

[31]K. Konolige and J. Bowman, “Towards lifelong visual maps,” in Intelligent Robots

lP

and Systems, IROS 2009, St. Louis, 2009.

[32]F. Dayoub, G. Cielniak, and T. Duckett, “Long-term experiments with an adaptive spherical view representation for navigation in changing environments,” Robotics and

urn a

Autonomous Systems, vol. 59, 2011.

[33]C. Cadena, D. Gálvez-López, J. D. Tardós, and J. Neira, “Robust place recognition with stereo sequences,” IEEE Trans. on Robotics, 2012. [34]F. Dayoub and T. Duckett, “An adaptive appearance-based map for long-term topological localization of mobile robots,” in IROS, 2008 [35]J. Saarinen, H. Andreasson, and A. J. Lilienthal, “Independent Markov chain

Jo

occupancy grid maps for representation of dynamic environment,” in Intelligent Robots and Systems (IROS), 2012 IEEE/RSJ International Conference on. IEEE, pp. 3489– 3495, 2012.

[36] T. Kucner, J. Saarinen, M. Magnusson, and A. J. Lilienthal, “Conditional transition maps: Learning motion patterns in dynamic environments,” in Intelligent Robots and Systems (IROS), IEEE/RSJ International Conference on. IEEE, 2013, pp. 1196–1201.

34

Journal Pre-proof

[37] P. Biber and T. Duckett, “Experimental analysis of sample-based maps for long-term slam,” The International Journal of Robotics Research, vol. 28, no. 1, pp. 20–33, 2009. [38]A. Walcott-Bryant, M. Kaess, H. Johannsson, and J. J. Leonard, “Dynamic pose graph slam: Long-term mapping in low dynamic environments,” in Intelligent Robots and Systems (IROS), IEEE/RSJ International Conference on. IEEE, pp. 1871–1878, 2012.

pro of

[39]T. Krajnik, J. P. Fentanes, O. M. Mozos, T. Duckett, J. Ekekrantz, and M. Hanheide, “Long-term topological localization for service robots in dynamic environments using spectral maps,” in Intelligent Robots and Systems (IROS), IEEE/RSJ International Conference on. IEEE, pp. 4537–4542. 2014.

[40]T. Kanda, M. Shiomi, Z. Miyashita, H. Ishiguro, N. Hagita , “A Communication Robot in a shopping mall ”, in IEEE Transactions of Robotics Vol 26 (5):1-10 October 2010. [41]Y Chen., F. Wu, W. Shuai, N. Wang, R. Chen and X. Chen, “KeJia Robot–An

re-

Attractive Shopping Mall Guide”. In: Tapus A., André E., Martin JC., Ferland F., Ammi M. (eds) Social Robotics. Lecture Notes in Computer Science, vol 9388. Springer

lP

[42]A. Hornung, K.M Wurm., M. Bennewitz, C. Stachniss and W. Burgard “OctoMap: An Efficient Probabilistic 3D Mapping Framework Based on Octrees. Autonomous Robots”, 2013.

2005. [44]PCL-

urn a

[43]S. Thrun, W. Burgard, and D. Fox. Probabilistic Robotics. MIT Press, Cambridge, MA,

point

cloud

library

(2018),

Retrieved

January

22,

2018,

http://pointclouds.org/trunk/

[45]N. Koenig, “Toward Real-time Human Detection and Tracking in Diverse Environments” in Development and Learning, International Conference on. IEEE, pp.

Jo

94-98, 2013.

[46]G. Grisetti, C. Stachniss, and W Burgard: “Improved Techniques for Grid Mapping with Rao-Blackwellized Particle Filters”, IEEE Transactions on Robotics, vol 23, pp34-46, 2007

[47] S. Satake, H. Iba, T. Kanda, M. Imai, Y. Morales, “May I Talk about Other Shops

Here? Modeling Territory and Invasion in Front of Shops”, Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction, 2014.

Journal Pre-proof

Understanding a Public environment via

Title

Continuous Robot Observations

pro of

Deneth Karunarathne

Authors

Yoichi Morales

Takayuki Kanda

re-

Hiroshi Ishiguro

Deneth Karunarathne completed a

Deneth Karunarathne

M.Eng and PhD degrees specializing in Robotics in 2013 and 2019 from the Osaka

lP

University’s Intelligent Robotics Laboratory, Japan. From 2012 to 2018, he was a student researcher at ATR’s

urn a

Intelligent Robotics and Communication

Jo

Yoichi Morales

Laboratory in Kyoto, Japan. His current research interests include autonomous navigation, sensor networks, and field robotics. Yoichi Morales got M.Eng. and Ph.D. degrees in 2006 and 2009 from the University of Tsukuba’s Intelligent Robot Laboratory in Tsukuba, Japan, where he also stayed half year as post-doctoral researcher until 2009. From 2009 to 2016 he was a researcher at the ATR Intelligent Robotics and Communication Laboratories

Journal Pre-proof

in Kyoto, Japan where he is currently collaborative researcher. Since 2016 he is

pro of

designated associate professor at the University of Nagoya. His current research interests include autonomous navigation, spatial cognition, perception and environment modeling. He is also a member of the IEEE, IEEE Robotics and Automation Society and the Robotics

re-

Society of Japan.

Takayuki Kanda is a professor in

Takayuki Kanda

Informatics at Kyoto University, Japan. He

lP

is also a Visiting Group Leader at ATR Intelligent Robotics and Communication Laboratories, Kyoto, Japan. He received

Jo

urn a

his B. Eng, M. Eng, and Ph. D. degrees in computer science from Kyoto University, Kyoto, Japan, in 1998, 2000, and 2003, respectively. He is one of the starting members of Communication Robots project at ATR. He has developed a communication robot, Robovie, and applied it in daily situations, such as peertutor at elementary school and a museum exhibit guide. His research interests include human-robot interaction, interactive humanoid robots, and field trials.

Journal Pre-proof

Hiroshi Ishiguro received a D. Eng. in

Hiroshi Ishiguro

systems engineering from the Osaka

pro of

University, Japan in 1991. He is currently Professor of Department of Systems Innovation in the Graduate School of Engineering Science at Osaka University (2009-) and Distinguished Professor of Osaka University (2017-). He is also visiting Director (2014-) (group leader:

re-

2002-2013) of Hiroshi Ishiguro Laboratories at the Advanced Telecommunications Research Institute and an ATR fellow. His research

lP

interests include sensor networks,

Jo

urn a

interactive robotics, and android science.

Journal Pre-proof

Understanding a Public environment via Continuous Robot Observations

Title

Deneth Karunarathne

pro of

Authors

Yoichi Morales

Takayuki Kanda Hiroshi Ishiguro

urn a

lP

re-

Deneth Karunarathne

Jo

Yoichi Morales

Journal Pre-proof

Jo

urn a

Hiroshi Ishiguro

lP

re-

pro of

Takayuki Kanda

Journal Pre-proof

re-

• • •

lP



urn a



There is genuine interest in observing how various stakeholders interact in public environments. A robot in an public environment with inbuilt sensors could be used to observe the environment. Environmental changes and the human density at each shop are consistent with the visual changes that observed in the shops. Shop configurations are often done in smaller shops. Shops with larger open space tend to attract larger number of customers. Similar shops can be classified based on shop change and the number of customers they attract.

Jo



pro of

Highlights

Journal Pre-proof

Jo

urn a

lP

re-

pro of

In the manuscript, the authors have indicated that there is no conflict of interest.