ScienceDirect ScienceDirect Procedia Computer Science 00 (2019) 000–000
Available online at www.sciencedirect.com
Available online at www.sciencedirect.com Procedia Computer Science 00 (2019) 000–000
ScienceDirect
www.elsevier.com/locate/procedia www.elsevier.com/locate/procedia
Procedia Computer Science 159 (2019) 1375–1386
23rd International Conference on Knowledge-Based and Intelligent Information & Engineering Systems 23rd International Conference on Knowledge-Based and Intelligent Information & Engineering Systems
Application of local binary patterns and cascade AdaBoost classifier for mice behavioural patterns detection analysis Application of local binary patterns and cascadeand AdaBoost classifier for mice behavioural patterns detection and analysis Tobechukwu Agbelea, Blessing Ojemeb,*, Richard Jiangc b, United Kingdon LA1 4YW c Department of Computer Science,aLancaster University, Lancaster, CAIR, Department of Computer Science, University of Cape Town, Rondebosh 7701, and CAIR, South Africa a, c Department of Computer Science, Lancaster University, Lancaster, United Kingdon LA1 4YW b CAIR, Department of Computer Science, University of Cape Town, Rondebosh 7701, and CAIR, South Africa
Tobechukwu Agbele , Blessing Ojeme *, Richard Jiang
a, c
b
Abstract Abstract The paper describes the application of local binary patterns and cascade AdaBoost classifier (CAC) to detect and
analyse mice behavioural movement. This was done with a view to investigating the inconsistencies associated with The paper describes the application of local binary patternsisand cascadebyAdaBoost (CAC) tolabels. detect The and current practices, whereby mice behavioural classification achieved means ofclassifier human-generated analyse mice behavioural movement. This done with aeight viewdifferent to investigating the inconsistencies associated with developed cascade AdaBoost algorithm waswas able to detect mice movement, and we develop a system current practices, whereby mice behavioural classification is supervision. achieved by Evaluating means of human-generated labels. The that allows mice behavioural analysis in videos, with minimal the results on Completeness, developed AdaBoost algorithm to detectanalysis, eight different micewas movement, andshowing we develop system Consistencycascade and Correctness, and basedwas on able the devised a solution deployed, that amachine that allows mice analysis in videos,video withdata minimal supervision. Evaluating theisresults on addition Completeness, learning plays anbehavioural important role in translating into scientific knowledge. This a useful to the Consistency and Correctness, based on the devised analysis, a solution was deployed, showing that machine animal behaviourist's analyticaland toolkit. learning plays an important role in translating video data into scientific knowledge. This is a useful addition to the animal behaviourist's analytical toolkit. © 2019 The Author(s). Published by Elsevier B.V. © 2019 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Peer-review International. © 2019 The under Author(s). Publishedof byKES Elsevier B.V. Peer-review under responsibility responsibility of KES International. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Keywords: computer vission; machineoflearning; mice behavioural pattern detection; local binary patterns; cascade AdaBoost classifier Peer-review under responsibility KES International.
Keywords: computer vission; machine learning; mice behavioural pattern detection; local binary patterns; cascade AdaBoost classifier
* Corresponding author. Tel.: +27725574409. E-mail address:
[email protected] * Corresponding author. Tel.: +27725574409. 1877-0509 © 2019 The Author(s). Published by Elsevier B.V. E-mail address:
[email protected] This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Peer-review ofPublished KES International. 1877-0509 ©under 2019 responsibility The Author(s). by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Peer-review under responsibility of KES International. 1877-0509 © 2019 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Peer-review under responsibility of KES International. 10.1016/j.procs.2019.09.308
1376
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
1. Introduction In the last decade, significant progress has been made in the domain of computing, sensing and tracking technologies. The remarkable innovations have benefits in critical application domains such as patient monitoring, pattern recognition, video retrieval, human-computer interaction, user interface design, multimedia retrieval, robot learning, visual surveillance and many other systems that require interactions between humans and electronic devices [1, 2]. Specifically, the innovations have supported researchers and practitioners in computer vision and machine learning in developing vision-based systems for the purpose of tracking and recognising behavioural patterns, and understanding actions, intent and motive of humans, animals and objects from observations alone [3]. The successful applications of vision-based approaches to detect, track and analyse the behavioural patterns of human actions [4, 5] is an indication that it can be applied to animals. This type of observation task is not well suited to humans, as it requires careful concentration over long hours or days. The need to automate behavioural detection and analysis arises due to the inherent limitations of the traditional manual, human assessment in terms of errorprone, misrepresentation, cost, time and reproducibility [6]. Research on mice has led to a good understanding of its genetics, its similarities (and differences) with other vertebrates, such as humans. Though its behaviour is not well studied and understood, studying the mice allows investigation of the various types of behaviour, addressing function, causation, development and evolution. To a great extent, the mouse is a promising model to increase our understanding of vertebrate social behaviour. Moreover, basic knowledge on the social behaviour of mice remain poorly studied and understood in the literature [7–12], and would be valuable to maximize the utility of this standard and increasingly used animal model. There is, therefore, a clear motivation to develop automated intelligent vision-based monitoring systems that can aid a human user in the process of tracking, detecting and analysing mouse behavioural patterns and habits [13]. The goal of the study is to develop a non-invasive, non-intrusive visual surveillance system for automating the detection, tracking, understanding and analyses of the patterns of home-cage mouse behaviour. Focus will be on developing models with local binary patterns (LBP) and cascade AdaBoost classifier for interesting pattern recognition in eight mice behavioural movement. It is hoped that the outcome of this study will lay the solid foundations necessary for the study of those wild animals, such as lion, shark, or tiger, that are difficult or impossible to observe closely in their natural habitat. The interest in this paper is mice behavioural monitoring task, described in the works of Dell’Omo et al [8] and Weisbrod et al [9]. The task is about using a combination of video surveillance system and machine learning techniques to monitor and quantify eight behaviour patterns of mice (Table 1) in videos. The videos contain singly housed mice from an angle perpendicular to the side of the cage as shown in Figure 1. In order to achieve the objectives of the paper, contents of the remaining sections of the paper are organized in the following structure: section 1.1 is a follow-up to the introduction, which describes mice and the elements of detecting and monitoring their behavioural patterns. The last part of this sub-section discusses the advantages of using automated monitoring system over manual system. Section 2 describes computer vision, OpenCV and machine learning algorithms used in the study. In section 3, we discuss the experimental setup and the environment where it was performed, the presentation of experimental results and its analysis. Section 4 discuss a few studies that are related to ours while section 5 concludes the study with an agenda for future work. 1.1. Monitoring Mice Behavioural Pattern Mice are myopic and prefer to stay under cover; they are affected by their living conditions, including their physical environment, their social environment and their interaction with humans [15]. A very important variable in behavioural research in mice is the degree of familiarity of the environment in which the animals are observed. When assessing the responses of mice to their living conditions, assessment of physiological and behavioural parameters is useful. Negative trends in these parameters, such as loss of body weight, failure to reproduce and changed behaviour patterns may indicate that the mice are distressed and failing to cope with their environment.
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
1377
Behavioural observations (such as drink, eat, groom, hang, micro-movement, rear, rest (inactive), and walk ) can provide useful cues and early warnings that something is wrong with a mouse’s state of health and wellbeing [14]. A range of responses may be observed from subtle changes in normal patterns of behaviour to stereotypy, which is a clear sign of mice inability to cope with its environment. Excessive grooming, aggression or states of fear are examples of behavioural indicators that a mouse is distressed [15]. Abnormal behaviour may manifest itself as an increased reactivity to environmental stimuli, leading to panic reactions, or to an increased passivity or state of depression [16]. Persistent and intense gnawing, as well as short-distance pacing, are typical stereotypic behaviours. The duration and kind of stereotypic behaviour is important when assessing its welfare significance [14]. A host of studies has showed that there are many factors to be taken into account when designing a Home cage tracking system [7, 17], some from the biological standpoint (such as tag shape and weight) and others from the technological standpoint (such as power requirements). This study is a foundation step to the realistic monitoring of animals in their habitats using automatic video monitoring and machine learning technologies, so that users from all around the world can analyse and observe data as it is collected by the system. Thus, one could envisage a researcher in America, for instance, obtaining real-time updates on the behaviour of a lion from a game reserve in Africa, without having to leave their desk. Ultimately, it is hoped that this work would help aid in conservation, by providing hard scientific data on the behaviour and habits of wild animals and their interaction with the environment. The home-cage system, designed to mimic some aspects of a natural situation that needs long term continuous assessment of behavioural activity in mice and other animals has the following advantages [18] Other potential advantages of behavioural automation through the home-cage compared to manual assessment include continuous and sensitive monitoring particularly during dark periods when mice are most active [19], and objective measurements can also be obtained because of a lack of observer-bias. Automated monitoring often takes place in the absence of humans, which is a key consideration when studying prey species such as mice where stoicism may be adaptive and the presence of humans may mask behavioural indicators of ill-health, particularly when pathological changes are mild to moderate. Automation also greatly reduces the requirement for animal handling which may be stressful and/or confound studies. If assessment of the animals using automated technologies is carried out in an enriched and/or complex environment, this is likely to encourage a broad range of species-typical behaviours as well as allowing animals to maintain some control over which resources they invest in; a key advantage from an animal welfare perspective There may also be negative aspects of automation with home-age, many relating to animal welfare. This include solitary housing requirement [18]. Some behavioural recording techniques require single housing or limit the environmental enrichment that can be provided, for example. minimal bedding is a requirement for some automated behavioural analyses. Automation may also encourage high throughput phenotyping, which typically involves large numbers of animals [20] potentially increasing the total number of animals used in scientific research. Similarly, automated systems from different companies may all want to validate their systems using animals, which could also increase numbers of laboratory mice used. The challenges of analysing the large amounts of data generated by automated systems and transforming data into information that is meaningful in terms of animal health and welfare must also be overcome to harness the true power of automated technologies. Finally, because there are practical and technological limitations with automation, home-cage technologies should never be used as a substitute to regular clinical monitoring carried out by experienced, compassionate staff, instead be used to provide supplemental monitoring.
Table 1: 1
Description of the behaviours (Ethogram) as scored by human observer
Behaviour Drinking
Description The mouse drinks by licking up fluids, like all mammals, in which the nose protrudes over the mouth-opening or the mouse’s mouth being juxtaposed to the tip of the drinking spout
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
1378
2
Eating
3
Grooming
4
Hanging
5
Rearing
6
Resting
7
Walking
8
Micromovements
Figure 1:
Bits of the food which are held by the forepaws, are gnawed or torn off and eaten. Liquid food is licked up. During grooming the mouse hunches or squats; sometimes it takes a lying position. The animal licks all the parts of its body it can reach, generally starting at the frontal parts and working in backward direction. The fur is also chewed and combed out by fast movements of the incisors. The hindquarters and the tail are manipulated by the forepaws during this action. Fur-chewing may also occur as a separate element, probably in reaction to local irritation of the skin, like is often to be seen in dogs Defined by grasping of the wire bars with the forelimbs and/or hindlimbs, with at least two limbs off the ground The rearing mouse supports itself on its bent hindlegs and often also on the base of its tail. The trunk is raised almost vertically, the back is straight and the head lifted up. The mouse may rear unsupported or leaning with its front paws against an object Defined by inactivity or nearly complete stillness Walk is moving for- or backward in the walk. During walking the trunk is closes to or in contact with the ground, but not resting on it. The tail is dragged along the ground or is held horizontally Defined by small movements of the animal’s head or limbs.
Eight Mice movement [21]
2. Machine learning software With Computer Vision fast becoming an important technology used in robots, national security systems, automated factories, driver-less cars, medical image analysis, and many forms of human-computer interaction, the need for machine learning platforms that can be used for its applications became obvious. OpenCV [22], a widely used free open-source library and open-source data mining platform, has implementation of some commonly used machine learning algorithms, including AdaBoost. OpenCV provides an easy-to-use computer vision infrastructure along with a comprehensive library containing more than 500 functions that can run vision code in real time. The software helps to get input from cameras, transforming images, shape matching, pattern recognition (including face detection segmenting images, tracking and motion in 2 and 3 dimensions). OpenCV is written in performance optimized C/C++ code, runs on Windows, Linux, and Mac OS X, and is free for commercial and research use under a
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
1379
Berkeley Software Distribution (BSD) license. It was used in the study in the first part of implementation using local binary patterns and cascade AdaBoost classifier to determine the best way to predict the class. After the establishment of a correct model with an efficient score of prediction, the deployment of the solution began. 2.1. Local Binary Pattern Local Binary Pattern (LBP) is a non-parametric descriptor whose aim is to efficiently summarize the local structures of images by comparing each pixel with its neighbouring pixels. Originally introduced by Ojala et al [23] for texture analysis, the LBP has attracted increasing interest in many areas of image processing and computer vision and has proved to be a simple but powerful approach in the following application areas [24, 25]: facial analysis (face recognition, face detection, expression recognition and gender classification), image analysis (interest region description, image forensics, image retrieval, and biometrics) texture analysis (classification, segmentation, background subtraction and visual inspection) and motion analysis (gesture recognition, lip reading, objection detection, gaze tracking). The most important properties of LBP are its tolerance regarding monotonic illumination changes and computational simplicity [18]. The original version of the LBP operator labels the pixels of an image with decimal numbers and are called LBPs or LBP codes that encode the local structure around each pixel. It works as follows: Each pixel is compared with its eight neighbours in a 3 × 3 neighbourhood by subtracting the center pixel value. The resulting strictly negative values are encoded with 0, and the others with 1 (Eq 1.1). For each given pixel, a binary number is obtained by concatenating all the binary values in a clockwise direction, starting from the one on its topleft neighbour. The corresponding decimal value of the generated binary number is then used for labelling the given pixel.
Where corresponds to the gray value of the center pixel refers to the value of its neighbours is the total number of involved neighbours is the radius of the neighbourhood defines a threshold function as follows:
Suppose the coordinate of
is
, then the coordinates of
are
The gray values of neighbours that are not in the image grids can be estimated by interpolation. . After the LBP pattern of each pixel is identified, a histogram is built to represent Suppose the image is of size the texture image:
where is the maximum LBP pattern value. The U value of an LBP pattern is defined as the number of spatial transitions (bitwise 0/1 change in that pattern in the The uniform LBP patterns refer to the patterns which have limited transition or discontinuities to (superscript “u2” means uniform circular binary presentation [26]. In practice, the mapping from
1380
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
pattern with , which has distinct output values, is implemented with a lookup table of elements. To achieve rotation invariance, a locally rotation invariant pattern could be defined as:
to (superscript “rui2” means rotation invariant “uniform” patterns with The mapping from which has distinct output values, can be implemented with a lookup table
,
As an illustration, Table 2(a) is the original 3*3 local structure with central pixel being 25. The difference vector . After the LDSMT, the sign vector (Table 2c) is (Table 2b) is . Clearly, the original LBP uses only the sign and the magnitude vector (Table 2d is ( vector to code the local pattern as an 8-bit string Table 2:
(a) A 3*3 sample block; (b) the local differences; (c) the sign and (d) magnitude components
Referring to Table 2: given a central pixel and its circularly and evenly spaced neighbours , we can simply calculate the difference between and as . The local difference vector characterises the image local structure at . Because the central gray level is removed, is robust to illumination changes and they are more efficient than the original image in pattern matching. can be further decomposed into two components:
Where
is the sign of
and
is the magnitude of
This equation, called the local difference sign-magnitude transform (LDSMT) has transformed sign vector and a magnitude vector Thus, and are complementary and the original difference vector reconstructed from them.
into a can be
LBP methodology has been developed recently with several variations for improved performance in different application domains [23–25]. These variations focus on different aspects of the original LBP operator including: (1) improvement of its discriminative capability; (2) enhancement of its robustness; (3) selection of its neighbourhood; (4) extension to 3D data; (5) combination with other approaches.
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
1381
2.2. Cascade AdaBoost Classifier Generally, almost every machine learning algorithm can be used to construct a cascade; the key properties are that computation time and the detection rate can be adjusted. A few examples of machine learning classifiers that can be used to construct a cascade include AdaBoost, support vector machines, perceptrons, and nearest neighbour. For this study, each classifier in the cascade is AdaBoost whose input is a set of computationally efficient binary features [27]. The main advantage of AdaBoost as a classifier, over other machine learning algorithms is its excellent feature selection mechanism and speed of learning [28]. Cascade AdaBoost methods have mainly been used in image detection applications [29] and many other fast applications, such as robotics and user interfaces [28]. The training algorithm for building cascade AdaBoost classifier is as follows: 1.
User selects values for , the maximum acceptable false alrm rate per layer and , the minimum acceptable detection rate per layer
2.
User selects target overall false alarm rate,
3.
is set of positive examples
4.
is set of negative examples
5. 6.
While
While to train a classifier with features using AdaBoost Use Evaluate current cascade classifier on validation set to determine
7.
If
classifier until the current cascade Determine threshold for the (this also affect Classifier has a detection rate of at least is NULL , then evaluate the current cascade classifier on the set of negative examples
and put any false detections into the set Algorithm 1:
Viola-Jones cascade AdaBoost structure [29, 31]
As shown in the training algorithm for constructing cascade-AdaBoost classifier (Algorithm 1), during the construction, the first thing is to set the detection rate and the false alarm rate of AdaBoost classifier of each of false alarm rate that cascade classifiers is supposed to reach in the end. layer. After that, we set the target Suppose the sets of positive examples and negative examples are called and respectively; the construction algorithm of the cascade classifier is mainly composed of two loops. The internal loop uses the above mentioned AdaBoost Algorithm to train AdaBoost classifier. Each time a weak classifier is added, the present AdaBoost classifier will be reappraised to see if it has reached the expected target. If the answer is yes, training of AdaBoost classifier at this layer will be completed; otherwise, continue adding weak classifiers until all conditions of the inner . If the loop are satisfied. Then, determine whether the false alarm rate of cascade classifier is below answer is yes, then terminate the algorithm training. Otherwise, reset negative examples and put any false detection into the set . And go back to external loop to train AdaBoost classifier of the next layer until the overall false alarm . rate of cascade classifier is below
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
1382
3. Experimental Set-up and Results Presentation The host computer runs on Microsoft Windows OS, and the behaviour recognition and analysis software was developed in Visual Studio 2013 (IDE) based on .NET framework and C# Programming Language, OpenCV Libraries version 3.1.0 [22] and EmguCV toolkits [30] image processing for the 3D visualisation of targets. The key algorithms used were the LBP [23] and cascade AdaBoost classifier [29, 31]. The step by step procedure goes thus: 1) Habituate mice for 1-2 weeks to an existing 12 hr light/dark cycle (e.g., 8 AM – 8 PM). Perform all procedures and testing during the dark cycle, with RT, humidity and light intensity being relatively constant 2) Mark or tailtattoo all mice for easy, numerical identification over a prolonged period and handle them 1-2 hr daily over 5-7 days 3) Repeat daily measurements of rectal temperature, body weight and food/water intake to detect potential fever and/or malnutrition induced by aging or disease progress. Standard exclusion criteria include low body weight due to reduced food/water intake, hunched posture with ruffled fur, hydrocephalus, porphyrin discharge around eyes, etc. 4) To identify neurological deficits that may confound overall activity and performance, perform standard sensorimotor tests such as the hind limb clasping reflex, visual placing reflex, geotaxis test, basket test, beam walking test, Rotarod, and olfactory tests 5) Clean the plastic and glass apparatus with disinfectant to remove urinary trails while testing mice. The general process of development is as follows: I. A set of training videos and images were collected II. The images and videos were prepared for training III. The training interface was built over OpenCV training tools which produced trained dataset used in behaviour recognition IV. The trained datasets were then embedded into a windows-based program using the EmguCV toolkits which is essentially a wrapper for OpenCV, enabling C# programming language to interact with OpenCV tools and resources. V. The dataset, compiled code and sample videos and images were then packaged as executable and zipped. 3.1
Output solution
Concerning the output of the solution, the study attempted to generate a string of the predicted function for the user to retrieve the related information from the Microsoft SQL Server database. To reduce the complexity of the registration, a selection of behaviours is made from the total number of behaviours that the mice performed. This list of behavioural categories is known as the ethogram. The measurements of these categories, often expressed in units of time (duration) or in frequency of occurrence are extensively used in several research areas including agricultural science, pharmacology, and biomedical research. The postures, transition of postures, or series of postures constitute a behavioural element. A typical run of the system takes a little over 2 minutes per video sequence. For the drinking behaviour, the system detects the behaviour with the mouse’s mouth being juxtaposed to the tip of the drinking spout (Figure 2). The experiment was repeated for the remaining seven behaviours under study, and the results are summarized in Tables 4 and 5.
Figure 2:
Snapshot of clip for drinking behaviour
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
Table 3:
1383
Results from video frames of expected and actual drinking behaviour
INPUT DRINK.MPG
EXPECTED RESULT DRINKING
ACTUAL RESULT Number of Drink behaviour detected: 104 Number of Eating behaviour detected: 89 Number of Micro-move behaviour detected: 89 Number of Walking behaviour detected: 89 Number of Grooming behaviour detected: 89 Number of Hanging behaviour detected: 89 Number of Rear behaviour detected: 89 Number of Resting behaviour detected: 89
Table 4 Summary of number of video frames Actual classification
Machine learning classification
Drink Eat Groom Hang Micro-move Rear Rest Walk
Drink
Eat
Groom
Hang
104 165 267 382 406 444 551 40
89 140 223 325 347 380 482 37
89 140 223 325 347 380 482 37
89 140 223 325 347 380 482 37
Micromove 89 140 223 325 347 380 482 37
Rear
Rest
Walk
89 140 223 325 347 380 482 37
89 140 223 325 347 380 482 37
89 140 223 325 347 380 482 37
Table 5 Summary of actual and machine learning classifications Actual classification Drink Eat Groom Hang Micro-move Rear Rest Walk 3.2
Machine learning classification Drink
Eat
Groom
Hang
Micromove
Rear
Rest
Walk
0 1 1 1 0 1 0
1 1 1 1 1 1 1
0 1 0 0 0 0 1
0 1 0 0 0 0 1
0 1 0 0 0 0 1
0 0 0 0 0 0 0
1 1 1 1 1 1 1
1 1 1 1 1 1 1
Result analysis
The study has explored the challenges of tracking and monitoring eight behavioural patterns of mouse using video surveillance system and classification (supervised algorithm) techniques. The research is important as the current understanding of how mice behaviour is not fully understood. A summary of the results of the experimental set up in section 3 is displayed in Figure 2 and Tables 3, 4 and 5. Interestingly, drinking behaviour, as seen in Table 5 was undetected by our system. Rather, drinking behaviour was detected as other behaviours of the mouse while other
1384
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
behaviours were detected with high accuracy, with the groom behaviour recording the highest detection accuracy. This was followed by the walk behaviour. With this study, we have demonstrated that the video surveillance system in collaboration with the LBP and CAC can be used to complement existing methods for tracking and monitoring different mouse behaviours with high degree of accuracy. 4. Review of Related Literature In this section, a review of works that are related to the different methods and technologies for monitoring, analysing and evaluating different animal behavioural patterns is done. Each of these methods and technologies in use has associated strengths and weaknesses, which can make the approach more or less suitable for certain environments and experiment. It is therefore necessary to establish the environments or conditions for which they may be most suitable. Traditionally, preliminary or initial studies of home-cage animal behavioural patterns were provided by direct human assessment and annotation [20, 32, 33]. However, this method has been identified to be expensive, slow, labour-intensive, requires long-term study, and presents obvious difficulties when animals perceive humans as predators or when they are naturally secretive and elusive [20]. Furthermore, though the animals under study may be habituated, the presence of a human observer near the home cage can still affect their behavioural interactions with other non-habituated predator, prey or competitor species or is likely to disrupt or influence normal spontaneous activity [33, 34]. The observer is rarely undetectable and even animals that do not appear to react to human presence may still change their behaviour in subtle ways. A plethora of studies have used the sensor-based methods, including the polyvinylidene fluoride (PVDF), Radio frequency identification (RFID) transponders, and photobeam (also called IR photocell), for monitoring animal behaviours [8, 35–38]. In Dell’Omo et al [8],it was shown that mice inoculated with transmissible spongiform encephalopathies (TSE) showed behavioural abnormalities long before the appearance of clinical signs. However, the need for automating behavioural observations using video monitoring system has been emphasised by many researchers [8, 20, 39, 40], mainly to address the limitations of the human (manual) monitoring systems. These include the intensive labour and data analysis problems of manual observation, and the suitability of sensorbased approaches for studies involving monitoring of only coarse locomotion activity based on spatial measurements, such as the distance covered by an animal or its speed [12]. To this end, several integrated video monitoring system (EthoVision) for automatic recording of activity, movement and interactions of rodents was developed [21, 40]. 5. Conclusions and Future Work In this paper, a demonstration of an effective machine learning–based system for automatically computing interpretable quantitative measures of mouse behaviour recognition has been presented. The automated and systematic implementation of training protocols with simultaneous monitoring of mice behaviour within its homecage dramatically reduces the effort involved in animal training and data acquisition while also removing human errors and biases from the process. The paper demonstrates that local binary patterns and cascade adaboost classifier seem to offer a flexible classification framework that could be used to monitor different behavioural categories. Additionally, the study methodology was presented in such a way as to make it easily reproducible by other researchers, which will facilitate the creation of new research avenues in behavioural neuroscience using a low-cost platform. Results from the experiments were presented, demonstrating that the devices were usable in the field and that very different applications could be handled using the same basic design. This includes a novel clip-based video labelling tool and an efficient machine learning mechanism to predict the true labels from multiple annotations. For future work, we would like to discover typical behaviours from videos of animals as a way of determining anomalous behaviours using unsupervised or Reinforcement learning. That is, without searching for behaviours previously labelled as such by biologists. Future work should also address questions that aim to understand factors contributing to social behaviours such as social exploration (hierarchical relations, social facilitation and imitation),
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
1385
contact behaviour (inter-attraction), sexual behaviour and antagonistic behaviour (aggression), in mice . References [1] R. S. Kaminsky, N. Snavely, S. M. Seitz, and R. Szeliski, “Alignment of 3d point clouds to overhead images,” in 2009 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2009, 2009, pp. 63–70. [2] R. O. Castle, G. Klein, and D. W. Murray, “Wide-area augmented reality using camera tracking and mapping in multiple regions,” Comput. Vis. Image Underst., vol. 115, no. 6, pp. 854–867, 2011. [3] J. J. Valletta, C. Torney, M. Kings, A. Thornton, and J. Madden, “Applications of machine learning in animal behaviour studies,” Animal Behaviour, vol. 124. pp. 203–220, 2017. [4] R. Poppe, “Vision-based human motion analysis: An overview,” Comput. Vis. Image Underst., vol. 108, no. 1–2, pp. 4–18, 2007. [5] C. Canton-Ferrer, J. R. Casas, and M. Pards, “Human motion capture using scalable body models,” Comput. Vis. Image Underst., vol. 115, no. 10, pp. 1363–1374, 2011. [6] A. Aydin, “Using 3D vision camera system to automatically assess the level of inactivity in broiler chickens,” Comput. Electron. Agric., vol. 135, pp. 4–10, 2017. [7] H. Jhuang, E. Garrote, N. Edelman, and T. Poggio, “Vision-Based Recognition of Mice Home-Cage Behaviors,” Nat. Commun., vol. 1, pp. 1–4, 2010. [8] G. Dell’Omo et al., “Early behavioural changes in mice infected with BSE and scrapie: Automated home cage monitoring reveals prion strain differences,” in European Journal of Neuroscience, 2002, vol. 16, no. 4, pp. 735–742. [9] A. Weissbrod et al., “Automated long-term tracking and social behavioural phenotyping of animal colonies within a semi-natural environment,” Nat. Commun., vol. 4, no. May, p. 2018, 2013. [10] O. F. Technology, “Mouse Behavior Recognition with The Wisdom of Crowd,” no. 2010, 2013. [11] E. H. Goulding, a K. Schenk, P. Juneja, A. W. MacKay, J. M. Wade, and L. H. Tecott, “A robust automated system elucidates mouse home cage behavioral structure.,” in Proceedings of the National Academy of Sciences, 2008, vol. 105, no. 52, pp. 20575–82. [12] T. Serre et al., “Automated home-cage behavioral phenotyping of mice: Technical Report,” Computer Science and Artificical Intelligence Laboratory, massachusetts institute of technology, cambridge, ma 02139 usa — www.csail.mit.edu, 2009. [13] L. Giancardo et al., “Automatic Visual Tracking and Social Behaviour Analysis with Multiple Mice,” PLoS One, vol. 8, no. 9, 2013. [14] H. Jhuang et al., “Automated home-cage behavioural phenotyping of mice.,” Nat. Commun., vol. 1, no. 5, p. 68, 2010. [15] V. André et al., “Laboratory mouse housing conditions can be improved using common environmental enrichment without compromising data,” PLOS Biol., vol. 16, no. 4, p. e2005019, Apr. 2018. [16] L. Lewejohann, A. M. Hoppmann, P. Kegel, M. Kritzler, A. Krüger, and N. Sachser, “Behavioral phenotyping of a murine model of Alzheimer’s disease in a seminaturalistic environment using RFID tracking.,” Behav. Res. Methods, vol. 41, no. 3, pp. 850–856, 2009. [17] P. Jirkof, “Burrowing and nest building behavior as indicators of well-being in mice,” J. Neurosci. Methods, vol. 234, no. August, pp. 139–146, 2014. [18] B. M. Spruijt and L. DeVisser, “Advanced behavioural screening: automated home cage ethology,” Drug Discov. Today Technol., vol. 3, no. 2, pp. 231–237, 2006. [19] T. V. Tkatchenko et al., “Photopic visual input is necessary for emmetropization in mice,” Exp. Eye Res., vol. 115, pp. 87–95, Oct. 2013. [20] C. A. Richardson, “The power of automated behavioural homecage technologies in characterizing disease progression in laboratory mice: A review,” Applied Animal Behaviour Science, vol. 163. pp. 19–27, 2015. [21] H. Jhuang et al., “Automated home-cage behavioural phenotyping of mice.,” Nat. Commun., vol. 1, no. 5, p. 68, 2010. [22] OpenCV, “OpenCV Tutorials,” OpenCV 2.4.11.0 documentation: Available at https://docs.opencv.org/2.4/opencv_tutorials.pdf, 2015. [Online]. Available: http://docs.opencv.org/doc/tutorials/tutorials.html.
1386
[23] [24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39] [40]
Tobechukwu Agbele et al. / Procedia Computer Science 159 (2019) 1375–1386 Tobechukwu Alban Agbele and Blessing Onuwa Ojeme/ Procedia Computer Science 00 (2019) 000–000
T. Ojala, M. Pietikäinen, and D. Harwood, “A comparative study of texture measures with classification based on featured distributions,” Pattern Recognit., vol. 29, no. 1, pp. 51–59, 1996. D. Huang, C. Shan, M. Ardabilian, Y. Wang, and L. Chen, “Local binary patterns and its application to facial image analysis: A survey,” IEEE Transactions on Systems, Man and Cybernetics Part C: Applications and Reviews, vol. 41, no. 6. pp. 765–781, 2011. M. Pietikäinen, A. Hadid, G. Zhao, and T. Ahonen, “Local Binary Patterns for Still Images,” in Computer Vision Using Local Binary Patterns, vol. 40, no. 2, 2011, pp. 13–43. Z. Guo, L. Zhang, and D. Zhang, “A {Completed} {Modeling} of {Local} {Binary} {Pattern} {Operator} for {Texture} {Classification},” IEEE Trans. Image Process., vol. 19, no. 6, pp. 1657–1663, 2010. X. Tang, Z. Y. Ou, T. M. Su, and P. F. Zhao, “Cascade AdaBoost classifiers with stage features optimization for cellular phone embedded face detection system,” in Advances in Natural Computation, Pt 3, Proceedings, vol. 3612, 2005, pp. 688–697. P. Viola and M. Jones, “Fast and robust classification using asymmetric AdaBoost and a detector cascade,” Adv. Neural Inf. Process. Syst. 14, Vols 1 2, vol. 14, pp. 1311–1318, 2002. M. Day and J. A. Robinson, “Constructing efficient cascade classifiers for object detection,” in Proceedings - International Conference on Image Processing, ICIP, 2010, pp. 3781–3784. EmguCV, “Behaviour Recognision Program,” “http://www.emgu.com/wiki/index.php?title=Main_Page&oldid=2076” This, 2017. [Online]. Available: http://www.jini.org/wiki/Main_Page. Y. Freund and R. E. Schapire, “A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting,” J. Comput. Syst. Sci., vol. 55, no. 1, pp. 119–139, 1997. J. Edwards and D. Gibson, “Novel Technology for the Remote Monitoring of Animals,” Companion Anim. Soc. Newsl., vol. 23, no. 2, pp. 56–59, 2012. X. Xie et al., “Rodent Behavioral Assessment in the Home Cage Using the SmartCageTM System,” in Animal Models of Acute Neurological Injuries II, vol. 1, Springer Protocols Handbooks, 2012, pp. 205–222. A. Markham, “On a Wildlife Tracking and Telemetry System: A Wireless Network Approach.,” no. August, p. 261 pp., 2008. X. Tang and L. D. Sanford, “Home cage activity and activity-based measures of anxiety in 129P3/J, 129X1/SvJ and C57BL/6J mice,” Physiol. Behav., vol. 84, no. 1, pp. 105–115, 2005. P. Tamborini, H. Sigg, and G. Zbinden, “Quantitative analysis of rat activity in the home cage by infrared monitoring. Application to the acute toxicity testing of acetanilide and phenylmercuric acetate,” Arch. Toxicol., vol. 63, no. 2, pp. 85–96, 1989. G. Casadesus, B. Shukitt-Hale, and J. A. Joseph, “Automated measurement of age-related changes in the locomotor response to environmental novelty and home-cage activity,” Mech. Ageing Dev., vol. 122, no. 15, pp. 1887–1897, 2001. A. A. Megens, J. Voeten, J. Rombouts, T. F. Meert, and C. J. Niemegeers, “Behavioral activity of rats measured by a new method based on the piezo-electric principle,” Psychopharmacology (Berl)., vol. 93, no. 3, pp. 382–388, 1987. E. H. Goulding, A. K. Schenk, P. Juneja, A. W. MacKay, J. M. Wade, and L. H. Tecott, “A robust automated system elucidates mouse home cage behavioral structure,” in Proceedings of the National Academy of Sciences, 2008, vol. 105, no. 52, pp. 20575–20582. L. P. J. J. Noldus, A. J. Spink, and R. A. J. Tegelenbosch, “EthoVision: A versatile video tracking system for automation of behavioral experiments,” Behav. Res. Methods, Instruments, Comput., vol. 33, no. 3, pp. 398–414, 2001.