An automated vision system for measurement of zebrafish length using low-cost orthogonal web cameras

An automated vision system for measurement of zebrafish length using low-cost orthogonal web cameras

Accepted Manuscript Title: An automated vision system for measurement of zebrafish length using low-cost orthogonal web cameras Authors: Qussay Al-Jub...

932KB Sizes 7 Downloads 113 Views

Accepted Manuscript Title: An automated vision system for measurement of zebrafish length using low-cost orthogonal web cameras Authors: Qussay Al-Jubouri, Waleed Al-Nuaimy, Majid l-Taee, Iain Young PII: DOI: Reference:

S0144-8609(17)30048-1 http://dx.doi.org/doi:10.1016/j.aquaeng.2017.07.003 AQUE 1909

To appear in:

Aquacultural Engineering

Received date: Revised date: Accepted date:

7-3-2017 14-6-2017 10-7-2017

Please cite this article as: Al-Jubouri, Qussay, Al-Nuaimy, Waleed, l-Taee, Majid, Young, Iain, An automated vision system for measurement of zebrafish length using low-cost orthogonal web cameras.Aquacultural Engineering http://dx.doi.org/10.1016/j.aquaeng.2017.07.003 This is a PDF file of an unedited manuscript that has been accepted for publication. As a service to our customers we are providing this early version of the manuscript. The manuscript will undergo copyediting, typesetting, and review of the resulting proof before it is published in its final form. Please note that during the production process errors may be discovered which could affect the content, and all legal disclaimers that apply to the journal pertain.

itle

An Automated Vision System for Measurement of Zebrafish Length Using Low-Cost Orthogonal Web Cameras

Qussay Al-Jubouri 1, Waleed Al-Nuaimy 1, Majid l-Taee1, Iain Young 2 1

Department of Electrical Engineering and Electronics, University of Liverpool. Liverpool L69 3GJ,

2

Department of Evolution, Ecology and Behaviour, Institute of Integrative Biology, University of

Liverpool. Liverpool L69 7ZB, UK. [email protected]

1

Abstract Fish size inspection is becoming increasingly important in aquaculture applications and research including growth monitoring, an early indication of health problems and others. However, sizing of small fish has not yet been thoroughly explored in the literature due to challenges associated with rapid and unpredictable changes in the swimming direction of such fish. This paper presents a new low-cost computer vision system for estimating the length of small fish length using dual synchronized orthogonal webcams. The contour and location of the fish body are identified through continuous capturing of front and side images of the fish population under study. A mathematical model that accounts for the projection error that is mainly caused by the depth from the camera and light refraction is derived and implemented in this study. An automatic calibration procedure is also suggested to account for light-refraction during the fish length measurements. Similarly, a camera distance calibration is performed experimentally and considered throughout the estimation process of the fish length. The performance of the developed vision system is assessed experimentally using individual free-swimming adult zebrafish. The obtained results for the subjects under test have demonstrated an average estimation error of approximately 1%. Such a relatively high estimation performance demonstrates the validity of the proposed model and compares favourably to the state of the art of small-fish sizing.

Keywords Computer vision; Free-swimming; Zebrafish; Length measurement; Web cameras, Aquaculture.

1. Introduction Video measurement and tracking systems has been widely used in aquaculture to measure fish and play a significant role in the enhancement of fish welfare levels. Furthermore, the continuous growth in fish research highlights the need for monitoring devices and procedure as that can acquire relevant information remotely (e.g. Beddow et al., 1996; Lines et al., 2001; Butail et al., 2012). Numerous studies on detection and tracking methods of underwater objects have been reported in the literature (e.g. Stien et al., 2007; Duarte et al., 2009; Martinez-de Dios et al., 2003) and using image processing2

based methods (e.g. Shi et al., 2005; Shiau et al., 2012; Chaturved et al., 2013). A comparison between the length-estimates mean from simultaneous underwater visual census diver-operated stereo video measurements for four fish-species was reported (Davis et al., 2015) to examine a technique that quantify and correct for observer bias in individual bias. In this study, the authors emphasized that an improvement in the length-estimation accuracy was achieved using a dive-specific correction factor.

Several computer vision methods have also been found to be particularly effective, however, due to their positive impact in remote monitoring under various operational conditions (e.g. Ruff et al., 1995; White, 2006). A 3D-infrared underwater monitoring system has also been proposed by Pautsina et al. (2015), using water’s high level of absorption of the near-infrared light range as the main distance indicator as reflected in the brightness of the camera image. Another method was reported in Rizzo et al., (2017), using a paired-laser photogrammetric for fish-length estimation. In this method, a set of digital photographs was taken for each fish individual by using waterproof camera equipped with two parallel lasers that were amounted on both sides of the camera. It was claimed that fish-length estimation was achieved with high accuracy. However, this system requires a precise alignment for the laser beam and is expensive and cumbersome to setup.

Thus, many computer vision systems based on image processing have been proposed for monitoring, tracking and size estimation of free-swimming fish. For example, Martinez-de Dios et al., (2003) reported an automated system for fish sizing based on stereo vision in sea cages with several preprocessing algorithms to compensate for the local variations in light illumination. Costa et al., (2006), meanwhile, developed a geometric algorithm with Fourier analysis to outline the fish body and to estimate its size using an underwater imaging system with a dual camera module connected to a portable waterproof PC. It was reported that the accuracy of fish-size estimation was improved by a post-processing procedure based on a neural network for error correction. Costa et al., (2009) also reported a stereo imaging system that utilizes dual-cameras with an optical ranging system to estimate the size of Bluefin Tuna fish remotely. An Artificial Neural Network (ANN) was suggested to 3

recognize the fish species. Furthermore, a single underwater side camera system has also been utilized to measure the size of different fish species (Zion et al., 2007). Another technique for measuring a tuna’s snout to fork length (SNFL) in digital images using hand-held camera was reported in (Hsieh et al., 2011). The images were taken on the deck of tuna fishing vessels. As reported, this technique can be efficiently used to estimate length of large stock fish but not free-swimming small fish.

Fish sizing systems have been mainly applied to larger-sized commercial species of fish, with only a few studies and computer vision systems having been reported to address the sizing challenges of smaller fish species. For instance, a manual separation, counting and inspection process for small commercial tropical guppy fish (Poecilia reticulate) has been reported (Karplus et al., 2003; 2005), as has gender classification based on extracting the shape and colour features for the same kind of fish species (Zion et al., 2008). The results of these studies demonstrate that these applications can have a positive effect in terms of reductions in labour cost and other aspects. Another small tropical fish species (zebrafish) has recently emerged as an experimental animal in various aspects of biological studies (e.g. Steenbergen et al., 2011; Stewart et al., 2011; Das et al., 2013), and large numbers of this kind of fish are now commonly used in laboratory experiments. There is, therefore, a demand for new monitoring, tracking, sizing and behavioural analysis techniques applicable to zebrafish (e.g. AlJubouri et al., 2014; AlZubi et al., 2016). This is a developing area of research that has not yet been thoroughly explored in the literature, with currently only a limited number of studies investigating behavioural aspects of zebrafish (e.g. Siccardi et al., 2009; Papadakis et al., 2012; Papadakis et al., 2014; Al-Jubouri et al., 2015). None of these studies, however, has tackled the challenges of sizing small fish including the rapid and unpredictable changes in the swimming direction of the fish.

Addressing these urgent needs, here we introduce a new method for estimating the length of freeswimming fish and outline its potential application to small-fish (zebrafish) sizing. A new costeffective vision system based on dual synchronized orthogonal webcams is designed and developed. The segmented area and location of the fish body are identified through continuous capturing of front and side images of the fish. A mathematical model is proposed for fish-length estimation that 4

accounts for the fish-length projection error. An automatic calibration procedure is also suggested to account for light-refraction during the fish-length measurements. A camera-distance calibration is also assessed experimentally and the performance of the entire sizing system is evaluated using couple of free-swimming zebrafish.

The remaining part of the paper is organized as follows. Section 2 describes the experimental setup; the mathematical model is presented in Section 3; Section 4 describes computer-vision system; Section 5 presents the obtained experimental results that are discussed in Section 6.

2. Materials and Methods The proposed system comprises a transparent-glass swimming container (i.e. water tank), two calibration objects, two web cameras and a computer. A generic laptop is used for data collection, processing and monitoring purposes: Intel R CoreTM i5-3320M CPU @ 2.6 GHz, 8.0 GB RAM, running MATLAB version 2013b (MathWorks, Inc.; Natick, MA, USA) and 64-bit WindowsTM 7 operating system. The main components of this experimental setup are shown in Fig. 1 and are described briefly as follows.

2.1 Water tank This is made up of 3 mm thick glass with dimensions of 32 × 20 × 24 cm (length, width and height), filled with water to a height of 18 cm. The top of the tank is open and exposed to ambient lighting while its bottom is clear. Two calibration objects (coins with 2.03 cm diameter) are fixed to two adjacent sides of the tank.

2.2 Web cameras Two generic low cost webcams (TRIXES) with the following specifications are used in this study; video frame size of 640 × 480 pixels (i.e. 300k pixels), 29 frames per second, and a focal length of 3.85 mm are used in this study. These cameras are fixed at the front and side views of the swimming 5

container/tank with different camera-tank distance, which helped capturing a full-vision for both sides of the tank. The front camera is used as the main source of image capturing to measure length of the fish body while the side camera only measures location of the subject. Knowing the camera’s depth of focus helps obtaining a clear image but such a specific information is not available in the technical specifications of the low-cost web camera used in this study, thus making calculation of this factor (i.e. the depth of focus) a difficult task. Instead, the depth of focus is assessed experimentally through immersing a coin object in the centre of the tank and adjusting the focal point manually until a clear image is obtained. Two snap images for first and last glass surface of the tank are then acquired to evaluate the effect of the obtained focal point on the detection quality of the front camera. Only a slight difference in the image quality was observed between the images taken at the first and last glass surfaces of the tank. The depth of focus for the front camera is therefore considered approximately equal to the depth of the tank.

Two coins are used to calculate the calibration factor (mm/pixel) as calibration objects. Utilisation of these objects helps in obtaining real physical metric of the fish under test as well as estimating the camera-tank distances Z∅1 and Z∅2 automatically. A similar calibration procedure is adopted for tracking and length estimation by both cameras.

2.3 Test Subjects Two free-swimming adult zebrafish with different lengths; 42 and 45 mm are used in this study. These fish are placed in two tanks with previous mentioned dimensions which filled with room temperature (25 °C) and filtered water. The actual length of fish is measured by using additional small glass tank with 30 × 11 × 17 cm (length, width and height) and a flat piece of white plastic is used to temporary confine the fish under test in a certain desired area closed to the ruler scale, as shown in Fig. 2. The actual fish length can therefore be measured, using a ruler fixed at the front-side of the test tank. It should be mentioned here that this manual measurement is only required for comparison with the fish-length estimation by the proposed vision system.

6

3. Mathematical Modelling 3.1 Challenges and assumptions The typical inverse proportionality relationship between distance and apparent length is a common phenomenon in vision systems. In vision-based sizing systems, therefore, this phenomenon is considered as a measurement challenge, where the actual length of fish needs to be reconstructed from a measured one that normally does not represent the real physical length. In the present study, the refraction through glass is negligible due to the relatively small thickness of the wall’s tank. The effect of water refraction is, however, a significant parameter. The proposed computer-vision system is based on a mathematical model that is derived by the authors taking the following assumptions into consideration: 

Distances between the cameras and the tank’s wall are different (Z∅1 < Z∅2).



The depth of focus of the front camera approximately equal to the depth of the tank.



Parallax effect on the measurement of the target coordinates is reduced through prespecifying boundaries of detection area for cameras.



Calibration of fish-length is performed using an object with known diameter (coin).



Subject under test should be pass through the camera image center as condition for the image capturing process.

3.2 Camera-tank distance measurement Calibration of the distance between the webcam and the tank’s wall is performed by measuring the distance changes incrementally within a range of 5 – 90 cm with a step of 0.5 cm. The distance between the camera (Z∅) and the calibration object is inversely proportional to the measured distance, as shown in Fig. 3. In mathematical form, this relationship is given by

Z 

1 d

Z  K

(1)

1 d

(2)

7

where d represents diameter of the calibration object in pixels and K is the proportion constants. For the webcams used in this study, the value of K is assessed through changing it value from 1 to 2000. At each step, the relative error (RE %) between the estimated and measured is calculated with the aim of identifying the optimal value of K.

3.3 Model derivation Based on the data collected by CAM1 and CAM2, the derivation of the proposed model can be described with the aid of Fig. 3, as follows. The front camera, CAM1, is used to measure the projected (measured) length Lm, for which the following trigonometric relationships are derived. These relationships are equally applicable for CAM2. Fig. 4 (a) demonstrates the effect of distance and water refraction on actual fish length while Fig. 4(b) provides the correction model geometry. From this geometry, the angle of front projection θ1 can be calculated from

 Lm    2 Z  1 

1  tan 1 

(3)

The angle of interest θ2, for length correction is obtained from

sin  2 

Lm n L2m  4Z 21

(4)

where n is the refractive index of the water in the tank which equals to 1.333, meaning that light travels 1.333 times faster in a vacuum than it does in water. Once Zd, is measured by the second camera, the estimated fish length Lest. is obtained from

Lest.

    Lm  1   Lm  2Z d tansin  1   2  n Lm 1  2   4Z 1    n   

(5)

4. The Vision System

8

The proposed vision system comprises four distinct stages; pre-processing, subject segmentation, length and distance measurement, length estimation. These stages are shown in the block diagram of Fig. 5 and are described as follows.

4.1 Pre-processing This stage composes of three main steps; camera-tank distance measuring, calibration, and background detection. These steps are outlined briefly as follows: 1) Camera-tank distance measure – Distance is directly measured through automatic detection for the radius of calibration object and then using the measured relationship represented in Eq.2.

2) Calibration – focuses on obtaining a factor that coverts fish-length from pixels a metric unit (mm). This is achieved through performing the three sequential steps; (i) collecting a single snapshot from each camera, (ii) detecting and measuring diameter of the calibration object used in this study (coin) in the image, and (iii) calculating the calibration factor through dividing the actual diameter of the calibration object in millimetre by the measured the diameter in pixels. As a result, the postcalibration maximum pixel-sizes are 457 and 253 micrometers for the front and side cameras, respectively.

3) Background generation – The decoded frames are added sequentially to artificial zero image (i.e. initial image) and then divided by the maximum value of resultant sum for normalizing. In mathematical terms, this process is given by 0 𝐺 = [⋮ 0

⋯ ⋱ ⋯

0 𝑓 ⋮ ] + ∑𝑖=1 Ω(𝑖) 0

(6)

where, Ω(i) is the converted RGB (i.e. grayscale) of the individual frame, f is the number of selected frames. In this study, f = 29 frames are considered adequate for detection of the image background. The obtained value of G is then normalised Gb using the following formula

Gb 

G  max Gmax

(7)

9

where Ωmax represents the maximum scalar value in individual frame and Gmax is maximum scalar value in the cumulative image G. The obtained value of Gb is then represent the required background in the subsequent segmentation stage of the work methodology.

4.2 Subject segmentation Post extracting both front and side backgrounds, each of the collected frames is subtracted from the corresponding background and the resultant image is threshold at 50% to convert the image into binary format. Fig. 6 shows several images that demonstrate the implemented subject-segmentation process. As illustrated, the quality segmented-area image in Fig. 6(b) is relatively poor due to some noise caused by transparency of some moving parts of the fish (i.e. fins and tail). The segmented binary image of Fig. 6(b) is enhanced using two-morphological operations to eliminate defragment in the detected regions, as illustrated in Fig. 6(c). The first morphological operation is dilation with 3 x 3 structuring element that can be defined as below (Gonzalez, R.C., 1977):

A  B   z  E | Bs z  A   

(9)

where Bs denotes the symmetric of B, which can be calculated as follow:

Bs   x  E |  x  B 

(10)

A single iteration is used to prevent the growing region from crossing other boundaries. The second morphological operation was erosion with a 3 x 3 structuring element that is defined by

A  B   z  E | Bz  A 

(11)

where Bz is the translation of B by the vector z which can be calculated as below:

Bz   b  z | b  B 

z  E

(12)

Post-morphological enhancement, the resultant image is shown in Fig. 6(d). This stage is not only enhanced the subject image but also improved measurement accuracy of the subject’s length.

4.3 Length-distance measurements

10

At this stage, two properties of the segmented regions are used to measure the lengths and depth’s distance of the tracked subject, represented by Zd plus Z∅ for each camera (Fig. 4). First, length Lm (pixels) of the tracked area from one camera, considering the orientation of the moving subject (see Fig. 6(d)). Second, the location of the areas centroid that obtains the corresponding subject distance Zd from the other camera.

4.4 Length estimation As illustrated in the geometry in Fig. 4, the estimated values of Lm and Zd for the front camera are used to estimate the fish length through the following steps: 1) Pre-length estimation: The measured values of Lm and Zd are used in Eq. 3 to obtain the estimate length of the fish Lest. 2) Maximum-length selection: In this process, a new vector of the maximum estimated length (𝐿𝑒𝑠𝑡𝑚𝑎𝑥 (𝑚) that represents the vector of the maximum values of the estimated length after using a certain threshold (see Fig. 7(a)) is generated from 𝐿𝑒𝑠𝑡𝑚𝑎𝑥 (𝑚) = {

𝑆𝑒𝑙𝑒𝑐𝑡 , 𝐼𝑔𝑛𝑜𝑟𝑒 ,

𝐿𝑒𝑠𝑡 (𝑚) ≥ 𝑇𝐻𝑅 𝑒𝑙𝑠𝑒

(13)

where the threshold (THR) represents the threshold that calculated from the first set n of the estimated length values Lest (n) is obtained from 𝑇𝐻𝑅 = 0.5 × max(𝐿𝑒𝑠𝑡 (𝑛)) ;

𝑛 < 𝑚 = 1: 100

(14)

3) Maximum-length Mode: Post selection of the maximum length, mode of the length values is computed and used to identify the final value of the fish length. Fig. 7(b) demonstrates an example for the obtained results in this experiment.

5. Results Two individual adult zebrafish and 10 experiments were carried out to estimate the length of each fish. In order to maintain appropriate welfare condition for the fish under test, these experiments were 11

conducted in a frequency of two experiments per day. In each experiment, the fish was moved into the testing tank for about an hour for acclimation. Next, the image capturing process was initiated until capturing 3000 frames that are considered adequate for the purpose of this study (i.e. fish-length estimation). It should be mentioned here that this process lasts for that last for a period of up to an hour, depending on the swimming behaviour of the fish under test. The fish is then moved back to the pre-experiment environment. Finally, the acquired images were analysed using the developed vision system to estimate the fish length.

Fig. 8 compares the actual and estimated fish-length measurements. As illustrated a difference in the range of 2-4% is demonstrated at this stage. This error that is comparable to that reported previously for large-fish length/mass manual estimation (Torisawa et al, 2011) can be caused by several factors including: (i) camera resolution, (ii) accuracy of coin calibration, (iii) refraction-correction error during rapid movement of the fish, and (iv) distortion relevant to image segmentation at large depth distance and morphological operations.

In this study, due to the above mentioned sources of error, further improvement for the measurement accuracy is suggested to minimize the systematic measurement error through identifying a new error correction factor (ε) and noting that the correct length estimate is slightly higher than the mode as bellow:

Lcorrected  Mode ( Lestmax (m))   . Lest

max

( m)

(15)

where Mode ( Lestmax (m)) represents the most frequent value of the maximum estimated length vector Lestmax (m) , and  Lest

max

( m)

represents the standard deviation of this vector. Lcorrected is the final

corrected estimated length using error correction factor ε.

As shown in Fig. 9, the optimization process of ε is carried out through calculating the relative percentage error RE% of the estimated length for a range of ε values. It can be noticed that the 12

minimum measurement error of the fish length is obtained with ε value around 0.23. Table 1 summarises and emphasizes the use of the identified value of ε on the length estimation accuracy in this study. The values in the table shows that the average length-estimation error is dropped from 24% (Fig. 8) to around 1%; 0.77% for a fish with a length of 42 mm and 1.08% for another fish with a length of 45 mm. This results demonstrate a slight difference between the manual and automated measurements of the fish length which reflects validity of the proposed model and addresses the challenge of sizing of small fish.

6. Discussion Dual orthogonal cameras were previously used for small fish tracking (Cachat et al., 2011a; Cachat, 2013). In these studies, the cameras were positioned as side-view and top-view, and each individual track file from each camera then had to be manually synchronized and exported (as raw track data) into spreadsheets. In this setup, the cameras need to be started manually, and their synchronization was often difficult and required careful pre-processing of recorded data. In a typical small fish experiment, many frames must be synchronized, thereby making it time-consuming and susceptible to human error. In the developed vision system, most of these practical difficulties have been addressed where the cameras are automatically synchronized and no manual camera-distance calibration is required. This is not only simplifying the system setup but also improves the estimation accuracy when compared to the state of the art.

The obtained experimental findings and observations have demonstrated feasibility of using the proposed mathematical model to develop a robust and non-destructive fish-length estimation system at low cost. The obtained estimation accuracy for the subjects used in this study can be considered quite acceptable for small experimental tanks in laboratory settings to estimate length of small fish. and easy to use for small experimental tanks in laboratory settings to estimate fish length.

Despite of the fact that the proposed system offers an acceptable range of accuracy, it still has the following limitations that are currently under consideration by the on-going research of the authors. 13

First, the time taken for each experiment is relatively long (up to 1 hour) which depends on the swimming behaviour of the fish under test which should be pass through the camera image center. However, the experimental time can be reduced through further improvement of the mathematical model, taking into consideration the conditions when the fish is not directly in front of the camera. Second, the use of fixed generated background and detection threshold in the detection process. This makes the system prone to error due to variations in the light intensity. This problem can be addressed through using a smarter detection based on dynamic adaptation of both the background and the detection threshold.

This pilot study, however, is expected to support further future studies on welfare of small fish which have not been thoroughly explored in literature yet. For example, further studies can be conducted on behavioural analysis of small fish as well as on the relationship between the fish behaviour and its size.

Acknowledgements Financial support for this study was provided by the University of Technology, Ministry of Higher Education and Scientific Research – Iraq, Grant: MOHESR-IQ-2013-17702. The authors would also like to extend their thanks to the staff at the Department of Life Sciences at the University of Liverpool for their support in facilitating the experimental work in the Aqua-Lab.

14

References Al-Jubouri, Q., Al-Nuaimy, W., Al-Taee, M.A., Luna, J.L. and Sneddon, L.U., 2015, November. Automated electrical stimulation and physical activity monitoring of zebrafish larvae. Conference on Applied Electrical Engineering and Computing Technologies (AEECT). IEEE. Al-Jubouri, Q., Al-Nuaimy, W., Al-Taee, M.A., Young, I. S, 2017. Towards Automated LengthEstimation of Free-Swimming Fish Using Machine Vision. Systems. 14th International MultiConference on System, Signals and Devices (SSD). IEEE. AlZubi, H.S., Al-Nuaimy, W., Buckley, J. and Young, I., 2016, March. An intelligent behavior-based fish feeding system. In Systems, Signals & Devices (SSD), 13th International Multi-Conference on (pp. 22-29). IEEE. Beddow, T.A., Ross, L.G. and Marchant, J.A., 1996. Predicting salmon biomass remotely using a digital stereo-imaging technique. Aquaculture, 146(3-4), pp.189-203. Brand, M., Granato, M. and Nüsslein-Volhard, C., 2002. Keeping and raising zebrafish. Zebrafish, 261, pp.7-37. Butail, S. and Paley, D.A., 2012. Three-dimensional reconstruction of the fast-start swimming kinematics of densely schooling fish. Journal of the Royal Society Interface, 9(66), pp.77-88. Cachat, J., Kyzar, E.J., Collins, C., Gaikwad, S., Green, J., Roth, A., El-Ounsi, M., Davis, A., Pham, M., Landsman, S. and Stewart, A.M., 2013. Unique and potent effects of acute ibogaine on zebrafish: the developing utility of novel aquatic models for hallucinogenic drug research. Behavioural brain research, 236, pp.258-269. Cachat, J., Stewart, A., Utterback, E., Hart, P., Gaikwad, S., Wong, K., Kyzar, E., Wu, N. and Kalueff, A.V., 2011. Three-dimensional neurophenotyping of adult zebrafish behavior. PloS one, 6(3), pp. 7597. Costa, C., Loy, A., Cataudella, S., Davis, D. and Scardi, M., 2006. Extracting fish size using dual underwater cameras. Aquacultural Engineering, 35(3), pp.218-227.

15

Costa, C., Scardi, M., Vitalini, V. and Cataudella, S., 2009. A dual camera system for counting and sizing Northern Bluefin Tuna (Thunnus thynnus; Linnaeus, 1758) stock, during transfer to aquaculture cages, with a semi-automatic Artificial Neural Network tool. Aquaculture, 291(3), pp.161-167. Chaturvedi, P.P., Rajput, A.S. and Jain, A., 2013. Video Object Tracking based on Automatic Background Segmentation and updating using RBF neural network. International Journal of Advanced Computer Research, 3(2), p.86. Das, B.C., McCormick, L., Thapa, P., Karki, R. and Evans, T., 2013. Use of zebrafish in chemical biology and drug discovery. Future medicinal chemistry, 5(17), pp.2103-2116. Davis, T., Harasti, D. and Smith, S.D., 2015. Compensating for length biases in underwater visual census of fishes using stereo video measurements. Marine and Freshwater Research, 66(3), pp.286-291. Duarte, S., Reig, L. and Oca, J., 2009. Measurement of sole activity by digital image analysis. Aquacultural Engineering, 41(1), pp.22-27. Gonzalez, R.C., 1977. Digital image processing (No. 001.64044 G6). Hsieh, C.L., Chang, H.Y., Chen, F.H., Liou, J.H., Chang, S.K. and Lin, T.T., 2011. A simple and effective digital imaging approach for tuna fish length measurement compatible with fishing operations. Computers and Electronics in Agriculture, 75(1), pp.44-51. Karplus, I., Gottdiener, M. and Zion, B., 2003. Guidance of single guppies (Poecilia reticulata) to allow sorting by computer vision. Aquacultural engineering, 27(3), pp.177-190. Karplus, I., Alchanatis, V. and Zion, B., 2005. Guidance of groups of guppies (Poecilia reticulata) to allow sorting by computer vision. Aquacultural engineering, 32(3), pp.509-520. Lines, J.A., Tillett, R.D., Ross, L.G., Chan, D., Hockaday, S. and McFarlane, N.J.B., 2001. An automatic image-based system for estimating the mass of free-swimming fish. Computers and Electronics in Agriculture, 31(2), pp.151-168. Martinez-de Dios, J.R., Serna, C. and Ollero, A., 2003. Computer vision and robotics techniques in fish farms. Robotica, 21(3), p.233. 16

Papadakis, V.M., Papadakis, I.E., Lamprianidou, F., Glaropoulos, A. and Kentouri, M., 2012. A computer-vision system and methodology for the analysis of fish behavior. Aquacultural engineering, 46, pp.53-59. Papadakis, V.M., Glaropoulos, A. and Kentouri, M., 2014. Sub-second analysis of fish behavior using a novel computer-vision system. Aquacultural Engineering, 62, pp.36-41. Pautsina, A., Císař, P., Štys, D., Terjesen, B.F. and Espmark, Å.M.O., 2015. Infrared reflection system for indoor 3D tracking of fish. Aquacultural Engineering, 69, pp.7-17. Rizzo, A.A., Welsh, S.A. and Thompson, P.A., 2017. A Paired-Laser Photogrammetric Method for In Situ Length Measurement of Benthic Fishes. North American Journal of Fisheries Management, 37(1), pp.16-22. Ruff, B.P., Marchant, J.A. and Frost, A.R., 1995. Fish sizing and monitoring using a stereo image analysis system applied to fish farming. Aquacultural engineering, 14(2), pp.155-173. Stien, L.H., Bratland, S., Austevoll, I., Oppedal, F. and Kristiansen, T.S., 2007. A video analysis procedure for assessing vertical fish distribution in aquaculture tanks. Aquacultural engineering, 37(2), pp.115-124. Shi, Y. and Karl, W.C., 2005, June. Real-time tracking using level sets. In Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on (Vol. 2, pp. 3441). IEEE. Shiau, Y.H., Lin, S.I., Lin, F.P. and Chen, C.C., 2012. Real-time fish observation and fish category database construction. International Journal of Advanced Computer Science and Applications, 3(4), pp.45-49. Steenbergen, P.J., Richardson, M.K. and Champagne, D.L., 2011. The use of the zebrafish model in stress research. Progress in Neuro-Psychopharmacology and Biological Psychiatry, 35(6), pp.1432-1451. Stewart, A., Wu, N., Cachat, J., Hart, P., Gaikwad, S., Wong, K., Utterback, E., Gilder, T., Kyzar, E., Newman, A. and Carlos, D., 2011. Pharmacological modulation of anxiety-like phenotypes in

17

adult zebrafish behavioral models. Progress in Neuro-Psychopharmacology and Biological Psychiatry, 35(6), pp.1421-1431. Siccardi III, A.J., Garris, H.W., Jones, W.T., Moseley, D.B., D'Abramo, L.R. and Watts, S.A., 2009. Growth and survival of zebrafish (Danio rerio) fed different commercial and laboratory diets. Zebrafish, 6(3), pp.275-280. Torisawa, S., Kadota, M., Komeyama, K., Suzuki, K. and Takagi, T., 2011. A digital stereo-video camera system for three-dimensional monitoring of free-swimming Pacific bluefin tuna, Thunnus orientalis, cultured in a net cage. Aquatic Living Resources, 24(2), pp.107-112. Viazzi, S., Van Hoestenberghe, S., Goddeeris, B.M. and Berckmans, D., 2015. Automatic mass estimation of Jade perch Scortum barcoo by computer vision. Aquacultural Engineering, 64, pp.42-48. White, D.J., Svellingen, C. and Strachan, N.J.C., 2006. Automated measurement of species and length of fish by computer vision. Fisheries Research, 80(2), pp.203-210. Zion, B., Alchanatis, V., Ostrovsky, V., Barki, A. and Karplus, I., 2007. Real-time underwater sorting of edible fish species. Computers and Electronics in Agriculture, 56(1), pp.34-45. Zion, B., Alchanatis, V., Ostrovsky, V., Barki, A. and Karplus, I., 2008. Classification of guppies’ (Poecilia reticulata) gender by computer vision. Aquacultural Engineering, 38(2), pp.97-104. Zhu, L. and Weng, W., 2007. Catadioptric stereo-vision system for the real-time monitoring of 3D behavior in aquatic animals. Physiology & behavior, 91(1), pp.106-119.

18

Fig. 1. System setup

Fish 1: La ≈ 42 mm Fig. 2. Actual length measurements (La) for the two-tested zebrafish. 19

20

Fig. 3. Measured relationship between the camera and the calibration object

(a) Distance and refraction effect

(b) Correction model geometry

Fig. 4. Fish-length correction model (Lest. estimated fish-length; Lm, fish-length measured by camera; Lw, fish-length affected by water refraction; Zo, the distance between the camera and the tank wall; Zd, the dynamic distance between the tank wall and the subject under test)

Fig. 5 Block diagram of the proposed vision system 21

(a) Raw image

(b) Segmented area

(c) Segmentation Enhancement

(d) Length measurement

Fig. 6. Example of fish-length estimation based on major length of bounding box of the segmented fish’s body area

(a) Maximum selected length

(b) Histogram of length estimation

Fig. 7. Example of fish-length estimation based on histogram mode

22

46 Actual Length Estimated Length

Fish Length (mm)

45 44 43 42 41 40 39 1

Fish Number

2

Fig. 8. Comparison between actual and estimated fish-length.

Fig. 9 Optimization of the error correction factor (ε)

23

Table 1. Fish length estimation accuracy for without and with error of the systematic error correction.

Test No. 1

Actual Length 42 mm RE% ε=0 ε = 0.23 2.38 0.19

Actual Length 45 mm RE% ε=0 ε = 0.23 2.22

0.21

2

2.38

0.04

4.44

2.08

3

2.38

0.27

4.44

2.20

4

4.76

2.18

2.22

0.09

5

2.38

0.23

2.22

0.22

6

2.38

0.80

2.22

0.14

7

2.38

0.46

2.22

0.16

8

2.38

0.27

2.22

0.42

9

2.38

0.09

4.44

1.47

10 Average

2.38 2.71

0.14

2.22

0.12

0.77

3.06

1.08

24