Brain Tumors Diagnosis and Prediction Based on Applying the Learning Metaheuristic Optimization Techniques of Particle Swarm, Ant Colony and Bee Colony

Brain Tumors Diagnosis and Prediction Based on Applying the Learning Metaheuristic Optimization Techniques of Particle Swarm, Ant Colony and Bee Colony

Available online at www.sciencedirect.com Available online at www.sciencedirect.com ScienceDirect ScienceDirect Procedia Computer Science (2019) Scie...

950KB Sizes 0 Downloads 2 Views

Available online at www.sciencedirect.com Available online at www.sciencedirect.com

ScienceDirect ScienceDirect Procedia Computer Science (2019) ScienceDirect Procedia Computer Science (2019) ScienceDirect ScienceDirect

Available online at www.sciencedirect.com Available online online at at www.sciencedirect.com www.sciencedirect.com Available

16th 16th 16th 16th

Procedia Computer Science (2019) Procedia Computer Science Procedia Computer Science 163 (2019) 165–179

LT Learning and Technology Conference 2019 LT Learning and Technology Conference 2019 LT LT Learning Learning and and Technology Technology Conference Conference 2019 2019

Brain Brain Tumors Tumors Diagnosis Diagnosis and and Prediction Prediction Based Based on on Applying Applying the the Learning Metaheuristic Optimization Techniques of Particle Brain Tumors Diagnosis and Prediction Based on Applying the Brain Tumors Diagnosis and PredictionTechniques Based on Applying the Learning Metaheuristic Optimization of Particle Swarm, Ant Ant Optimization Colony and and Bee Bee Colony of Learning Techniques Learning Metaheuristic Metaheuristic Optimization Techniques of Particle Particle Swarm, Colony Colony Swarm, Ant Colony and Colony Swarm, Colony and Bee Bee Colony Rabab Hamed M. AlyAnt , Kamel H. Rahouma , Hesham F.A. Hamed a

b

b

Rabab Hamed M. Alya, Kamel H. Rahoumab, Hesham F.A. Hamedb aThe Higher Institute for Management Technology and Information, Minia, 61519, Egypt a b aThe Higher Institute for Management and Information, Minia,Minia, 61519, Egypt Rabab Hamed M. Aly Kamel H. Rahouma Hesham F.A. Hamed bElectrical Engineering Department, FacultyTechnology of Engineering, Minia University, 61519, Egypt bb a, b, Rabab Hamed M. Aly , Kamel H. Rahouma , Hesham F.A. Hamed bElectrical Engineering Department, Faculty of Engineering, Minia University, Minia, 61519, Egypt aThe Higher Institute for Management Technology and Information, Minia, 61519, Egypt aThe Higher Institute for Management Technology and Information, Minia, 61519, Egypt bElectrical Engineering Department, Faculty of Engineering, Minia University, Minia, 61519, Egypt bElectrical Engineering Department, Faculty of Engineering, Minia University, Minia, 61519, Egypt

Abstract Abstract Brain tumors are intensively studied and many techniques and algorithms have been proposed to extract the features of the brain Abstract Brain tumors are intensively studied and many techniques and algorithms have been proposed to extract the features of the brain Abstract MRI images and diagnose the tumors. The different techniques are distinguished and favored based on their accuracy and speed. MRI images and diagnose thestudied tumors.and The different techniquesalgorithms are distinguished and favored based on their accuracyofand Brain tumors are optimization intensively many techniques have been to extract the features thespeed. brain This to apply methods improve them. Inand thisalgorithms paper, we apply three proposed methaheuristic optimization methods Brainled tumors are intensively studied andtomany techniques and have been proposed to extract the features of thewhich brain This led to apply optimization methods to improve them. In this paper, we apply three methaheuristic optimization methods which MRI images and diagnose the tumors. The different techniques are distinguished and favored based on their accuracy and speed. recently gained interest. These are: the Binary Particle Swarm Optimization the Ant colony Traveland salesman MRI images andmuch diagnose the tumors. The different techniques are distinguished and (B-PSO), favored based on their accuracy speed. recently gained interest.methods These are: the Binary Particle Swarmwe Optimization (B-PSO), the Ant colony Travel salesman This led to applymuch optimization to improve In this paper, apply three methaheuristic optimization methods problem (ACO-TSP) and the Artificial Bee colonythem. optimization (ABCO-BFS). methods are carried out in two main which steps. This led to apply optimization methods to improve them. In this paper, we apply These three methaheuristic optimization methods which problem (ACO-TSP) and the Artificial Bee colony optimization (ABCO-BFS). These methods are carried out in two main steps. recently gained much interest. These are: the Binary Particle Swarm Optimization (B-PSO), the Ant colony Travel salesman The first gained step is much the segmentation process brain Particle MRI images using K-means cluster andthe theAnt second steps is thesalesman features recently interest. These are: of thethe Binary Swarm Optimization (B-PSO), colony Travel The first step is the segmentation process of the brain MRI images using K-means cluster and the second steps is the features problem Artificial colony (ABCO-BFS). methods for are carried extraction. out in two main steps. extraction(ACO-TSP) from them.and Wethe compare theBee results of optimization these methods to obtain theThese best solution The group problem (ACO-TSP) and the Artificial Bee colony optimization (ABCO-BFS). These methods are feature carried out in two main steps. extraction from them. We compareprocess the results ofbrain theseMRI methods to using obtainK-means the best cluster solutionand forthe feature extraction. Thefeatures group The first step is the segmentation of the images second steps is the method dataishandling (GMDH) process is applied the extracted features for the cluster tumors and diagnosis. Prediction alsofeatures carried The firstofstep the segmentation of to theclassify brain MRI images using K-means the second steps is the method of from data handling (GMDH) is the applied to classify extracted forbest the tumors Prediction is also extraction them. We compare results of complications. thesethe methods to features obtain the solutiondiagnosis. for feature extraction. Thecarried group on by applying GMDH method for future accuracy classification prediction is obtained to be extraction from the them. We compare theany results of these methods The to obtain the of best solution forand feature extraction. The group on by applying the GMDH method for any future complications. The accuracy of classification and prediction is obtained to be method of data handling (GMDH) is applied to classify the extracted features for the tumors diagnosis. Prediction is also carried range from 88.9% to 98.8%. method of data handling (GMDH) is applied to classify the extracted features for the tumors diagnosis. Prediction is also carried range 88.9% 98.8%.method for any future complications. The accuracy of classification and prediction is obtained to be on by from applying thetoGMDH on by applying the GMDH method for any Ltd. future © The Authors. Published by © 2019 2019 The88.9% Authors. Published by Elsevier Elsevier B.V. complications. The accuracy of classification and prediction is obtained to be range from to 98.8%. © 2019 The88.9% Authors. Published by Elsevier Ltd. range from to 98.8%. This is anAnt open access article under the CC BY-NC-ND license Keywords: Colony Optimization, Particle Swarm Optimization, Bee (https://creativecommons.org/licenses/by-nc-nd/4.0/) Colony Optimization, Group Method of Data Handling. Keywords: AntAuthors. Colonyresponsibility Optimization, Particle Swarm Bee Colony Optimization, Group Method of Data Handling. © 2019 The Published by Elsevier Ltd.Optimization, Peer-review under of the scientific © 2019 The Authors. Published by Elsevier Ltd.committee of the 16th International Learning & Technology Conference 2019. Keywords: Ant Colony Optimization, Particle Swarm Optimization, Bee Colony Optimization, Group Method of Data Handling. Keywords: Ant Colony Optimization, Particle Swarm Optimization, Bee Colony Optimization, Group Method of Data Handling. 1. Introduction

1. Introduction 1. Introduction 1. Nowadays, Introductioncomputer vision and deep learning with big data play important roles in scientific researches, Nowadays, computer vision and deep learning with big data play important roles in scientific researches, especially in Magnetic Resonance Imaging (MRI) diagnosis. MRI is one of the most popular techniques for especially in Magnetic (MRI) with diagnosis. MRIplay is one of the roles most inpopular techniques for Nowadays, computer Resonance vision and Imaging deep learning big data important scientific researches, Nowadays, computer vision and ordeep learning with big diseases data play important roles in scientific researches, diagnosing such as in brain tumors the detection of other based on image processing technique. Most diagnosing such as in brain tumors or the detection other diseases based processing especially in Magnetic Resonance Imaging (MRI) of diagnosis. MRI is one on of image the most populartechnique. techniquesMost for especially Magnetic Imagingof (MRI) diagnosis. MRI one of most the popular techniques researches in have focusedResonance on the method detection of tumor andis how to the achieve optimal solution for in researches focused ontumors the method of detection of tumor andbased how on to image achieve the optimal solution in diagnosing have such as in brain or the detection of other diseases processing technique. Most diagnosing such as in brain tumors or the detection of other diseases based on image processing technique. Most classification techniques. Actually, the diagnosis of the tumor depends on processing its features. These features can classification techniques. the diagnosis of the tumor depends processing its features. These features researches have focused Actually, on the method of detection of tumor andon how to achieve the optimal solutioncan in researches focusedthe ontype theofmethod of detection tumor and how to achieve be classifiedhave to represent tumors and to give theofmain difference between a tumorthe andoptimal another. solution in be classified totechniques. represent the type ofthe tumors and toofgive main difference between a its tumor and another. classification Actually, diagnosis the the tumor depends on processing features. These features can classification techniques. the computerized diagnosis of the tumor depends on processing its features. These Many researchers haveActually, suggested techniques to analyze tumors and to diagnose thefeatures degreecan of Many researchers have computerized techniques to analyzebetween tumors aand to and diagnose the degree of be classified to represent thesuggested type of tumors and to give the main difference tumor another. be classified represent the type of tumors and totools givesuch the main between (NN), a tumor another. tumors. Thesetotechniques have utilized different as thedifference Neural Networks theand Fuzzy Logic (FL), the tumors. techniques utilizedcomputerized different toolstechniques such as thetoNeural Networks Fuzzy Logic (FL), the ManyThese researchers havehave suggested analyze tumors (NN), and tothediagnose the degree of Many researchers have(OT), suggested computerized techniques to Regression analyze tumors the degree of Optimization Techniques the Deep Learning (DL), the Linear (LR),and ….,toetc.diagnose In this paper, we have Optimization the Deep Learning (DL), theasLinear Regression (LR), …., etc. In this Logic paper,(FL), we have tumors. TheseTechniques techniques (OT), have utilized different tools such the Neural Networks (NN), the Fuzzy the tumors. These techniques have utilized different the Neural Networks theofFuzzy LogicMATLAB (FL), the used these tools and combined them to analyzetools the such brainastumor and classify the(NN), degree a tumor. used these tools and combined them to Learning analyze the brain classify(LR), the degree tumor. MATLAB Optimization Techniques (OT), the Deep (DL), the tumor Linear and Regression …., etc.ofInathis paper, we have Optimization Techniques (OT), the Deep (DL), thedatabase Linear Regression …., In for thisthe paper, we have software is used to simulate the results andLearning the Radiopaedia MRI cases'(LR), images areetc. used analysis [1]. software is used simulate the results the Radiopaedia MRIclassify cases' images are used the analysis [1]. used these toolstoand combined them and to analyze the braindatabase tumor and the degree of afor tumor. MATLAB ∗used these tools and combined them to analyze the brain tumor and classify the degree of a tumor. MATLAB ∗software is used to simulate the results and the Radiopaedia database MRI cases' images are used for the analysis [1]. software is used to simulate the results and the Radiopaedia database MRI cases' images are used for the analysis [1]. ∗ ∗1877-0509

© 2019 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0/) Peer-review under responsibility of the scientific committee of the 16th International Learning & Technology Conference 2019. 10.1016/j.procs.2019.12.098

166

Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

The paper is composed of six sections. Section (1) gives an introduction and section (2) depicts a literature review. Section (3) explains the methodology and section (4) presents the results while section (5) discusses these results and compares them to a set of algorithms from the literature review. Section (6), gives some conclusions and some of the used references are listed at the end. 2. A Literature Review The previous researches of tumor diagnosis have focused on the method of detection of tumor and how to classify the degree of a tumor. Many algorithms have introduced the different measurements of extracting features from tumor MRI images. On the other hand, other researchers have studied these features to help in diagnosing the degree of tumors. In the following, we will introduce some of the important previous researches which have focused on classifications and detection of tumors. In [2], authors have applied a survey for the methods of segmentation and classifications of brain tumors and the computer-aided diagnosis of human brain tumor. Authors have explained how computer-aided analysis can improve the diagnosis of the brain tumor and after that, they have tried to collect a comprehensive survey about the methods of classifications and detections of the brain tumor up to the date of writing the survey. In [4, 5], authors have applied other methods of detection and classification such as Fuzzy logic techniques. In [4], authors have introduced the fuzzy logic technique as a new expert system for diagnosing glaucoma. They have applied "Randomized Hough Transform "to extract some of the features parameters and also have achieved a classification of these features based on "Fuzzy logic" technique. The accuracy of this method has compared to other methods and it has achieved more than 96% especially in prediction. Furthermore, in [5], authors have applied a new segmentation technique to extract features of glaucoma called Ocular cup. They achieved high exactness in the segmentation process. Authors, in [6], have applied a diffusion filter for denoising and they have classified tumors by using evaluation of the stages of tumors such as malignant or benign to achieve high accuracy. The method has achieved an accuracy of detection of malignant tumor 99.02% and of benign tumor 99.67%. The most popular classification methods is neural network. In [7], authors achieved neural networking in classification techniques and actually they applied the NN to classify tumor and healthy tissue, and to highlight the presence of infiltrating tumor cells with high accuracy. Furthermore, a Gabor filter is considered one of the most practical filters in computer vision. In [8], authors have described the automation recognition of brain tumors in MRI based on Gabor wavelet features and static features. They have compared the methods and studied several classifiers. On the other hand, Particle Swarm is considered faster and stable in some cases than genetic algorithms [9]. Some authors have introduced it as a segmentation method such as in [10] where authors have introduced the PSO as a segmentation technique for MRI and have presented the types of PSO methods as a segmentation process such as "Darwinian Particle Swarm" DPSO. Furthermore, they have classified the results based on SVM and the complete system have achieved an accuracy near to 100%. In this paper, will also introduce it but as a feature extraction for brain tumor detection. On the other hand, there are other algorithms of optimizations based on a search of the shortest paths such as Ant colony optimization and Bee colony. Actually, the Metaheuristic methods especially an Ant colony optimization has attracted a lot of attention in recent researches and Bee colony has also played an important role. In [11], authors have applied ACO (ant colony optimization) to improve the method of feature extraction based on feature subset selection. They have applied it on TIMIT database of speed signal and they have extracted some of the selected features and then they have classified them based on the artificial neural network. The accuracy of the system 84.21% and they compared the results with other algorithms they also have applied. In [12], authors have introduced Artificial Bee Optimization (ABO) as a segmentation method. Actually, they have applied the method on MRI images. When they compared the ABO with other segmentation methods such as K-means and Fuzzy-means, the ABO has achieved the best results.



Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

167

In this paper, we use, combine, and modify some of the mentioned methods to analyze the MRI images of the brain tumor, diagnose the types of tumors, classify and predict future complications. 3. Methodology The system of this paper is focusing on the detection of brain tumors and prediction of future complications based on the diagnosis of the degree of the tumors. The structure of this paper is composed of five main units (eg. Preprocessing stage, ROI segmentation process, feature extraction stage, classification and the prediction of future complication) as shown in Figure.1. The description of the block diagram stages of the paper system will be discussed in the following subsections.

Fig.1: the system block diagram

3.1 The preprocessing unit: The MRI database images are affected by some noises so, we have applied a suitable filter to enhance the images before the processing. We have chosen "Kuwahara filter". Actually, a Kuwahara filter is introduced as a noise reduction technique for medical images. The characteristic operation of this filter is nonlinear, smoothing and adaptive noise reduction. Furthermore, the cause of using this filter is the enhancement of it to be more suitable in the feature extraction technique [13]. The main steps of mathematical operations to apply this filters will be: a) Consider a gray scale of the image as I (x,y). b) Square window of size "2a + 1" centered around (x,y). c) Divide the square into four smaller regions Q1,Q2,Q3 and Q4 as equation (1) to (4): Q1(x,y) = ሾ𝑥𝑥ǡ𝑥𝑥൅𝑎𝑎ሿ𝑋𝑋ሾ𝑦𝑦ǡ𝑦𝑦൅𝑎𝑎ሿ Q2(x,y) = ሾ𝑥𝑥− 𝑎𝑎ǡ𝑥𝑥ሿ𝑋𝑋ሾ𝑦𝑦ǡ𝑦𝑦൅𝑎𝑎ሿ Q3(x,y) = ሾ𝑥𝑥− 𝑎𝑎ǡ𝑥𝑥ሿ𝑋𝑋ሾ𝑦𝑦−𝑎𝑎ǡ𝑦𝑦ሿ Q4(x,y) = ሾ𝑥𝑥ǡ𝑥𝑥൅𝑎𝑎ሿ𝑋𝑋ሾ𝑦𝑦−𝑎𝑎ǡ𝑦𝑦ሿ



(1) (2) (3) (4)

Where X is the cartesian product. However, the pixels have located on the borders between two regions belong to both regions so there is a slibght overlap between sub regions. The determination of central pixels of the four regions is based on arithmetical mean ሺ𝑥𝑥ǡ𝑦𝑦ሻand standard deviation ሺ𝑥𝑥ǡ𝑦𝑦ሻand the output of Kuwahara filter 𝜑𝜑ሺ𝑥𝑥ǡ𝑦𝑦ሻfor any (x,y) is the mean value of sub region which gives the least standard deviation as shown in equation (5). [13-14] 𝑚𝑚ͳሺ𝑥𝑥ǡ𝑦𝑦ሻ𝑖𝑖𝑓𝑓𝜎𝜎ͳሺ𝑥𝑥ǡ𝑦𝑦ሻൌ𝑚𝑚𝑖𝑖𝑛𝑛𝑖𝑖𝜎𝜎𝑖𝑖ሺ𝑥𝑥ǡ𝑦𝑦ሻ    𝑚𝑚ʹሺ𝑥𝑥ǡ𝑦𝑦ሻ𝑖𝑖𝑓𝑓𝜎𝜎ʹሺ𝑥𝑥ǡ𝑦𝑦ሻ𝑚𝑚𝑖𝑖𝑛𝑛𝑖𝑖𝜎𝜎𝑖𝑖ሺ𝑥𝑥ǡ𝑦𝑦ሻ (5) 𝜑𝜑ሺ𝑥𝑥ǡ𝑦𝑦ሻ= 𝑚𝑚͵ሺ𝑥𝑥ǡ𝑦𝑦ሻ𝑖𝑖𝑓𝑓𝜎𝜎͵ሺ𝑥𝑥ǡ𝑦𝑦ሻ𝑚𝑚𝑖𝑖𝑛𝑛𝑖𝑖𝜎𝜎𝑖𝑖ሺ𝑥𝑥ǡ𝑦𝑦ሻ ሼͶሺ𝑥𝑥ǡ𝑦𝑦ሻ𝑖𝑖𝑓𝑓𝜎𝜎Ͷሺ𝑥𝑥ǡ𝑦𝑦ሻ𝑚𝑚𝑖𝑖𝑛𝑛𝑖𝑖𝜎𝜎𝑖𝑖ሺ𝑥𝑥ǡ𝑦𝑦ሻሽ

Note that, the filter has taken into account the homogeneity of the regions to ensure that it will be preserved on the edges while using the mean which creates the blurring effect. The result after applying this filter is shown in Figure (2) and we will discuss the results of each block in section (5).

Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

168

Fig.2: a) Enhancemnt of MRI image after Kuwahara filter for colord image b) Enhancemnt of MRI image after Kuwahara filter for gray image

Fig.2: a) Enhancemnt of MRI image after Kuwahara filter for colord image b) Enhancemnt of MRI image after Kuwahara filter for gray image

3.2 The ROI Segmentation unit:

3.2 The ROI Segmentation unit:

Actually, this unit consists of two main parts: a) Thresholding and binarization analysis after enhancement. Actually, this consists of two main b) unit Segmentation process basedparts: on K-means clusters. following, the of these processes. a) In the Thresholding anddiscussion binarization analysis after enhancement.

b) Segmentation process based on K-means clusters. 3.2.1 Thresholding and Binarization: In the following, the discussion of these processes.

3.2.1 Thresholding and Binarization: In this paper, we apply the Otsu thresholding method because of its efficiency and simplicity. The method depends

on choosing the optimal threshold value that maximizes the between-class variance of a resulting object and background classes. If we need to obtain two classes thresholding on the image, we will apply an equation (6) but if In this we paper, apply multilevel the Otsu thresholding. thresholdingFinally, methodinbecause of we its have efficiency andmultilevel simplicity. The method depends need we to obtain this paper, obtained thresholding for MRI on choosing optimal value that maximizes the between-class variance of a resulting object and databasethe images based threshold on the improvement of this method [15]. background classes. If we need to obtain two classes thresholding on the image, we will apply an equation (6) but if Note that: ∗is optimal thresholding and σʹሺ–ሻis the between classes 𝑤𝑤ͳሺ𝑡𝑡ሻǡ𝑤𝑤 we need to obtain multilevel thresholding. Finally, in this paper, we have obtained multilevel thresholding for MRI ʹሺ𝑡𝑡ሻ𝑎𝑎𝑟𝑟𝑒𝑒𝑡𝑡ℎ𝑒𝑒’”‘„ƒ„‹Ž‹–›𝑜𝑜𝑓𝑓𝑡𝑡ℎ𝑒𝑒𝑡𝑡𝑤𝑤𝑜𝑜𝑐𝑐𝑙𝑙𝑎𝑎𝑠𝑠𝑠𝑠𝑒𝑒𝑠𝑠ǡ–  variance canbased be obtained maximizing the between the classes database images on theby improvement of this method [15]. variance as:

Note that: ʹ 𝑡𝑡 ∗ ൌ𝐴𝐴𝑟𝑟𝑔𝑔𝑚𝑚𝑎𝑎𝑥𝑥 (6)between classes 0≤𝑡𝑡≤255ሼ𝜎𝜎 𝐵𝐵ሺ𝑡𝑡ሻሽ ∗is optimal thresholding and σʹሺ–ሻis the 𝑤𝑤ͳሺ𝑡𝑡ሻǡ𝑤𝑤 ʹሺ𝑡𝑡ሻ𝑎𝑎𝑟𝑟𝑒𝑒𝑡𝑡ℎ𝑒𝑒’”‘„ƒ„‹Ž‹–›𝑜𝑜𝑓𝑓𝑡𝑡ℎ𝑒𝑒𝑡𝑡𝑤𝑤𝑜𝑜𝑐𝑐𝑙𝑙𝑎𝑎𝑠𝑠𝑠𝑠𝑒𝑒𝑠𝑠ǡ–  varianceʹ can be obtained by maximizing the betweenʹ the classes variance as: ʹ 𝜎𝜎𝐵𝐵ሺ𝑡𝑡ሻൌ𝑤𝑤ͳሺ𝑡𝑡ሻሺ𝜇𝜇ͳሺ𝑡𝑡ሻ−𝜇𝜇𝑇𝑇ሻ ൅𝑤𝑤ʹሺ𝑡𝑡ሻሺ𝜇𝜇ʹሺ𝑡𝑡ሻ−𝜇𝜇𝑇𝑇ሻ

𝑡𝑡 ∗ ൌ𝐴𝐴𝑟𝑟𝑔𝑔𝑚𝑚𝑎𝑎𝑥𝑥

ሼ𝜎𝜎ʹ𝐵𝐵ሺ𝑡𝑡ሻሽ 𝜇𝜇0≤𝑡𝑡≤255 ͳǡʹ𝑎𝑎𝑟𝑟𝑒𝑒𝑡𝑡ℎ𝑒𝑒𝑚𝑚𝑒𝑒𝑎𝑎𝑛𝑛𝑣𝑣𝑎𝑎𝑙𝑙𝑢𝑢𝑒𝑒𝑠𝑠𝑜𝑜𝑓𝑓𝑡𝑡ℎ𝑒𝑒𝑡𝑡𝑤𝑤𝑜𝑜𝑐𝑐𝑙𝑙𝑎𝑎𝑠𝑠𝑠𝑠𝑒𝑒𝑠𝑠[15].

(7)

(6)

Where Furthermore, when the multilevel thresholding of image based on M classed will be:



ʹ ሺ𝑡𝑡ሻൌ𝑤𝑤 ሺ𝑡𝑡ሻሺ𝜇𝜇 ሺ𝑡𝑡ሻ−𝜇𝜇 ሻ ʹ ൅𝑤𝑤 ሺ𝑡𝑡ሻሺ𝜇𝜇 ሺ𝑡𝑡ሻ−𝜇𝜇 ሻ ʹ 𝜎𝜎𝐵𝐵 ͳ ͳ 𝑇𝑇 ʹ ʹ 𝑇𝑇 ∗ ∗ ∗ 𝑀𝑀

ሼ𝑡𝑡 ǡ𝑡𝑡 …ǡ𝑡𝑡 ͳʹǡ

ሽൌ𝐴𝐴𝑟𝑟𝑔𝑔𝑚𝑚𝑎𝑎𝑥𝑥0≤𝑡𝑡≤255{∑

𝑀𝑀−ͳ

𝑤𝑤𝑘𝑘ሺ𝜇𝜇𝑘𝑘 −𝜇𝜇 𝑇𝑇ሻ ʹሽ

𝐾𝐾ൌͳ

(7)

Where 𝜇𝜇ͳǡʹ𝑎𝑎𝑟𝑟𝑒𝑒𝑡𝑡ℎ𝑒𝑒𝑚𝑚𝑒𝑒𝑎𝑎𝑛𝑛𝑣𝑣𝑎𝑎𝑙𝑙𝑢𝑢𝑒𝑒𝑠𝑠𝑜𝑜𝑓𝑓𝑡𝑡ℎ𝑒𝑒𝑡𝑡𝑤𝑤𝑜𝑜𝑐𝑐𝑙𝑙𝑎𝑎𝑠𝑠𝑠𝑠𝑒𝑒𝑠𝑠[15]. On the other we have applied the binarization for based the MRI after thresholding. Actually, binarization is Furthermore, whenhand, the multilevel thresholding of image onimages M classed will be: very important before segmentation and this helps in classifying the image pixels.



(8)

ሼ𝑡𝑡∗ǡ𝑡𝑡 ∗ …ǡ𝑡𝑡 ∗ ͳʹǡ

ሽൌ𝐴𝐴𝑟𝑟𝑔𝑔𝑚𝑚𝑎𝑎𝑥𝑥0≤𝑡𝑡≤255{∑𝑀𝑀𝑤𝑤𝑘𝑘ሺ𝜇𝜇𝑘𝑘 −𝜇𝜇 𝑇𝑇ሻ ʹሽ

𝑀𝑀−ͳ

𝐾𝐾ൌͳ

(8)

On the other hand, we have applied the binarization for the MRI images after thresholding. Actually, binarization is very important before segmentation and this helps in classifying the image pixels.



Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

169

3.2.2 Segmentation process: In this paper, we have applied the K-means clusters to segment the tumor from the MRI images. Segmentation will help in features extraction and the classification of the tumors degrees. The k-means algorithm divides a set of data into K-groups of a disjoint clusters as in clustering process. The method which we have used to calculate the distance for centroid data is "Euclidean distance" [15]. For example, if there is an image with resolution (x, y) and the cluster is K-numbers, let consider P(x, y) is an input pixel in cluster and c k is the cluster center then the algorithm of Kmeans will be as the following: a) Define the number of cluster K. b) For each pixel, the Euclidean distance" d will be calculated based on equation (9). (9)

ൌ‖𝑝𝑝ሺ𝑥𝑥ǡ𝑦𝑦ሻ−𝑐𝑐𝑘𝑘‖ ͳ

𝑐𝑐𝑘𝑘ൌ ∑𝑦𝑦∈𝑐𝑐∑𝑥𝑥∈𝑐𝑐 𝑝𝑝ሺ𝑥𝑥ǡ𝑦𝑦ሻ c)

𝑘𝑘

𝑘𝑘

(10)

𝑘𝑘

Based on distance "d", attribute all the pixel such as proof [17]. Recalculate the new position of the center based on equation (10). d) Repeat the process until satisfying the minimum error. e) Extract the segmentation from MRI images. In this paper, the K-means achieved an optimal segmentation as shown in figure (3) and the discussion of the results will be in section (5).

Fig.3: K-means clustering for segemantation of brain tumor

3.3 Feature extraction unit: After segmentation, feature extraction is carried out based on applying three different methods which are explained and compared in the following subsections. 3.3.1 Binary Particle swarm optimization (BPSO) based feature selection algorithm: In the last few decades, a lot of algorithms have been introduced for feature selection methods. Actually, the most practical method to find the global optimal solutions is particle swarm optimization. The concept of solving or finding optimal solutions in this paper is the Binary Particle Swarm Optimization (BPSO) selection algorithm. Data sets consist of big numbers of redundant and heuristic features based on the fast preprocessing strategy used for performing the preprocessing of generating some of the data. The extraction of the preprocessing data is considered an input for choosing the effective features based on Binary Particle Swarm Optimization (BPSO) selection

Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

170

algorithm. In the structure of PSO, the values will be in the n-dimensional search space. The initialization values are the random values of swarm of particles. Furthermore, the values of each particle consist of two main values. These are the fitness best or sometimes called locally best values and the globally best or sometimes called the best cost. Actually, each one of them has a value and a position. However, there is a value called inertia weight which gives the balance between global and local exploration [16]. Assume, the ith particle is Xi=( Xi1, Xi2,…… Xin), and the velocity Vi = ( Vi1, Vi2,..................Vin), then the position and velocity of each particle update in each iteration are based on equation (11) and equation (12), respectively: (11)

ሺ𝑡𝑡൅ͳሻൌ𝑤𝑤" ∗ 𝑉𝑉𝑖𝑖ሺ𝑡𝑡ሻ൅𝑐𝑐ͳ𝜌𝜌ͳሺ𝑃𝑃𝑖𝑖𝑂𝑂ሺ𝑡𝑡ሻ−𝑋𝑋𝑖𝑖𝑂𝑂ሺ𝑡𝑡ሻሻ൅𝑐𝑐ʹ𝜌𝜌ʹሺ𝑃𝑃𝑔𝑔ሺ𝑡𝑡ሻ−𝑋𝑋𝑖𝑖𝑂𝑂ሺ𝑡𝑡ሻሻ

(12)

ሺ𝑡𝑡− 1ሻൌ𝑋𝑋𝑖𝑖൅𝑉𝑉𝑖𝑖𝑂𝑂ሺ𝑡𝑡൅ͳሻ



Where 𝑃𝑃𝑔𝑔 is global best, 𝑃𝑃𝑖𝑖 is position of fitness and the 𝑃𝑃𝑏𝑏𝑒𝑒𝑠𝑠𝑡𝑡 is the value of fitness, d is the dimension of each particle and O= (1, 2,…, n) , w" is weight inertia , 𝜌𝜌ͳǡ 𝜌𝜌ʹ are random number between [0,1] and c1 and c2 are The acceleration of each particle towards 𝑃𝑃𝑏𝑏𝑒𝑒𝑠𝑠𝑡𝑡 and global best positions and they are also called cognitive and social acceleration coefficients. The operation of the update is the comparing of Pbest and gbest with present particle. If the present particle is less than the updates values, the update happens for all particles. In this paper, we use the binary particle swarm optimization (BPSO). Actually, the difference between PSO and BPSO is in the particles used in BPSO. Each particle in BPSO contains asset of 0's and 1's [16]. Based on the structure of BPSO, 𝑉𝑉𝑖𝑖𝑂𝑂is mapped into interval [0, 1] and velocity will be: ͳ

𝑆𝑆ሺ𝑉𝑉𝑖𝑖𝑂𝑂ሻൌ ͳ൅𝑒𝑒−𝑉𝑉𝑖𝑖𝑂𝑂

(13)



The update of velocity will be based on equation (11) but the new positions for all particles are as [18]: (14)

ൌͳǡ𝑖𝑖𝑓𝑓𝜌𝜌൏𝑆𝑆ሺ𝑉𝑉𝑖𝑖𝑂𝑂ሻǡ𝑜𝑜𝑡𝑡ℎ𝑟𝑟𝑤𝑤𝑖𝑖𝑠𝑠𝑒𝑒𝑋𝑋𝑖𝑖𝑂𝑂 ൌͲ

In this paper, the fitness function has been calculated based on the effects of the behavior of all features. This method has helped to extract the optimal effective features which achieved an accurate classification of tumors. We used each of equations from table (1) as a fitness function and extracted only one effective feature based on the BPSO optimal solution. The optimal solutions achieved by extracting 50 as a minimum best effective features as shown in equation (15) and we extracted the optimal feature for each fitness function. The BPSO has extracted 12 features based on the fitness function. Each fitness function extracted optimal feature has helped in classification process. In this paper, we have applied all fitness functions and tried to extract all 12 features for each function with minimum cost and time. The feature extraction based on BPSO is shown in the algorithm (1). Name of equations Mean Standard deviation

Energy

Homogeneity

Table 1. Table of fitness equations

Equation ∑ 𝑥𝑥 𝜇𝜇ൌ  𝑁𝑁

ͳ 𝑁𝑁 𝜎𝜎ൌ√ ∑(𝑥𝑥− 𝜇𝜇ሻʹ 𝑖𝑖 𝑁𝑁 𝑖𝑖ൌͳ

𝑚𝑚𝑛𝑛

𝐸𝐸𝑛𝑛𝑒𝑒𝑟𝑟𝑔𝑔𝑦𝑦ൌ∑ ∑(𝐺𝐺𝐿𝐿𝐶𝐶𝑀𝑀ሺ𝑖𝑖ǡ𝑗𝑗ሻሻʹ 𝑖𝑖ൌͳ𝑗𝑗ൌͳ

𝑚𝑚𝑛𝑛

𝐺𝐺𝐿𝐿𝐶𝐶𝑀𝑀ሺ𝑖𝑖ǡ𝑗𝑗ሻ 𝐻𝐻𝑜𝑜𝑚𝑚𝑜𝑜𝑔𝑔𝑒𝑒𝑛𝑛𝑒𝑒𝑖𝑖𝑡𝑡𝑦𝑦ൌ∑∑  ͳ൅ȁ𝑖𝑖−𝑗𝑗ȁ 𝑖𝑖ൌͳ𝑗𝑗ൌͳ



Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

Correlation

𝑚𝑚

𝑛𝑛ሼ𝑖𝑖𝑗𝑗ሽ𝐺𝐺𝐿𝐿𝐶𝐶𝑀𝑀ሺ𝑖𝑖ǡ𝑗𝑗)−{𝜇𝜇𝑥𝑥𝜇𝜇𝑦𝑦ሽ

∑ 𝑐𝑐𝑜𝑜𝑟𝑟𝑟𝑟𝑒𝑒𝑙𝑙𝑎𝑎𝑡𝑡𝑖𝑖𝑜𝑜𝑛𝑛ൌ∑ ሼ𝑖𝑖𝑗𝑗ሽ𝐺𝐺𝐿𝐿𝐶𝐶𝑀𝑀ሺ𝑖𝑖ǡ𝑗𝑗)−{𝜇𝜇 𝑥𝑥𝜇𝜇𝑦𝑦ሽ

Correlation

𝑚𝑚 𝑛𝑛 ∑ 𝑗𝑗ൌͳ 𝑐𝑐𝑜𝑜𝑟𝑟𝑟𝑟𝑒𝑒𝑙𝑙𝑎𝑎𝑡𝑡𝑖𝑖𝑜𝑜𝑛𝑛ൌ∑ 𝑖𝑖ൌͳ

𝜎𝜎𝑥𝑥𝜎𝜎𝑦𝑦

171

where 𝜇𝜇𝜇𝜇and 𝜎𝜎𝜎𝜎are

𝑦𝑦 where 𝜇𝜇𝜇𝜇and 𝑥𝑥 𝜎𝜎𝜎𝜎are

𝑥𝑥 𝑦𝑦

𝑦𝑦 𝑥𝑥 𝑦𝑦 𝑖𝑖ൌͳ 𝑗𝑗ൌͳ 𝜎𝜎𝑥𝑥𝜎𝜎𝑦𝑦 of probability𝑥𝑥 the mean and standard deviations matrix GLCM along the mean and standard deviations of probability matrix GLCM along row wise x and column wise y.

row wise x and column wise y.

Entropy Entropy Root Rootmean meansquare(RMS) square(RMS) Variance Variance

ENT=−∑ 𝑖𝑖∑𝑖𝑖𝑥𝑥ሺ𝑖𝑖ǡ𝑗𝑗ሻ𝐿𝐿𝑜𝑜𝑔𝑔ʹ𝑥𝑥ሺ𝑖𝑖ǡ𝑗𝑗ሻ ENT=−∑ 𝑖𝑖∑𝑖𝑖𝑥𝑥ሺ𝑖𝑖ǡ𝑗𝑗ሻ𝐿𝐿𝑜𝑜𝑔𝑔ʹ𝑥𝑥ሺ𝑖𝑖ǡ𝑗𝑗ሻ ͳ∑𝑁𝑁 ʹ 𝑥𝑥ʹሺ𝑖𝑖ሻ RMS= ͳ∑𝑁𝑁 RMS= 𝑥𝑥𝑖𝑖ൌͳ ሺ𝑖𝑖ሻ 𝑁𝑁 𝑁𝑁𝑖𝑖ൌͳ 𝑚𝑚−ͳ𝑛𝑛−ͳ 𝑚𝑚−ͳ𝑛𝑛−ͳ

ʹ ʹ𝑥𝑥ሺ𝑖𝑖ǡ𝑗𝑗ሻ 𝜎𝜎ʹൌ∑ ∑ሺ𝑖𝑖−𝜇𝜇ሻ 𝑥𝑥ሺ𝑖𝑖ǡ𝑗𝑗ሻ 𝜎𝜎ʹൌ∑ ∑ሺ𝑖𝑖−𝜇𝜇ሻ 𝑖𝑖ൌͲ𝑗𝑗ൌͲ 𝑖𝑖ൌͲ𝑗𝑗ൌͲ

Smoothness Smoothness

ͳሻ ͳ Smoothness =1 −=1 (− Smoothness ( ሻ ʹሻ ሺͳ൅𝜎𝜎ʹሺͳ൅𝜎𝜎 ሻ

𝑚𝑚−ͳ𝑚𝑚−ͳ

Kurtosis Kurtosis

ͳ ͳ 𝐾𝐾𝑢𝑢ൌ ∑ሺ𝑖𝑖− 𝜇𝜇ሻͶ𝑥𝑥ሺ𝑖𝑖) − 3 − 3 𝐾𝐾𝑢𝑢ൌ ∑ሺ𝑖𝑖− 𝜇𝜇ሻͶ𝑥𝑥ሺ𝑖𝑖) 𝜎𝜎Ͷ 𝜎𝜎Ͷ 𝑖𝑖ൌͲ 𝑖𝑖ൌͲ 𝑚𝑚−ͳ

Skewness Skewness Inverse Difference Movement (IDM)

Inverse Difference Movement (IDM)

𝐵𝐵𝑒𝑒ൌ𝑚𝑚𝑖𝑖𝑛𝑛

𝑀𝑀

ͳ ͳ𝑚𝑚−ͳ ͵ 𝛾𝛾ൌ ∑ሺ𝑖𝑖− 𝜇𝜇ሻ 𝑥𝑥ሺ𝑖𝑖ሻ ͵ 𝛾𝛾ൌ 𝜎𝜎͵ ͵ ∑ሺ𝑖𝑖− 𝜇𝜇ሻ 𝑥𝑥ሺ𝑖𝑖ሻ

𝜎𝜎 𝑖𝑖ൌͲ

𝑖𝑖ൌͲ

𝐺𝐺𝐿𝐿𝐶𝐶𝑀𝑀ሺ𝑖𝑖ǡ𝑗𝑗ሻ

𝑚𝑚∑𝑛𝑛 IDM=∑𝑖𝑖ൌͳ𝑗𝑗ൌͳ 𝑚𝑚

𝐺𝐺𝐿𝐿𝐶𝐶𝑀𝑀ሺ𝑖𝑖ǡ𝑗𝑗ሻ

𝑛𝑛 ʹ ∑ͳ൅ሺ𝑖𝑖−𝑗𝑗ሻ IDM=∑𝑖𝑖ൌͳ𝑗𝑗ൌͳ

ͳ൅ሺ𝑖𝑖−𝑗𝑗ሻʹ

𝑔𝑔𝑗𝑗ൌͳ(15)

𝐵𝐵𝑒𝑒ൌ𝑚𝑚𝑖𝑖𝑛𝑛𝑀𝑀 𝑔𝑔𝑗𝑗ൌͳ(15) Where gi is the best fitness values at iteration j.

Where gi is the best fitness values at iteration j.

In the following, the main pseudo algorithm of this method [18-21] is given.

In the following, the main pseudo algorithm of this method [18-21] is given.

General Algorithm1.Feature extraction (Binary –PSO) Inputs : Constant(C1) General = Constant (C2) =2, weight(W")=0.9;Max_iteration=500; Algorithm1.Feature extraction (Binary –PSO) Outputs: Features extraction based (C2) on selection fitness function from table (1) Inputs : Constant(C1) = Constant =2, weight(W")=0.9;Max_iteration=500;

Outputs: Features extraction based on selection fitness function from table (1) Start 1)

Initialize Particles: - Boundary is based on size of data inputs (segmentation data). Initialize Particles: - Positions=zeros (population, number Of Selection); // number of selection is less than - Boundary is based on size of data inputs (segmentation data). number of boundary. Positions=zeros 2) velocity initialization:(population, number Of Selection); // number of selection is less than boundary. - number Velocityof = ones(population,nOfSelection) 2) 3) velocity Initializeinitialization: Global best: -- Velocity itration = = 1;ones(population,nOfSelection) 3) Initialize Global best: - globalBest.value(itration) = inf; itration = 1; 4) -While (Max_iteration) do ( i =1 to num_particles) do = inf; -For globalBest.value(itration) Calculate fitness function 4) While (Max_iteration) do ʹ 𝑚𝑚 𝑛𝑛 Fun=∑ as the first fitness function, we will apply algorithm to test For ( i =1∑ toሺ𝐺𝐺𝐿𝐿𝐶𝐶𝑀𝑀𝑥𝑥ሺ𝑖𝑖ǡ𝑗𝑗ሻሻ num_particles)//do

Start 1)

𝑖𝑖ൌͳ𝑗𝑗ൌͳ

all fitness function in table 1. Calculate fitness function 𝑚𝑚 𝑛𝑛 Fun=∑ if (fitness∑ofሺ𝐺𝐺𝐿𝐿𝐶𝐶𝑀𝑀𝑥𝑥ሺ𝑖𝑖ǡ𝑗𝑗ሻሻ 𝑥𝑥𝑖𝑖>𝑝𝑝𝑏𝑏𝑖𝑖) then ʹ// as the first fitness function, we will apply algorithm to test 𝑖𝑖ൌͳ𝑗𝑗ൌͳ

𝑝𝑝𝑏𝑏𝑖𝑖fitness =𝑥𝑥𝑖𝑖 function in //update all table 1. 𝑝𝑝𝑏𝑏𝑒𝑒𝑠𝑠𝑡𝑡 𝒆𝒆𝒏𝒏𝒅𝒅𝒊𝒊𝒇𝒇 if (fitness of 𝑥𝑥𝑖𝑖>𝑝𝑝𝑏𝑏𝑖𝑖) then if (fitness of 𝑥𝑥𝑖𝑖>𝑝𝑝𝑔𝑔𝑖𝑖) then //update 𝑝𝑝𝑏𝑏𝑒𝑒𝑠𝑠𝑡𝑡 𝑝𝑝𝑏𝑏 𝑖𝑖=𝑥𝑥𝑖𝑖 𝑝𝑝𝑔𝑔𝑖𝑖=𝑥𝑥𝑖𝑖 //update 𝒈𝒈𝒃𝒃𝒆𝒆𝒔𝒔𝒕𝒕 𝒆𝒆𝒏𝒏𝒅𝒅𝒊𝒊𝒇𝒇 if (fitness of 𝑥𝑥𝑖𝑖>𝑝𝑝𝑔𝑔𝑖𝑖) then 𝑝𝑝𝑔𝑔𝑖𝑖=𝑥𝑥𝑖𝑖 //update 𝒈𝒈𝒃𝒃𝒆𝒆𝒔𝒔𝒕𝒕

172

Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

𝒆𝒆𝒏𝒏𝒅𝒅𝒊𝒊𝒇𝒇 For (O= 1: population size) do // update velocity for each dimension. 𝑉𝑉𝑖𝑖𝑂𝑂ሺ𝑡𝑡൅ͳሻൌ𝑤𝑤" ∗ 𝑉𝑉𝑖𝑖𝑂𝑂ሺ𝑡𝑡ሻ൅𝑐𝑐ͳ𝜌𝜌ͳሺ𝑃𝑃𝑖𝑖𝑂𝑂ሺ𝑡𝑡ሻ−𝑋𝑋𝑖𝑖𝑂𝑂ሺ𝑡𝑡ሻሻ൅𝑐𝑐ʹ𝜌𝜌ʹሺ𝑃𝑃𝑔𝑔ሺ𝑡𝑡ሻ−𝑋𝑋𝑖𝑖𝑂𝑂ሺ𝑡𝑡ሻሻ; if (𝑉𝑉𝑖𝑖𝑂𝑂ሺ𝑡𝑡൅ͳሻ൐ 𝑉𝑉𝑚𝑚𝑎𝑎𝑥𝑥then ሺ𝑡𝑡൅ͳሻൌ𝑉𝑉𝑚𝑚𝑎𝑎𝑥𝑥; 𝒆𝒆𝒏𝒏𝒅𝒅𝒊𝒊𝒇𝒇 if (𝑉𝑉𝑖𝑖𝑂𝑂ሺ𝑡𝑡൅ͳሻ൐𝑉𝑉𝑚𝑚𝑖𝑖𝑛𝑛then ሺ𝑡𝑡൅ͳሻൌ𝑉𝑉𝑚𝑚𝑖𝑖𝑛𝑛; 𝒆𝒆𝒏𝒏𝒅𝒅𝒊𝒊𝒇𝒇 if (𝑆𝑆ሺ𝑉𝑉𝑖𝑖𝑂𝑂ሺ𝑡𝑡൅ͳሻሻ൐𝑟𝑟𝑎𝑎𝑛𝑛𝑑𝑑ሺͲǡͳሻ𝒕𝒕𝒉𝒉𝒆𝒆𝒏𝒏//update position ሺ𝑡𝑡൅ͳሻൌͳ; else ሺ𝑡𝑡൅ͳሻൌͲ; 𝒆𝒆𝒏𝒏𝒅𝒅𝒊𝒊𝒇𝒇 𝒆𝒆𝒏𝒏𝒅𝒅𝒇𝒇𝒐𝒐𝒓𝒓 // end of the dimension in particle 𝒆𝒆𝒏𝒏𝒅𝒅𝒇𝒇𝒐𝒐𝒓𝒓 // end of particle End while 𝒃𝒃𝒆𝒆𝒔𝒔𝒕𝒕𝒔𝒔𝒐𝒐𝒍𝒍𝒖𝒖𝒕𝒕𝒊𝒊𝒐𝒐𝒏𝒏ൌ𝒈𝒈; // best value ; End The features of this method have achieved accuracy from 86-93 % in the classification part. 3.3.2 Colony optimization algorithms: In this section, we introduced another two types of optimization under named of colony optimizations methods .The two methods called "Ant colony optimization " and "Bee colony optimization". However, the main target is to apply these method to select the fast optimization method based on feature selection technique. 3.3.2.1 Ant Colony optimization algorithm based on Travel salesman feature selection problem (ACO-TSP): Feature selection is a common problem in many algorithms. Feature selection plays an important role to select the optimal and effective features. Recently, colony optimization has achieved satisfying levels of accuracy in giving the optimal selected features. Ant colony is considered one of the more attractive and successful methods to a number of different optimization NP-hard problems. This paper presents the ACO method based on the feature selection reduction technique and applies it on the MRI brain tumor database images [1]. The role of feature selection is the reduction of feature space which helps in improving the classification and prediction accuracy. In ant colony optimization, the used indirect communication is called a pheromone. The quantity of pheromone is based on upon distance, quality, and quantity of food source. The artificial ant colony algorithm is introduced in [18]. The authors have introduced the solution of the ant colony optimization problem and then applied it to solve problem based on travelling salesman in [23]. The target of ant colony optimization is to find the shortest path to the food of ants and travelling salesman achieved it with the shortest time and paths. In this paper, we use the ant colony optimization based on travelling salesman for feature extraction and selection method for binary MRI images of different brain tumor cases [1]. Our algorithm for Ant Colony is based on travelling salesman (ACO-TSP) and pheromone indirect communications [18, 19]. The algorithm is as follows: a) The design of the problem solution : In the first part of the design, the virtual ant will have amount of pheromone on the graph of edge. The probability of the ant moving is calculated by using equation (16) as shown as in figure (4), where i is the present node for the present step of the ant movement and k is the next node.



Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

173

Fig.4: The probability of the ant movement

𝑝𝑝𝑘𝑘ൌ

𝛽𝛽

𝜏𝜏𝛼𝛼൅Ί  𝑖𝑖

𝑖𝑖 𝛽𝛽

(16)



∑(𝜏𝜏𝛼𝛼𝑖𝑖൅Ί ሻ 𝑁𝑁𝑖𝑖 𝑁𝑁

Where: - 𝛽𝛽is heuristic exponential weight and 𝛼𝛼is pheromone exponential weight. - 𝜏𝜏𝑖𝑖The pervious or past attractive value. - Ί𝑖𝑖adds to transition attractiveness for ants. - 𝑁𝑁 𝑖𝑖 ‹•–Š‡•‡–‘ˆ‘†‡•…‘‡…–‡†–‘’‘‹–‹™‹–Š‘—–Žƒ•–˜‹•–‡†.

b) Reverse : In this case, the internal memory of the virtual ant plays and important role. The ant has reversed the path based on its internal memory. Note that, the ant in her reversed path will be in opposite order and eliminated cycles. The reversed equation of the ant movements is shown in equation (17) 𝜏𝜏𝑡𝑡൅ͳൌ𝜏𝜏𝑡𝑡൅∆𝜏𝜏𝑡𝑡 𝑖𝑖𝑗𝑗

𝑖𝑖𝑗𝑗

(17)

Where: 𝑡𝑡 - 𝜏𝜏𝑖𝑖𝑗𝑗 is the value of pheromone in step t . - ∆ 𝜏𝜏𝑖𝑖𝑠𝑠𝑡𝑡ℎ𝑒𝑒𝑠𝑠𝑎𝑎𝑣𝑣𝑒𝑒𝑑𝑑𝑣𝑣𝑎𝑎𝑙𝑙𝑢𝑢𝑒𝑒𝑜𝑜𝑓𝑓’Š‡”‘‘‡‹•–‡’–and this values can be constant values in some cases. c)

The last step of Ant colony –TSP " Evaporation of 𝐩𝐩𝐡𝐡𝐞𝐞𝐫𝐫𝐨𝐨𝐦𝐦𝐨𝐨𝐧𝐧𝐞𝐞𝐬𝐬̶:

There is one important case to improve the ACO-TSP algorithm to be optimal in short paths. This is called "Evaporation". Evaporation is provided such that no path will be estimated as the shortest and the equation (18) achieves that on all graph edges intervals (0, 1).



𝜏𝜏𝑡𝑡൅ͳൌሺͳ−𝜌𝜌ሻ𝜏𝜏𝑡𝑡 𝑖𝑖𝑗𝑗

𝑖𝑖𝑗𝑗

(18)

The ACO-TSP algorithm is given in the following: Algorithm2. Ant colony –Travel salesman problem(ACO-TSP) Inputs : maximum iterations=300 , population size(number of ants 40 to 100) Outputs: Features selections (values of the shortest paths) Start 1) Set the initial value of Pheromone Exponential Weight (alpha) =1, Heuristic Exponential Weight (Beta) =1 and Evaporation Rate =0.05 ;

Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

174

2)

3) 4) 5) 6) 7)

Design the initial matrices of : a) Heuristic Information Matrix [25] b) Pheromone Matrix c) Matrix to hold best values d) Ant colony initial matrix While (Max_iteration) do For ( i =1 to num_ants) do Calculate the tour of the first ant based on equation(16) Calculate the shortest and best path if (ant(k).Cost
// Update Pheromones 8) For ( i =1 to num_ants) do // update ant_tour randomly For( j=2:num_of variables) do Calculate finial of tour end for end for // calculate Evaporation based on formula (18) 9) Store best solution (best value of the shortest path) End while best solution = best value; End

Actually, each equation in table (1), can be used as the input function of the model of ant colony optimization –TSP. we have achieved 12 equation and each equation is repeatedly by calculated in 300 maximum iterations. Ant colony –TSP is found to extract features in the shortest time and the obtained values are found to have levels of 92.8-99.8 % accuracy of the system. These results will be discussed in more details in section (4). 3.3.2.2 Artifical Bee Colony optimization algorithm for binary feature selection (ABCO-BFS): ABO is another type of colony optimization techniques. This technique is based on like as ant colony, looking for the food for bees. The main behavior of bees depends on two main parameters: a) The selection based on size of food source. b) The termination of low food junction. There are four phases to use the algorithm [20]. i)

Initialization phase : Firstly, source of food is defined randomly as follows: š‹Œൌš‹൅ሺͲǡͳሻሺšƒš −š‹ሻ Œ



Œ

Œ

(19)

™Š‡”‡‹ൌሼ1,2, … … , SNሽǡ‹•—„‡”‘ˆˆ‘‘†•‘—”…‡•ǡŒൌሼ0, … … , DሽǢ ൌሼͲǡͳሽ‹•–Š‡”ƒ†‘˜ƒ”‹ƒ„Ž‡Ǣ‹•–Š‡†‹‡•‹‘Ǣ šƒšƒ†š‹ƒ”‡ƒš‹—ƒ†‹‹—‘ˆŒ Œ

Œ

ii) Employed foragers (Bee phased):

The relation between food source and employed bee is one to one (only on food source). This relationship is as follows [28].



Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

175

ൌ𝑥𝑥𝑖𝑖𝑗𝑗൅𝜑𝜑𝑖𝑖ሺ𝑥𝑥𝑖𝑖𝑗𝑗−𝑥𝑥𝑘𝑘𝑗𝑗ሻ (20) ™Š‡”‡‹‹•–Š‡‹†‡š‘ˆ–Š‡…—””‡–•‘—”…‡‘ˆˆ‘‘†ǡ‹•–Š‡‹†‡š‘ˆ–Š‡•‘—”…‡Ǣ φ‹Œൌሾ−1,1ሿ”ƒ†‘†‹•–”‹„—–‡†Ǣ ‘–‡–Šƒ–ሺ‹ǡŒሻƒ”‡”ƒ†‘Ž›•‡Ž‡…–‡†’ƒ”ƒ‡–‡”•Ǣš‹‹•ˆ‹–‡••ˆ—…–‹‘. If the value of the Vi is better than š‹ǡthen, the employed bee have memorized the new position of food source otherwise the value is kept in memory and number of iterations or trials will increase by 1. iii) Unemployed foragers (onlooker bee phase): The onlooker bees are the ones waiting for the employed bee to share information about the sources of the food. Scouts bees are considered responsible for exploration when the employed bees are responsible for exploitation. According to roulette –wheel scheme, the fitness equation for this problem is in equation (21) and selection scheme is in equation (22). [20] 



 ͳ 𝑋𝑋𝑓𝑓𝑖𝑖𝑡𝑡𝑛𝑛𝑒𝑒𝑠𝑠𝑠𝑠ൌሼͳ൅𝑓𝑓ሺ𝑥𝑥ሻ ͳ൅ሺሺ𝑥𝑥ሻሻ 

Where f is the objective function.

𝑖𝑖𝑓𝑓ሺ𝑥𝑥ሻ≥ 0



ሺʹͳሻ

ሽ 𝑖𝑖𝑓𝑓𝑓𝑓ሺ𝑥𝑥ሻ൏ Ͳ

In this paper, we have applied the fitness functions from the table (1) and used them in the model of roulette-wheel based on the probability equation (22). ˆ‹–‡••ሺš ሻ ’‹ൌ • ‹

∑ൌͳˆ‹–‡••ሺšሻ



(22)

Where fitness (xi) is the fitness value of solution i and ns is the total number of food sources. After all of that we extract the optimal features and actually it equal to100 features for each image. We calculated the statistical features from 100 features based on equations in table 1. In the following is the short algorithm which we applied in this paper: Algorithm3. ABCO-BFS Inputs : maximum iterations , population size(number of bees =100); Number of Onlooker Bees; Outputs: Features selections (optimal solution ) Start 1) Set the initial value of Acceleration Coefficient Upper Bound =1(cycle =1); 2) Calculate Abandonment Limit Parameter(ALP) : ALP=0.6*number of variable decision*colony size; 3) Initialize Population Array. 4) Initialize Best Solution Ever Found. 5) While (Max_iteration) do // employed Bees For ( i =1 to num_bees) do - Choose k randomly, not equal to i. - calculate acceleration coefficient - update bees positions - Compare the two positions and calculate the best value and update the position. - Calculate the fitness and probability End for // Onlooker Bees For ( K =1 to num_ onlooker bees) do

176

Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

- Choose k randomly, not equal to i. - calculate acceleration coefficient - update bees positions - Compare the two positions and calculate the best value and update the position. - Calculate the fitness and probability End for // Scout Bees For (i =1 to population size ) do If (update Abandonment Counter >= APL) Counter position =0; End if End for For (i =1 to population size ) do - Update Best Solution Ever Found End for - Store best solution End while End Actually, the method based on the pervious algorithm achieved the shortest optimized time which the other methods of the optimizations and also the discussion of the results will be given is section (4). 3.4 Classification and prediction unit (GMDH –method): In this unit, we use group method of data handling (GMDH) for classification and prediction of the next features of the MRI brain tumor databases. Polynomial Neural Network (PNN) is the most practical method as a type of Artificial Neural Network (ANN). On another hand, (ANN) is used in classification and prediction so, we use the PNN to classify and predict the feature complications. Furthermore, GMDH is a multilayer network which uses quadratic neurons offering an effective solution to modeling of non-linear systems. It is more practical and accurate in prediction of behavior of the system model [21]. Ivakhnenko used a polynomial (Ivakhnenko Polynomial) with the grouping method of data handling (GMDH) to obtain a more complex PNN. Layers connections are simplified and an automatic algorithm is developed to design and adjust the structure GMDH as shown in figure (5).

Fig.5: The GMDH Structure To obtain the characteristics nonlinear relationship between the inputs and outputs of the GMDH, structure, a multilayer network of second order polynomials is used. Each quadratic neuron has two inputs and the output calculated as described in equations as described in [21]. In this paper, we use GMDH-PNN to classify and predict the next features of MRI brain tumor images based on the following algorithm:



Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

177

Algorithm4. GMDH classification and prediction Classification Part %******************************* Input data : Enter number of images n - Maximum Number of Neurons (Nu) =50 and the Maximum of Layer (NL) =10, - Alpha (AL)=0.6,The train ratio (TR)=0.7. Start 1) Enter n datasets of MRI images. Loop i =1: n a) Enhance The MRI images using Kuwahara filter. b)Thresholding method and binarization. c) K-means segmentation process. 2) Select the features extraction method (ACO or BPSO or BCO-TSP) 3) Extract best features values that refer to Mean, Standarddeviation, Energy, Homogeneity, Correlation, Entropy, Rootmean,square(RMS), Variance,Smoothness,Kurtosis,Skewness and Inverse Difference Movement (IDM) Start a loop Use 80% of the datasets for training using PNN structure to obtain the system coefficient. Use the trained system to estimate the classification the rest of 20% of datasets. End Loop 4) Compute the error of the accuracy for each feature as follows: Err = (actual value-Estimated value)/actual value*100% Calculate the accuracy = 100 – Err. 5) Print the results. %************************************************************ Prediction part %******************************************* Start a loop - Use 70% of the datasets of "patient history" for training. - Use the trained system to estimate the prediction the rest of 30% of current features End Loop Compute the error of the accuracy of prediction for each feature as follows: Err = (actual value-Estimated value)/actual value*100% Calculate the accuracy = 100 – Err. End Loop 6) Print the results. End

4. Results and discussions: The system of this paper is based on classifying and predicting the brain tumors images. The brain image datasets consist of 50 different types of brain tumors which are available online [1]. However, the system of this paper consists of three main stages. The first stage is a preprocessing stage. We applied the Kuwahara filter to enhance the brain tumor images based on the four window sizes method. The practical window size is founded to be 11 which is more practical. Furthermore, after the thresholding process and binarization process; we applied K-mean method for the segmentation process. The feature extraction process is

Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

178

based on the tumor optimal segmentation tumor which help in the classification and prediction process. The segmented tumor is as shown in figure (3). Actually, we applied three methods of features extraction. Binary particle swarm optimization is one of the practical methods to get the optimal solution. In this method, we used the function in table (1) as a fitness function to get the best solution of each one. After the segmentation processing, in BPSO, we use the correlation as fitness functions to get the best value of correlation which gives the extracted features. The optimal obtained fitness function values give the mean, standard values, deviation…etc. We applied our system on laptop with (core i3 2.5GH and 6GB Ram) and using MATLAB 2017a. The datasets images consist of 50 different of images. We applied the BPSO algorithm for feature extraction. The result showed that each feature of any image needed 12 minutes for whole process including the all stages (preprocessing, segmentation, feature extraction, classification and prediction) with total maximum limit of iteration is 500 which means that for each feature. The maximum limit of iterations for all features is equal to 6000 iterations. Applying the MATLAB parallel processing tools on the laptop system, we got a total time 8 minutes for each feature which means that for 12 feature, we need 96 minutes. When GMDH process is applied in the classification and prediction process, it achieves accuracy from 88% to 94%. Furthermore, many authors introduced the BPSO in some cases of brain tumor but not focused on the CPU time [16]. The Ant colony based on travelling salesman (ACO-TSP) achieved the best solutions in classification and prediction process. In ACO-TSP, we created a model of matrix based on the segmented data and after that we used the principle component analysis (PCA) to extract the optimal general features from segmented images and after that we calculated the 12 features from the 100 general features based on equations in table (1) and the maximum limit of iterations is 300. The model of ACO-TSP helped to save time in CPU [19]. Furthermore, we applied the ACO-TSP algorithm for feature extraction at the same laptop system. The result showed that all features of any image needed 5 minutes for the whole process including the all stages (preprocessing, segmentation, feature extraction, classification, and prediction) which means that for all of the stages. Applying the MATLAB parallel processing tools on the laptop system, we got a total time 2 minutes for all feature. The accuracy of using ACO-TSP after classification and prediction from 95 % up to 99.8%. ABCO-BFS is very much powerful in the extraction process but more complex. When we applied ABCO-BFS, we achieved powerful and more accurate such as Ant colony. The difference between them in practical programming is the stages of the process between ants rather than bees. ABCO-BFS is achieved results near to (ACO-TSP). The accuracy of ABCO-BFS after classification and prediction at the same system of laptop is 88 % up to 95 % and the maximum limit of iterations is 300. The last results can be collected as shown in Table (2). Table 2 . Compare between results of the methods

Algorithm

Maximum iteration

CPU time (sec)

CPU parallel time (sec)

Accuracy of (GMDH)

BPSO ACO-TSP ABCO-BFS

500-6000 300-1000 300-1000

8640 sec 300 sec 420 sec

5760 sec 120 sec 300sec

88%-94% 94%-100% 88%-95%

5. Conclusion: This paper introduced three main features extraction techniques Binary particle swarm optimization (BPSO), Ant colony optimization based on travel salesman (ACO-TSP) and Artificial Bee colony optimization (BCO). The three methods achieved maximum accuracy when applied GMDH for predication and classification up to 98%. The comparing between the methods appeared the ACO-TSP is the best method in feature extraction of brain tumor images.



Rabab Hamed M. Aly et al. / Procedia Computer Science 163 (2019) 165–179

179

References [1] https://radiopaedia.org/encyclopaedia/cases/all. [2] El-Dahshan, El-Sayed A., et al. "Computer-aided diagnosis of human brain tumor through MRI: A survey and a new algorithm." Expert systems with Applications 41.11 (2014), pp. 5526-5545. [3] Mathew, A. Reema, and P. Babu Anto. "Tumor detection and classification of MRI brain image using wavelet transform and SVM." Signal Processing and Communication (ICSPC), 2017 International Conference on. IEEE, 2017. [4] Soltani, A., Battikh, T., Jabri, I., & Lakhoua, N. A new expert system based on fuzzy logic and image processing algorithms for early glaucoma diagnosis. Biomedical Signal Processing and Control, 40, 2018, pp. 366-377. [5] Bokhari, F., Syedia, T., Sharif, M., Yasmin, M., & Fernandes, S. L. Fundus Image Segmentation and Feature Extraction for the Detection of Glaucoma: A New Approach. Current Medical Imaging Reviews, 14(1), 2018, pp. 77-87. [6] Chowdhury, Avirup, et al. "DETECTION AND CLASSIFICATION OF BRAIN TUMOR USING ML." International Journal 9.2 ,2018. [7] Minelli, Eleonora, et al. "Neural Network Approach for the Analysis of AFM Force-Distance Curves for Brain Cancer Diagnosis." Biophysical Journal 114.3, 2018, 353a. [8] Nabizadeh, N., & Kubat, M. Brain tumors detection and segmentation in MR images: Gabor wavelet vs. statistical features. Computers & Electrical Engineering, 45, 2015, pp. 286-301. [9] Bonyadi, M. R., & Michalewicz, Z. Particle swarm optimization for single objective continuous space problems: a review, 2017. [10] Lahmiri, S. Glioma detection based on multi-fractal features of segmented brain MRI by particle swarm optimization techniques. Biomedical Signal Processing and Control, 31, 2017, pp.148-155. [11] Al-Ani, A. Ant Colony Optimization for Feature Subset Selection. In WEC (2), 2005, pp. 35-38. [12] Hancer, E., Ozturk, C., & Karaboga, D. Extraction of brain tumors from MRI images with artificial bee colony based segmentation methodology. In Electrical and Electronics Engineering (ELECO), 2013 8th International Conference on IEEE. [13] Kuriakose, J., & Joy, J. Image Fusion Using Kuwahara Filter, (IJCSIT) International Journal of Computer Science and Information Technologies, Vol. 5 (4) , 2014, pp.5944 - 5949 . [14] Bartyzel, K. Adaptive Kuwahara filter. Signal, Image and Video Processing, 10(4), 2016, pp.663-670. [15] Burney, S. A., & Tariq, H. K-means cluster analysis for image segmentation. International Journal of Computer Applications, 2014, 96(4). [16] Majid, M. A., Abidin, A. F. Z., Anuar, N. D. K., Kadiran, K. A., Karis, M. S., Yusoff, Z. M., ... & Rizman, Z. I. A comparative study on the application of binary particle swarm optimization and binary gravitational search algorithm in feature selection for automatic classification of brain tumor MRI. Journal of Fundamental and Applied Sciences, 10(2S), 2018, pp.486-498. [17] Gu, S., Cheng, R., & Jin, Y. Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft Computing, 22(3), 2018, pp.811-822. [18] Wu, Yue, et al. "High-order graph matching based on ant colony optimization." Neurocomputing, 2018. [19] Gülcü, Şaban, et al. "A parallel cooperative hybrid method based on ant colony optimization and 3-Opt algorithm for solving traveling salesman problem." Soft Computing 22.5, 2018, pp.1669-1685. [20] Ahmad, R., & Choubey, N. S. Review on Image Enhancement Techniques Using Biologically Inspired Artificial Bee Colony Algorithms and Its Variants. In Biologically Rationalized Computing Techniques for Image Processing Applications, Springer, Cham, 2018, pp. 249-271. [21] Takao, Shoichiro, et al. "Deep feedback GMDH-type neural network and its application to medical image analysis of MRI brain images." Artificial Life and Robotics 23.2, 2018, 161-172.