- Email: [email protected]

DESIGN OF TUNNEL SHOTCRETE-BOLTING SUPPORT BASED ON A SUPPORT VECTOR MACHINE K.Y. Liu¹, C.S. Qiao², S.F.Tian3… ¹)School of Civil Engineering and Architecture / Beijing Jiaotong University, China e-mail: [email protected] ²) School of Civil Engineering and Architecture / Beijing Jiaotong University, China e-mail: [email protected]

Abstract: In this paper, the support vector machine (SVM) regression algorithm is introduced for the design of tunnel shotcrete-bolting support parameters. As different from the classical support vector regression (SVR) algorithm which can only solve the single output variable problem, an improved SVR algorithm is proposed to resolve the problem with multiple output variable regression; the corresponding code is developed in MATLAB. Sixteen main factors influencing tunnel shotcrete-bolting support are chosen as input variables and six support parameters as output variables of the SVM network. Ninety-four engineering cases were collected, sixty-four of which were used as the learning sample and the remainder as the testing sample. A 16-24-6 BP artificial neural network (ANN) was applied to verify the effectiveness of the improved SVR algorithm based on identical learning sample and testing sample. From the results calculated by the two algorithms, it can be concluded that the improved SVR algorithm is more effective than ANN. The improved SVR algorithm and the computer program can be employed in similar engineering situations. Keywords: tunnel; support vector regression (SVR); shotcrete-bolting support; improved SVR algorithm; BP neural network; parameter design

1. INTRODUCTION Nowdays, NATM has been widely used in the process of tunnel construction. Shotcrete-bolting support had become an integral part of its basic construction method. However, the design of NATM still mainly relies on constructive experiences. This situation is difficult to meet the variant engineering demands. Tunnel shotcrete-bolting support design is affected by many factors. There is an uncertain nonlinear relation between these influencing factors and the supporting parameters, and it is difficult to represent in deterministic expressions. Wei (1999), Tang and Qiao (2002) applied artificial neural network (ANN) to determine the support parameters of tunnel and obtained some better results. However, ANN has the disadvantage of local optimum and poor generalization. In recent years, many researchers have directed their attention to a new universal machine-learning algorithm–support vector machine (SVM). It has the advantage of global optimization, small sample and good generalization. Feng and Zhao (2002), Zhao et al (2002) and Zhao and Feng (2003) used this algorithm to predict rock burst, classify the

engineering rock masses and evaluate stability of slope. This algorithm can gain better effectiveness from the calculation results by using SVM than by using ANN. In this paper, the algorithm would be applied to design shotcrete-bolting parameters of tunnel and compared with the design result by using ANN.

2. SUPPORT VECTOR REGRESSION 2.1 Classical support vector regression algorithm Vapnik (1995), Campbell (2002) and Cristianini and Shawe-Taylor (2000) expatiated on the theoretical basis and algorithm of support vector machine (SVM). Because SVM was designed to minimize structural risk whereas ANN is based on minimization of empirical risk, SVM is usually less vulnerable to overfitting. The training problem in SVM is reducible to solving a convex quadratic programming (QP) problem, So the solution calculated by using SVM algorithm must be the global optimum. Support vector regression (SVR) adopts SVM to solve function-fitting problem.

1

Paper 3A 09 — SINOROCK2004 Symposium Int. J. Rock Mech. Min. Sci. Vol. 41, No. 3, CD-ROM, © 2004 Elsevier Ltd.

Given a training data set, construct a linear function to fit these data, {

}

x i , y i li =1 ,

d

x ∈ X ⊂ R , yi ∈ Y ⊂ R ,

f (x ) = w ⋅ x + b

(1)

where w denotes weight vector, b is bias.

constraints yi − w ⋅ xi − b ≤ ε and w ⋅ xi + b − yi ≤ ε to allow for some deviation ε between the eventual targets yi and the function f ( x) = w ⋅ x + b , modelling the data. We can visualize this as a band or tube of ± ε around the hypothesis function f (x ) and any points outside this tube can be viewed as training errors. The structure of the tube is defined by an ε -insensitive loss function as following: We

use

0, if y − f ( x) ≤ ε L ( y, f ( x)) = y − f ( x ) − ε , else 2

Minimize w to penalize overcomplexity. To account for training errors, also introduce slack variables ξ,ξ* for the two types of training error. These slack variables are zero for points inside the tube and progressively increase for points outside the tube according to the loss function used. The general approach is called ε -SVR and is the most common approach to SV regression, though not the only one. For a linear ε -insensitive loss function, the task is therefore to minimize : l 1 2 w + C ∑ (ξi + ξi* ) 2 i= 1

(2)

Subject to (( w ⋅ xi ) + b ) − y i ≤ ε + ξi yi − (( w ⋅ x i ) + b ) ≤ ε + ξi* ξi , ξi* ≥ 0 , i = 1,2 , L , l

(3)

where C is a parameter to measure the trade-off between complexity and losses. The corresponding dual problem can be derived using the now standard techniques: Maximize l

l

i =1

i =1

* * ∑ (αi − αi ) yi − ε ∑ (αi + αi ) −

1 l * * ∑ (αi − αi )(α j − α j )( xi ⋅ x j ) 2 i , j =1

subject to

i =1 * 0 ≤ αi , αi ≤ C, i = 1, L , l l

* ∑ (αi − αi ) = 0

1

(4)

where α i , α *i denote the Lagrange multiplier of the ith learning sample. So the regression function is: l

f ( x ) = ( w ⋅ x) + b = ∑ (αi − αi* )( x ⋅ xi ) + b i =1

(5)

For a nonlinear regression problem, the data is mapped to a high dimensional feature space by using a nonlinear mapping in order to execute linear regression. In this high dimensional feature space, the inner product of linear regression was replaced by a kernel function, i.e. K ( xi , x j ) = φ( x i ) ⋅ φ( x j )

(6)

Finally, a regression function can be obtained according to the deduction process same as linear regression as the following: k

f ( x ) = ∑ (αi − αi* ) K ( x, x i ) + b

(7)

i =1

There are a few training samples whose α and α are not zero.Let *

β i = αi − α*i

(8)

If the β value of the ith training sample is nonzero, this training sample was called a support vector. All support vectors can determine the regression function. Different kernel function forms different algorithm. The commonly used kernel function includes the following three types: 1) Polynomial kernel function K ( x, y ) = (x ⋅ y +1) d

(9)

2) Radial basis function (RBF kernel function) K ( x, y) = exp(−2σ −2 x − y ) 2

(10)

3) Sigmoid kernel function K ( x, y) = tanh(ν( x ⋅ y ) + c)

(11)

2.2 Improved SVR algorithm It is noticed that the classical SVR algorithm only can solve the single output variable problem. So far as we are concerned, the multi-outputvariable problem is still unsolved in theory. we

2

Paper 3A 09 — SINOROCK2004 Symposium Int. J. Rock Mech. Min. Sci. Vol. 41, No. 3, CD-ROM, © 2004 Elsevier Ltd.

assume that the multivariable output problem can be solved by multiple single output variable regression. First, we resolve the first output variable regression problem, then we add this output variable to the input variable so that we can settle the second output variable regression problem. In this way, the last output variable regression problem can be resolved in the end. The solution procedure can be presented as follows: START

START

j=0

i=1

X,Y=(y 1,...,y i) of learning samples

X,Y¨ @ =( y1¨ @ ,...,y j¨ @ ) of testing samples

training and fitting

extrapolation and prediction j=j+1

i=i+1

y

y* i

i=N YES output Y*

END

prediction value Y1 of the first output variable by extrapolation of this network. Step 2: The X and Y1 are combined as the input variable of the SVM network and the prediction value Y2 of the second output variable can be gained by extrapolation of this network. Step 3: Repeating the step 2, we can obtain the prediction value of all output variables. This improved SVR algorithm can be illustrated in flow charts as shown in Figure 1. (The number of output variables are assumed to be N)

NO

j+1 ¨

@

NO

j+1=N YES output Y¨

@

3. INPUT AND OUTPUT VARIABLES OF TUNNEL SHOTCRETE-BOLTING SUPPORT DESIGN Tunnel shotcrete-bolting support design are mainly influenced by 16 stability factors: excavation span, depth of excavation, rock strength, rock mass structure, density of joint, tightness, connectivity, type, filling, surface smoothness, roughness, dip and strike of discontinuity, shear zone and filling material, RQD and groundwater condition. These factors act as input variables of SVM network. Shotcrete thickness, diameter, length and spacing of rock bolt, diameter and spacing of wire mesh are considered as the design parameters, i.e. output variables of SVM network.

END

(a) (b) (a) Training algorithm (b) Prediction algorithm Figure 1. Flow chart of the improved SVR algorithm (1) Training and fitting Step 1: The input variable X of learning samples is treated as the training input variable and the first output variable Y1 as the training output variable of SVM network. We can obtain the fitting value Y1* of the first output variable by machine learning. Step 2: The Y1 and X are combined as the training input variable and the second output variable Y2 is treated as the training output variable of SVM network. Then the fitting value Y2* of the second output variable can be obtained by machine learning. Step 3: Repeating the step 2, we can obtain the fitting value of all output variables. (2) Testing and prediction Step 1: Adopting the SVM network obtained from the above training procedure and treating the input variable X of testing sample as the input variable of the SVM network, we can obtain the

4. REALIZATION OF THE IMPROVED SVR ALGORITHM AND APPLICATION Wei (1999) collected ninety-five engineering cases of tunnel shotcrete-bolting support and gave the input values of learning and testing samples in table 3.3 and 3.4. In this paper, we delete one data point that its irrationality is so obvious. The first 64 samples were chosen as the learning samples and the latter 30 samples as the testing samples. In order to examine the effectiveness of the improved SVR algorithm, a three-layer BP network was utilized to train and forecast the same training and testing samples. The variance S 2 is employed to judge the fitting and prediction accuracy of these ANN and SVM networks. S2 =

1 n −1

n

* 2 ∑ ( y i − yi )

i =1

(12)

where n is the number of the learning (or testing) samples and yi* , yi are the fitting (or prediction) and

3

Paper 3A 09 — SINOROCK2004 Symposium Int. J. Rock Mech. Min. Sci. Vol. 41, No. 3, CD-ROM, © 2004 Elsevier Ltd.

actual values of the ith learning (or testing) sample, respectively. The function was chosen to be as the transfer function of the BP network is: f (x ) =

Here,we defined a variable S as: S=

1 1+e

5. ANALYSIS OF CALCULATION RESULT

−x

2 S 2ANN − S SVM 2 S SVM

(13)

Normalizing the sample data and using the trialand-error method, the numbers of the input layer, the middle layer and the output layer node of this BP network are 16, 24 and 6 respectively. When S 2 is less than 0.01,the training error descends but prediction error ascends. That is the “overfitting” phenomenon of ANN. We consider the S 2 of the best BP network is 0.01. For SVM, the network with the smallest training error must be the network with the smallest prediction error. Because the improved SVR algorithm employed the sequential computation method, we can get the optimal SVM network as long as we optimise the first output variable. Gunn (1998) provided a source code programmed in MATLAB. In this paper, the ε -insensitive loss function and RBF kernel function were adopted and a corresponding code was developed in MATLAB based on the source code. The network with the smallest training error S 2 can be ontained by using the following procedure(only optimise the first output variable of the learning sample): (1). Assume the initial values of σ ,ε, C are 15, 0.001 and 0.5, respectively and the loop step length, terminal value of C are 0.5 and 500, respectively. Searching in this interval, it is found that S 2 is minimum when C equals to 500. (2). The initial value of C is 500 instead of 0.5 and the initial values of σ and ε remain unchanged. We assume the loop step length and terminal value of ε are 0.005 and 5, respectively. When ε equal to 0.251,the S 2 is at its minimum. (3). Replace the initial value of ε with 0.251 and assume the initial value, loop step length and terminal value of σ are 1,1 and 25, respectively. The S 2 is at its minimum when σ equal to 15 by the loop computing. (4). By now, we have obtained the best SVM network, with C , ε and σ being 500,0.251 and 15, respectively. The best ANN and SVM network have been found by machine learning. The next step is to use these networks to predict the testing samples. The calculation result of the testing samples is shown in table 1.

× 100 %

(14)

2 2 where S ANN , S SVM denote the S 2 of results calculated by ANN and SVM, respectively. Calculated by equation (14), the S values of design result of the six support parameters shown in table 1 are 77%, 92%, 94%, 57%, 59%, 33%, respectively. The variances S 2 of the first three support parameters calculated by ANN are twice as much as those of by SVM. The variances of the spaceing of bolts and diameter of wire mesh calculated by ANN are 1.5 times those of by SVM. For the last support parameter, the design results calculated by the two methods are about the same.This is because the fluctuation of the sample data of the latter two support parameters are too high. Difine another variable as:

e=

1 m

m y j − y 'j

∑

j =1

yj

× 100

(15)

where m denotes these testing samples whose value is nonzero and y j , y ,j denote the sample value and the perdiction value of the jth testing sample, respectively. From the table 1, one can see that the e calculated by SVM is also much less than that by ANN. The effectiveness of shotcrete-bolting support mainly depends on the shotcrete thickness and length, diameter, spacing of bolt. The wire mesh only plays an auxiliary role. The value of e of the first four support parameters don’t exceed 15%. For those testing samples whose actual value are zero, the design values of these samples are greater than zero. Therefore, it is acceptable in engineering because the design is conservative.

6. CONCLUSIONS (1). Being restricted in model building and the selection of input parameters, the conventional methods are difficult in designing the parameters of tunnel shotcrete-bolting support. Adopting the artificial intelligence can solve this problem. As can

4

Paper 3A 09 — SINOROCK2004 Symposium Int. J. Rock Mech. Min. Sci. Vol. 41, No. 3, CD-ROM, © 2004 Elsevier Ltd.

be seen in the design results, the SVR algorithm can (3). The method to search optimal parameters of obtain feasible and good design results. SVM network has the advantages of being (2). Compared with BP network, the improved feasible,simple and quick. Due to restrictions in the SVR algorithm makes great improvement in the interval of parameters, the calculation effectiveness precision of the design results, and can be of this method still isn’t very ideal and further employed to the similar engineering situations. studies are needed. Table 1 The design result of tunnel shotcrete-bolting support by ANN and SVM Serial Shotcrete thickness (cm) Diameter of bolt (mm) Length of bolt (m) number Actual Design Design Actual Design Design Actual Design Design value value of value of value value of value of value value of value of ANN SVM ANN SVM ANN SVM 9.65 19.6 2.53 1 8 8.58 16 19.88 2.2 2.16 2 14.4 19.1 2.01 12 12.22 18 19.24 2 2.23 10.8 19.6 2.21 3 8 10.91 16 20.84 2.2 2.33 13.7 19.1 2.39 4 11 13.56 18 19.74 2.4 2.38 11.2 19.8 2.57 5 10 12.16 18 21.2 2.4 2.4 13 21.5 2.58 6 13 12.42 20 20.1 2.6 2.44 7 17.9 20.3 2.28 16 14.42 22 19.26 2.5 2.33 14.4 19.7 2.46 8 15 14.66 22 21.49 2.6 2.51 18.2 24.2 2.58 9 18 16.25 22 21.74 2.6 2.41 17 22.2 2.8 10 15 17.46 22 22.75 2.6 2.45 12.2 19.4 2.76 11 16 14.52 22 20.02 2.6 2.45 12 6 4.44 5.36 16 2.4 10.47 2 0.6 1.32 13 8 9.1 8.7 16 18.2 17.34 2.2 2.26 2.06 14 9 9.25 8.18 18 17.7 16.8 2 2.03 1.98 15 11 12.6 11.75 18 17.2 18.66 2.4 2.03 2.3 16 10 8.41 10.04 18 11.9 17.24 2.4 1.81 2.27 17 13 11.1 12.96 20 16.9 19.9 2.6 2.26 2.5 18 13 16.3 14.12 18 23 21.77 2.4 2.68 2.44 19 12 10.2 12.15 18 15.2 19.16 2 2.08 2.23 20 7 7.04 8.31 0 13.2 14.21 0 1.72 1.91 21 16 13.3 12.68 22 20.3 20.26 2.5 2.32 2.33 22 15 16.1 15.19 22 20.5 21.6 2.6 2.6 2.59 23 15 12.3 12.69 22 21.8 21.26 2.6 2.49 2.44 24 5 6.15 7.29 14 13.4 14.81 1.8 1.77 1.85 25 6 6.13 6.11 16 8.47 12.89 2 1.03 1.56 26 8 6.5 9.05 16 9.19 15.8 2.2 1.08 2.06 27 10 8.83 9.27 18 11 14.93 2.4 1.32 2.07 28 13 12 13.5 20 18.3 20.02 2.6 2.31 2.52 29 18 12.3 15.0 22 19.7 21.05 2.6 2.25 2.4 30 13 14.9 14.55 18 18.2 19.75 2.4 2.36 2.47 S2 4.55 2.57 22.86 11.92 0.33 0.17 e 14.9 11.5 16.6 9.44 13.1 6.75 Serial number

1 2 3 4

Spacing of bolt (m) Actual Design Design value value of value of ANN SVM 1 1.14 0.98 0.8 0.9 0.89 1 1.16 0.94 1 1.1 0.9

Diameter of wire mesh (mm) Actual Design Design value value of value of ANN SVM 0 0.79 2.58 8 5.13 4.61 0 0 0 0 3.01 2.0

Space of wire mesh (cm) Actual Design Design value value of value of ANN SVM 0 0.38 6.93 25 14.06 14.69 0 0 0 0 8.91 7.38

5

Paper 3A 09 — SINOROCK2004 Symposium Int. J. Rock Mech. Min. Sci. Vol. 41, No. 3, CD-ROM, © 2004 Elsevier Ltd.

5 1 0.93 0.9 6 6 1 0.9 0.93 0 7 1.8 0.69 0.84 0 8 1 1.02 0.84 6 9 1 0.89 0.81 8 10 1 1.09 0.78 8 11 1 1.18 0.88 8 12 1 0.29 0.78 0 13 1 0.88 0.95 0 14 1 0.85 0.96 0 15 1 0.88 0.93 0 16 1 0.7 0.95 0 17 1 0.86 0.9 0 18 1 0.73 0.88 6 19 0.8 1.05 0.89 8 20 0 0.75 0.88 0 21 0.8 0.84 0.9 0 22 1 0.78 0.84 6 23 1 1.11 0.91 8 24 1 0.78 0.9 0 25 1 0.67 0.83 0 26 1 0.64 0.91 0 27 1 0.57 0.91 0 28 1 0.95 0.89 0 29 1 1.09 0.84 8 30 1 0.92 0.86 6 S2 0.11 0.07 e 19.91 12.75 Acknowledgements-This research is supported by the National Natural Science Foundation of the People ’s Republic of China.(No.50078002). The authors greatly appreciate the assistance of Professor Y.Xiang.

7. REFERENCES C. Campbell. 2002. Kernel methods: a survey of current techniques. Neurocomputing. 48: pp 6384 Feng & Zhao. 2002. Prediction of rockburst using support vector machine. Journal of Northeastern University (Natural Science). 23(1): pp. 57-59. N.Cristianini & J.Shawe-Tayor. 2000. An introduction to support vector machine and other kernel-based learning methods. Cambridge University Press. S.R.Gunn. 1998. Support vector machines for classification and regression. Technical Report. University of Southampton.

1.39 0 7.06 7.53 12.29 8.01 6.08 0 0 0 2.29 0.31 6.81 0 7.38 1.14 2.7 9.61 4.93 0 0.39 0.58 0 2.79 8.92 8.86 8.62 40.12

4.49 0 3.83 1.83 8 8.16 8.16 1.02 0 0 1.46 0.71 2.3 1.67 6.49 0 2.93 8.23 3.34 0 0.5 1.44 0 5.3 6.51 7.7 5.43 31.23

20 0 0 20 20 20 20 0 0 0 0 0 0 20 25 0 0 20 20 0 0 0 0 0 20 20

3.48 0 17.57 18.99 30.73 23.32 18.94 2.71 0 0 6.27 1.81 18.65 0 21.84 3.69 5.82 25.87 11.3 0 0.66 4.1 0.05 8.97 25.58 24.87 69.9 37.06

13.08 0 12.96 6.93 19.83 21.02 20.2 4.67 0 0 5.4 2.97 8.7 4.95 19.39 0 8.79 22.82 10.13 0 2.76 6.71 0 17.22 17.48 20.63 52.71 27.08

Tang & Qiao. 2002. Method of Artificial Neural Network for bolt-shotcrete support design in tunnels. China Journal of Highway and Transport. 15(3): pp. 68-72. Vapnik,V. 1995. The nature of Statistical Learning Theory. Springer. New York. Wei. 1999. Study of case-based design for shotcrete-bolting support in tunnel engineering and the system of its integrated with expert system, artificial neural network. Ph.D.thesis. Northern Jiaotong University, Beijing, China. Zhao &. Feng. 2003. Application of support vector machines function fitting in slope stability evalution. Chinese Journal of Rock Mechanics and Engineering 22(2): pp. 241-245. Zhao et al. 2002. Classification of engineering rock based on support vector machine. Rock and Soil Mechanics 23(6): pp. 698-701.

6

Copyright © 2023 C.COEK.INFO. All rights reserved.