Fuzzy Sets and Systems 58 (1993) 155-170 North-Holland
155
On solutions and distribution problems of the linear programming with fuzzy random variable coefficients Qiao Zhong Hebei b~stitute of Architectural Engineering. Zhang]iakott, Hebei. China
Wang Guang-yuan Harbin Architectural and Civil Engineering Instituw, Harbin, Heihmgjiang, Chhm Received September 1992 Revised January 1993 Abstract: This paper studies the solution method and distribution problems for two kinds of linear programming with fuzzy random variable coefficients. Some concepts of solutions (for example, the fuzzy (pseudo-) random optimization solution) are introduced for these new programming problems. We prove some equivalent theorems that transform the fuzzy random programming problems into a series of random programming problems. By using the simplex method of linear programming with random variable coefficients, we give the solution method of linear programming with fuzzy random vat'abl¢ coefficients. The formulas of probability distribution function, projective distribution function and expectation on these new programmings are presented. Keywords: Interval analysis; fuzzy random variable; stochastic programming; fuzzy programming; fuzzy random programming.
1. Introduction
The theory and methods of fuzzy programming have been developed for the past twenty years (see the references introduced in this paper), and have become a useful tool to treat decision-making problems in a fuzzy system. However, there are not only fuzzy factors but random factors in many engineering systems, and these two kinds of uncertain factors are often fused with each other. Hence it is very difficult to distinguish the two uncertain factors. In this case, the effectiveness of fuzzy programming or stochastic programming will be lost. It is very necessary to build a new kind of programming to make decisions in a fuzzy random system. Wang and Qiao [20] have studied two cases of linear programming with fuzzy random variable coefficients. In this paper, we shall use the theory of fuzzy random variables (cf. [7,13,19]) to discuss two other cases of linear programming with fuzzy random variable coefficients. The fuzzy random programming studied in this paper and [20] can be used to make decisions in a fuzzy random system. In Section 2, we shall state some basic concepts and conclusions on fuzzy random variables in preparation for our discussion. In Section 3, we shall introduce two kinds of models of linear programming with fuzzy random variable coefficients, and state their distribution problems. In Section 4, we shall discuss the solution and distribution problem on Model 1. Finally, we shall study the solution and distribution problem on Model 2 in Section 5. 2. Preliminaries
In this section, we shall state some necessary concepts and results on fuzzy numbers and fuzzy random variables ~a preparation for our discussion. Correspondence to: Dr. Qiao Zhong, Hebei Institute of Architectural Engineering, Zhangjiakou 075024, Hebei, China. 0165-0114/93/$06.00 (~) 1993--Elsevier Science Publishers B.V. All rights reserved
Qiao Zhong. Wang Guang-yuan / Lbu, ar programmitlg with fi~zzy random variahh, coeffh'ietlt~
156
Moore [121 introduced the concept of closed interval numbers. Denote l ( R ) = {[a ,a+l l a - , a * e R = ( - ~ , :~),a
-~a '}.
Definition 2.1. Let [a , a+], [b-, b +] e I(R). We have
la~, ~+l+[b
.b'l=l
ia . a + l - [ b
.b'l=la~
a-
+b .a+ +b~l,
(2.l)
-b'.a'--b-l,
(2.2)
[a-. a+} • [b-, b*] = [rain(a-b-, a+b +, a - b +. a ' b - ) , m a x ( a - b - , a 'b +. a - b +, a ~b )1, [a , a + l l l b - . b + l = [ a - . a ~ i ' l l / b ~ ,
(if0~[b-,b~l),
(2.4)
+vb+],
(2.5)
. b+l = [a- ^ b - . a ~ ^ b + l .
(2.6)
[a , a + l v i b , b ' ] = [ a - v b - . a la-. a+l,,lb
llb-]
(2.3)
Obviously, for any r ~ R, if r =a [r, r l, then r . [a_ a +l= {lra -- ra +] ifr~>0, [ra ~ ra- I ifr<~0.
(2.7)
The order relation '<~' is defined by [ a - , a + ] < ~ [ b ~ , b ~]
if and only i f a
<~b
a n d a +<~b'
Denote the set of all bounded closed fuzzy numbers on R by .~j(R) (where the bounded closed fuzzy number is the same as in [20]). It is clear that i f f e ~j(R), then for any a e (0, 1], f,, = [fll,]'~] is a closed interval number on R, and f = O,,~{,,.,i a[f;,,f,;],. Observe that R c I ( R ) c .~-~,(R). Definition 2.2. Suppose that * is an algebraic operation on R, and f, g e ,~(R). The algebraic operation • on , ~ ( R ) is defined by
(f,g)(z)= 7.. =.t" V *y (f(x)^g(y)).
(2.8)
Particularly, when * is "+', '-~, ".', '/' respectively, we can obtain their correspondents on .~,(R). Let r e R. We define r(x) = {10 i f x = r ifx ~r. Then r e ,,~(R), and we have ( r . f ) ( z ) = V : =., .~. (r(x) A f ( y ) ) The order relation '~<' on ~)(R) is defined by f <~g
if and only if f,,, ~
Theorem 2.3. Let f, g ~ ~o(R). For any a e (0, 1] we have ( f *g),, = f . * g . , where * may be a continuous algebraic operation. Particularly, (r . f ) . = r . f,,, where r e R, r . f,, = [r, r]-jr.. Definition 2.4. Let {f, ] t e T} = ~,(R). (l) A , ~ r f , is defined by a fuzzy number g e ~)(R) such that
g.= [A7~.(~)~, ,Ar (~).+] for any
ae
(0,'1.
Qiao Zhong, Wang Guang-yuan [ Linear programming wida fuzzy random mtriable coefficients
157
(2) \/t~rf, is defined by a fuzzy number g ~ .~]~(R) such that g,, =
[,v,
.(];)g,
ft),,
for any a ~ (0, I1,
where (f,),, = [(fJi~, (fi),',l. and T is an index set. The references [7, 13, 19] give the concept of a fuzzy random variable under different conditions respectively. In this paper, we shall adopt the following definition by the results given in [19]. Definition 2.5. Let (g2, M. P) be ,~ probability measure space. A mapping fi :1'2--- ~ ( R ) is called a fuzzy random variable on (1'2,.~/), if for any a ¢ (0, 1], ~,(w) = {x I x E R, ~(w)(x) >i a} = [a~(w), a~+(w)] is a random interval, namely, a~(w) and a,+,(w) are two random variables on (.O, .~/, P). Denote the set of all fuzzy random variables on (12, ~/) by FR(12).
Remark. We may prove that Definition 2.5 is equivalent to those concepts in [7.13. 19] when the value region of a fuzzy random variable is taken as 3~,(R). Definition 2.6. Let • be an algebraic op,. "lion on ,~(R). The algebraic ol~ration * on FR($2) may be defined by (,,7 ./;)(w) A,~(w)./;(w)
for any w e .O,
(2.9)
and ,,7 ~~ if and only if #(w) <~/~(w) for any w e 1"~,where a, b E FR(.O). Furthermore, let {8, It e T},=. FR(f2). Then ( , 6 r a ' ) (w) A ,~rAa,(w)
for any w e I'Z
(2.10)
Definition 2.7. Suppose that ,'7 ? FR(f2), ~,(w) = [a~(w), a~(w)], a ~ (0, 11. (1) For any a ~ (0, 1], the fu,hction PD,(x) ~ P({w [w E ~, x E ~,(w) = [a~(w), a2(w)l}
(2.11)
is called a projective distribution function of ~, where x ~ R. (2) Suppose that F~(x) (resp. F~(x)) is the probability distribution function of a2(w) (resp. a+(w)). The function F(x) = [.-J~,E~o.,Ia[F~(x), F~(x)] is called the fuzzy probability distribution function of a. We may prove that F+~(x)<<-Fg(x) and PD,(x) = Fg(x + O) - F+~(x), where
F~(x+O):P({wlwE~,a2(w)<<-X}),
F2(x)=P({wlwel%a+~(w)
Definition 2.8. Let a e FR(12) such that ~,,(w) = [a,7,(w), a,~(w)]. The expectation of a is defined by E(~)=U,E(o.llot(E(a~,), E(a.+)], where E(aT~) and E(a+~) are the expectations of aT~ and a2 respectively.
3. Models of linear programming with fuzzy random variable coemcients In this section, we first shall introduce two models of linear programming with fuzzy random variable coefficients (which are different from those studied in [20]). Then we shall state their distribution problems.
Qiao Zhtmg, Wang Guangz,,Uan / Linear programming with fi~zzy mndona variable coeJ]icients
158
Denote R ( ~ ) = { h t h is a random variable on (~, ~q, P)}. Suppose that 6#, /~i, 6j E FR(g2), a O, xje{xJx'~.--,R}, .~je{.~l.?:gJ--,,~j(R)}, where i = 1 . . . . . m; j = l . . . . . n. Write
b, e R ( g J ) ,
A = ( , % ) ..... ,, t~ = ( b , . . . . . .~. = ( . ~ . . . . . .~,,) r.
t;,,,) 1, C = (/', . . . . .
e , , ) ' . A : (a,,),,,×,,, B = (b, . . . . .
b,,,)', X = (x~ . . . . .
x,,) v,
We introduce the following two models. Model 1 (Linear programming of objective function with fuzzy random variable coefficients). Find X to minimize subject to
W(C', X) = c r x
(3.1a)
A X <~B,
(3.1b)
X~>0
(3.1c)
where X ~>0 if and only if xj ~>0, j = 1. . . . . n. Model 2 (Linear programming with fuzzy random variable coefficients and decision vector .~'). Find ,~" to minimize subject to
W(C', ,~) = C'",~"
(3.2a)
,~.~' ~< B,
(3.2b)
,V~>0
(.,~" ~>0 means (.~/),, I>0 for any a E (0, l],j.) Distribution problem of Model 1. Find the fuzzy probability distribution function, projective distribution function and expectation of l~'(w) = rain c ' r x in Model 1. Distribution problem of Model 2. Find the fuzzy probability distribution function, projective distribution function and expectation of l~'(w) = min C"r,1]" in Model 2. Remark. Evidently, the stochastic linear programming (cf. [8, 14, 18]), the linear programming with fuzzy number coefficients (cf. [4]), the classical linear programming are special cases of Model 1, 2. (cf. 120])
4. The solution and distribution problem on Model 1 In the section, we first discuss some problems on linear programming with random variable coefficients. Then we study the solution and distribution problem of Model I. Let us first consider the following problem (RP1): Find X to minimize subject to
W(C, X ) = C v X
(4.1a)
A X = B,
(4.1b)
X >~0
where A = (ao),,,×n. B = (bi . . . . . b,,,) T, C = (c~ . . . . . c,) T, X = (xl . . . . . x,,) T, a o, b~, cj E R(g2), xj E {x I x : f 2 - o R}, i = 1. . . . . m; j = 1. . . . . n. For ~..anvenience, we take D =(A~ . . . . . A,,) as the base matrix, and such that D - ~ B > O . Thus A = (D N), and in correspondence with this we may denote X = (X~ X~))r, C r = (Cg C~), where N = (Am+, . . . . . An), X ~ = ( X , . . . . . X,,,), X ~ = (x,,+, . . . . . x,), C ~ = ( c , . . . . . cm), C ~ = (cm+t . . . . . c,). The (4.1b) of (RP1) may be written as follows: Xo + D-JNXN = D-lB.
(4.2)
Qiao Zhoug. Wang Guang-vuan / Linear programming with fuzzy random variable coefficients
159
Thus we have c T x = C'II;XD 4- CTXN =
CT,D - ' B + (C T - CT,O - ' N ) X :
= c ~ t r D - ' B + (C T - C~')D-'A)X.
(4.3)
Denote W. z_ C,~D_ ~B,
(4.4)
(c,,,+, . . . . . c,,) a c T _ C ~ ; D _ , N ,
(4.5)
(b; . . . . . b~,)-r __aD - ' B ,
(4.6)
(
a I.m
+ I
• " "
/
a i.n
•
"
#,
(4.7)
~= O - t N .
t,
\a,,,.,.+t
- • •
a,,,, /
Then, the problem (RPI) may be expressed by (RP2): Minimize
W(C, X) = W, +
~
c~.ri
(4.8a)
i=m + !
subjectto
{xi+
~ a;jxj=b', i = 1 . . . . . m, j =,,,+ t xj>>-O, j = l . . . . . n.
(4. 8b)
It is clear that X' = (b't . . . . . b,',, 0 . . . . . 0)r is a basic feasible solution of (RP2). Wang and Qiao [20] discussed the simplex algorithm of (RPI), namely, they proved the following theorem: Theorem 4.1. I f c; ~ 0 (j = m + 1. . . . . n) in (RP2), then X ' is an optimization sohaion o f (RPI), and rain CTX = W,. If (RP2) satisfies the following conditions: (1) There exists jo (m + 1 <~jo<~n) such that cj,, " <0: • F ~> (2) There exists io (1 <-to<~m) such that ai.j. 0; then (RP2) has a new basic feasible sohaion X" such that c T x " < C'rX'. This theorem provides a fundament to find the optimization solutions of Model 1, 2. Assume that H = (A, B, C) is an r-random vector of all elements of A, B and C in (RP1), where r = m + n + ran. Thus (RP1) may be expressed by (RP3): W(H) = min c T x
(4.9a)
subject to A ( H ) X = B(H), X 1>0.
(4.9b) (4.9c)
Definition 4.2. (1) Let D(H) be a m-submatrix of A(H). If there exists w E .(2 such that Det(D(H(w))) # O, then D(H) is called almost everywhere nonsingular. (2) Let {D;(H) I i = 1. . . . , p} be the class of all almost everywhere nonsingular m-submatrixes of A(H). If w e .(2 such that Det(Di(H(w))) = O, we define D~-t(H(w)) = 0 (i = 1 , . . . , p). We structure sets: u, = {H(w) l w e o, D;t(H)B(H)Jw>~O, (CT(H) - CToD;t(H)A(H))I,.>~O}, i-l
Vj = U~ .- U Uj,
i = l, . . . , p.
]=1
Then U,. or V, (i = 1. . . . . p) are called decision regions of (RP3).
Qiao Zhong. Wang Guang-vnan / Linear im,gramming with fiazzy nmdom variahh"coefficients
160
Remark. D ( H ) is almost everywhere nonsingular if D e t ( D ( H ( w ) ) ) = 0}) = 0, where u is a Lebesgue measure on R ~.
and
only
if
u({H(w) w •.(2,
In the theory of stochastic programming, the following conclusions have been proved (cf. [8, 14, 18]). Theorem 4.3. (1) W ( H ) = min C " X is a Borel measurable fimction (or a random variable) on (f2, M). (2) f f (RP3) has a feasible sohaion, there exbts an optimization solution which is a random vector on (~, d ) . Theorem 4.4. Assume that P ( { w l - ~ < W ( H ( w ) ) < + ~ } ) = l am/ dtere is a w • O such that R a n k ( A ( H ( w ) ) ) = m. If fn(S) is the probabifity density fimction o f H, then the probability distribution function Fwm~(x) and the expectation E ( W ( H ) ) o f W ( H ) may be expressed by =t~l~ FwtII)(X)
.~,(s) d~"
(4.10)
Ii(,r) i ,v c I.L lllw),~ V. W(IIIW))'" x}
E(W(H))= ~ f W(H).f.(s)ds. wherex • R. i=1
(4.11)
Jr,
In the following, we study the solution and distribution problem of Model I. We have introduced the following model in Section 3.
Model 1 (Linear programming of objective function with fuzzy random variable coefficients). Find X to minimize subject to
W(t:', X) = C"X A X ~ B, X~>O.
(4.12a) (4.12b) (4.12c)
Remark. All matrixes and vectors are identical with those given in Section 3. Theorem 4.5. Model 1 is equivalent to tile followhtg linear programming with random variable coefficients: Model 1.1 Find X to minimize
W(C, X)~, = (C,~)rx, W((7, X ) ~ = (C,+,)Tx
subject to
(4.13a) (4.13b) (4.13c)
A X <~B, X >t 0 for any ot • (0, 1],
where ff'j),~ = [(c~)~, (cj),+,], C,+, = ((c,),+ . . . , (,..~+xT.~..,.,,,C,; = ((c,)/,, . . . , (c,,)~,) T.
Proof. We first prove t:lat
(CTx);, = (c~)Tx,
(C'X),
+ =
(C,+)'x
(,)
for any w e ~, ot • (0, 1], where ( c T x ) , , = [(C'rx)~,, (c"rx),+,]. By using Definition 2.1 and Theorem 2.3 for any w • 12, we have [(CIX)~,
+ (c- T X).l=(C~x)°
/=i
=
j=i [(cj);,
j=t
(c;).lxj +
=
(cj)Z,x.
)=1
(c;),,x; +
= I(c~)Tx, (C~)"X]. Thus it follows that (C'rx)s, = (C~)'rx and (C-T X),,÷ = ( c + ) T x for any w • t2, a • (0, II.
Qhto Zhong, Wang Guang-yuan / Linear pn~grtmmlhtg with fitzzy rambml variable coefficients
161
Now we prove that if' = m i n ~"rx if and only if W,; = m i n ( C , , ) r x and W,~ =min(C,+)'rX for a e (0, I ], w • .(2, where I~',, = [ W,~, W/, ]. Necessity: Let if' = min C'rX. By Definition 2.4 and (*), we have
[W,,, W/,] = (min d'rx),, = [min(CTX)il. min(CrX),~]
= [min(C~,)'rx, min(C/,)rX]
for any a • (0, I], w • 12
Therefore. W , = min(C,;)TX, W, ~,= mii~(C,+,)'rX for any a • (0, 1], w • 12 Sufficiency: Let Ws -- min(C,~)"X, W,* = rain(C,+,)rX. Using the above result (.), we have
171:,,= [W;;, W,~! = [min(C,)VX, min(C~)TX] = [min(C'TX)S, min(~"rx)~]
for any a e (0, 1], w • ~,
and by Definition 2.4 it follows that I~' = min c"rx. Consequently, Model 1 is equivalent to Model 1.1. Remark, For any given a • (0. 1], Model 1.1 is a multiobjective linear programming with random variable coefficients. We may find (absolute, Pareto or weak Pareto) optimization solutions under any given a • (0. I]. Hence the optimization solutions of Model 1 may be obtained. We structure two programming problems by the coefficients in Model 1.1. Model 1.2.
Find X to minimize subject to
W(C., X);, = (CI;)rX
(4.13a)
A X <~B,
(4.13c)
X~0
for any a • (0,1].
Model 1.3.
Find X to minimize subject to
W(C, X), +,= (C,,) + "rX
(4.13b)
A X <~B,
(4.13c)
X >~0
for any a E (0, 1].
Now we discuss the distribution problem of Model 1. In the discussion on the distribution problem, we assume that Model 1, 1.2, 1.3 have the equality constraint: A . X = B. Let H,~ = (A, B, C~,) be the r-random vector of all elements of A, B and C~,, and let H,+, = (A, B, C +) be the r-random vector of all elements of A, B and C ,+, where r = m + n + mn. 4.6. Assume that W(H2)=min(C~,)TX, W(H+~)=min(C+,,)TX, and such that P({w[ -~
Theorem
Fwm,;}(x) = ~, f¢ i=1 It,,(.,) I tl,,(vr)~(V,,)~.
E(W(H~.))= ~'~ i=l
fm(s) ds,
(4.14)
WItI[,(u,'))
W(H~,) . ftt,,(s)ds I,',,)i
where (V~,)i (i = 1. . . . . p) is a decision region of Model 1.2, a e (0, 1], x • R.
(4.15)
Qiao Zhong, Wang Guang-yuan [ Linear programmbtg with fitzzy random variable coefficienu
I62
(2) The probability distribution function FWtH.;)(x) and the expectation of W (H,+,) may be expressed by
Fwu,.:,(x)= ~ f(
I= 1 ll~(w) I II,;(w)E(V,;)/, WIH,,Iw])<.z, we'.l}
E(W(H,~))= ~. f~ j= I
v,;)p
f,c.(s)ds,
(4.16) (4.17)
W(H,+,)'fH,;(s)ds
where (V+,,)j (j = 1. . . . . q) is a decision region of Model 1.3, a E (0, 1], x ~ R. (3) The fi~zzy probability distribution fimction of I~' = rain c ' r x 0~ Model 1 may be expressed by F'g,(x) = [..J ot[Fwot:~(x), Fwtn,,j(x)], ~(0.1|
where x ~ R.
(4.18)
(4) The projective distribution fimction of I~' = rain C'rX may be expressed by PD~(x) = Fwut,~(x +0) - Fwut.:~(x), x ~ R, where Fw~H,,~(X + O) = e({w I w E [2, W(H~(w)) <~x}).
(4.19)
(5) The expectation of W = min c r x may be expressed by
E(I~')=
U
r~,E(O.I]
a[E(W(H:.)),E(W(H+,))I •
(4.20)
Proof. We prove easily that (I) and (2) are true by using Theorem 4.4. Now we prove (3)-(5). (3) From the proof of Theorem 4.5 we know that IT' = rain CTX
if and only if W2 = min(C2)TX and W,+, = min(C~)rx,
where if',, =[W~, W+,]. Hence we have {wJw~(w)
P~(x)= cr~(O,l] U ,~[Fw.,:,,(x).Fw,.,.~(x)l. (4) By the result given in Section 2. (5) Using Definition 2.8 and the result that
if' = m i n C T X
if and only if WS = W(H~),
W~=W(H:,). D. Dubois and H. Prade [5] introduced the concept of L-R fuzzy number. Definit;on 4.7. f ~ .~(R) is called an L-R fuzzy number, if its membership function may be expressed by
~sl((m - x)la) f ( x ) = [s2((x - m)/b)
if x ~< m, a > 0, if x >~m, b > 0
(4.21)
where sl and s2 are two reference functions (cf. [5]) . ~ ( R ) = {f [f:R---> [0, 1]}. Denote this L-R fuzzy number by f = (m, a, b),.i,. Theorem 4.8. Let f = (m, a, b ).,i,, g = (n, c, d).,t, be L-R filzzy numbers. Then (1) f +g =(m + n , a + c , b +d)~lr. (2) f - g = (m - n , a +d, b +c)sm ifst =s2.
(rm, ra, rb ).,./, if r > 0 ' -rb, -ra)~t, ifr
(3) r ' f = t ( r m
where r e R.
Qiao Zhong, Wang Guang-yuan / Linear programming with fuzzy random variable coefficients
163
(4) f <~g if and only if m <~n, a >>-c, b <~d. (5) f a g = ( m An, a r c , b ^d),l,. In the following, we discuss the case that elements of ~7 are L-R fuzzy numbers in ~o(R) for any given w E 1'2. Denote
(4.22)
ej(w) = (cj(w). c,(w). 6(w)),~, c = (c,(w) .....
e,,(w))". _c = (_c,(w) . . . . .
_~,,(w))L ~ = (e,(w) . . . . .
e . ( w ) ) T.
We have the following theorem. Theorem 4.9. I f e ; ( w ) , j = I . . . . . n, are L-R fi~zzy numbers in ~ ( R ) such that (4.22) for any given w E 12, then Model 1 is approximately equivalent to the following Model 1.4: Find X to minimize subject to
W, = c T x ,
W, = -c_'rx,
W~ = C.TX
(4.23a)
A X <~B,
(4.23b)
x ~ 0.
(4.23c)
Proof. By Theorem 4.8(5), we may prove this theorem. In the following, we shall discuss a numeri~.al example. Example 4.10. Suppose that r = r(w) (w ~ 12) is a random variable with values in the interval [0, + ~ ) continuously. We take a fuzzy random variable ~ :12----, .~(R), with the following membership function:
~(w)A-~(r,t) =
t+l-r, -t+l+r, 0,
ift~[r-l,r], iftE(r,r+l],
otherwise
where w ~ 12 (See Figure 1). Let us consider the problem (El): Find X = (x j, xz) to minimize subject to
w ( a , x ) = a - x, + x.,
(1 °)
x, - x 2 ~ > - 1 ,
(2 °)
x, + x2 -~ r,
(3°)
x,~O,
(4°)
x2~O.
This is a linear programming that the objective function contains a fuzzy random variable coefficient ~.
em
a
1 It
0
°
r-1
r
r+l
Fig. 1. Representation of the membership function of ~i(w).
164
Qiao Zltong, Wang Guang-)'uan / Linear progranmting with fitzzy rundmn variahh, coefficients
By T h e o r e m 4.5, ( E l ) is equivalent to the following problem (E2): Find X = (x~,x,_) to minimize subject to
W(~, X),, = (4 + r - 1)x~ +x2,
(5 °)
W(tL X),', = ( - o r + r + l ).rl + x_,
(6")
x= - x2 >i - l,
(2 °)
x, + x2 i> r.
(3 °)
x, ~ O, x_, i> O,
(4 °)
for any a E (0, 1]. (E2) is a linear programming with random variable coefficients. We introduce two slack variables x3, x4 such that x3 ~ 0 and x4 ~ 0. The inequality constraints (2°), (3 °) may be written as - x , + x , + x~ = I,
(7 °)
x , + x : - x . = r.
(8 °)
We obtain the following programming through calculations: Minimize
X).~
w(a,
x ) , +, = ~(r" + (1 - ~ ) r + a ) + ~(r - ~ ) x , + ~(r - ~ + 2)x~
subject to x , -
=
2(r 2)x3 + _[(a + r)x4, ~ "~ + (~ - l)r + (2 - a ) ) + [ . (a + r-
W(,q.
I ,I x ~ - .-x4 = ~ ( r - I),
( l i °) (12 o)
X2 + ~X., - ~x4 = ~(r + 1),
x l , x 2 , x 3 , x4~>0
(9 ° ) (10 °)
for any a E (0.1].
W h e n tr + r > 2 (or when r > 2), all coefficients of x~ and x4 in (9 °) and (10 °) arc positive. It follows, from T h e o r e m 4.1. that ( ~ ( r - 1), ~(r + !), 0, 0) is an absolute optimization solution. A n d therefore, X* = ( ~ ( r - 1 ) , ~(r + 1)) is an absolute optimization solution of (E2) or ( E l ) for any or, r such that t~ + r > 2 (resp. r > 2 ) . X* is a random vector on g2. Putting (~(r - 1), ~(r + 1), O, 0) into (9 °) and (10°), we have (13 ° )
min W(a, X),~ = ~,(r-"+ (t~ - l)r + (2 - c~)), I
"*
rain W(a, X)~ = _,(r" + (1 - ot)r + a )
(i¢)
whenever a + r > 2 (resp. r > 2). They are the optimization values of (E2) (evidently, they also are random variables on I2) (see Figure 2). We call the fuzzy random variable I~'(r): l~'(r) a-- t..J
trEl(I,I]
a l m i n W ( a , X ) S , min w ( a , x ) , ; l
(15 °)
a fuzzy r a n d o m optimization value of ( E l ) . Furthermore, if we have found the probability distribution functions of min W(,~, X)~ and min W ( & X ) , +, from the probability distribution function of r = r(w), we also may obtain the fuzzy probability distribution function oi" W(r).
o r(l~r~
Xl
F i g . 2. Representation of the programming (E2). (a + r > 2 or r > 2).
Qiao Zhong. Wang (htang-vuan [ Linear programming with filzzy rm~dom variable coeflficients
165
5. The solution and distribntion problem on Model ~In this section, we shall discuss the solution and distribution problem of Mode~ 2. All matrices and vectors used in this section are identical with those given in Section 3. in Section 3. we give the following model. Model 2 (Linear programming with fuzzy random variable coefficients and decision vector ,X'). Find ,~" to minimize subject to
W(C', ,Y) = ~'r~.
(5.1a)
,~)i" ~0,
(5.1b) (SAc)
for any a • (0, 11. Since ~,.~, /~. /~i • FR(.Q), for any t~ • (0, 1], their a-level cuts are random interval numbers. Denote them by (~io),,=[(aoK,,(ao)~l,
(~,),,=[(b,)~,(b,),+,l
(e~),,=[(c~)=.(cj):,l,
i=~. . . . . . m: j = l . . . . .
n.
Moreover we denote (.~j),,=l(x;),;,(xj):,l,
j= l .....
n.
Remark. Observe that .ei • {.~ [ .~:..Q---> ~(R)}. so (xi)~, and (xi),,+ may not be random variables on s"2 (cf. Section 2). Write
AS = ((aq);~),,,×,,, A,~ = ((a0),,),,,×,,, + B2 = ((b,),~, . . . , (b,,,)S) r, B 2 = ((b,)~+. . . . . (b,,,),,)+T, c,~ : ((c,),L . . . , (c,,)~,)L x,; = ((x,);, ..... (x,, ),: )".
c , +, = ( ( c , ) ,+ . . . . . x. + = ((x,). + .....
(~ = {,~" I ,4,~" <~/~, )~" ~>0}, -+
+
.~
CJ~={X, 1 A ~ X , ~ B ~ , X ~ , ~ O
+
G,, = {X, I A , X , , ~ B , , , X,,-~0}, (xj),, • { x l x : t ~ R } ,
(c,,L)+ ~, (x,,).+) ~.
j= 1.....
},
where X,, = ((x,),, . . . , (x,,),) T, n.
For convenience, we rewrite Model 2 as follows: Model 2. Find ,~'* from G such that ~,-r ~,, = min (,T~,.
(5.2)
,i" ~ G"
Definition 5.1. The ,~'*, which satisfies the condition in Model 2, is called a fuzzy pseudorandom optimization solution of Model 1. Furthermore, if ,A'* is a fuzzy random vector on/2, then it is called a fuzzy random optimization solution of Model 2. Corresponding to Model 2, we structure the following two programmings. Model 2.1. Find X* from (~2 such that ( c g ) T x * = min_ (C7,)rx,,. X,, EG;~
for any a ~ (0, 1].
(5.3)
166
Qiao Zhom?. Wang Guang-yuan / Linear programmh~g with fi~zzy random variable coefficients
Model 2.2. Find X* from Go " + such that
(5.4)
(C~+)TX * = min_ ( c . + ) Tx . , X . ~ Gg
for any a ~ (0, 1]. R e t o o k . Model 2.1, 2.2 are linear programmings with random variable coefficients, which can be solved by the simplex method given in [20]. In order to study the relation of optimization solutions of Model 2, 2.1, 2.2, we prove a lemma. Lemma 5.2. Let A >~0 (i.e. for all aq in A, we have (~q),, ~0, for any ~ ~ (0, 1]). If ?( • G, then X~, ~ (_;~, X+~ ~ (~+~for any ot • (0. 1l. Froof. Let ,~" e G. It follows, from Definitions 2.6 and 2.2, that
~0.~j
~ (/~)~,
for any a • (0, 1], i = 1. . . . . m.
or
Observe that (.~j)~, ~ 0 and (~q)- ~ 0 for any i, j. Using Definition 2.1 and Theorem 2.3, we have aNj
= or
(OoL "(,~;L = i=1
[(ao),L (a,j)~l" [(xj)7,. (x,)2l j=l +
+
[(a,j)~,(xj)~,, (aq),,(xi),, ] =
=
+
(ao)~,(xj)~,
j=l
4-
(aq),,(xj),, . 1:]
Thus we obtain
(aq)-~(x/)=<~(bi);,,
~
j=l
+ 4-<~(b,),,,+ (a,,).(x,)o
j:'l
for every i = I . . . . . m. Consequently, X ~ • C;~, X~+ • (~,+, for any a ~ (0, 1]. The following theorems show the relation of optimization solutions of Model 2, 2.1, 2.2.
Theorem 5.3. Suppose that A >-0, C >~0, G= = {XS I ff E G} and G+,,= {X+,, I f( • G}, where o t e (0, 1].
I f f(* is a fuzzy pseudorandom (resp. fuzzy random) optimization sohaion of Model 2, then for any a E (0, 1] we have (1) (X*)~ is a pseudorandom ( resp. random) optimization solution of Model 2.1. (2) (X*) + is a pseudorandom (resp. random) optimization sohaion o f Model 2.2. (3) Wf,= min_ (C~,)Tx,,,
W+~ = min_ (C~)TX,,,
X~,EG;.
X.~G,;
where 17V= min ~,'r~, X cG
17V,=[W~, W+~].
Proof. Let ,~'* be a fuzzy pseudorandom (resp. fuzzy random) optimization solution of Model 2. We have ,,~'* ~ (~
and
cT,y * = rnin C1X.
(5.5)
X~G
First of all, from Lemma 5.2 and the condition of the theorem, we can see that (x*)~ ~ C;,
+
( x * L + ~ G=,
C7, ={XF,] f f ~ G}
and
G,,-+-- { x : l x
c}
Qiao Zhong. Wang Guang-yuan
I
Linear programming with fuzzy random variable coefficients
167
Since (~ >~0, it follows, by Theorem 2.3 and Definition 2.1, that ltCT.)T(X.E, (C,.) + T( X ). . 1+ =
+ . + (c~),,(x~),,
(c~)2,(x*)2,,
= j~= i [(c,):(x;o:. (c;).+(x~).+1 - j=~ ~: [(c,)=. (c,):l. [(x;~)=. (x;~).+l = ~ (~.j),.(].). = (~.'r.(,,),,.
(5.6)
Furthermore, using Definitions 2.1, 2.6, 2.4 and Theorem 2.3, we have
rain ~T,~')____[_min (t~T,~')~.,
rain ((~T~'),+] = min ((~T,~-),
= min
( ~ ) , . (,~/), = min
= rain
•.k -t[(c/)~,(xj)~,, (c/)~,(x/),,
=
[ min
[(c~)~,,(c~)2]" [(x~)2,
,t
+T÷] (C,?,)TX;, rain (C,,,) X. X cG
LX EG
rain
= [ rain_ ( c : ) T x , ,
= rain [ ( c ; ) T x ~ ,
=
[
rain
LX;. c{Xl;
I NEG}
(C.)+ 1-X~,] +
(c;)'x;,
min
+T (Ca) x .+
(5.7)
(C:)TXct].
Equation (5.7) shows that the part (3) of this theorem is true. Synthesizing (5.5), (5.6) and (5.7), we obtain that
(C~)T(X*)~ = min_ X.~G,,
(C~t)TXa,
+T (C,,) ( X ), +, = min
(C~)TXa.
These two equalities show that (1) and (2) of this theorem are true too.
Theorem 5.4. Assume that A >>-O, C" >~O, G~, ~ {XT, I ~c ~ ~} and G,, ~ + ~ {X +=[ f ( ~ Cr}, where at ~ (0, I]. I f X " = (x;(w, a ) . . . . . x;~(w, tr)) T and X",, = (x~'(w, a t ) , . . . , x,'[(w, tr)) T are pseudorandom (resp. random) optimization soh,tions o f Model 2.1, 2.2 respectively, and there exists f(* = (~* . . . . . .~,)T such that (Z*),, =[x}(w, t~),xT(w, a ) ] f o r any t~ E (0, 1], j (where w E ~), then (1) ,~'* is a f u z z y pseudorandom (resp. f u z z y random) optimization soh, tion o f Model 2. (2) W~, = min_ (c~)Tx~,
W + - rain_ (C~)÷TX,,, X,,eG~
X,,¢ Cg
where I~' = min (~.T~.). XEG
I~o,= [W;, W,,+].
Proof. It is sufficient to prove that .~'* ~ (~
and
CT~'* = _rain_CT~'. XEG
In fact, the conditions of thi~ theorem have told us that X" ~ t~
and
(C~)TX~=
X" e G ,
and
, v , , ..~, = rain (C~+)Tx~.
rain
X,, EGF, x~ ~&~
(C~)TXo,,
(5.8) (5.9)
First of all, we prove .~'* E (~. By using Definition 2.1, Theorem 2.3 and the conditions A ~>0, .k'* ~,0, it
168
Qkto Zhong, Wang Guang-vuan / Liswar programming with fitzzy ramlom varial,h, coeJ]icienls
foilows that, for any
a E
(0, l i,
i = 1 .....
m,
( "~ a , . ~ * ) = ~ ( a , , } , , - ( . ~ * ) . = ~ l ( a , ) = , ' ,a,,,,,][x,(w, ~' . . . . . a. )..,,(w. ~)] ,'*
j=l
..
j= I
-- ~. [(a,,);,x;(,,,, ~,). (a,):,x;(w, ~)1 i=t
(a,/),,xj(w, a), '
=
+ " (w, ~ ) . (ao),,x, j=l
- +
Since X', • (; ~ and X:~ • G , , we have 0o~*
t ~ (B~),,
for a n y a • (0. 11.
Using Definition 2.2, we know that ,4X* ~/1, and 2 * i> 0, namely, 2 * e G. Furthermore, on one hand, it follows, from Definition 2.1. Theorem 2.3 and C ~0, that for any ~ •(0,11
(c'2 .). -- ~ (e,),,. (.q). = :~ I(c/),;, (c,):, 1. lx;(~,. ), x;(.,,. )1 i:t -
~.
i::1
. . . a), . . (ci),,x/(w,~ ) l l(cj).x/(w,
=
l(C,,)
'
x,,,'
(c,,) + '
X,,l. "
(5.10)
i:I
On
the other hand. by L e m m a
(~, = {X~ I 2 • (~} and G,,"÷ = {X,,X I+ proof of Theorem 5.3, we obtain
min (C~,)"X,,, min (C,+,)r x,, ].
(min C " r 2 ) = [ X" c ~
5.2 and the conditions of this theorem, we can see that e G}. Hence, for any a E (0, ll, by using the (5.7) given in
t,
L X,, ¢ ¢;,,
(5.11)
X,, ~ ¢';,;
Synthesizing (5.8), (5.9), (5.10) and (5.11), we have (c"r2*),, = (_rain t~'"2]
for any a • (0, 11.
k
Therefore, C"X* = mink ~~ C'"X, that is, part (1) of this theorem is true. From (5.11), we can see that part (2) of the theorem ~s true too. In the following, we study the dist:ibution problem of Model 2. We always assume that Model 2, 2.1, 2.2 possess the equality constraints respectively: , 4 2 = / ] , A ; . X . = B;., A . X .
= B,+~, 2 >~O, X,, >i O.
Let H~ =(A~, B~,C;,) be the r-random vector of all elements of A,~, B~, C,-~, and let H : = (A,+,, B~, C,+,) be the r-random vector of all elements of A,+~, B,+~and C,+~,where r = m + n + ran. We have the following theorem on the distribution problem of Model 2. Theorem
5.5. Suppose that Model 2.1, 2.2, 2 satisfy the same conditions given in Theorem 5.4. We
denote W(H~) = min_ (C;,)Tx,,
W(H+~)= min_ (C,)+ TX,,
X,. c eL,
x,. c t;,;
such that P({w j -:~ < W ( H ~ ) < +~}) = 1, P({w I - ~ < W(H,+,) < +:~}) = 1, and there exists w E g2 such that Rank(A~(w)) = Rank(A~(w)) = m. lf f lt. (s ) and fll:(s) are the probability density functions of H~ and H+~ respectively, then we have the followhzg conchtsions:
Qiao Zho,g, Wang Guang-vuan / Linear progrmmuing wi#Jfiezzy random variable coefficients
169
( 1) The probability distribution function FW(H..)(X)and the expectation of W(H~) may be expressed by
Fw~,,..,(x) = ~ f~ i= I
f,..(s)ds.
(5.12)
I 1 . ( , ¢ ) 111,,(w)E(V,),. W(II,,O,'))<.~. w ~ I 2 }
V,, ),
where (V:,)i (i = l . . . . .
p) is a decision region o f Model 2.1, et • (0, 1], x • R.
(2) The probabifity distribution fimction Fw(.,; )(x) and the expectation of W (H +) may be expressed by FWtH.;)(X) = J~-JflH,;~,,.) [ H,;(,.)~ (V,;),, W~H,;(,,'~)<.~-. ,,'~U~ f,,.;(s)ds,
(5.14)
E ( W ( H : ) ) : ~ ft
(5.15)
"'=
W(H'+')f";(s)ds E; )~
where (V,~,)i (j = 1. . . . . q) is a decision region o f Model 2.2, a • (0, 1], x • R.
(3) Denote 17V= min.~.~ (c'r.~'). Then the fi~zzy probability distribution fimction of W may be expressed by F~(x) = [..J ot[Fw~H,;)(x),Fwt#,,~(x)] where x e R. tvc-((hl]
(5.16)
(4) The projective distribution fimction of 1~ may be expressed by PD.(x) = Fwul,,)(x + O) - Fwclt;~(x) where Fw(,,.;)(x +0) = P({w I w E ~2, W(H~(w))~x}),
(5.17) x • R.
(5) The expectation of ~' may be expressed by E(I~)=
[._J a[E(W(H~,)),E(W(H+))].
(5.18)
By using Definitions 2.7, 2.8, Theorems 4.4, 5.4(2), we easily prove that Theorem 5.5 is true. Remark. Theorem 5.4 told us that fuzzy pseudorandom (resp. fuzzy random) optimization solutions of Model 2 can be obtained through the simplex method in [20]. The distribution problem of Model 2 can
be solved by Theorem 5.5. 6. Conclusion In this paper two cases of linear programming with fuzzy random variable coefficients have been studied, which are different from the models presented in [20]. The first case (Model 1) is the linear programming that its objective function has fuzzy random variable coefficients, but its decision vector X is a pseudorandom (or random) vector on /2. The solution of the model has been proposed. Model 1 can be transformed into a class of linear programmings with random variable coefficients which may be solved by the simplex method given in [20]. This model also may be transformed into a multiobjective linear programming with pseudorandom (or random) variable coefficients by the theory of L-R fuzzy numbers. Moreover the distribution problem on Model 1 has been treated. The second case (Model 2) is the linear programming that its objective and constraints have fuzzy random variable coefficients, and its decision vector ,~"is a fuzzy pseudorandom (or fuzzy random) vector on .(2. The model may be solved by way of two class of linear programmings with random variable
170
Qiao Zhong. Wang Guang-yuan / Linear pnJgramming with fuzzy random variable coefficients
coefficients. The distribution problem (about the fuzzy probability distribution function, projective distribution function, expectation of the optimization value) of Model 2 has been discussed. Acknowledgement The authors would like to thank Dr. Zhang Yue, Prof. H.-J. Zimmermann and referees for their critical reading of the manuscript and their many valuable suggestions for improvements. References [i] R.E. Beitman and L.A. Zadeh. Decision-making in a fuzzy environment, Mamtgetm,nt Science 17 (1971)) BI41-BI64. I21 J.J. Buckley, Multiobjective possibilistic linear programming, Fuzzy ;?,,,tsmad Systems 3S (19~)) 23-28. [3] S. Chanas, The use of parametric progr,-,mming in fuzzy linear programming, Fuzzy Sets and Systems !1 (1983) 243-251. [4] C. Carlsson and P. Korhonen, A parametric approach to fuzzy linear progra,nming, Fuzzy Sets and Systems 20 (1986) 17-3(I. [5] D. Dubois and H. Prade, Systems of linear fuzzy constraints, Fuzzy Sets and Systems 3 (1980) 37-48. [6] L.R. Foulds, Optimization Techniques: An Introduction (Springer-Verlag, New York Inc., 1981). [7] H. Kwakernaak, Fuzzy random variables- Definitions and theorems, h~formation Sciencz,s IS (1978) 1-29. [8] P. Kall, Stochastic Linear Programmhtg (Springcr-Verlag, Berlin Heidelberg, New York, 1976). [9J M.K. Luhandjula, Satisfying solutions for a possibilistic linear program, Information Sciences 40 (1986) 247-265. [10] M.K, Luhandjula, Fuzzy optimization: an appraisal. Fuzzy Sets and Systems 3t1 (1989) 257-282. [11] Y.J. Lai and C.L. Hwang, Interactive fuzzy linear programming, Fuzzy Sets and Systems 45 (lt.,~)2) 169-183. [12] R.E. Moore, Methods and Applications ofhtterual Analysis (SIAM, Philadelphia, 1979). [131 M.L. Puri and D.A. Ralescu, Fuzzy random variables, J. Math. Anal AppL 14 (1986) 4(}9-422. [14] J.K. Sengupta, Stochastic Progratnming - Methods and Applications (North-Holland Publishing Company, Amsterdam, New York, 1972). [15] M. Sakawa and H. Yano. Feasibility and Pareto optimality for multiobjective nonlinear programming problems with fuzzy parameters. Fuzzy Sets and Systems 43 (1991) 1-15. [16] H. Tanaka, T. Okuda and K. Asai. On fuzzy mathematical programming, J. Cyberm,tie~ 3 (1974) 37-46. [17] H. Tanaka, H. lchihashi and K. Asai, A formulation of fuzzy linear programming problem based on comparison of fuzzy numbers, Control and Cybernetics 13 (1984) 186-194. [18] S. Vajda, Probahili,s'tic Programming (Academic Press, New York and London, 1972). [19] Wang Guangyuan and Zhang Yue, The theory of fuzzy stochastic processes, Fuzzy Sets and Systems $1 (1992) 161-178. [20] Wang Guangyuan and Qiao Zhong, Linear programming with fuzzy random variable coefficients, F~zzy Sets and Systems $7 (1993) 295-311. 121] Wang Guangyuan, Theory of Soft Design in Engbzeering (Science Press, Beijing, China. 1992). [221 Wang Guangyuan and Wang Wenquan, Fuzzy optimum design of structure, Engineering Opth~ization 8 (1985) 291-300. [23] Yoshikazu et al., Theory of Multiobjectioe Optimization (Academic Press Inc. Orlando Fla, 1985). [241 H.-J. Zimmermann, Description and optimization of fuzzy systems, J. General Systems ~ (1976) 2119-215. [25] H.-J, Zimmermann, Fuzzy programming and linear programming with several objective functions, Fuzzy Sets and Systems 1 (1978) 45-55.