International Congress Series 1269 (2004) 161 – 164
www.ics-elsevier.com
Constraint satisfaction problems and neurocomputing Masahiro Nagamatu *, Takahiro Nakano, Kairong Zhang Division of Neural Information Processing, Department of Brain Science and Engineering, Graduate School of Life Science and Systems Engineering, Kyushu Institute of Technology, 2-4 Hibikino, Wakamatsu, Kitakyushu, Fukuoka 808-0196, Japan
Abstract. CSP is an important problem in computer science. However, the CSP is also a well-known NP-complete problem. Local search, which is a representative of incomplete methods, can usually find a solution considerably faster than complete methods. However, the local search has a serious drawback of being trapped by local minima. We proposed a neural network called Lagrange programming neural network with polarized high-order connection (LPPH) for SAT. The LPPH solves a problem called CONSAT, which is a continuous valued variant of the SAT. Experimental results show that the LPPH can find the solution effectively without using any stochastic moves or multiple restarts. In this paper, we propose an extension of the LPPH called LPPH-CSP for solving the CSP. We show the effectiveness of the LPPH-CSP through experiments. The LPPH(-CSP) can update the values of all variables simultaneously. This is an advantage for VLSI implementation. The LPPH(-CSP) is also very flexible and robust to deal with uncertainty or errors included in the problem specifications or requirements for solutions. We think these properties are very useful for brain-like computing and its hardware realization. D 2004 Published by Elsevier B.V. Keywords: Satisfiability problem; Constraint satisfaction; Local minima; Lagrangian method; Neurocomputing; VLSI implementation
1. Introduction CSP is a problem to find a solution, i.e., a value assignment to all variables that satisfies all given constraints. The CSP can represent many problems in computer science and its applications. However, this problem is a well-known NP-complete problem. The local search, which is a representative of incomplete methods, can usually find a solution considerably faster than the complete methods. However, the local search has a serious drawback of being trapped by local minima. We proposed a neural network called Lagrange programming neural network with polarized high-order connection (LPPH [1]) for the SAT. The SAT is a problem to find an assignment of truth-values to variables, which satisfies the given conjunctive normal form (CNF). In our approach, at first the SAT is converted to an equivalent, continuous, valued * Corresponding author. Tel./fax: +81-93-695-6088. E-mail address:
[email protected] (M. Nagamatu). 0531-5131/ D 2004 Published by Elsevier B.V. doi:10.1016/j.ics.2004.05.013
162
M. Nagamatu et al. / International Congress Series 1269 (2004) 161–164
problem called CONSAT and then, the LPPH is used to find the solution of the CONSAT in the continuous state space [0,1]n. The dynamics of the LPPH have the following properties: (1) In the state space, it does not stop at any point, which is not a solution of the CONSAT. (2) If it comes near to a solution of the CONSAT, it converges to the solution. These properties do not exclude the case in which the dynamics move around forever in the state space. However, experimental results show that the LPPH can find the solution more effectively than already proposed methods, without using any stochastic moves or multiple restarts. In this paper, we propose an extension of the LPPH called LPPH-CSP for solving the CSP. We show the effectiveness of LPPH-CSP by comparing the performance with the GENET [2] through experiments. In the GENET, all variables cannot update their values simultaneously since this may cause the network to ‘‘oscillate between a small number of states indefinitely,’’ while the LPPH-CSP can update the values of all variables simultaneously. We think this is an advantage of the LPPH-CSP for the VLSI implementation. The LPPH (hence, also the LPPH-CSP) is known to be able to flexibly treat several types of requirements for the solutions [3,4]. The LPPH is also robust to deal with uncertainty or errors included in the problem specifications or requirements for solutions. We think these properties are very useful for brain-like computing and its hardware realization. 2. CSP The CSP is a problem to find a value assignment to variables, which satisfies the given constraints. It is defined by a triple (X, D, C) [5]. X={X1, X2,. . ., Xn} is a finite set of variables. D={D1, D2,. . ., Dn} is a finite set of domains. Each domain is a set of discrete values. Each variable Xi is assigned a value in Di. C={C1, C2,. . ., Cm} is a finite set of constraints. Each constraint Cr describes a restriction, which must be satisfied with respect to the value assignment of variables included in Cr. Let xij be a Boolean variable, which represents a statement that variable Xi is assigned the jth value in Di. xij is called a variable value pair (VVP). A literal is a VVP or its negation. In the former case, we say that the literal is a positive literal and in the later case, a negative literal. In this paper, we consider the following types of constraints:
ALT(n, l1, l2,. . ., lk) [at-least-n-true-constraint] l1, l2,. . ., lk are literals. This type of constraint represents a requirement that at least n of l1, l2,. . .,lk must be true. AMT(n, l1, l2,. . ., lk) [at-most-n-true-constraint] = ALT(k-n, Il1, Il2,. . ., Ilk). ALF(n, l1, l2,. . ., lk) [at-least-n-false-constraint] = ALT(n, Il1, Il2,. . ., Ilk). AMF(n, l1, l2,. . ., lk) [at-most-n-false-constraint] = ALT(k-n, Il1, Il2,. . ., Ilk). 3. SAT and LPPH The SAT can be considered to be a kind of the CSP in which each domain Di and constraint Cr are restricted to {false(0),true(1)} and ALT(1, l1, l2,. . ., lk), respectively. In the SAT, each constraint Cr = ALT(1, l1, l2,. . ., lk) = (l1 _ l2 _. . ._ lk) is called a clause. When VVP xi2 is abbreviated as xi, xi1, it can be written as Ixi. The SAT can be converted to an equivalent, continuous, valued problem called CONSAT. (CONSAT) find x
M. Nagamatu et al. / International Congress Series 1269 (2004) 161–164
163
such that hr ðxÞ ¼ 0 r ¼ 1; 2; . . . ; m; xi a½0; 1 i ¼ 1; 2; . . . ; n: If we consider xi represents the degree of certainty that the variable Xi is assigned a value ‘‘true’’, hr(x) represents the degree of unsatisfaction of the clause Cr. If the clause is satisfied, certainly hr(x) = 0, and hr(x)>0 otherwise. The dynamics of the LPPH are defined as follows: m X dxi ¼ xi ð1 xi Þ wr sri ðxÞ; i ¼ 1; 2; . . . ; n; dt r¼1 dwr ¼ awr þ hr ðxÞ; r ¼ 1; 2; . . . ; m; dt where wr is a weight of clause Cr and sri(x) is a force put on variable xi to satisfy the clause Cr. Cr. Parameter a specifies the degree of attenuation of the weights. From experiments, it is known that the CPU time to find a solution is influenced by the value of a, and the best value of a, which ranges from 0.0 to about 0.2, depends on the problem to solve. The above equations are similar to the ones obtained by applying the Lagrangian method to constrained optimization problems [6]. 4. LPPH-CSP Here we extend the LPPH to solve the CSP. Similar to the case of the SAT, we consider that each VVP xij has real value between 0 and 1, and represents the degree of certainty that variable Xi is assigned the jth value of Di. The extension of the LPPH-CSP has the following dynamics: m X dxij ¼ xij ð1 xij Þ wr srij ðxÞ; for all VVP xij ; dt r¼1 dwr ¼ awr þ hr ðxÞ; r ¼ 1; 2; . . . ; m; dt where wr is a weight of constraint Cr, srij(x) represents a force put on VVP xij to satisfy constraint Cr, and hr(x) represents a degree of unsatisfaction of the constraint Cr. For constraint Cr = ALT(n, l1, l2,. . ., lk), hr(x) and srij(x) are defined as follows: hr ðxÞ ¼ Nminðn; fgðlÞAlaCr gÞ; 8 NMinðn þ 1; fgðlÞAlaCr gÞ; > > > > > if xij is included in Cr as a positive literal; and 1 xij V hr ðxÞ; > > > > > > > hr ðxÞ; > > > > > > < if xij is included in Cr as a positive literal; and 1 xij > hr ðxÞ; srij ðxÞ ¼ NMaxðn; fgðlÞAlaCr gÞ; > > > > if xij is included in Cr as a negative literal; and xij V hr ðxÞ; > > > > > > hr ðxÞ; > > > > if x is included in C as a negative literal; and x > h ðxÞ; > > ij r ij r > > : 0; if xij is not included in Cr ;
164
M. Nagamatu et al. / International Congress Series 1269 (2004) 161–164
Fig. 1. LPPH-CSP vs. GENET for n-Queen problems.
where NMin(n,S) is the nth minimum element in set S. For other types of constraints, hr(x) and srij(x) are defined similarly. 5. Experimental results We compare the LPPH-CSP with the GENET for several kinds of CSP problems. Fig. 1 shows the average CPU time for n-Queen Problems (n = 200, 210,. . ., 300). The average is calculated by changing initial assignments 30 times for both methods. The value of the parameter a of the LPPH-CSP used in this experiment is 0.1. From the experimental results included in this graph, we can say that the LPPH-CSP is as efficient as the GENET when both methods are executed on a conventional computer. 6. Conclusions In this paper, we extend the LPPH for solving the CSP. Experimental results show that the LPPH-CSP is very efficient even in the simulation on a conventional computer. The LPPH-CSP can update all variables simultaneously. We think this is an advantage of the LPPH-CSP for neurocomputing. For future works, we plan to investigate VLSI implementation and the improvement of functions hr(x) and srij(x). References [1] M. Nagamatu, T. Yanaru, On the stability of Lagrange programming neural networks for satisfiability problems of proposi-tional calculus, Neurocomputing 13 (2 – 4) (1996) 119 – 133. [2] A.J. Davenport, et al., GENET: a connectionist architecture for solving constraint satisfaction problems by literative improvement, Proc. 12th National Conference on Artificial Intelligence, 1994, pp. 325 – 330. [3] M. Nagamatu, et al., Solving SAT with hint by Lagrange programming neural network, International Journal of Chaos Theory and Applications 5 (3) (2000) 11 – 21. [4] M. Nagamatu, et al., Extensions of Lagrange programming neural network for satisfiability problem and its several variations, Proceedings of 9th International Conference on Neural Information Processing (ICONIP2002), 2002, pp. 1781 – 1785. [5] E. Tsang, Foundations of Constraint Satisfaction, Academic Press, London, 1993. [6] D.P. Bertsekas, Constrained Optimization and Lagrange Multiplier Methods, Academic Press, London, 1982.