New Reoptimization Techniques applied to Steiner Tree Problem

New Reoptimization Techniques applied to Steiner Tree Problem

Electronic Notes in Discrete Mathematics 37 (2011) 387–392 www.elsevier.com/locate/endm New Reoptimization Techniques applied to Steiner Tree Problem...

181KB Sizes 0 Downloads 131 Views

Electronic Notes in Discrete Mathematics 37 (2011) 387–392 www.elsevier.com/locate/endm

New Reoptimization Techniques applied to Steiner Tree Problem Anna Zych 1 ETH Z¨urich, Switzerland

Davide Bil`o University of L’Aquila, Italy

Abstract Given an instance of an optimization problem together with an optimal solution for it, a reoptimization problem asks for a solution for a locally modified input instance. In this paper we develop new reoptimization techniques and apply them to the Steiner Tree Problem. Our techniques significantly improve the previous results and apply to a variety of reoptimization problems. Keywords: Reoptimization, Steiner Tree Problem, Approximation.

1

Introduction

In classical algorithmics, one is interested in finding good solutions to problem instances about which not much is known in advance. Many practically relevant problems, however, are computationally hard, so different approaches such as approximation algorithms are used for computing reasonably good solutions efficiently. In 1

This work was partially supported by SBF grant C 06.0108 as part of the COST 293 (GRAAL) project funded by the European Union. 1571-0653/$ – see front matter © 2011 Elsevier B.V. All rights reserved. doi:10.1016/j.endm.2011.05.066

388

A. Zych, D. Bilò / Electronic Notes in Discrete Mathematics 37 (2011) 387–392

reality, some extra knowledge about the instance might be at hand. The concept of reoptimization employs a special kind of additional knowledge: under the assumption that we are given an instance of an optimization problem together with an optimal solution for it, we want to efficiently compute a good solution for a locally modified input instance. The concept of reoptimization has been successfully applied to the Traveling Salesman Problem [1,8], various Covering Problems [4] and the Shortest Common Superstring problem [3]. A survey of reoptimization problems can be found in [7]. The Minimum Steiner Tree (SMT) problem is a very prominent optimization problem with many practical applications, especially in network design. It is APX-complete, however a variety of approximation algorithms has been proposed. The 1.834-approximation algorithm by Zelikovsky [12] achieves a running time of O(n6 ), the 1.55-approximation algorithm by Robins 156 and Zelikovsky [11] achieves O(n2 ), and the running time of the best up to date linear programming based 1.39-approximation algorithm [9], to the best of our knowledge, has never been determined. The problem of reoptimizing SMT where the local modification consists of adding or removing one vertex to/from the input graph was considered in [10]. The local modification of adding or removing a vertex to/from the terminal set was addressed in [6,2]. All these reoptimization problems are strongly NP-hard [6]. In this paper we present new reoptimization techniques (in Section 3) applicable to many problems. We apply them (in Section 4) to the problem of reoptimizing SMT for the local modification of adding a terminal. We obtain a significant improvement over the previous results [2] as shown in the table below. The adapting method gives two efficient algorithms with good approximation ratios (see A1 and A2 in the table). The Iterative Parametrized Contraction (IPC) technique improves all known approximation guarantees for the price of a worse running time. Using IPC, we obtain parametrized approximation algorithms for both adding and removing a terminal. Our IPC-based algorithms, as well as the algoritms proposed in [2], use an approximation algorithm for SMT as a subroutine. We denote its approximation ratio as σ and its running time as t(σ). In [2] the results are calculated using at that time the best up-to-date 1.55-approximation algorithm. We calculate our ratios for σ = 1.55 (for the sake of fair comparison) as well as in a general form and for currently the best σ = 1.39. Due to space limitations the terminal removal and the second adapting algorithm is addressed in the full version of the paper [5]. Previous results Addition ratio Addition running time Removal ratio

1.34 O(n3 t(σ = 1.55)) 1.4

Adapting

IPC

A1

A2

general

4 3

1.3125

3σ−2 2σ−1

+

3σ−2 2σ−1

+

O(n3 )

O(n7 )

σ = 1.55

σ = 1.39

1.262

1.2

O(n64 t(σ))

O(n16 t(σ))

1.262

1.2

A. Zych, D. Bilò / Electronic Notes in Discrete Mathematics 37 (2011) 387–392

2

389

Preliminaries

The Minimum Steiner Tree Problem (SMT) given a metric graph G = (V, E, d) and a terminal set S ⊆ V asks for a minimum subtree of G spanning S. In this abstract we focus on applying our techniques to the Minimum Steiner Tree Terminal Addition Problem. Given a metric graph G = (V, E, d), a terminal set S ⊆ V , OptS - an optimal Steiner tree for (G, S) and a new terminal t ∈ V , it asks for a minimum Steiner tree for (G, S ∪ {t}). In the remainder of this section we introduce the notation we use further. The standard notation can be found in [5]. Set S  denotes S ∪ {t} and et is the cheapest edge connecting t with a terminal in OptS . Path Pt is the cheapest path in OptS  connecting t with a terminal in S, let it be wt . The neighborhood of a vertex v ∈ V (G) in graph G is ΓG (v). The degree degG (v) equals |ΓG (v)|. By metricity we may assume that degOpt (v) ≥ 3. If H is a subgraph of G, we write H ⊆ G. S For the sake of simplicity we use H to denote the cost of H, defined as the sum of the costs of its edges. For G1 , G2 ⊆ G, let G1 + G2 = (V (G1 ) ∪ V (G2 ), E(G1 ) ∪ E(G2 )) and G1 −G2 = (V  (G1 ), E(G1 )\E(G2 )), where V  (G1 ) is V (G1 ) without the vertices isolated by removing the edges E(G2 ). For a set of vertices X ⊆ V we denote as GX the subgraph of G induced by the nodes of X. Our algorithms only work for metric graphs. This is no restriction for SMT since the Steiner tree for an arbitrarily weighted graph and for its metric closure coincide (see e. g. [6]). We view all algorithms as functions from the instance space to the solution space, providing the input in brackets if necessary.

3

Our Methods and Techniques

In this section we introduce the new reoptimization techniques on a general level. Adapting Approximation Algorithms. We illustrate this method on a minimization problem, but it is analogical when we deal with maximization. Let A be an approximation algorithm admitting an approximation ratio of α. Hence for any instance I holds A(I) ≤ αOpt(I). The first step of our method is to observe that for some algorithms holds A(I) ≤ αOpt(I) − f (I), where f (I) is a part of I and can be cheap or expensive depending on I. Clearly, if f (I) is expensive, this provides a ratio better than α. For the case when f (I) is cheap, we provide an alternative solution of cost at most Opt(I) plus (some fraction of) f (I). This solution is a simple modification of an optimum solution given for a similar problem instance. Iterative Parametrized Contraction Technique. The Iterative Parametrized Contraction (IPC) is a generalization of the shrinking technique introduced in [2]. This technique applies an approximation algorithm for SMT to the instance where a part of OptS is contracted. If we apply a σ-approximation algorithm to a graph G after contracting T ⊆ OptS into a terminal vertex x, and then substitute x with T to

390

A. Zych, D. Bilò / Electronic Notes in Discrete Mathematics 37 (2011) 387–392

obtain a solution for the initial instance, such solution costs at most σ(OptS − T ) + T . We refer further to the algorithm described above as Shrink(G, S, T ), where (G, S), the instance before contraction and T are given in input. Proposition 3.1 If T ⊆ OptS then Shrink(G, S, T ) ≤ σOptS − (σ − 1)T . The drawback of the shrinking technique is that it does not help unless T ⊆ OptS and T costs substantially more than 0. The idea is to shrink a subtree T such that we can find it in polynomial time with exhaustive search and such that we can provide an alternative solution with upper bound on the cost proportional to the cost of T . The IPC technique goes a step further by applying shrinking to polynomially many T ’s, each of size bounded by a constant parameter (hence the polynomial running time). The larger the parameter, the larger part of OptS is shrunk and thus the better bound on the approximation ratio we obtain.

4

Our Techniques and Minimum Steiner Tree Problem

In this section we apply our techniques to the scenario of adding a terminal. We further assume that degOpt  (t) ≥ 2. If we have a reoptimization algorithm A for S this case, we can easily adapt it for the case when degOpt  (t) = 1. We exhausS tively search for the only neighbor t of t in OptS  . Either t ∈ S, in which case t ∈ OptS and we return optimum given by OptS + {t, t }, or t ∈ V \ S  , in which case degOpt  −{t,t } (t ) ≥ 2. We then apply A for adding t to OptS and add {t, t }. S This solution provides an approximation ratio at least as good as the ratio of A. 4.1 Adapting Minimum Spanning Tree (Mst) Algorithm We adapt here for reoptimization an approximation algorithm for SMT. The algorithm computes Mst(GS ) and trivially admits an approximation ratio of 2. Fact 4.1 If P is a simple path in OptS then Mst(GS ) ≤ 2OptS − P . Our reoptimization algorithm computes two solutions: Sol0 = OptS + et and Mst(GS ), and returns the better of these solutions. We analyze this simple algorithm as follows. Observe that Sol0 gives a 43 -approximation if Pt ≤ 13 OptS  since et ≤ Pt and OptS ≤ OptS  . On the other hand, since degOpt  (t) ≥ 2, there is a S path P trough t in OptS  from wt ∈ S to some other terminal. Naturally, P ≥ 2Pt . Moreover, by Fact 4.1, we have Mst(GS  ) ≤ 2OptS  − P . If Pt > 13 OptS  , this gives us a solution of cost at most 2OptS  − 23 OptS  = 43 OptS  . Therefore we have a 43 -approximation for this local modification. 4.2 Employing Iterative Parametrized Contraction In this section we apply IPC to the Minimum Steiner Tree Terminal Addition problem. As described in Section 3, we define the subtrees of OptS  to be shrunk.

A. Zych, D. Bilò / Electronic Notes in Discrete Mathematics 37 (2011) 387–392

391

t T1

T2 Imaginary edges

... Tk

P1 P2 P3

P2k+1 −2

P2k+1 −1

P2k+1

Definition 4.2 Binary Expansion Trees (BETs) Let k be a fixed parameter. BETs is a series of k trees Ti rooted at t and satisfying T1 ⊆ T2 ⊆ · · · ⊆ Tk ⊆ OptS  . Let T0 = {t}. For a leaf v ∈ Ti let γvi = ∅ if v ∈ S, otherwise let γvi = {vl , vr } ⊆ ΓOpt  (v) \ V (Ti ) where vl = vr (if degOpt  (v) > 3, the choice is arbitrary). S S i expands T The tree Ti+1 i by connecting each leaf v ∈ Ti with its neighbors in γv : Ti+1 = Ti + v-leaf ofTi u∈γvi {u, v}, see the figure above. The IPC-based algorithm exhaustively tries all the candidates for Tk in G, so it searches through all binary trees of height k rooted at t. For each guess it constructs k solutions: Soli = Shrink(G, S, Ti ). Then it returns the best among these solutions and Sol0 = OptS + et . We focus the analysis on the case when Tk ⊆ OptS  .  Lemma 4.3 For any k ≥ 1 holds 2k+1 Pt ≤ OptS  + ki=1 2k−i Ti Proof. Imagine that Tk is a full binary tree with 2k non-terminal leaves. For each leaf there are at least two edge-disjoint paths in OptS  connecting it with S, so there is in total 2k+1 such paths. If we extend each of them with the path from the aforementioned leaf to t, we obtain 2k+1 paths of cost at least Pt . If we sum costs, k their k+1 k−i only the edges in BETs contribute multiple times, thus 2 Pt − i=1 2 Ti ≤ OptS  . If Tk is not a full binary tree, then we consider instead a full binary tree with imaginary edges of cost 0 (see the figure above). 2  k−i Lemma 4.4 Solk ≤ (2σ − 1)OptS  − 2k+1 (σ − 1)Pt + (σ − 1) k−1 Ti i=1 2 Proof. Lemma gives the following bound on the cost of Tk : Tk ≥ 2k+1 Pt − k−1 4.3 k−j OptS  − j=1 2 Tj . We plug this bound to the estimation of the cost of Solk provided in Proposition 3.1: Solk ≤ σOptS  − (σ − 1)Tk and obtain the lemma. 2 The analysis of the algorithm is based on the following: Sol0 ≤ OptS  + et ≤ OptS  + Pt by metricity of G, Soli ≤ σOptS  − (σ − 1) · Ti by Lemma 3.1, Solk ≤ k−i (2σ − 1)OptS  − 2k+1 (σ − 1)Pt + (σ − 1) k−1 c(Ti ) by Lemma 4.4. i=1 2 We arrange them in the table below. The rows correspond to the solutions, the columns to the variables, and the coefficients are put in the table. One of the k+1 +2k )σ−2k+1 −1 solutions has the cost of at most (2 2k+1 OptS  , the weighted average of σ−2k −1 the right sides of inequalities, where the weights are the multipliers. We can see that the approximation ratio converges to 3σ−2 . 2σ−1

392

A. Zych, D. Bilò / Electronic Notes in Discrete Mathematics 37 (2011) 387–392

O PTS 

Pt

S OL0

1

1

S OL1

σ

S OLk−1

σ

S OLk

2σ − 1

T1

...

Tk−1

2k−1

-(σ-1) −2k+1 (σ-1)

M ULTIPLIERS 2k+1 (σ-1)

2k−1 (σ-1)

-(σ-1)

2

2 (σ-1)

1

References [1] G. Ausiello, B. Escoffier, J. Monnot, and V. T. Paschos. Reoptimization of minimum and maximum traveling salesman’s tours. In SWAT 2006, volume 4059 of Lecture Notes in Comput. Sci., pages 196–207. Springer, 2006. [2] D. Bil`o, H.-J. B¨ockenhauer, J. Hromkoviˇc, R. Kr´aloviˇc, T. M¨omke, P. Widmayer, and A. Zych. Reoptimization of Steiner trees. In SWAT 2008, volume 5124 of LNCS, pages 258–269. Springer. [3] D. Bil`o, H.-J. B¨ockenhauer, D. Komm, R. Kr´alovic, T. M¨omke, S. Seibert, and A. Zych. Reoptimization of the shortest common superstring problem. In CPM, pages 78–91, 2009. [4] D. Bil`o, P. Widmayer, and A. Zych. Reoptimization of weighted graph and covering problems. In WAOA 2008, 2008. [5] D. Bil`o and A. Zych. New reoptimization techniques employed to steiner tree problem. In http://pwgrp1.inf.ethz.ch/publications/publications-2010.html. [6] H.-J. B¨ockenhauer, J. Hromkovi´c, R. Kr´alovi´c, T. M¨omke, and P. Rossmanith. Reoptimization of steiner trees: Changing the terminal set. volume 410, pages 3428– 3435. Elsevier Science Publishers Ltd., August 2009. [7] H.-J. B¨ockenhauer, J. Hromkoviˇc, T. M¨omke, and P. Widmayer. On the hardness of reoptimization. In SOFSEM 2008, volume 4910 of LNCS. Springer. [8] H.-J. B¨ockenhauer and D. Komm. Reoptimization of the metric deadline TSP. In MFCS 2008, volume 5162 of LNCS, pages 156–167. Springer, 2008. [9] J. Byrka, F. Grandoni, T. Rothvoß, and L. Sanit`a. An Improved LP-based Approximation for Steiner Tree. In STOC 2010, Best Paper Award, 2010. [10] B. Escoffier, M. Milaniˇc, and V. T. Paschos. Simple and fast reoptimizations for the Steiner tree problem. 4(2):86–94, 2009. [11] G. Robins and A. Zelikovsky. Improved Steiner tree approximation in graphs. In ACM-SIAM Symposium on Discrete Algorithms, pages 770–779. ACM, 2000. [12] A. Zelikovsky. An 11/6-approximation algorithm for the network steiner problem. Algorithmica, 9(5):463–470, 1993.