European Journal of Operational Research 130 (2001) 44^ mindij, where J c L and |J | = p. Let us denote with y = {x | x = set of p (out of m) locations of L} the solution space of the problem. The distance between two solutions x1 and x2 (x1; x2 e y)is equal to k, if and only if they differ in k locations. Since y is a set of sets, a (symmetric) distance function p can be defined as P(X1, X2)=|X1 \ X2| = |X2 \ X1IVX1, X2 £ 5. (2) It can easily be checked that p is a metric function in y, thus, y is a metric space. The neighborhood structures ^Vk we use are induced by metric p, i.e., k locations of facilities (k v') Ui,Vi,Zij s.t. p i=1 j=1 i1 ztj e[0, 1], i = 1,2,p, j = 1,2,n, where p facilities must be located to satisfy the demand of n users, ui, ui denote the coordinates of the ith facility, dj the Euclidean distance from (ui, ui) to the jth user, Wj the demand (or weight) of the jth user and zij the fraction of this demand, which is satisfied from the ith facility. It is easy to see that there is always an optimal solution with all zij e{0,1}, i.e., each user is satisfied from a single facility, which is the (or a) closest one. An early application of VNS to MW is given in [5]. Several other ones are discussed at length in [6]. It appears that the choice of neighborhoods is crucial. Reassignment of customers to facilities a few at a time is a poor choice, as it entails only marginal changes in the solutions considered. Much better results are obtained when the facilities themselves are moved. As they may be located anywhere in the plane target locations are needed. An easy and efficient choice is locations of customers, where there is no facility as yet. Using this neighborhood structure, several basic TS and VNS heuristics were developed and an extensive empirical study carried out to evaluate various heuristics - old, recent, and new - in a unified setting. The different methods (i.e., Genetic search, three TS variants, four VNS variants, etc.) were compared on the basis of equivalent CPU times. Results of this study indicate that the VNS can be effectively used to obtain superior solutions. For instance on a series of 20 problems with 1060 users the average error (by comparison with best known solution) is of 0.02% only for the best VNS, while it can rise to more than 20% for some well-known heuristics of the literature. Average error for a Genetic algorithm was 1.27% and for the best TS 0.13%. 2.5. MSSC Consider a set X of n entities in ^-dimensional Euclidean space: X = {xb ... , Xn}, Xj =(Xlj, ..., Xqj)(z Rq. The MSSC problem is to ind a partition of X into m disjoint subsets (or clusters) Ci, such that the sum of squared distances from each entity to the centroid Xi of its cluster is minimum, i.e., miny^ \\xt - Xi||2, Xi = — ^ xt, i=1 l:xt€Ci ni (3) where denotes the Euclidean norm and n =\d| (n H-----Ynm = n). Among many heuristics for MSSC (see [28] or [63] for surveys), the k-means local search algorithm [39,48] is the most popular. Given an initial partition, an entity Xj that belongs to the cluster Q in the current solution is assigned to some other cluster Ci, i = t. New centroids and objective function values are then updated by using simple formulae; a neighborhood of the current solution is deined by all possible such exchanges (i.e., for all i and j). A move is made if the best solution in the neighborhood is better than the current one. Otherwise, the procedure stops. Another popular heuristic is the so-called h-means algorithm [1], which is very similar to Cooper's alternating heuristic [10] for the MW problem. A new descent local search heuristic, called j-means, is proposed in [34], where the cluster centroid Xi is relocated at some entity which does not already coincide with a centroid. Since this move may be a large one and corresponds to ni reallocations (or ni k-means moves), we refer to it as a jump, hence the name j-means. Obviously, heuristic solutions obtained with the jump neighborhood could be improved by the h-means and/or k-means heuristics. Using them after j-means gives a VND heuristic called j+h+k-means. 456 P. Hansen, N. Mladenovic I European Journal of Operational Research 130 (2001) 449^67 Table 3 MSSC: Results for a 1060-entity problem; average and best results in 10 trials; stopping rule: 150 seconds for each trial Best found % Deviation from best found k-means h-means h + k-means VNS-1 VNS-2 Av. Best Av. Best Av. Best Av. Best Av. Best 10 1754840264.9 0.19 0.19 0.01 0.00 0.01 0.00 0.14 0.00 0.14 0.04 20 791925963.7 2.89 0.00 1.87 1.29 1.31 0.46 3.50 1.84 0.74 0.01 30 482302357.1 9.34 6.68 11.77 9.01 8.20 5.79 10.64 5.16 0.86 0.00 40 342844809.0 15.50 12.04 20.01 15.28 16.86 11.53 15.95 9.64 0.81 0.00 50 256892529.0 27.16 24.57 35.60 30.59 28.66 17.14 29.95 13.50 1.42 0.00 60 199151542.6 35.79 32.53 44.59 33.56 36.21 30.00 35.19 20.50 0.59 0.00 70 159781533.1 44.28 33.08 56.63 47.90 47.39 38.89 43.85 25.59 0.78 0.00 80 130038918.6 53.15 46.64 62.34 50.77 56.69 43.98 51.08 35.07 0.75 0.00 90 111322621.7 56.12 48.94 63.94 51.38 54.40 48.38 44.94 28.17 0.73 0.00 100 97352045.7 60.41 54.74 46.21 46.21 35.95 35.95 42.99 28.00 1.16 0.00 110 86287804.2 60.69 52.78 59.92 49.73 41.44 40.79 43.97 23.76 1.34 0.00 120 76380389.5 62.90 54.00 62.32 52.66 48.96 41.28 38.58 33.56 1.02 0.00 130 68417681.6 65.91 50.73 54.66 38.95 42.34 24.64 38.46 26.30 0.67 0.00 140 61727504.5 62.16 49.82 53.05 45.51 36.00 36.00 30.85 22.04 1.43 0.00 150 56679822.6 66.06 55.05 47.82 40.74 33.43 26.88 25.41 20.05 1.34 0.00 160 52210995.2 59.37 53.16 41.74 34.88 30.85 25.61 25.83 19.32 0.70 0.00 Average error 42.62 35.93 41.40 34.28 32.42 26.71 30.08 19.53 0.90 0.00 In Table 3, two variants of VNS [34] are compared with multi-start versions of k-means, h-means and h+k-means local searches based on equivalent CPU time (150 seconds). Both VNS-1 and VNS-2 are parameter free, i.e., &max = m. VNS-1 uses k-means neighborhoods (for step 2a of the basic VNS from Fig. 1) and h+k-means for step 2b. VNS-2 uses relocation neighborhoods and j+h+k-means local search. The large difference in the quality of the solutions obtained by these two VNS versions can be explained by the effectiveness of the relocation (jump) neighborhood structure. A similar difference in efficiency between reallocation and relocation neighborhoods for the MW problem is reported in [6]. 2.6. BBLP Structured global optimization problems, while having several and often many local optima, possess some particular structure, which may be exploited in heuristics or exact algorithms. One such problem, of considerable generality, is the BBLP. This problem has three sets of variables, x, y and z, with cardinalities «1, n2 and n3, respectively. When all variables of y are fixed, it becomes a linear program in x and z. When all variables of z are ixed, it becomes a linear program in x and y.It can be expressed as follows: min c^x + cf^y + z + yT Qz + p(G, G')= k. (6) In addition to these components, which follow the basic schemes of Figs. 1 and 2, AGX comprises a routine for graph analysis which performs the following: (i) recognition of the class to which belongs the extremal graph G obtained (if it is one of the series of well-known classes such as path, star, tree, complete or bipartite graph, ...); (ii) computation of various invariants for G; (iii) visualization of G on the screen using X-Windows, with interactive modiications of position of vertices, (iv) subroutines for interactive modiications of G by deletion or addition of edges or vertices. Moreover, AGX also has a routine for parametric analysis which obtains, stores, analyzes and represents values of invariants, and of functions corresponding to conjectures. VNS can be used in graph theory for several purposes: P. Hansen, N. Mladenovic I European Journal of Operational Research 130 (2001) 449^67 463 (a) Find a graph satisfying given constraints. Let 'i(G), z2(G),..., z'z(G) denote l invariants of G and pi; p2,pi values given to them. Consider the problem min/ (g) = E |'* (g)- p* i- (?) Any graph such that /(G) = 0, satisfies these constraints. Note that constraints involving formulae on several invariants can be treated in a similar way. Expression (7) shows constrained problems can be reduced to unconstrained ones. (b) Find optimal or near optimal values for an invariant subject to constraints. Let z'0(G) denote the objective function invariant and assume constraints expressed as above. Consider the problem min /(G) = z'o(G)+M£ |i*(G) -p*|, (8) " *=i where M is a constant sufficiently large to ensure that for any pair of graphs G, G £ such that E*=i I''*(G)-P*|=0 and £*= |i*(G')-p*| > 0, /(G) < /(G'). Maximum values are tackled in a similar way. Note that some invariants take integer values and remain constant on large plateaus when G is modified a little at a time, i.e., replaced sequentially by a graph in the neighborhood ^(G).It is then convenient to modify the objective function by adding a secondary objective, with a coefficient suiciently small not to change the ordering of graphs according to the first objective, and orienting the transformations in the right direction. (c) Refute a conjecture. Consider a conjecture /z(G) < g(G) where /z(G) and g(G) are formulae depending on one or more invariants of G.It corresponds to the problem mm/ (G)=g(G)-A(G)- (9) If a graph G for which /(G) < 0 is found, the conjecture is refuted. (d) Suggest a conjecture (or sharpen one). This can be done in various ways, which usually use parameterization on n or other invariants ^(G), z'2(G), z'3(G),..., ^(G). For instance consider mm/(G)=i2(G)-ii(G)- (i0) If no graph G with /(G) < 0 is found, then this suggests ^(G) < z'2(G). If the extremal graphs found belong to a recognizable class, then it may be possible to deduce a more precise inequality in 'i(G), '2(G) and n. (e) Suggest a proof. The way the extremal graphs are obtained, e.g., what transformations of G are used, may suggest strategies to prove conjectures for all graphs or for some classes of them. The Graffiti system of Fajtlowicz [i5-i9] has been used to generate hundreds of conjectures in graph theory which have been studied by many mathematicians, some of them proved and more disproved. The following conjecture is a reformulation of one of them (i.e., # 834): c5/(G) < n, (ii) where 8 denotes the minimum degree and I the average distance between pairs of distinct vertices of G. This conjecture has been refuted by AGX [9] in i53 seconds of CPU time using an edge addition, i5 edge deletions, a split, two mergings, six moves, three 2-opts, another merging and split after VNS with neighborhoods of size up to seven, and a inal edge deletion after another application of VNS with neighborhoods of size up to seven. Two other conjectures of Graffiti are also refuted in [9]. Recently five more conjectures of Graiti could be refuted by AGX in conjunction with a program to generate cubic graphs [7]. The energy £(G) of a graph G is the sum of absolute values of the eigenvalues of G's adjacency matrix. It has an important interpretation in chemistry, in connection with Huckel's molecular orbital theory [29]. A systematic study of £(G), using AGX, has been initiated in [8]. Curves of (presumably) minimum and maximum values of £(G) as functions of numbers n of vertices and m of edges of the graph considered were systematically obtained (in several days of computing time). While extremal graph were not always obtained, and sometimes better ones could be found interactively, graphs corresponding to local minima or linear portions of the curves were easily recog- 464 P. Hansen, N. Mladenovic I European Journal of Operational Research 130 (2001) 449^67 nized. AGX then led to several conjectures, including the following: 2\fm ^ E ^ 2m, which despite its simplicity does not appear to have been proposed before. These bounds could then be proved, and shown to be tight for complete bipartite graphs and for disjoint edges plus additional isolated vertices, respectively. Moreover, several conjectures related to various other graph invariants (chromatic number, independence number,... ) were obtained and some of them proved. 5. Conclusions When evaluating metaheuristics, it appears that several, possibly conflicting, criteria should be taken into account. Desirable properties of meta-heuristics include 1. Simplicity: The metaheuristic should be based on a simple and clear principle which should be widely applicable. 2. Coherence: The various steps of heuristics for particular problems should follow naturally from the principle of the metaheuristic. 3. Efficiency: Heuristics for particular problem should be eicient, i.e., provide optimal or near-optimal solutions for all or at least most realistic instances. Preferably, the heuristics should solve optimally most problems of benchmarks, when available. 4. Effectiveness: Heuristics for particular problems should provide good, perhaps optimal or near-optimal, solutions in moderate CPU time. 5. Robustness: Heuristics for various problems should be efective and eicient, and for each of these problems should give good solutions for a variety of instances, i.e., not just be fine-tuned to some training sets and less good elsewhere. 6. User-friendliness: Heuristics should be well de-ined, easy to understand and, most important, easy to use. This implies they should have few, ideally no, parameters. 7. Innovation: Preferably the principle of the metaheuristic and/or efficiency and effectiveness of heuristics derived from it should lead to new types of applications. While VNS is still in its infancy, results for a variety of problems described in this paper allow a irst assessment. Clearly, VNS is based on a simple principle, i.e., systematic change of neighborhood within the search. This principle, explained in the basic schemes of Figs. 1 and 2 is largely applicable and often in an easy way when a local search heuristic is available. (This simplicity implies that isolated steps of several heuristics or metaheuris-tics are particular cases or come close to some steps of VNS; their derivation often follows, when it is explicit, from diferent principles.) While VNS heuristics use several and sometimes many neighborhoods, their various steps stem from a common principle, i.e., hybridation is avoided and VNS is coherent. Experiments with benchmarks of large PM and MW problems show VNS is very eicient: it solves many instances exactly and otherwise gives solutions very close to the optimum. For several other problems VNS appears to perform better than recent heuristics developed by leading researchers. Moreover, VNS is efective: the best solutions are obtained in moderate computing time. This is particularly true when using VNDS. For instance, time to ind the best solution of very large instances of PM problem is less than that for a single descent with the fast interchange heuristic. Computational results on simulated and real data show VNS is robust, i.e., about equally eicient and efective for both types of problems. Moreover, VNS is very user-friendly. The basic steps are easy to understand and apply and versions without parameters (except for total CPU time allocated) are not hard to obtain, while remaining eicient and efective. Finally, VNS is innovative: on the one hand, eiciency and efectiveness of VNS heuristics for PM, MW, MSSC, and other problems are an indispensable ingredient in stabilized column generation algorithms for solving those problems exactly; and on the other hand, the basic principle of VNS naturally suggest the approach to graph theory developed in AGX. Developments of VNS irst focussed on applications of its basic principle to combinatorial optimization as well as to a few global optimization problems. More recently, several extensions were P. Hansen, N. Mladenovic I European Journal of Operational Research 130 (2001) 449^67 465 proposed. RVNS aims at solving very rapidly, if approximately, very large instances. VNDS aims at increasing precision and at reducing solution time for decomposable problems. This avenue deserves more research, particularly as there are many ways to decompose a problem. SVNS aims at making efficient the search for better solutions far from the incumbent one. It is a first step towards solution of a problem crucial to obtention of optimal or very close to optimal solution for large instances. Neighborhoods used in VND also strongly influence the efficiency of VNS, as e.g., for the MW problem. Their nature and properties should be investigated in more detail. Some comparisons of VNS with other meta-heuristics have been made and clearly there is room for many more. Hybrid heuristics have not been studied up to now. While they appear to be promising in terms of inding better solutions, particularly for large instances of diicult problems, they would also imply a decrease in simplicity and consequently in the understanding of the heuristics behavior. The same appears to be true for the introduction of additional parameters to guide the search more eiciently either in the descent or in the diversiication phase. A first study of valleys (or mountains) of the objective function and their exploration by VNS has been conducted for the weighted maximum satisiability problem. It led to the development of an eicient SVNS heuristic. Much more work on the topology of local optima and valleys appears to be desirable. VNS which does not alter the shape of valleys explored could prove to be a very good tool for that purpose. It might thus help to address the important and largely open theoretical question "Why do metaheuristics work as well as they do?" Acknowledgements Research of the authors was supported by ONR grant N00014-95-1-0917, NSERC grant OGPOO 39682 and FCAR grant 32EQ 1048. We thank Jack Brimberg, Gilles Caporossi, Michel Gendreau, Fred Glover, Brigitte Jaumard, Anderson Parreira and Dionisio Perez-Brito for discussions of VNS, Dragos Cvetkovic, Ivan Gutman, Paul Seymour and Bjarne Toft for discussions of AGX and David Schilling for providing test problems for the PM problem. References [1] M.R. Anderberg, Cluster Analysis for Application, Academic Press, New York, 1973. [2] J.E. Beasley, A note on solving large p-median problems, European Journal of Operational Research 21 (1985) 270273. [3] J.L. Bentley, Fast algorithms for geometric traveling salesman problem, ORSA Journal on Computing 4 (1992) 387-411. [4] K.D. Boese, A.B. Kahng, S. Muddu, A new adaptive multi-start technique for combinatorial global optimizations, Operational Research Letters 16 (1994) 101-113. [5] J. Brimberg, N. Mladenovic, A variable neighborhood algorithm for solving the continuous location-allocation problem, Studies in Location Analysis 10 (1996) 1-12. [6] J. Brimberg, P. Hansen, N. Mladenovic, E. Taillard, Improvements and comparison of heuristics for solving the multisource Weber problem, Operations Research 48 (3) (2000). [7] G. Brinkmann, Fast generation of cubic graphs, Journal of Graph Theory 23 (2) (1996) 139-149. [8] G. Caporossi, D. Cvetkovic, I. Gutman, P. Hansen, Variable neighborhood search for chemical graphs, Part 2, Graphs with extremal energy, Journal of Chemical Information and Computer Sciences 39 (1999) 984-996. [9] G. Caporossi, P. Hansen, Variable neighborhood search for extremal graphs, 1. The AutoGraphix system, Discrete Mathematics 212 (2000) 29-44. [10] L. Cooper, Location-allocation problems, Operational Research 11 (1963) 331-343. [11] M. Dorigo, V. Maniezzo, A. Colorni, The ant system: Optimization by a colony of cooperating agents, IEEE Transactions on Systems, Man, and Cybernetics - Part B 26 (1) (1996) 29-41. [12] O. du Merle, D. Villeneuve, J. Desrosiers, P. Hansen, Stabilized column generation, Discrete Applied Mathematics 194 (1999) 229-237. [13] O. du Merle, P. Hansen, B. Jaumard, N. Mladenovic, An interior point algorithm for minimum sum-of-squares clustering, SIAM Journal on Scientific Computing 21 (2000) 1485-1505. [14] T. Feo, M. Resende, Greedy randomized adaptive search, Journal of Global Optimization 6 (1995) 109-133. [15] S. Fajtlowicz, On conjectures of Graffiti, Discrete Mathematics 72 (1987) 113-118. [16] S. Fajtlowicz, On conjectures of Graffiti-II, Congressus Numerantium 60 (1987) 187-197. [17] S. Fajtlowicz, On conjectures of Graffiti-III, Congressus Numerantium 66 (1988) 23-32. 466 P. Hansen, N. Mladenovic I European Journal of Operational Research 130 (2001) 449^67 [18] S. Fajtlowicz, On conjectures of Graffiti-IV, Congressus Numerantium 70 (1990) 231-240. [19] S. Fajtlowicz, On conjectures of Graffiti-V, Seventh International Quadrennial Conference on Graph Theory 1 (1995) 367-376. [20] R.A. Fisher, The use of multiple measurements in taxo-nomic problems, in Annual Eugenics VII, Part II, 1936, pp. 179-188. [21] M. Gendreau, A. Hertz, G. Laporte, New insertion and postoptimization procedures for the traveling salesman problem, Operational Research 40 (1992) 1086-1094 1992. [22] M. Gendreau, A. Hertz, G. Laporte, The traveling salesman problem with back-hauls, Computers and Operational Research 23 (1996) 501-508. [23] F. Glover, Tabu search - Part II, ORSA Jounal of Computing 1 (1989) 190-206. [24] F. Glover, Tabu search - Part II, ORSA Journal of Computing 2 (1990) 4-32. [25] F. Glover, M. Laguna, Tabu search, in: C. Reeves (Ed.), Modern Heuristic Techniques for Combinatorial Optimization, Blackwell, Oxford, 1993, pp. 70-150. [26] F. Glover, Tabu thresholding: Improved search by nonmonotonic trajectories, ORSA Journal of Computing 7 (1995) 426-442. [27] F. Glover, M. Laguna, Tabu Search, Kluwer Academic Publishers, Boston, MA, 1997. [28] A.D. Gordon, Classification: Methods for the Exploratory Analysis of Multivariate Sata, Chapman & Hall, New York, 1981. [29] I. Gutman, O.E. Polansky, Mathematical Concepts in Organic Chemistry, Springer, Berlin, 1986. [30] P. Hansen, B. Jaumard, Algorithms for the maximum satisfiability problem, Computing 44 (1990) 279-303. [31] P. Hansen, B. Jaumard, S. Krau, O. du Merle, A stabilized column generation algorithm for the multisource Weber problem (in preparation). [32] P. Hansen, B. Jaumard, N. Mladenovic, A. Parreira, Variable neighborhood search for weighted maximum satisfiability (in preparation). [33] P. Hansen, N. Mladenovic, Variable neighborhood search for the p-median, Location Science 5 (4) (1998) 207-226. [34] P. Hansen, N. Mladenovic, J-MEANS, a new local search heuristic for minimum sum-of-squares clustering, Les Cahiers du GERAD G-99-14 and Pattern Recognition, forthcoming. [35] P. Hansen, N. Mladenovic, An introduction to variable neighbourhood search. in: S. Voss et al. (Eds.), Meta-heuristics, Advances and Trends in Local Search Paradigms for Optimization, Kluwer Academic Publishers, Dordrecht, 1999, pp. 433-458. [36] P. Hansen, N. Mladenovic, D. Perez-Brito, Variable Neighborhood Decomposition Search, Journal of Heuristics, forthcoming. [37] A. Hertz, B. Jaumard, M. Poggi de Aragao, Local optima topology for the k-coloring problem, Discrete Applied Mathematics 49 (1994) 257-280. [38] J.H. Holland, Adaptation in Natural and Artificial Systems, The University of Michigan Press, Ann Arbor, MI, 1975. [39] R.C. Jancey, Multidimensional Group Analysis, Australian Journal of Botany 14 (1966) 127-130. [40] D.S. Johnson, L.A. McGeoch, The traveling salesman problem: a case study in local optimization, in: E.H.L. Aarts, J.K. Lenstra (Eds.), Local Search in Combinatorial Optimization, Wiley, London, 1997, pp. 215-310. [41] O. Kariv, S.L. Hakimi, An algorithmic approach to network location problems; Part 2, the p-medians, SIAM Journal on Applied Mathematics 37 (1969) 539-560. [42] S. Kirkpatrick, C.D. Gelatt, M. Vecchi, Optimization by simulated annealing, Science 220 (1983) 671-680. [43] S. Kirkpatrick, G. Toulouse, Coniguration space analysis of traveling salesman problems, Journal de Physique 46 (1985) 1277-1292. [44] S. Krau, Extensions du probleme de Weber, Ph D. Thesis, Ecole Polytechnique de Montreal (under direction of P. Hansen and B. Jaumard), 1997. [45] R.E. Kuenne, R.M. Soland, Exact and approximate solutions to the multisource Weber problem, Mathematical Programming 3 (1972) 193-209. [46] S. Lin, Computer solutions of the traveling salesman problem, Bell Systems Technical Journal 44 (1965) 224522691965. [47] S. Lin, B.W. Kernighan, An effective heuristic algorithm for the traveling salesman problem, Operational Research 21 (1973) 498-516 1973. [48] J.B. MacQueen, Some methods for classification and analysis of multivariate observations, in: Proceedings of the Fifth Berkeley Symposium on Mathematical Statistics and Probability 1 (1967) 281-297. [49] P. Mirchandani, R. Francis (Eds.), Discrete Location Theory, Wiley, New York, 1990. [50] N. Mladenovic, A variable neighborhood algorithm: A new metaheuristic for combinatorial optimization, Abstracts of papers presented at Optimization Days, Montreal, 1995, p. 112. [51] N. Mladenovic, J.P. Moreno, J. Moreno-Vega, A chain-interchange heuristic method, Yugoslav Journal of Operational Research 6 (1) (1996) 41-54. [52] N. Mladenovic, P. Hansen, Variable neighborhood search, Computers and Operations Research 24 (1997) 1097-1100 1997. [53] I.H. Osman, N. Christoides, Capacitated clustering problems by hybrid simulated annealing and Tabu search, International Transactions on Operational Research 1 (3) (1994) 317-336. [54] I.H. Osman, G. Laporte, Metaheuristics: A bibliography, Annals of Operational Research 63 (1996) 513-628. [55] C.H. Papadimitriou, K. Steiglitz, Combinatorial Optimization, Algorithms and Complexity, Prentice-Hall, Engle- wood Clifs, NJ, 1982. [56] C. Reeves (Ed.), Modern Heuristic Techniques for Combinatorial Problems, Blackwell, Oxford, 1993. P. Hansen, N. Mladenovic I European Journal of Operational Research 130 (2001) 449^67 467 [57] G. Reinelt, TSLIB - a traveling salesman library, ORSA Journal of Computing 3 (1991) 376-384. [58] M.G.C. Resende, L.S. Pitsoulis, P.M. Pardalos, Approximate solution of weighted max-sat problems using GRASP, in: D. Du, J. Gu, P.M. Pardalos, (Eds.), Satisfiability Problem: Theory and Applications, DIMACS Series in Discrete Mathematics and Theoretical Computer Science 35, American Mathematical Society, Providence, RI, 1997. [59] E. Rolland, D.A. Schilling, J.R. Current, An efficient Tabu search procedure for the p-median problem, European Journal of Operational Research 96 (1996) 329-342. [60] K.E. Rosing, Private communication. [61] K.E. Rosing, C.S. ReVelle, Heuristic concentration: Two stage solution construction, European Journal of Operational Research 97 (1997) 75-86 1997. [62] K.E. Rosing, C.S. ReVelle, E. Rolland, D.A. Schilling, J.R. Current, Heuristic concentration and Tabu search: A head to head comparison, European Journal of Operational Research 104 (1998) 93-99. [63] H. Spath, Cluster Dissection and Analysis (Theory, Fortran Programs, Examples), Ellis Horwood, Chichester, 1985. [64] N. Thabet, Des algorithmes de generation de colonnes pour le probleme de la p-mediane, Master thesis, Ecole des HEC (under direction of P. Hansen), 1998. [65] S. Voss, A reverse elimination approach for the p-median problem, Studies in Locational Analysis 8 (1996) 49-58. [66] R. Whitaker, A fast algorithm for the greedy interchange for large-scale clustering and median location problems, INFOR 21 (1863) 95-108.