Abstract

The box-constrained weighted maximin dispersion problem is to find a point in an n-dimensional box such that the minimum of the weighted Euclidean distance from given m points is maximized. In this paper, we first reformulate the maximin dispersion problem as a non-convex quadratically constrained quadratic programming (QCQP) problem. We adopt the successive convex approximation (SCA) algorithm to solve the problem. Numerical results show that the proposed algorithm is efficient.

Highlights

  • The weighted maximin problem model with box constraints is as follows:{ } ( ) m= ax x∈χ f x : mini= 1,m ωi x − xi 2 (1){ ( ) } where χ = y ∈ Rn y12, yn2,1 T ∈κ, κ is a convex cone; x1, xm ∈ Rn are m given points; these m points are equivalent to m locations; ωi > 0 for i = 1, m and ⋅ denotes the Euclidean norm

  • The box-constrained weighted maximin dispersion problem is to find a point in an n-dimensional box such that the minimum of the weighted Euclidean distance from given m points is maximized

  • Numerical results show that the proposed algorithm is efficient

Read more

Summary

Introduction

The weighted maximin problem model with box constraints is as follows:. { ( ) } where χ = y ∈ Rn y12 , , yn , T ∈κ , κ is a convex cone; x1, , xm ∈ Rn are m given points; these m points are equivalent to m locations; ωi > 0 for i = 1, , m and ⋅ denotes the Euclidean norm. In paper [5], they consider the 2 problem of finding a point in a unit n-dimensional p - ball (p ≥ 2) such that the minimum of the weighted Euclidean distance from given m points is maximized They show in paper [6] that the SDP-relaxation-based approximation algorithm provides the first theoretical approximation bound of ( ) 1− O ln (m) n . We model the maximin dispersion problem as a Quadratically constrained quadratic programming (QCQP), noting that (1) is a non-smooth, non-convex optimization problem, because the point-wise minimum of convex quadratics is non-differentiable and non-concave We solve this problem with a general approximation framework, which is successive convex approximation (SCA), which can be summarized as follows: each quadratic component of (1) is locally linearized at the current iteration to construct its convex approximation function, so we obtain a convex subproblem.

Technical Preliminaries
Algorithm of Generation
Numerical Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call