Maximizing a convex function over convex constraints is an NP-hard problem in general. We prove that such a problem can be reformulated as an adjustable robust optimization (ARO) problem in which each adjustable varia...
详细信息
Maximizing a convex function over convex constraints is an NP-hard problem in general. We prove that such a problem can be reformulated as an adjustable robust optimization (ARO) problem in which each adjustable variable corresponds to a unique constraint of the original problem. We use ARO techniques to obtain approximate solutions to the convex maximization problem. In order to demonstrate the complete approximation scheme, we distinguish the cases in which we have just one nonlinear constraint and multiple linear constraints. Concerning the first case, we give three examples in which one can analytically eliminate the adjustable variable and approximately solve the resulting static robust optimization problem efficiently. More specifically, we show that the norm constrained log-sum-exp (geometric) maximization problem can be approximated by (convex) exponential cone optimization techniques. Concerning the second case of multiple linear constraints, the equivalent ARO problem can be represented as an adjustable robust linear optimization problem. Using linear decision rules then returns a safe approximation of the constraints. The resulting problem is a convex optimization problem, and solving this problem gives an upper bound on the global optimum value of the original problem. By using the optimal linear decision rule, we obtain a lower bound solution as well. We derive the approximation problems explicitly for quadratic maximization, geometric maximization, and sum-of-max-linear-terms maximization problems with multiple linear constraints. Numerical experiments show that, contrary to the state-of-the-art solvers, we can approximate large-scale problems swiftly with tight bounds. In several cases, we have equal upper and lower bounds, which concludes that we have global optimality guarantees in these cases.
The simplicial algorithm is a popular branch-and-bound approach to the convex maximization problem with multiple local maxima. In this paper, we discuss some difficulties revealed when implementing this algorithm unde...
详细信息
The simplicial algorithm is a popular branch-and-bound approach to the convex maximization problem with multiple local maxima. In this paper, we discuss some difficulties revealed when implementing this algorithm under the omega-subdivision rule. To overcome those, we modify the bounding process and extend the omega-subdivision rule. We also report numerical results for the simplicial algorithm according to the new subdivision rule.
We consider mathematical programming problems with the so-called piecewise convex objective functions. A solution method for this interesting and important class of nonconvex problems is presented. This method is base...
详细信息
We consider mathematical programming problems with the so-called piecewise convex objective functions. A solution method for this interesting and important class of nonconvex problems is presented. This method is based on Newton's law of universal gravitation, multicriteria optimization and Helly's theorem on convex bodies. Numerical experiments using well known classes of test problems on piecewise convex maximization, convex maximization as well as the maximum clique problem show the efficiency of the approach.
In this article we provide an algorithm, where to escape from a local maximum y of convex function f over D, we (locally) solve piecewise convex maximization max{min{f (x) - f (y), p (y) (x)} | x is an element of D} w...
详细信息
In this article we provide an algorithm, where to escape from a local maximum y of convex function f over D, we (locally) solve piecewise convex maximization max{min{f (x) - f (y), p (y) (x)} | x is an element of D} with an additional convex function p (y) (center dot). The last problem can be seen as a strictly convex improvement of the standard cutting plane technique for convex maximization. We report some computational results, that show the algorithm efficiency.
In this note we establish a relation between two bounds for convex maximization problems, the one based on a concavity cut, and the surrogate dual bound. Both bounds have been known in the literature for a few decades...
详细信息
In this note we establish a relation between two bounds for convex maximization problems, the one based on a concavity cut, and the surrogate dual bound. Both bounds have been known in the literature for a few decades but, to the authors' knowledge, the relation between them has not been previously observed in the literature.
Many important classes of decision models give rise to the problem of finding a global maximum of a convex function over a convex set. This problem is known also as concave minimization, concave programming or convex ...
详细信息
Many important classes of decision models give rise to the problem of finding a global maximum of a convex function over a convex set. This problem is known also as concave minimization, concave programming or convex maximization. Such problems can have many local maxima, therefore finding the global maximum is a computationally difficult problem, since standard nonlinear programming procedures fail. In this article, we provide a very simple and practical approach to find the global solution of quadratic convex maximization problems over a polytope. A convex function achieves its global maximum at extreme points of the feasible domain. Since an inscribed ball does not contain any extreme points of the domain, we use the largest inscribed ball for an inner approximation while a minimal enclosing box is exploited for an outer approximation of the domain. The approach is based on the use of these approximations along with the standard local search algorithm and cutting plane techniques.
In this article, we provide a global search algorithm for maximizing a piecewise convex function F over a compact D. We propose to iteratively refine the function F at local solution y by a virtual cutting function p ...
详细信息
In this article, we provide a global search algorithm for maximizing a piecewise convex function F over a compact D. We propose to iteratively refine the function F at local solution y by a virtual cutting function p (y) (a <...) and to solve max {min {F(x)-F(y),p (y) (x)}a xaD} pound instead. We call this function either a patch, when it avoids returning back to the same local solutions, or a pseudo patch, when it possibly yields a better point. It is virtual in the sense that the role of cutting constraints is played by additional convex pieces in the objective function. We report some computational results, that represent an improvement on previous linearization based techniques.
The simplicial algorithm is a kind of branch-and-bound method for computing a globally optimal solution of a convex maximization problem. Its convergence under the omega-subdivision strategy was an open question for s...
详细信息
The simplicial algorithm is a kind of branch-and-bound method for computing a globally optimal solution of a convex maximization problem. Its convergence under the omega-subdivision strategy was an open question for some decades until Locatelli and Raber proved it (J Optim Theory Appl 107: 69-79, 2000). In this paper, we modify their linear programming relaxation and give a different and simpler proof of the convergence. We also develop a new convergent subdivision strategy, and report numerical results of comparing it with existing strategies.
In this paper we consider the problem of assigning frequencies to mobile terminals in a cellular network. We show that an optimal solution can be obtained by solving a sequence of alternating linear and quadratic maxi...
详细信息
In this paper we consider the problem of assigning frequencies to mobile terminals in a cellular network. We show that an optimal solution can be obtained by solving a sequence of alternating linear and quadratic maximization programming problems. We address co-channel constraints and adopt as an objective function the maximization of potentially established calls. Our algorithm is fairly general, and does not depend on any special network structure. This study indicates that mathematical programming can be used as an efficient technique for solving the aforementioned problem.
We prove polynomial-time solvability of a large class of clustering problems where a weighted set of items has to be partitioned into clusters with respect to some balancing constraints. The data points are weighted w...
详细信息
We prove polynomial-time solvability of a large class of clustering problems where a weighted set of items has to be partitioned into clusters with respect to some balancing constraints. The data points are weighted with respect to different features and the clusters adhere to given lower and upper bounds on the total weight of their points with respect to each of these features. Further the weight-contribution of a vector to a cluster can depend on the cluster it is assigned to. Our interest in these types of clustering problems is motivated by an application in land consolidation where the ability to perform this kind of balancing is crucial. Our framework maximizes an objective function that is convex in the summed up utility of the items in each cluster. Despite hardness of convex maximization and many related problems, for fixed dimension and number of clusters, we are able to show that our clustering model is solvable in time polynomial in the number of items if the weight-balancing restrictions are defined using vectors from a fixed, finite domain. We conclude our discussion with a new, efficient model and algorithm for land consolidation. (C) 2016 Elsevier B.V. All rights reserved.
暂无评论