We consider the problem of partitioning a set of elements (or objects) into mutually exclusive classes (or groups), where elements which are "similar" to each other are, hopefully, located in the same class....
详细信息
ISBN:
(纸本)9781424400225
We consider the problem of partitioning a set of elements (or objects) into mutually exclusive classes (or groups), where elements which are "similar" to each other are, hopefully, located in the same class. This problem has been shown to be NP-Hard, and the literature reports solutions in which the similarity constraint consists of a single index. For example, typical "similarity" conditions that have been used in the literature include those in which "similar" objects are accessed together, or when they communicate (as processes do) with each other. In this paper, we present the first reported solution(1) to the case when the objects could be linked together in a multi-constraint manner, and indeed, visit the scenario when the constraints could, themselves, be contradictory. The solution we propose is based on the theory of estimator-based Learning Automata (LA), operating in non-stationary environments. Rather than use traditional estimates, we advocate the use of stochastic weak-estimates [2] and the specific digraph properties of the relations between the elements. Although the solutions proposed perform admirably when the number of elements is small, the simulated results demonstrate that the quality of the final solution decreases with the number of elements. Thus, although this is the first reported solution to the problem which incorporates specific digraph properties of the objects, the scalability of the solution remains open.
Several successful partitioning models and methods have been proposed and used for computational load balancing of irregularly sparse applications in a distributed-memory setting. However, the literature lacks partiti...
详细信息
Several successful partitioning models and methods have been proposed and used for computational load balancing of irregularly sparse applications in a distributed-memory setting. However, the literature lacks partitioning models and methods that encode both computational and data load balancing. In this article, we try to close this gap in the literature by proposing two hypergraph partitioning (HP) models which simultaneously encode computational and data load balancing. Both models utilize a two-constraint formulation, where the first constraint encodes the computational loads and the second constraint encodes the data loads. In the first model, we introduce explicit data vertices for encoding data load and we replicate those data vertices at each recursive bipartitioning (RB) step for encoding data replication. In the second model, we introduce a data weight distribution scheme for encoding data load and we update those weights at each RB step. The nice property of both proposed models is that they do not necessitate developing a new partitioner from scratch. Both models can easily be implemented by invoking any HP tool that supports multiconstraintpartitioning as a two-way partitioner at each RB step. The validity of the proposed models are tested on two widely used irregularly sparse applications: parallel mesh simulations and parallel sparse matrix sparse matrix multiplication. Both proposed models achieve significant improvement over a baseline model.
暂无评论