Large scale matrix inversion has been widely used in many scientific research domains. As a classical method of large matrix inversion, block-based gauss-jordan (BbGJ) algorithm has aroused great concern among many re...
详细信息
Large scale matrix inversion has been widely used in many scientific research domains. As a classical method of large matrix inversion, block-based gauss-jordan (BbGJ) algorithm has aroused great concern among many researchers. Many people have developed parallel versions of BbGJ. But large granularity based task parallelism (intra-iterative data dependence based tasks parallelism) disables mapping more tasks on computing resources simultaneously. As a result, the performance of those parallel versions degrades when running on the Grid platform consisting of a lot of heterogeneous PCs and workstations. So this paper presents a fine-grained tasks based parallel BbGJ algorithm (Max-par BbGJ) in which both intra-iterative and inter-iterative based data dependences are considered. According to the analysis of data dependence in BbGJ, a global flag is adopted to control all the data dependence during the process of algorithm execution. The flag can be presented as a three-tuple. At the beginning, all the logical values of those three-tuples are set as false. When the logical value of the three-tuple becomes true, the tasks decided by the three-tuple can be executed immediately and simultaneously. Max-par BbGJ aims at achieving maximum parallelism of different parts of tasks. To evaluate its performance, YML a new high-level parallel programming tool, is introduced for its series of good features, such as components reuse, extreme ease-of-use and platform-independence. Grid environments are based on Grid' 5000 platform in France. Experiments illustrate that the better performance can be achieved through making more tasks executed simultaneously in the Max-par BbGJ. The time needed using Max-par BbGJ can save about 30% than that using conventional one. At the same time, experiments also validate that though a little overhead existing, YML is still an acceptable and easy-to-use tool for scientific researchers especially for those non-professionals to make large scale comput
暂无评论