It is shown that the process of structuring of the solutions' set to the acceptable starts at the stage of the initial set building. At the same time, for a consistent structuring of the diversity of the solutions...
详细信息
ISBN:
(纸本)9781509056484
It is shown that the process of structuring of the solutions' set to the acceptable starts at the stage of the initial set building. At the same time, for a consistent structuring of the diversity of the solutions' set during the synthesis of problem-solving algorithms, it is advisable to use the principle of progressive reduction for the initial information uncertainty. At the first stage, the principle of gradual reduction of the uncertainty source for many alternative solutions is structured to multiple valid solutions while at the second stage, the set of feasible solutions is restructured to the set of efficient solutions;finally, the third phase selects a unique solution from the set of efficient solutions. The paper discusses the possibility to use fission probability distributions of the source data for the Committee (serial) synthesis algorithm solving the problem. A procedure is proposed for finding the optimal model evaluation of the corrective algorithm for composition with the basic (standard) algorithm for the purpose of mutual compensation of their errors. A number of examples illustrate the use of the formulated approach. The article presents the achieved results in improving the efficiency and the quality of the algebraizable way to solve the tasks of processing, analysis and control which is an actual and promising direction of modern information systems designing.
Service-based approach has been successfully applied to distributed environments, modelling them as pieces of functionality that exchange information by means of messages in order to achieve a common goal. The advanta...
详细信息
Service-based approach has been successfully applied to distributed environments, modelling them as pieces of functionality that exchange information by means of messages in order to achieve a common goal. The advantages of this approach can be also be applied to distributed real-time systems, increasing their flexibility and allowing the creation of new brand applications from existing services in the system. If this is an online process, then time-bounded composition algorithms are needed to not jeopardize the performance of the whole system. Different composition algorithms are studied and proposed, two of them optimal and another two based on heuristics. This paper presents an analytical solution that selects, depending on the structure of the application and on the load of the whole system, the most suitable composition algorithm to be executed in order to obtain a composed application in bounded time. Copyright (C) 2011 John Wiley & Sons, Ltd.
We study the Cutwidth problem, where the input is a graph G, and the objective is find a linear layout of the vertices that minimizes the maximum number of edges intersected by any vertical line inserted between two c...
详细信息
We study the Cutwidth problem, where the input is a graph G, and the objective is find a linear layout of the vertices that minimizes the maximum number of edges intersected by any vertical line inserted between two consecutive vertices. We give an algorithm for Cutwidth with running time O(2 (k) n (O(1))). Here k is the size of a minimum vertex cover of the input graph G, and n is the number of vertices in G. Our algorithm gives an O(2 (n/2) n (O(1))) time algorithm for Cutwidth on bipartite graphs as a corollary. This is the first non-trivial exact exponential time algorithm for Cutwidth on a graph class where the problem remains NP-complete. Additionally, we show that Cutwidth parameterized by the size of the minimum vertex cover of the input graph does not admit a polynomial kernel unless NPaS dagger coNP/poly. Our kernelization lower bound contrasts with the recent results of Bodlaender et al. (ICALP, Springer, Berlin, 2011;SWAT, Springer, Berlin, 2012) that both Treewidth and Pathwidth parameterized by vertex cover do admit polynomial kernels.
This paper presents a model for quality-of-service (QoS)-aware service composition in distributed systems with real-time and fault-tolerance requirements. This model can be applied in application domains like, for exa...
详细信息
This paper presents a model for quality-of-service (QoS)-aware service composition in distributed systems with real-time and fault-tolerance requirements. This model can be applied in application domains like, for example, remote monitoring, control and surveillance. Classic approaches to real-time systems do not provide the flexibility and fault-tolerance required in new emerging environments that need to combine a high degree of dynamism with temporal predictability. Our approach addresses these new challenges by combining concepts from the service oriented paradigm and distributed real-time systems. We propose a concrete system model based on a holistic time-triggered-based approach for design and configuration. Based on this model, we propose two algorithms for the composition of QoS-aware service-based applications with temporal requirements: an exhaustive algorithm that computes the optimal service combination in terms of a figure of merit, suitable for offline composition;and an improved algorithm based on heuristics and partial figures of merit, suitable for online composition. Experimental results show that the latter reduces dramatically the number of combinations explored with a minimal degradation in the quality of the solution, making it feasible for online execution in dynamic environments.
Preprocessing (data reduction or kernelization) to reduce instance size is one of the most commonly deployed heuristics in the implementation practice to tackle computationally hard problems. However, a systematic the...
详细信息
Preprocessing (data reduction or kernelization) to reduce instance size is one of the most commonly deployed heuristics in the implementation practice to tackle computationally hard problems. However, a systematic theoretical study of them has remained elusive so far. One of the reasons for this is that if an input to an NP-hard problem can be processed in polynomial time to an equivalent one of smaller size in general, then the preprocessing algorithm can be used to actually solve the problem in polynomial time proving P = NP, which is expected to be unlikely. However the situation regarding systematic study changed drastically with the advent of parameterized complexity. Parameterized complexity provides a natural framework to analyse preprocessing algorithms. In a parameterized problem, every instance x comes with a positive integer, or parameter, k. The problem is said to admit a kernel if, in polynomial time, we can reduce the size of the instance x to a function in k, while preserving the answer. The central notion in parameterized complexity is fixed parameter tractability (FPT), which is the notion of solvability in f(k) . p(vertical bar x vertical bar) time for any given instance (x, k), where f is an arbitrary function of the parameter k and p is a polynomial in the input size vertical bar x vertical bar. It is widely believed that a parameterized problem Pi is fixed-parameter tractable if and only if there exists a computable function g(k) such that Pi admits a kernel of size g(k). However, the kernels obtained by this theoretical result are usually of exponential (or even worse) sizes, while problem-specific data reductions often achieve quadratic- or even linear-size kernels. So a natural question for any concrete FPT problem is whether it admits polynomial time kernelization to a problem kernel that in the worst case is bounded by a polynomial function of the parameter. Despite several attempts, there are fixed-parameter tractable problems that have on
Complex behaviour generation for artificial systems requires huge implementation efforts that can be partially overcome by decomposing global tasks in simpler well specified behaviours easier to design and to tune ind...
详细信息
Complex behaviour generation for artificial systems requires huge implementation efforts that can be partially overcome by decomposing global tasks in simpler well specified behaviours easier to design and to tune independent of each other. Robots behaviour can be implemented as a set fuzzy rules that mimic expert knowledge in specific tasks as a way to model expert knowledge. Present work shows a complex navigational behaviour as it is Safe follow wall that can be decomposed in three basic behaviours. Two combination lnethods are proposed and compared; they are controlled by fuzzy rnetarules to fuse the actions derived from the predefined elementary behaviours. These actions combination/arbitration algorithms have been demonstrated in both simulated and real worlds. Some results are presented to display the major issues concerned to each of the proposed algorithms.
暂无评论