We present here a new randomized algorithm for repairing the topology of objects represented by 3D binary digital images. By "repairing the topology", we mean a systematic way of modifying a given binary ima...
详细信息
We present here a new randomized algorithm for repairing the topology of objects represented by 3D binary digital images. By "repairing the topology", we mean a systematic way of modifying a given binary image in order to produce a similar binary image which is guaranteed to be well-composed. A 3D binary digital image is said to be well-composed if, and only if, the square faces shared by background and foreground voxels form a 2D manifold. Well-composed images enjoy some special properties which can make such images very desirable in practical applications. For instance, well-known algorithms for extracting surfaces from and thinning binary images can be simplified and optimized for speed if the input image is assumed to be well-composed. Furthermore, some algorithms for computing surface curvature and extracting adaptive triangulated surfaces, directly from the binary data, can only be applied to well-composed images. Finally, we introduce an extension of the aforementioned algorithm to repairing 3D digital multivalued images. Such an algorithm finds application in repairing segmented images resulting from multi-object segmentations of other 3D digital multivalued images.
Many algorithms and data structures employing hashing have been analyzed under the uniform hashing assumption, i.e., the assumption that hash functions behave like truly random functions. Starting with the discovery o...
详细信息
Many algorithms and data structures employing hashing have been analyzed under the uniform hashing assumption, i.e., the assumption that hash functions behave like truly random functions. Starting with the discovery of universal hash functions, many researchers have studied to what extent this theoretical ideal can be realized by hash functions that do not take up too much space and can be evaluated quickly. In this paper we present an almost ideal solution to this problem: a hash function h : U. V that, on any set of n inputs, behaves like a truly random function with high probability, can be evaluated in constant time on a RAM and can be stored in (1+epsilon) n lg vertical bar V vertical bar + O(n+lg lg vertical bar U vertical bar) bits. Here epsilon can be chosen to be any positive constant, so this essentially matches the entropy lower bound. For many hashing schemes this is the first hash function that makes their uniform hashing analysis come true, with high probability, without incurring overhead in time or space.
We study the question of finding a deepest point in an arrangement of regions and provide a fast algorithm for this problem using random sampling, showing it sufficient to solve this problem when the deepest point is ...
详细信息
We study the question of finding a deepest point in an arrangement of regions and provide a fast algorithm for this problem using random sampling, showing it sufficient to solve this problem when the deepest point is shallow. This implies, among other results, a fast algorithm for approximately solving linear programming problems with violations. We also use this technique to approximate the disk covering the largest number of red points, while avoiding all the blue points, given two such sets in the plane. Using similar techniques implies that approximate range counting queries have roughly the same time and space complexity as emptiness range queries.
In this paper, we address the problem of analyzing the performance of an electrical circuit in the presence of uncertainty in the network components. In particular, we consider the case when the uncertainties are know...
详细信息
In this paper, we address the problem of analyzing the performance of an electrical circuit in the presence of uncertainty in the network components. In particular, we consider the case when the uncertainties are known to be bounded and have probabilistic nature, and aim at evaluating the probability that a given system property holds. In contrast with the standard Monte Carlo approach, which utilizes random samples of the uncertainty to estimate "soft" bounds on this probability, we present a methodology that provides "hard" (deterministic) upper and lower bounds. To this aim, we develop an iterative algorithm, based on a property oracle, which is shown to converge asymptotically to the true probability of property satisfaction. Construction of the property oracles for specific applications in circuit analysis is explicitly presented. In particular, we study in full detail the problems of assessing the probability that the gain of a purely resistive network does not exceed a prescribed value, and of evaluating the probability of stability of an uncertain network under parameter variations. The paper is accompanied by illustrating examples and extensive numerical simulations.
Dynamical connection graph changes are inherent in networks such as peer-to-peer networks, wireless ad hoc networks, and wireless sensor networks. Considering the influence of the frequent graph changes is, thus, esse...
详细信息
Dynamical connection graph changes are inherent in networks such as peer-to-peer networks, wireless ad hoc networks, and wireless sensor networks. Considering the influence of the frequent graph changes is, thus, essential for precisely assessing the performance of applications and algorithms on such networks. In this paper, using stochastic hybrid systems (SHSs), we model the dynamics and analyze the performance of an epidemic-like algorithm, Distributed Random Grouping (DRG), for average aggregate computation on a wireless sensor network with dynamical graph changes. Particularly, we derive the convergence criteria and the upper bounds on the running time of the DRG algorithm for a set of graphs that are individually disconnected but jointly connected in time. An effective technique for the computation of a key parameter in the derived bounds is also developed. Numerical results and an application extended from our analytical results to control the graph sequences are presented to exemplify our analysis.
Recently, Charikar et al. investigated the problem of evaluating AND/OR trees, with non-uniform costs on its leaves, from the perspective of the competitive analysis. For an AND/OR tree T they presented a mu(T)-compet...
详细信息
ISBN:
(纸本)9783540212362
Recently, Charikar et al. investigated the problem of evaluating AND/OR trees, with non-uniform costs on its leaves, from the perspective of the competitive analysis. For an AND/OR tree T they presented a mu(T)-competitive deterministic polynomial time algorithm, where mu(T) is the number of leaves that must be read, in the worst case, in order to determine the value of T. Furthermore, they proved that mu(T) is a lower bound on the deterministic competitiveness, which assures the optimality of their algorithm. The power of randomization in this context has remained as an open question. Here, we take a step towards solving this problem by presenting a 5/6 mu(T)-competitive randomized polynomial time algorithm. This contrasts with the best known lower bound mu(T)/2. (c) 2008 Elsevier B.V. All rights reserved.
In this paper, we consider a constrained sequential resource allocation problem where an individual needs to accomplish a task by repeatedly guessing/investing a sufficient level of effort/input. If the investment fal...
详细信息
In this paper, we consider a constrained sequential resource allocation problem where an individual needs to accomplish a task by repeatedly guessing/investing a sufficient level of effort/input. If the investment falls short of a minimum required level that is unknown to the individual, she fails;with each unsuccessful attempt, the individual then increases the input and tries again until she succeeds. The objective is to complete the task with as little resources/cost as possible subject to a delay constraint. The optimal strategy lies in the proper balance between 1) selecting a level (far) below the minimum required and therefore having to try again, thus wasting resources, and 2) selecting a level (far) above the minimum required, and therefore, overshooting and wasting resources. A number of motivating applications arising from communication networks are provided. Assuming that the individual has no knowledge on the distribution of the minimum effort required to complete the task, we adopt a worst-case cost measure and a worst-case delay measure to formulate the above constrained optimization problem. We derive a class of optimal strategies, shown to be randomized, and obtain their performance as a function of the constraint.
It is becoming increasingly apparent that probabilistic approaches can overcome conservatism and computational complexity of the classical worst-case deterministic framework and may lead to designs that are actually s...
详细信息
It is becoming increasingly apparent that probabilistic approaches can overcome conservatism and computational complexity of the classical worst-case deterministic framework and may lead to designs that are actually safer. In this paper we argue that a comprehensive probabilistic robustness analysis requires a detailed evaluation of the robustness function, and we show that such an evaluation can be performed with essentially any desired accuracy and confidence using algorithms with complexity that is linear in the dimension of the uncertainty space. Moreover, we show that the average memory requirements of such algorithms are absolutely bounded and well within the capabilities of today's computers. In addition to efficiency, our approach permits control over statistical sampling error and the error due to discretization of the uncertainty radius. For a specific level of tolerance of the discretization error, our techniques provide an efficiency improvement upon conventional methods which is inversely proportional to the accuracy level;i.e., our algorithms get better as the demands for accuracy increase.
This article investigates the following problem: Given the fractional relaxation of the edge disjoint routing problem, how small a fractional congestion is sufficient to guarantee efficient edge disjoint routing? That...
详细信息
This article investigates the following problem: Given the fractional relaxation of the edge disjoint routing problem, how small a fractional congestion is sufficient to guarantee efficient edge disjoint routing? That is, what is the largest possible value v such that a fractional flow with congestion at most v, can be efficiently converted into an edge disjoint routing? Leighton, Lu, Rao, and Srinivasan (SIAM J Comput 2001) have established that fractional congestion of at most the order of O(1/(d log k)) is sufficient, where d is the maximum path length in the fractional relaxation, and k is the number of pairs to be routed. It is also known that Theta)(1/d) is the correct bound, if we are only interested in an existence result (Leighton, Rao, and Srinivasan, Hawaii International Conference on System Sciences, 1998). Motivated by the fact that d is small for many types of routing problems, specifically, polylogarithmic for expander graphs, this article improves upon the former result by showing O(11(d log d)) fractional congestion to suffice. (C) 2007 Wiley Periodicals, Inc.
Shape analysis is a fundamental problem in image processing field. In shape analysis, lines, circles, and ellipses are three important features since they often occur in the image. Based on the determined windows on t...
详细信息
Shape analysis is a fundamental problem in image processing field. In shape analysis, lines, circles, and ellipses are three important features since they often occur in the image. Based on the determined windows on the edge map, this paper first presents a novel pruning-and-voting strategy to speed up the detection for lines, circles, and ellipses. Especially, our proposed strategy can be plugged into several existing randomized algorithms to reduce the required computation time while preserving the same robustness. In addition, some related time complexity analyses are provided to show the Computational advantage of our proposed strategy. Under some real images, experimental results confirm our theoretic analyses.
暂无评论