A fundamental approach in finding efficiently best routes or optimal itineraries in traffic information systems is to reduce the search space (part of graph visited) of the most commonly used shortest path routine (Di...
详细信息
Massive data streams are now fundamental to many data processing applications. For example, Internet routers produce large scale diagnostic data streams. Such streams are rarely stored in traditional databases and ins...
详细信息
Massive data streams are now fundamental to many data processing applications. For example, Internet routers produce large scale diagnostic data streams. Such streams are rarely stored in traditional databases and instead must be processed "on the fly" as they are produced. Similarly, sensor networks produce multiple data streams of observations from their sensors. There is growing focus on manipulating data streams and, hence, there is a need to identify basic operations of interest in managing data streams, and to support them efficiently. We propose computation of the Hamming norm as a basic operation of interest. The Hamming norm formalizes ideas that are used throughout data processing. When applied to a single stream, the Hamming norm gives the number of distinct items that are present in that data stream, which is a statistic of great interest in databases. When applied to a pair of streams, the Hamming norm gives an important measure of (dis)similarity: the number of unequal item counts in the two streams. Hamming norms have many uses in comparing data streams. We present a novel approximation technique for estimating the Hamming norm for massive data streams;this relies on what we call the "l(0) sketch" and we prove its accuracy. We test our approximation method on a large quantity of synthetic and real stream data, and show that the estimation is accurate to within a few percentage points.
Here we describe the equational theorem prover Barcelona, in its version that participated in the CADE-13 ATP System Competition. The system was built on top of our toolkit of data structures and algorithms for automa...
详细信息
Here we describe the equational theorem prover Barcelona, in its version that participated in the CADE-13 ATP System Competition. The system was built on top of our toolkit of data structures and algorithms for automated deduction in first-order logic with equality and was devised mainly to test the performance of this toolkit.
Hierarchical terrain models provide a multiresolution description of a topographic surface based on a nested partition of the domain. The tree-like structure of these models is an effective support to processing spati...
详细信息
Hierarchical terrain models provide a multiresolution description of a topographic surface based on a nested partition of the domain. The tree-like structure of these models is an effective support to processing spatial operations. In this paper, we consider visibility computations on hierarchical terrain models based on triangular subdivisions, called Hierarchical Triangulated Irregular Networks (HTINs). We address two basic problems in visibility computation, namely determining the visibility of a query object, and computing the viewshed of a given viewpoint. We propose algorithms for performing such operations on an HTIN at variable resolution. A general drawback of hierarchical models is in the inconsistency of representations at variable resolution obtained from them, since vertical gaps may occur at edges where different resolutions meet. The algorithms proposed here avoid this undesired effect. A related, but independent, contribution of this paper is also a new algorithm for extracting a consistent terrain representation at variable resolution from an HTIN.
Proposed is a new algorithmic approach to segmentation-based image coding. A good compromise is achieve between segmentations by quadtree-based decomposition and by free region-growing in terms of time complexity and ...
详细信息
Proposed is a new algorithmic approach to segmentation-based image coding. A good compromise is achieve between segmentations by quadtree-based decomposition and by free region-growing in terms of time complexity and scene adaptability. Encoding is to recursively partition an image into convex n-gons, 3 less-than-or-equal-to n less-than-or-equal-to 8, until the pixels in current n-gon satisfy a uniformity criterion. The recursive partition generates a valid segmentation by aligning the polygon boundaries with image edges. This segmentation is embedded into a binary tree for compact encoding of its geometry. The compressed image is sent as a labeled pointerless binary tree, and decoding is simply polygon filling. High-compression ratios are obtained by balancing the accuracy and geometric complexity of the image segmentation;a key issue for segmentation-based image coding that was not addressed before. Due to its tree structure, the new method is also suitable for progressive image coding.
暂无评论