Background: A common challenge in medicine, exemplified in the analysis of biomarker data, is that large studies are needed for sufficient statistical power. Often, this may only be achievable by aggregating multiple ...
详细信息
Background: A common challenge in medicine, exemplified in the analysis of biomarker data, is that large studies are needed for sufficient statistical power. Often, this may only be achievable by aggregating multiple cohorts. However, different studies may use disparate platforms for laboratory analysis, which can hinder merging. Methods: Using circulating placental growth factor (PIGF), a potential biomarker for hypertensive disorders of pregnancy (HDP) such as preeclampsia, as an example, we investigated how such issues can be overcome by inter-platform standardization and merging algorithms. We studied 16,462 pregnancies from 22 study cohorts. PIGF measurements (gestational age >= 20 weeks) analyzed on one of four platforms: R & Systems, Alere (R) Triage, Roche (R) Elecsys or Abbott (R) Architect, were available for 13,429 women. Two merging algorithms, using Z-Score and Multiple of Median transformations, were applied. Results: Best reference curves (BRC), based on merged, transformed PIGF measurements in uncomplicated pregnancy across six gestational age groups, were estimated. Identification of HDP by these PIGF-BRCS was compared to that of platform-specific curves. Conclusions: We demonstrate the feasibility of merging PIGF concentrations from different analytical platforms. Overall BRC identification of HDP performed at least as well as platform-specific curves. Our method can be extended to any set of biomarkers obtained from different laboratory platforms in any field. Merged biomarker data from multiple studies will improve statistical power and enlarge our understanding of the pathophysiology and management of medical syndromes. (C) 2015 International Society for the Study of Hypertension in Pregnancy. Published by Elsevier B.V. All rights reserved.
This paper presents lossless prefix codes optimized with respect to a payoff criterion consisting of a convex combination of maximum codeword length and average codeword length. The optimal codeword lengths obtained a...
详细信息
This paper presents lossless prefix codes optimized with respect to a payoff criterion consisting of a convex combination of maximum codeword length and average codeword length. The optimal codeword lengths obtained are based on a new coding algorithm, which transforms the initial source probability vector into a new probability vector according to a merging rule. The coding algorithm is equivalent to a partition of the source alphabet into disjoint sets on which a new transformed probability vector is defined as a function of the initial source probability vector and scalar parameter. The payoff criterion considered encompasses a tradeoff between maximum and average codeword length;it is related to a payoff criterion consisting of a convex combination of average codeword length and average of an exponential function of the codeword length, and to an average codeword length payoff criterion subject to a limited length constraint. A special case of the first related payoff is connected to coding problems involving source probability uncertainty and codeword overflow probability, whereas the second related payoff compliments limited length Huffman coding algorithms.
The k-XOR problem can be generically formulated as the following: given many n-bit strings generated uniformly at random, find k distinct of them which XOR to zero. This generalizes collision search (two equal element...
详细信息
ISBN:
(纸本)9783030992774;9783030992767
The k-XOR problem can be generically formulated as the following: given many n-bit strings generated uniformly at random, find k distinct of them which XOR to zero. This generalizes collision search (two equal elements) to a k-tuple of inputs. This problem has become ubiquitous in cryptanalytic algorithms, including variants in which the XOR operation is replaced by a modular addition (k-SUM) or other non-commutative operations (e.g., the composition of permutations). The case where a single solution exists on average is of special importance. At EUROCRYPT 2020, Naya-Plasencia and Schrottenloher defined a class of quantum merging algorithms for the k-XOR problem, obtained by combining quantum search. They represented these algorithms by a set of merging trees and obtained the best ones through linear optimization of their parameters. In this paper, we give a simplified representation of merging trees that makes their analysis easier. We give better quantum algorithms for the Single-solution k-XOR problem by relaxing one of the previous constraints, and making use of quantum walks. Our algorithms subsume or improve over all previous quantum algorithms for Single-solution k-XOR. For example, we give an algorithm for 4-XOR (or 4-SUM) in quantum time (O) over tilde (2(7n/24)).
Meet-in-the-middle (MITM) is a general paradigm where internal states are computed along two independent paths ('forwards' and 'backwards') that are then matched. Over time, MITM attacks improved using...
详细信息
ISBN:
(纸本)9783031159817;9783031159824
Meet-in-the-middle (MITM) is a general paradigm where internal states are computed along two independent paths ('forwards' and 'backwards') that are then matched. Over time, MITM attacks improved using more refined techniques and exploiting additional freedoms and structure, which makes it more involved to find and optimize such attacks. This has led to the use of detailed attack models for generic solvers to automatically search for improved attacks, notably a MILP model developed by Bao et al. at EUROCRYPT 2021. In this paper, we study a simpler MILP modeling combining a greatly reduced attack representation as input to the generic solver, together with a theoretical analysis that, for any solution, proves the existence and complexity of a detailed attack. This modeling allows to find both classical and quantum attacks on a broad class of cryptographic permutations. First, Present-like constructions, with the permutations from the Spongent hash functions: we improve the MITM step in distinguishers by up to 3 rounds. Second, AES-like designs: despite being much simpler than Bao et al.'s, our model allows to recover the best previous results. The only limitation is that we do not use degrees of freedom from the key schedule. Third, we show that the model can be extended to target more permutations, like Feistel networks. In this context we give new Guess-and-determine attacks on reduced Simpira v2 and SPARKLE. Finally, using our model, we find several new quantum preimage and pseudo-preimage attacks (e.g. Haraka v2, Simpira v2...) targeting the same number of rounds as the classical attacks.
There is an increasing need to develop appropriate techniques for merging probabilistic knowledge bases (PKB) in knowledge-based systems. To deal with merging problems, several approaches have been put forward. Howeve...
详细信息
ISBN:
(纸本)9783030147990
There is an increasing need to develop appropriate techniques for merging probabilistic knowledge bases (PKB) in knowledge-based systems. To deal with merging problems, several approaches have been put forward. However, in the proposed models, the representation of the merged probabilistic knowledge base is not similar to the representation of original knowledge bases. The drawback of the solutions is that probabilistic constraints on the set of input knowledge bases must have the same structure and there is no algorithm for implementing the merging process. In this paper, we proposed two algorithms for merging probabilistic knowledge bases represented by various structures. To this aim, the method of constraint deduction is investigated, a set of mean merging operators is proposed and several desirable logical properties are presented and discussed. These are the basis for building algorithms. The complexity of algorithms as well as related propositions are also analysised and discussed.
This work presents a novel cooperative merging algorithm for lane reduction scenarios using decentralized control. A smart choice of a suitable formation and sliding mode techniques are used to decrease the inter-vehi...
详细信息
There exist many dataflow applications with timing constraints that require real-time guarantees on safe execution without violating their deadlines. Extraction of timing parameters (offsets, deadlines, periods) from ...
详细信息
There exist many dataflow applications with timing constraints that require real-time guarantees on safe execution without violating their deadlines. Extraction of timing parameters (offsets, deadlines, periods) from these applications enables the use of real-time scheduling and analysis techniques, and provides guarantees on satisfying timing constraints. However, existing extraction techniques require the transformation of the dataflow application from highly expressive dataflow computational models, for example, Synchronous Dataflow (SDF) and Cyclo-Static Dataflow (CSDF) to Homogeneous Synchronous Dataflow (HSDF). This transformation can lead to an exponential increase in the size of the application graph that significantly increases the runtime of the analysis. In this article, we address this problem by proposing an offline heuristic algorithm called slack-based merging. The algorithm is a novel graph reduction technique that helps in speeding up the process of timing parameter extraction and finding a feasible real-time schedule, thereby reducing the overall design time of the real-time system. It uses two main concepts: (a) the difference between the worst-case execution time of the SDF graph's firings and its timing constraints (slack) to merge firings together and generate a reducedsize HSDF graph, and (b) the novel concept of merging called safe merge, which is a merge operation that we formally prove cannot cause a live HSDF graph to deadlock. The results show that the reduced graph (1) respects the throughput and latency constraints of the original application graph and (2) typically speeds up the process of extracting timing parameters and finding a feasible real-time schedule for real-time dataflow applications. They also show that when the throughput constraint is relaxed with respect to the maximal throughput of the graph, the merging algorithm is able to achieve a larger reduction in graph size, which in turn results in a larger speedup of the real-
Nowadays, the high dynamic range (HDR) imaging represents the subject of the most researches. The major problem lies in the implementation of the best algorithm to acquire the best video quality. In fact, the major co...
详细信息
ISBN:
(纸本)9781628417647
Nowadays, the high dynamic range (HDR) imaging represents the subject of the most researches. The major problem lies in the implementation of the best algorithm to acquire the best video quality. In fact, the major constraint is to conceive an optimal fusion which must meet the rapid movement of video frames. The implemented merging algorithms were not quick enough to reconstitute the HDR video. In this paper, we detail each of the previous existing works before detailing our algorithm and presenting results from the acquired HDR images, tone mapped with various techniques. Our proposed algorithm guarantees a more enhanced and faster solution compared to the existing ones. In fact, it has the ability to calculate the saturation matrix related to the saturation rate of the neighboring pixels. The computed coefficients are affected respectively to each picture from the tested ones. This analysis provides faster and efficient results in terms of quality and brightness. The originality of our work remains on its processing method including the pixels saturation in the totality of the captured pictures and their combination in order to obtain the best pictures illustrating all the possible details. These parameters are computed for each zone depending on the contrast and the luminosity of the current pixel and its neighboring. The final HDR image's coefficients are calculated dynamically ensuring the best image quality equilibrating the brightness and contrast values and making the perfect final image.
Simplicial meshes are useful as discrete approximations of continuous spaces in numerical simulations. In some applications, however, meshes need to be modified over time. Mesh update operations are often expensive an...
详细信息
Simplicial meshes are useful as discrete approximations of continuous spaces in numerical simulations. In some applications, however, meshes need to be modified over time. Mesh update operations are often expensive and brittle, making the simulations unstable. In this paper we propose a framework for updating simplicial meshes that undergo geometric and topological changes. Instead of explicitly maintaining connectivity information, we keep a collection of weights associated with mesh vertices, using a Weighted Delaunay Triangulation (WDT). These weights implicitly define mesh connectivity and allow direct merging of triangulations. We propose two formulations for computing the weights, and two techniques for merging triangulations, and finally illustrate our results with examples in two and three dimensions.
In this work we present an approach for color image segmentation based on pixel classification. Such methods are based on the assumption that meaningful regions are defined by homogeneous colors and give rise to compa...
详细信息
ISBN:
(纸本)9783642273360;9783642273377
In this work we present an approach for color image segmentation based on pixel classification. Such methods are based on the assumption that meaningful regions are defined by homogeneous colors and give rise to compact clusters in the color space. Each cluster defined a class of pixels which share similar color properties The construction of the pixel classes is performed by detecting the modes of the color histogram of the image. To identify these modes, mathematical morphology techniques are used. The application of watersheds on the color histogram leads to an over partitioning of the color plane. which can be processed by mode merging algorithms based on mode adjacency graph analysing. Depending the merging criterion we present in this paper two merging algorithms, the first relies on the gravity centers of the modes as a merging criterion, and in the second we introduce a new merging criterion: the spatial-color compactness degree.
暂无评论