One of the main challenges in algorithmics in general, and in Memetic Computing, in particular, is the automatic design of search algorithms. A recent advance in this direction (in terms of continuous problems) is the...
详细信息
ISBN:
(纸本)9781479944910
One of the main challenges in algorithmics in general, and in Memetic Computing, in particular, is the automatic design of search algorithms. A recent advance in this direction (in terms of continuous problems) is the development of a software prototype that builds up an algorithm based upon a problem analysis of its separability. This prototype has been called the Separability Prototype for Automatic Memes (SPAM). This article modifies the SPAM by incorporating within it an adaptive model used in hyper-heuristics for tackling optimization problems. This model, namely Adaptive Operator Selection (AOS), rewards at run time the most promising heuristics/memes so that they are more likely to be used in the following stages of the search process. The resulting framework, here referred to as SPAM-AOS, has been tested on various benchmark problems and compared with modern algorithms representing the-state-of-the-art of search for continuous problems. Numerical results show that the proposed SPAM-AOS is a promising framework that outperforms the original SPAM and other modern algorithms. Most importantly, this study shows how certain areas of Memetic Computing and Hyper-heuristics are very closely related topics and it also shows that their combination can lead to the development of powerful algorithmic frameworks.
The most compelling ideas in systems are abstractions such as virtual memory, sockets, or packet scheduling. algorithmics is the servant of abstraction, allowing system performance to approach that of the underlying h...
详细信息
ISBN:
(纸本)9781450328364
The most compelling ideas in systems are abstractions such as virtual memory, sockets, or packet scheduling. algorithmics is the servant of abstraction, allowing system performance to approach that of the underlying hardware, sometimes by using efficient algorithms but often by simply leveraging other aspects of the system. I will survey the trajectory of network algorithmics starting with a focus on speed and scale in the 1990s to measurement and security in the 2000s. While doing so, I will reflect on my experiences in choosing problems and conducting research. I will conclude by describing my passion for the emerging field of network verification and its confluence with programming language *** Varghese worked at DEC designing DECNET protocols before obtaining his Ph.D. in 1992 from MIT. He worked from 1993-1999 at Washington University and from 1999 to 2012 at UCSD, both as professor of computer science. He joined Microsoft Research as a Principal Researcher in 2012. He won the ONR Young Investigator Award in 1996, and was elected to be a Fellow of the Association for Computing Machinery (ACM) in 2002. He helped design the 40 Gbps forwarding engine for Procket Networks, subsequently acquired by Cisco Systems. His book "Network algorithmics" was published in December 2004 by Morgan-Kaufman. He co-founded NetSift Inc. in May 2004. NetSift was acquired by Cisco in 2005. He was the 2014 winner of the IEEE Koji Kobayashi Computers and Communications Award
The use of anonymity-based infrastructures and anonymisers is a plausible solution to mitigate privacy problems on the Internet. Tor (short for The onion router) is a popular low-latency anonymity system that can be i...
详细信息
The use of anonymity-based infrastructures and anonymisers is a plausible solution to mitigate privacy problems on the Internet. Tor (short for The onion router) is a popular low-latency anonymity system that can be installed as an end-user application on a wide range of operating systems to redirect the traffic through a series of anonymising proxy circuits. The construction of these circuits determines both the latency and the anonymity degree of the Tor anonymity system. While some circuit construction strategies lead to delays which are tolerated for activities like Web browsing, they can make the system vulnerable to linking attacks. We evaluate in this paper three classical strategies for the construction of Tor circuits, with respect to their de-anonymisation risk and latency performance. We then develop a new circuit selection algorithm that considerably reduces the success probability of linking attacks while keeping a good degree of performance. We finally conduct experiments on a real-world Tor deployment over Planet Lab. Our experimental results confirm the validity of our strategy and its performance increase for Web browsing. (c) 2013 Elsevier Ltd. All rights reserved.
We consider a graph-theoretical formalization of the process of gene assembly in ciliates introduced in Ehrenfeucht et al. (2003)[3], where a gene is modeled as a signed graph. The gene assembly, based on three types ...
详细信息
We consider a graph-theoretical formalization of the process of gene assembly in ciliates introduced in Ehrenfeucht et al. (2003)[3], where a gene is modeled as a signed graph. The gene assembly, based on three types of operations only, is then modeled as a graph reduction process (to the empty graph). Motivated by the robustness of the gene assembly process, the notions of parallel reduction and parallel complexity of signed graphs have been considered in Harju et al. (2006)[7]. We describe in this paper an exact algorithm for computing the parallel complexity of a given signed graph and for finding an optimal parallel reduction for it. Checking the parallel applicability of a given set of operations and scanning all possible selections amount to a high computational complexity. We also briefly discuss a faster approximate algorithm that however, cannot guarantee finding the optimal reduction. (C) 2010 Elsevier B.V. All rights reserved.
Despite the recent advances in streamlining high-throughput proteomic pipelines using tandem mass spectrometry (MS/MS), reliable identification of peptides and proteins on a larger scale has remained a challenging tas...
详细信息
Despite the recent advances in streamlining high-throughput proteomic pipelines using tandem mass spectrometry (MS/MS), reliable identification of peptides and proteins on a larger scale has remained a challenging task, still involving a considerable degree of user interaction. Recently, a number of papers have proposed computational strategies both for distinguishing poor MS/MS spectra prior to database search (pre-filtering) as well as for verifying the peptide identifications made by the search programs (post-filtering). Both of these filtering approaches can be very beneficial to the overall protein identification pipeline, since they can remove a substantial part of the time consuming manual validation work and convert large sets of MS/MS spectra into more reliable and interpretable proteome information. The choice of the filtering method depends both on the properties of the data and on the goals of the experiment. This review discusses the different pre- and post-filtering strategies available to the researchers, together with their relative merits and potential pitfalls. We also highlight some additional research topics, such as spectral denoising and statistical assessment of the identification results, which aim at further improving the coverage and accuracy of high-throughput protein identification studies.
Many ubiquitous computing systems and applications, including mobile learning ones, can make use of personalization procedures in order to support and improve universal usability. In our previous work, we have created...
详细信息
ISBN:
(纸本)9783642027093
Many ubiquitous computing systems and applications, including mobile learning ones, can make use of personalization procedures in order to support and improve universal usability. In our previous work, we have created a GUI menu model for mobile device applications, where personalization capabilities are primarily derived from the use of adaptable and adaptive techniques. In this paper we analyze from a theoretical point of view the efficiency of the two adaptation approaches and related algorithms. A task simulation framework has been developed for comparison of static and automatically adapted menus in the mobile application environment. Algorithm functionality is evaluated according to adaptivity effects provided in various mend configurations and within several classes of randomly generated navigation tasks. Simulation results thus obtained support the usage of adaptivity, which provides a valuable improvement in navigation efficiency within menu-based mobile interfaces.
Network calculus offers powerful tools to analyze the performances in communication networks, in particular to obtain deterministic bounds. This theory is based on a strong mathematical ground, notably by the use of (...
详细信息
Network calculus offers powerful tools to analyze the performances in communication networks, in particular to obtain deterministic bounds. This theory is based on a strong mathematical ground, notably by the use of (min,+) algebra. However, the algorithmic aspects of this theory have not been much addressed yet. This paper is an attempt to provide some efficient algorithms implementing network calculus operations for some classical functions. Some functions which are often used are the piecewise affine functions which ultimately have a constant growth. As a first step towards algorithmic design, we present a class containing these functions and closed under the main network calculus operations (min, max, +, -, convolution, subadditive closure, deconvolution): the piecewise affine functions which are ultimately pseudo-periodic. They can be finitely described, which enables us to propose some algorithms for each of the network calculus operations. We finally analyze their computational complexity.
If perceptually relevant multimedia methods with guaranteed performance are not developed soon, there is no hope that the problem of multimedia information overload is effectively solved. Research is needed to handle ...
详细信息
If perceptually relevant multimedia methods with guaranteed performance are not developed soon, there is no hope that the problem of multimedia information overload is effectively solved. Research is needed to handle images, music, video, and 3D models, with methods that guarantee robustness, invariance, efficiency, etc., and are also perceptually and cognitively relevant. The invention of algorithms that provably satisfy such properties is a new field of research: multimedia algorithmics.
2D-gon tilings with parallelograms are a model used in physics to study quasicrystals, and they are also important in combinatorics for the study of aperiodic structures. In this paper, we study the graph induced by t...
详细信息
2D-gon tilings with parallelograms are a model used in physics to study quasicrystals, and they are also important in combinatorics for the study of aperiodic structures. In this paper, we study the graph induced by the adjacency relation between tiles. This relation can been used to encode simply and efficiently 2D-gon tilings for algorithmic manipulation. We show for example how it can be used to sample random 2D-gon tilings. (c) 2005 Elsevier B.V. All rights reserved.
暂无评论