This study explores fundamental aspects of probability theory within the framework of constructing random sequences random number generators. We propose an interpretation of probability spaces and operations on formal...
详细信息
This study explores fundamental aspects of probability theory within the framework of constructing random sequences random number generators. We propose an interpretation of probability spaces and operations on formal events the theory of formal languages, utilizing string manipulation techniques. As part of the research, we present implementation of the discussed concepts in the form of a program that generates random numbers of the required by processing signals from a sound card. Additionally, the problem of primality testing, which is particularly relevant practical cryptographic applications, is addressed. We critically examine common misconceptions regarding the properties Carmichael numbers and the application of Fermat's Little Theorem. Furthermore, we propose an efficient primality algorithm.
Kleene algebras are one of the basic algebraic structures used in computer science, involving iteration, or Kleene star. An important subclass of Kleene algebras is formed by *-continuous ones. In his 2002 paper, Dext...
详细信息
ISBN:
(数字)9783031479632
ISBN:
(纸本)9783031479625;9783031479632
Kleene algebras are one of the basic algebraic structures used in computer science, involving iteration, or Kleene star. An important subclass of Kleene algebras is formed by *-continuous ones. In his 2002 paper, Dexter Kozen pinpointed complexity of various logical theories for Kleene algebras, both in the general and in the *-continuous case. Those complexity results range from equational theories to Horn theories, or reasoning from hypotheses. In the middle, there are fragments of Horn theories, with restrictions on hypotheses. For the case when the hypotheses are commutativity conditions, i.e., commutation equations for designated pairs of atoms, however, Kozen mentioned the complexity result (Pi(0)(1)-completeness) only for the *-continuous case, while the general case remained an open question. This was the only gap in Kozen's table of results, and the present paper fills this gap. Namely, we prove that reasoning from commutativity conditions on the class of all Kleene algebras is Sigma(0)(1)-complete. In particular, this problem is undecidable.
Sorting by Genome Rearrangements is a classic problem in Computational Biology. Several models have been considered so far, each of them defines how a genome is modeled (for example, permutations when assuming no dupl...
详细信息
Sorting by Genome Rearrangements is a classic problem in Computational Biology. Several models have been considered so far, each of them defines how a genome is modeled (for example, permutations when assuming no duplicated genes, strings if duplicated genes are allowed, and/or use of signs on each element when gene orientation is known), and which rearrangements are allowed. Recently, a new problem, called Sorting by Multi-Cut Rearrangements, was proposed. It uses the k-cut rearrangement which cuts a permutation (or a string) at k >= 2 places and rearranges the generated blocks to obtain a new permutation (or string) of same size. This new rearrangement may model chromoanagenesis, a phenomenon consisting of massive simultaneous rearrangements. Similarly as the Double-Cut-and-Join, this new rearrangement also generalizes several genome rearrangements such as reversals, transpositions, revrevs, transreversals, and block-interchanges. In this paper, we extend a previous work based on unsigned permutations and strings to signed permutations. We show the complexity of this problem for different values of k, and that the approximation algorithm proposed for unsigned permutations with any value of k can be adapted to signed permutations. We also show a 1.5-approximation algorithm for the specific case k = 4, as well as a generic approximation algorithm applicable for any k >= 5, that always reaches constant ratio. The latter makes use of the cycle graph, a well-known structure in genome rearrangements. We implemented and tested the proposed algorithms on simulated data.
The problem of adopting quantum-resistant cryptographic network protocols or post-quantum cryptography (PQC) is critically important to democratizing quantum computing. The problem is urgent because practical quantum ...
详细信息
ISBN:
(纸本)9798331541378
The problem of adopting quantum-resistant cryptographic network protocols or post-quantum cryptography (PQC) is critically important to democratizing quantum computing. The problem is urgent because practical quantum computers will break classical encryption in the next few decades. Past encrypted data has already been collected and can be decrypted in the near future. The main challenges of adopting post-quantum cryptography lie in algorithmic complexity and hardware/software/network implementation. The grand question of how existing cyberinfrastructure will support post-quantum cryptography remains unanswered. This paper describes: i) the design of a novel Post-Quantum Cryptography (PQC) network instrument placed at the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign and a part of the FABRIC testbed;ii) the latest results on PQC adoption rate across a wide spectrum of network protocols (Secure Shell - SSH, Transport Layer Security - TLS, etc.);iii) the current state of PQC implementation in key scientific applications (e.g., OpenSSH or SciTokens);iv) the challenges of being quantum-resistant;and v) discussion of potential novel attacks. This is the first large-scale measurement of PQC adoption at national-scale supercomputing centers and FABRIC testbeds. Our results show that only OpenSSH and Google Chrome have successfully implemented PQC and achieved an initial adoption rate of 0.029% (6,044 out of 20,556,816) for OpenSSH connections at NCSA coming from major Internet Service Providers or Autonomous Systems (ASes) such as OARNET, GTT, Google FiberWebpass (U.S.) and Uppsala Lans Landsting (Sweden), with an overall increasing adoption rate year-over-year for 2023-2024. Our analyses identify pathways to migrate current applications to be quantum-resistant.
Continuous and discrete models as well as algorithms Bressan (2007) and Finbow and MacGillivray (2009) for firefighting problems are well-studied in Theoretical Computer Science. We introduce a new, discrete, and more...
详细信息
Continuous and discrete models as well as algorithms Bressan (2007) and Finbow and MacGillivray (2009) for firefighting problems are well-studied in Theoretical Computer Science. We introduce a new, discrete, and more general framework based on a graph to study firefighting problems in various terrains. In the context of this model, we present two different firefighting problems called village protection problems, where the goal to protect a specific set of locations and provide efficient polynomial time algorithms for their solutions. We also discuss extensions of the model. (c) 2021 Elsevier B.V. All rights reserved.
Childhood leukaemia demands meticulous blood cell analysis for diagnosis, focusing on morphological irregularities like asymmetry and abnormal cell counts. Traditional manual diagnosis via microscopic blood smear imag...
详细信息
ISBN:
(纸本)9781510673199;9781510673182
Childhood leukaemia demands meticulous blood cell analysis for diagnosis, focusing on morphological irregularities like asymmetry and abnormal cell counts. Traditional manual diagnosis via microscopic blood smear images suffers from limitations: reduced reliability, time intensiveness, and observer variability. Automated methods using Computer-Aided Diagnostic (CAD) systems address these challenges. Integrating real-time image preprocessing and segmentation ensures swift operation, reducing the CAD system's processing time and enhancing its overall effectiveness, enabling timely medical intervention and better patient outcomes. This study aims to simplify the overall algorithmic complexity of other state-of-the-art techniques using preprocessing steps, including bilateral filtering and Contrast-Limited Adaptive Histogram Equalization (CLAHE), alongside the segmentation stage involving morphological operations and the watershed algorithm. Therefore, an edge-based approach is proposed to enhance the segmentation mask as these operations aggregate time complexity. This work also describes a parallel implementation utilizing OpenMP and CUDA of the proposed method, evaluating its performance using Intersection over Union (IoU) and F1-score metrics along with computing time and algorithmic complexity. Furthermore, computing time is compared using two x86-64 and two ARM architectures, each with a different number of available CPU cores, to evaluate the behaviour of the proposed approach in different conditions. The study highlights the benefits of parallel processing in enhancing the efficiency and accuracy of White Blood Cell (WBC) Segmentation. It reports that the proposed approach has an average IoU of 0.9439 and an F1-score of 0.9697 with an image processing rate higher than 60 images per second.
We propose several constructions for the original multiplication algorithm of D.V. and G.V. Chudnovsky in order to improve its scalar complexity. We highlight the set of generic strategies who underlay the optimizatio...
详细信息
We propose several constructions for the original multiplication algorithm of D.V. and G.V. Chudnovsky in order to improve its scalar complexity. We highlight the set of generic strategies who underlay the optimization of the scalar complexity, according to parameterizable criteria. As an example, we apply this analysis to the construction of type elliptic Chudnovsky(2) multiplication algorithms for small extensions. As a case study, we significantly improve the Baum-Shokrollahi construction for multiplication in F-256/F-4.
We develop an analogue of eigenvalue methods to construct solutions of systems of tropical polynomial equalities and inequalities. We show that solutions can be obtained by solving parametric mean payoff games, arisin...
详细信息
ISBN:
(纸本)9783031645280;9783031645297
We develop an analogue of eigenvalue methods to construct solutions of systems of tropical polynomial equalities and inequalities. We show that solutions can be obtained by solving parametric mean payoff games, arising to approriate linearizations of the systems using tropical Macaulay matrices. We implemented specific algorithms adapted to the large scale parametric games that arise in this way, and present numerical experiments.
Whether and how well people can behave randomly is of interest in many areas of psychological research. The ability to generate randomness is often investigated using random number generation (RNG) tasks, in which par...
详细信息
Whether and how well people can behave randomly is of interest in many areas of psychological research. The ability to generate randomness is often investigated using random number generation (RNG) tasks, in which participants are asked to generate a sequence of numbers that is as random as possible. However, there is no consensus on how best to quantify the randomness of responses in human-generated sequences. Traditionally, psychologists have used measures of randomness that directly assess specific features of human behavior in RNG tasks, such as the tendency to avoid repetition or to systematically generate numbers that have not been generated in the recent choice history, a behavior known as cycling. Other disciplines have proposed measures of randomness that are based on a more rigorous mathematical foundation and are less restricted to specific features of randomness, such as algorithmic complexity. More recently, variants of these measures have been proposed to assess systematic patterns in short sequences. We report the first large-scale integrative study to compare measures of specific aspects of randomness with entropy-derived measures based on information theory and measures based on algorithmic complexity. We compare the ability of the different measures to discriminate between human-generated sequences and truly random sequences based on atmospheric noise, and provide a systematic analysis of how the usefulness of randomness measures is affected by sequence length. We conclude with recommendations that can guide the selection of appropriate measures of randomness in psychological research.
The Lambek calculus with the unit can be defined as the atomic theory (algebraic logic) of the class of residuated monoids. This calculus, being a theory of a broader class of algebras than Heyting ones, is weaker tha...
详细信息
The Lambek calculus with the unit can be defined as the atomic theory (algebraic logic) of the class of residuated monoids. This calculus, being a theory of a broader class of algebras than Heyting ones, is weaker than intuitionistic logic. Namely, it lacks structural rules: permutation, contraction, and weakening. We consider two extensions of the Lambek calculus with modalities-the exponential, under which all structural rules are permitted, and the relevant modality, under which only permutation and contraction rules are allowed. The Lambek calculus with a relevant modality is used in mathematical linguistics. Both calculi are algorithmically undecidable. We consider their fragments in which the modality is allowed to be applied to just formulas of Horn depth not greater than 1. We prove that these fragments are decidable and belong to the NP class. To show this, in the case of a relevant modality, we introduce a new notion of Script capital R-total derivability in context-free grammars, i.e., existence of a derivation in which each rule is used at least a given number of times. It is stated that the Script capital R-totality problem is NP-hard for context-free grammars. Also we pinpoint algorithmic complexity of Script capital R-total derivability for more general classes of generative grammars.
暂无评论