In this paper, the dynamics of heuristic algorithms for constructing small vertex covers (or independent sets) of finite-connectivity random graphs is analysed. In every algorithmic step, a vertex is chosen with respe...
详细信息
In this paper, the dynamics of heuristic algorithms for constructing small vertex covers (or independent sets) of finite-connectivity random graphs is analysed. In every algorithmic step, a vertex is chosen with respect to its vertex degree. This vertex, and some environment of it, is covered and removed from the graph. This graph reduction process can be described as a Markovian dynamics in the space of random graphs of arbitrary degree distribution. We discuss some solvable cases, including algorithms already analysed using different techniques, and develop approximation schemes for more complicated cases. The approximations are corroborated by numerical simulations.
The Watts-Strogatz algorithm of transferring the square lattice to a small world network is modified by introducing preferential rewiring constrained by connectivity demand. The evolution of the network is two-step: s...
详细信息
The Watts-Strogatz algorithm of transferring the square lattice to a small world network is modified by introducing preferential rewiring constrained by connectivity demand. The evolution of the network is two-step: sequential preferential rewiring of edges controlled by p and updating the information about changes done. The evolving system self-organizes into stationary states. The topological transition in the graph structure is noticed with respect to p. Leafy phase - a graph formed by multiple connected vertices (graph skeleton) with plenty of leaves attached to each skeleton vertex emerges when p is small enough to pretend asynchronous evolution. Tangling phase where edges of a graph circulate frequently among low degree vertices occurs when p is large. There exist conditions at which the resulting stationary network ensemble provides networks which degree distribution exhibit power-law decay in large interval of degrees.
The computational paradigm represented by Cellular Neural/nonlinear Networks (CNN) and the CNN Universal Machine (CNN-UM) as a Cellular Wave computer, gives new perspectives for computational physics. Many numerical p...
详细信息
The computational paradigm represented by Cellular Neural/nonlinear Networks (CNN) and the CNN Universal Machine (CNN-UM) as a Cellular Wave computer, gives new perspectives for computational physics. Many numerical problems and simulations can be elegantly addressed on this fully parallelized and analogic architecture. Here we study the possibility of performing stochastic simulations on this chip. First a realistic random number generator is implemented on the CNN-UM, and then as an example the two-dimensional Ising model is studied by Monte Carlo type simulations. The results obtained on an experimental version of the CNN-UM with 128 x 128 cells are in good agreement with the results obtained on digital computers. Computational time measurements suggest that the developing trend of the CNN-UM chips - increasing the lattice size and the number of local logic memories - will assure an important advantage for the CNN-UM in the near future.
The problem of secure information transmission over networks in distributed systems is considered. It is shown that protective measures that are used in such systems only on the network layer are insufficient. Analysi...
详细信息
The problem of secure information transmission over networks in distributed systems is considered. It is shown that protective measures that are used in such systems only on the network layer are insufficient. Analysis of the data packets transmitted between the system components has shown that, in the systems with high requirements for information security, the application-level security protocols should be used. It is shown that the maximum independence of security protocols from the remaining data-transmission protocols should be provided.
I have developed a simple model to study the effects of computer-aided detection (CADe) on screening mammography. The model incorporates tumor growth rate, the sensitivity of radiologists and the CADe scheme, how effe...
详细信息
I have developed a simple model to study the effects of computer-aided detection (CADe) on screening mammography. The model incorporates tumor growth rate, the sensitivity of radiologists and the CADe scheme, how effectively the radiologist uses the CADe output, and the interval cancer rate. The model shows that the additional cancers detected when the radiologists uses CADe depends on how many cancers the radiologist misses without the computer aid, how many of the missed cancers the computer can detect, and what fraction of the computer detected missed cancers the radiologist will correctly recognize as a missed cancer. I also modeled the effect of CADe on radiologist false-detection rate. Unless the computer can preferentially detect cancers over benign lesions, the increase in the number of cancers detected when using CADe will be the same as the increase in the number of women recalled (the vast majority of which will be false positives). This does not imply that CADe has no net affect. On the contrary, an equal increase in the cancer detection rate and the recall rate is consistent with radiologists operating on a higher ROC curve implying that CADe is improving the radiologists' performance. However, even if CADe was able to help radiologists reduce their miss rate by 50%, there will be only a 10% increase in the cancer detection rate before and after implementation of CADe. This increased is difficult to detect in practice because of the variable growth rate of tumors. The number of cancers present in a screened population will change year by year so that this variation can mask the actual increase in the cancer rate when CADe is implemented. However, the size of the cancers detected when using CADe is smaller than those detected by the radiologist when not using CADe. Educational Objectives: 1. Understand the factors affecting the clinical effectiveness of computer-aided detection 2. Understand some of the difficulties of measuring the clinical effectivenes
This Resource Letter provides a guide to print and electronic literature relevant to a computational physics course. The multidisciplinary aspect of computational physics and its relation to computational science is r...
详细信息
This Resource Letter provides a guide to print and electronic literature relevant to a computational physics course. The multidisciplinary aspect of computational physics and its relation to computational science is reflected in the sections Courses, Programs and Reviews, Journals, Conferences and Organizations, Books, Tools, Languages and Environments, Parallel Computing, and Digital Libraries. (c) 2008 American Association of Physics Teachers.
Purpose: One of the basic tenets of Neurosurgical planning is the ability to generate an operative approach that minimizes the disruption of normal tissues while allowing the required access to target tissues. To aid ...
详细信息
Purpose: One of the basic tenets of Neurosurgical planning is the ability to generate an operative approach that minimizes the disruption of normal tissues while allowing the required access to target tissues. To aid the surgeon in his or her ability to appreciate the location of target tissues, as well as the relationship of the target to normal tissues, graphical workstation have been employed. The introduction of Image Guidance Systems (IGS) into OR has brought along a host of new computers, infrared camera systems and radio frequency transmitters and receivers, all of which pose restrictions on the placement and operation of the other equipment needed for the operative procedure. In order to provide the advantages of IGS while avoiding the problems associated with the commercial equipment we elected to investigate a mechanical alternative, one that did not require any of the above equipment to be present with the operating room. Method and Materials: Recently, a new generation of 3-dimensional printers has been developed. These systems are capable of fabricating OR compatible objects within an hour of design. The goal of this project is to develop software that provides the surgeon with the ability to build a patient specific 3D model from a diagnostic image dataset and to then plan a surgical procedure. Utilizing the 3D patient specific model, the software designs a patient specific reference frame and fabricates the frame using rapid prototyping technology. This reference frame incorporate all necessary trajectories, including mechanical referencing to the patient, guidance for initial skin incision, trajectory alignment to the target tissues, as well as providing a mechanical platform for mounting other surgical tools. Results Conclusion: We have written and tested this new generation of software on phantom targets and are now engaged in an IRB surgical trial assessing the system's accuracy and precision.
Higher‐order Markovsequences were constructed by a digital computer, converted into electrical pulse trains, and transduced to a high‐speed auditory display by earphones. Under appropriate conditions, the depth of s...
详细信息
Higher‐order Markovsequences were constructed by a digital computer, converted into electrical pulse trains, and transduced to a high‐speed auditory display by earphones. Under appropriate conditions, the depth of sequential information processing available to the auditory system is virtually without limit.
We consider the El Farol bar problem, also known as the minority game (W. B. Arthur, The American Economic Review, 84 (1994) 406;D. Challet and Y. C. Zhang, Physica A, 256 (1998) 514). We view it as an instance of the...
详细信息
We consider the El Farol bar problem, also known as the minority game (W. B. Arthur, The American Economic Review, 84 (1994) 406;D. Challet and Y. C. Zhang, Physica A, 256 (1998) 514). We view it as an instance of the general problem of how to configure the nodal elements of a distributed dynamical system so that they do not "work at;cross purposes", in that, their collective dynamics avoids frustration and thereby achieves a provided global goal. We summarize a mathematical theory for such configuration applicable when (as in the bar problem) the global goal can be expressed as minimizing a global energy function and the nodes can be expressed as minimizers of local fi ee energy functions. We show that;a system designed with that theory performs nearly optimally for the bar problem.
Tsallis entropy introduced in 1988 is considered to have obtained new possibilities to construct generalized thermodynamical basis for statistical physics expanding classical Boltzmann-Gibbs thermodynamics for nonequi...
详细信息
Tsallis entropy introduced in 1988 is considered to have obtained new possibilities to construct generalized thermodynamical basis for statistical physics expanding classical Boltzmann-Gibbs thermodynamics for nonequilibrium states. During the last two decades this q-generalized theory has been successfully applied to considerable amount of physically interesting complex phenomena. The authors would like to present a new view on the problem of algorithms computational complexity analysis by the example of the possible thermodynamical basis of the sorting process and its dynamical behavior. A classical approach to the analysis of the amount of resources needed for algorithmic computation is based on the assumption that the contact between the algorithm and the input data stream is a simple system, because only the worst-case time complexity is considered to minimize the dependency on specific instances. Meanwhile the article shows that this process can be governed by long-range dependencies with thermodynamical basis expressed by the specific shapes of probability distributions. The classical approach does not allow to describe all properties of processes (especially the dynamical behavior of algorithms) that can appear during the computer algorithmic processing even if one takes into account the average case analysis in computational complexity. The importance of this problem is still neglected especially if one realizes two important things. The first one: nowadays computer systems work also in an interactive mode and for better understanding of its possible behavior one needs a proper thermodynamical basis. The second one: computers from mathematical point of view are Turing machines but in reality they have physical implementations that need energy for processing and the problem of entropy production appears. That is why the thermodynamical analysis of the possible behavior of the simple insertion sort algorithm will be given here.
暂无评论