This paper is turned to the advanced Monte Carlo methods for realistic image creation. It offers a new stratified approach for solving the rendering equation. We consider the numerical solution of the rendering equati...
This paper is turned to the advanced Monte Carlo methods for realistic image creation. It offers a new stratified approach for solving the rendering equation. We consider the numerical solution of the rendering equation by separation of integration domain. The hemispherical integration domain is symmetrically separated into 16 parts. First 8 sub‐domains are equal size of orthogonal spherical triangles. They are symmetric each to other and grouped with a common vertex around the normal vector to the surface. The hemispherical integration domain is completed with more 8 sub‐domains of equal size spherical quadrangles, also symmetric each to other. All sub‐domains have fixed vertices and computable parameters. The bijections of unit square into an orthogonal spherical triangle and into a spherical quadrangle are derived and used to generate sampling points. Then, the symmetric sampling scheme is applied to generate the sampling points distributed over the hemispherical integration domain. The necessary transformations are made and the stratified Monte Carlo estimator is presented. The rate of convergence is obtained and one can see that the algorithm is of super‐convergent type.
parallel 3D reconstruction and partial retrieval from 2D image slices based on object contour structure are introduced. In this paper, 2D images are segmented into object contours. Related object contours on adjacent ...
详细信息
The Tissue MicroArray (TMA) technique is assuming even more importance. Digital images acquisition becomes fundamental to provide an automatic system for Subsequent analysis. The accuracy of the results depends on the...
详细信息
ISBN:
(纸本)9781586037383
The Tissue MicroArray (TMA) technique is assuming even more importance. Digital images acquisition becomes fundamental to provide an automatic system for Subsequent analysis. The accuracy of the results depends on the image resolution, which has to be very high in order to provide as many details as possible. Lossless formats are more suitable to bring information, but data file size become a critical factor researchers have to deal with. This affects not only storage methods but also computing times and performances. Pathologists and researchers who work with biological tissues, in particular with the TMA technique, need to consider a large number of case studies to formulate and validate their hypotheses. It is clear the importance of image sharing between different institutes worldwide to increase the amount of interesting data to work with. In this context, preserving the security of sensitive data is a fundamental issue. In most of the cases copying patient data in places different from the original database is forbidden by the owner institutes. Storage, computing and security are key problems of TMA methodology. In our system we tackle all these aspects using the EGEE (Enabling Grids for E-sciencE) Grid infrastructure. The Grid platform provides good storage, performance in imageprocessing and safety of sensitive patient information: this architecture offers hundreds of Storage and Computing Elements and enables users to handle images without copying them to physical disks other than where they have been archived by the owner giving back to end-users only the processed anonymous images. The efficiency of the TMA analysis process is obtained implementing algorithms based on functions provided by the parallelimageprocessing Genoa Library (PIMA(GE)(2) Lib). The acquisition of remotely distributed TMA images is made using specialized I/O functions based on the Grid File Access Library (GFAL) API. In our opinion this approach may represent important contribution
The proceedings contain 23 papers. The topics discussed include: challenges for formal verification in industrial setting;an easy-to-use, efficient tool-chain to analyze the availability of telecommunication equipment...
详细信息
ISBN:
(纸本)3540709517
The proceedings contain 23 papers. The topics discussed include: challenges for formal verification in industrial setting;an easy-to-use, efficient tool-chain to analyze the availability of telecommunication equipment;discovering symmetries;on combining partial order reduction with fairness assumptions;test coverage for loose timing annotations;heuristics for ioco-based test-based modelling;verified design of an automated parking garage;evaluating quality of service for service level agreements;simulation-based performance analysis of a medical image-processing architecture;a finite state modeling of AFDX frame management using spin;automated incremental synthesis of timed automata;SAT-based verification of LTL formulas;parallel SAT solving in bounded model checking;and parallel algorithms for finding SCCs in implicitly given graphs.
Convergence of computer systems and communication technologies are moving to switched high-performance modular system architectures on the basis of high-speed switched interconnections. Multi-core processors become mo...
详细信息
Convergence of computer systems and communication technologies are moving to switched high-performance modular system architectures on the basis of high-speed switched interconnections. Multi-core processors become more perspective way to high-performance system, and traditional parallel bus system architectures (VME/VXI, cPCI/PXI) are moving to new higher speed serial switched interconnections. Fundamentals in system architecture development are compact modular component strategy, low-power processor, new serial high-speed interface chips on the board, and high-speed switched fabric for SAN architectures. Overview of advanced modular concepts and new international standards for development high-performance embedded and compact modular systems for real-time applications are described. (c) 2006 Elsevier B.V. All rights reserved.
Although soft classification analyses can reduce problems such as those associated with mixed pixels that impact negatively on conventional hard classifications their accuracy is often low. One approach to increasing ...
详细信息
Although soft classification analyses can reduce problems such as those associated with mixed pixels that impact negatively on conventional hard classifications their accuracy is often low. One approach to increasing the accuracy of soft classifications is the use of an ensemble of classifiers, an approach which has been successful for hard classifications but rarely applied for soft classifications. Four methods for combining soft classifications to increase soft classification accuracy were assessed. These methods were based on (i) the selection of the most accurate predictions on a class-specific basis, (ii) the average of the outputs of the individual classifications for each case, (iii) the direct combination of classifications using evidential reasoning and (iv) the adaptation of the outputs to enable the use of a conventional (hard classification) ensemble approach. These four approaches were assessed with classifications of National Oceanic and Atmospheric Administration (NOAA) Advanced Very High-Resolution Radiometer (AVHRR) imagery of Australia. The data were classified using two neural networks and a probabilistic classifier. All four ensemble approaches applied to the outputs of these three classifiers were found to increase classification accuracy. Relative to the most accurate individual classification, the increases in overall accuracy derived ranged from 2.20% to 4.45%, increases that were statistically significant at 95% level of confidence. The results highlight that ensemble approaches may be used to significantly increase soft classification accuracy.
Large-scale numerical simulation produces datasets with ever-growing size and complexity. In particular, unstructured meshes are encountered in many applications. Volume rendering provides a way to efficiently analyze...
A difference scheme for noise removal based on four-order partial differential equations is suggested. It can approximate actual image while preserving edges and avoiding blocky effects in imageprocessing. Numerical ...
详细信息
A difference scheme for noise removal based on four-order partial differential equations is suggested. It can approximate actual image while preserving edges and avoiding blocky effects in imageprocessing. Numerical results are demonstrated its efficiency and the better choice of parameters.
The following topics were dealt with: information assurance; cryptography and network security; component-based software engineering; software specification and architecture; software testing; requirements engineering...
The following topics were dealt with: information assurance; cryptography and network security; component-based software engineering; software specification and architecture; software testing; requirements engineering; user-centered design methods; embedded systems; operating systems; image, speech, and signal processing; data mining and knowledge discovery; Internet technology and applications; artificial intelligence; natural language processing; neural networks and genetic algorithms; parallel and distributed computing; voice-over-IP; communication systems and networks; mobile/wireless/ad-hoc networks; collaborative computing; algorithms; visual and multimedia computing; web-based applications; E-commerce and its applications.
Graphical models provide a powerful formalism for statistical signal processing. Due to their sophisticated modeling capabilities, they have found applications in a variety of fields such as computer vision, image pro...
详细信息
Graphical models provide a powerful formalism for statistical signal processing. Due to their sophisticated modeling capabilities, they have found applications in a variety of fields such as computer vision, imageprocessing, and distributed sensor networks. In this the- sis we study two central signal processing problems involving Gaussian graphical models, namely modeling and estimation. The modeling problem involves learning a sparse graphi- cal model approximation to a specified distribution. The estimation problem in turn exploits this graph structure to solve high-dimensional estimation problems very efficiently. We propose a new approach for learning a thin graphical model approximation to a specified multivariate probability distribution (e. g., the empirical distribution from sample data). The selection of sparse graph structure arises naturally in our approach through the solution of a convex optimization problem, which differentiates our procedure from stan- dard combinatorial methods. In our approach, we seek the maximum entropy relaxation (MER) within an exponential family, which maximizes entropy subject to constraints that marginal distributions on small subsets of variables are close to the prescribed marginals in relative entropy. We also present a primal-dual interior point method that is scalable and tractable provided the level of relaxation is sufficient to obtain a thin graph. A crucial ele- ment of this algorithm is that we exploit sparsity of the Fisher information matrix in models defined on chordal graphs. The merits of this approach are investigated by recovering the graphical structure of some simple graphical models from sample data. Next, we present a general class of algorithms for estimation in Gaussian graphical models with arbitrary structure. These algorithms involve a sequence of inference prob- lems on tractable subgraphs over subsets of variables. This framework includes parallel iterations such as Embedded Trees, serial iterati
暂无评论