the authors define a methodology for aiding the interpretation of medical images and their automatic analysis with particular attention given to the case of bone scintigraphy. three stages are considered necessary to ...
详细信息
ISBN:
(纸本)0879425598
the authors define a methodology for aiding the interpretation of medical images and their automatic analysis with particular attention given to the case of bone scintigraphy. three stages are considered necessary to the interpretation: segmentation and contour delineation;identification of the shapes relevant to the interpretation;and medical diagnosis. the last two stages implement logic tools which use the methodology of expert systems. the differences in interpretation (inter-observer differences) can also be simulated. Withthe proposed approach, the subjectivity of the observer can be taken into account, simulated, and used to express different points of view.
the concepts and technology incorporated into the new Bales Scientific thermal image processor (TIP) a third-generation medical thermographic system are discussed. the evolution of medical thermography is chronicled w...
详细信息
ISBN:
(纸本)0879425598
the concepts and technology incorporated into the new Bales Scientific thermal image processor (TIP) a third-generation medical thermographic system are discussed. the evolution of medical thermography is chronicled with emphasis on the present clinical need for reliable high resolution imaging systems capable of full image manipulation, analysis nodes, and database management. It is shown how this system allows the user to select the frame rate, field-of-view, and image resolution and to optically zoom any area of interest to suit a specific application. the Bales TIP provides the user with extensive diagnostic tools including high-low-average temperature, thermal gradients, histograms, fast Fourier transforms, real-time total image flip, and image subtraction. the patient database includes mini-image viewing and the ability to store normal standards data for use in study comparisons.
In order to develop renewable energy sources technology in power supply plants, it is necessary to provide designers withtools capable of recognizing at short notice any areas of convenience in its utilization, eithe...
详细信息
In order to develop renewable energy sources technology in power supply plants, it is necessary to provide designers withtools capable of recognizing at short notice any areas of convenience in its utilization, either as a stand-alone or as a hybrid (with generator back-up) project. Typical renewable sources considered are solar and wind energy, but the results are applicable to any other energy source. the block diagram of the plant includes renewable generator(s), storage battery, and standby diesel engine generator. At first, the authors developed an analysis based on a mathematical model that included only one renewable random source. this model related environmental statistical factors to the sizing parameters and consequently to the costs of each block. However, since the sizing is not univocally defined, the model takes the battery autonomy as 'variable' and executes on it an optimization process finalized to minimum overall cost. In conclusion, the resulting output gives either the optimized sizing parameters or the costs of each block. A step-by-step optimization procedure is described including: annual cost vs battery autonomy function consideration; mathematical approach to the environmental influence; sizing criteria of the renewable generator; the database utilized and its analysis; model algorithms; comparison between experimental and theoretical results; and the two renewable generators model.< >
In the software reengineering discussion it is assumed that the system can be understood on the following four levels: the programming language level, the control structure level, the generic algorithm level, and the ...
详细信息
In the software reengineering discussion it is assumed that the system can be understood on the following four levels: the programming language level, the control structure level, the generic algorithm level, and the problem domain level. It is noted that it is now possible to build tools which understand systems on the first three levels. there have been considerable advances in system data analysisthat will lead directly to identification of abstract data types and objects. It is suggested that future progress will critically depend on the ability to represent and reason about the problem domain. the reengineering systems of tomorrow will require knowledge not only of software engineering, but more important, of particular problem domains.< >
the authors describe the writing of a small, very specific, software tool, Igor, that automates the creation and maintenance of many routine and repetitive code fragments used in a large software system. Igor is viewe...
详细信息
the authors describe the writing of a small, very specific, software tool, Igor, that automates the creation and maintenance of many routine and repetitive code fragments used in a large software system. Igor is viewed as an application-specific application generator; it generates C source code from a higher level specification as an application generator does, and it is designed to be used only for this one purpose in this one project. the data structure is described in a concise, declarative notation, and a special-purpose translator was written to process the description. the translator generates files of source code that implement the many simple declarations, manipulations, and interrogations of this data structure. the authors discuss experience using this paradigm to implement the intermediate format and how it contributed to solving the larger task of building the CAE (computer-aided engineering) system of which it is a part. the authors compare their solution with other approaches and examine what aspects of the paradigm may be applicable to other software development efforts.< >
the authors have been developing knowledge-based tools to support the evolutionary development of specifications. Evolution is accomplished by means of evolution transformations, which are meaning-changing transformat...
详细信息
the authors have been developing knowledge-based tools to support the evolutionary development of specifications. Evolution is accomplished by means of evolution transformations, which are meaning-changing transformations applied to formal specifications. A sizable library of evolution transformations has been developed for the specification language Gist. the authors assess the results of their previous work on evolution transformations. they then describe their current efforts to build a versatile, usable evolution transformation library. they have identified important dimensions along which to describe transformation functionality, so that it is possible to assess the coverage of a library along each dimension. the potential applicability of this formal evolution paradigm to other environments is assessed.< >
Time-settlement data of more than 30 different sections of the 55 km long Bangna-Bangpakong Highway were studied by inverse analysis. the deformation parameters, namely: undrained modulus, Eu;drained modulus, E′;and ...
详细信息
ISBN:
(纸本)9061918936
Time-settlement data of more than 30 different sections of the 55 km long Bangna-Bangpakong Highway were studied by inverse analysis. the deformation parameters, namely: undrained modulus, Eu;drained modulus, E′;and coefficient of consolidation, Cv, were back-figured from the field performance of the highway embankment and the following correlations were found: Eu/Suv = 150, E′/Suv = 15, and Cv(field)/Cv(lab) = 26, where Suv is the vane shear strength. It was also found that Cv values were overestimated by the method of Asaoka (1978) when the during-construction time-settlement curve was used, and best estimated from post-construction data. For prediction of construction settlements, the method of Cox (1981) which is a combination of the method of D'Appolonia et al (1971) for immediate settlements and that of Leroueil et al (1978) for consolidation settlements, underpredicted settlements at some sections but yielded conservative estimates by adding secondary settlements since the beginning of construction. A good estimate of long term settlement was obtained for firmer sections by the method of Skempton & Bjerrum (1957) and the method of Asaoka (1978) generally underpredicted. the elastic method of Davis & Poulos (1968) gave the best estimates of bothconstruction and post construction settlements when back-figured parameters were used.
To deal withthe sometimes overwhelming volume of performance end availability tools, busy and inexperienced network operations personnel desperately need assistance. Requirements for such assistance are discussed. A ...
详细信息
the availability of high-performance, low-cost microprocessors has removed one of the obstacles to construction of multiprocessor systems which are cost-performance competitive from entry levels through large systems....
详细信息
ISBN:
(纸本)0818606347
the availability of high-performance, low-cost microprocessors has removed one of the obstacles to construction of multiprocessor systems which are cost-performance competitive from entry levels through large systems. Advances in understanding of computer structure and focus on multiprocessor design practices has removed the performance bottlenecks associated with multiprocessor systems of the past. Better understanding of multiprocessor scheduling and synchronization in the symmetric Pool Processor Architecture built on the portable base of the UNIX system interface has removed the obstacle to easy application of multiprocessors to single applications. Current research work has produced techniques and tools to facilitate and in some cases automate the decomposition of programs to into parallel execution streams. A computer is described which exploits the removal of these barriers. the Balance 8000 is a multimicrocomputer system based on the National Semiconductor Series 32000 microprocessor. It runs DYNIX, a version of UNIX bsd 4. 2, which exploits the power of from 2 to 12 coordinated microprocessors to achieve a range of performance which approaches the sum of the performance of the individual processors.
this book constitutes the proceedings of the 19thinternationalconference on tools and algorithms for the construction and analysis of systems, TACAS 2013, held in Rome, Italy, in March 2013. the 42 papers presented ...
详细信息
ISBN:
(数字)9783642367427
ISBN:
(纸本)9783642367410
this book constitutes the proceedings of the 19thinternationalconference on tools and algorithms for the construction and analysis of systems, TACAS 2013, held in Rome, Italy, in March 2013.
the 42 papers presented in this volume were carefully reviewed and selected from 172 submissions. they are organized in topical sections named: Markov chains; termination; SAT/SMT; games and synthesis; process algebra; pushdown; runtime verification and model checking; concurrency; learning and abduction; timed automata; security and access control; frontiers (graphics and quantum); functional programs and types; tool demonstrations; explicit-state model checking; Büchi automata; and competition on software verification.
暂无评论