A probabilistic approach is developed which allows the explicit and quantitative representation of the uncertainties inherent in innovative technologies. Probabilistic analyses provide insights into the uncertainties ...
A probabilistic approach is developed which allows the explicit and quantitative representation of the uncertainties inherent in innovative technologies. Probabilistic analyses provide insights into the uncertainties in process performance and cost not possible with conventional deterministic or sensitivity analysis. Applications of the approach are illustrated via analyses of the performance and cost of the fluidized bed copper oxide process, an advanced technology for the control of SO$\sb2$ and NO$\sb{\rm x}$ emissions from coal-fired power plants, and three integrated gasification combined cycle (IGCC) systems. Engineering performance and cost models of conceptual commercial-scale systems for each technology provides the basis for the analysis. For each technology evaluated, uncertainties in performance and cost parameters of the engineering models were explicitly characterized using probability distributions. Estimates of uncertainty were based on literature review, dataanalysis, and elicitation of the expert judgment of process engineers involved in technology development. The engineering models were exercised in probabilistic modeling environments to characterize the uncertainties in key measures of process performance and cost. The resulting uncertainties in performance and cost provide a quantitative measure of the risk of either poor performance or high cost associated with innovative process technologies. The key input uncertainties that drive uncertainty in performance and cost can be identified and prioritized. Thus, probabilistic analysis has direct implications for cost estimating, risk assessment, and research planning. Competing technologies are compared probabilistically to obtain quantification of the probability that an advanced technology will have higher performance and lower cost than conventional technology. Additional research is assumed to reduce the uncertainty in key input parameters. Therefore, the expected pay-off from additional resea
There are a wide variety of analytical tools which can be employed in the validation of modal test data. These tools can be used to improve the quality of test measured data and subsequent correlation to analysis resu...
详细信息
There are a wide variety of analytical tools which can be employed in the validation of modal test data. These tools can be used to improve the quality of test measured data and subsequent correlation to analysis results. Use of some of these tools has been applied to the modal test of the Atlas ii payload fairing. A new Atlas ii 11-foot diameter payload fairing was designed for the new generation of General Dynamics Atlas launch vehicles. The required validation of the design for launch and separation events involved a modal test of the test article and correlation of a finite element model. The correlated finite element model will then be used in a series of analyses to verify important mission events. Modal tests of two configurations were performed: (1) a free-free test of a half payload fairing for verification of the model for separation and (2) a fixed-base test of the complete payload fairing for verification of the model for launch and boost phases of flight. The modeling and testing of two configurations allowed multiple verifications through the correlation process. Typical verification methods, using orthogonality and modal assurance criteria (MAC), were employed during the modal surveys. During one of the surveys, errors were detected in the test data using an improved mode shape comparison technique, a diagnostic local hybrid MAC (DLHMAC). This new approach proved to be superior to the Coordinate MAC (COMAC) in identifying errors in the test mode shapes, allowing subsequent improved correlation to the analysisdata.
As part of DOE's Clean Coal Technology program, a field evaluation project is underway in Illinois to demonstrate Gas Reburning-Sorbent Injection (GR-SI) technology for controlling the emissions of the acid rain p...
详细信息
As part of DOE's Clean Coal Technology program, a field evaluation project is underway in Illinois to demonstrate Gas Reburning-Sorbent Injection (GR-SI) technology for controlling the emissions of the acid rain precursor species, NOx and SO2, from coal fired utility boilers. This provides an attractive, cost-effective control technology that can be used in retrofit applications on coal fired utility boilers. The project co-funders are the Gas Research Institute and the State of Illinois Department of Energy and Natural Resources, with Energy and Environmental Research Corporation as the prime contractor. GR controls the emission of NOx by staged fuel introduction, while SI consists of the injection of dry, calcium-based sorbents for SO2 capture. Two utility boilers representative of pre-NSPS design practices, a 71 MWe (net) tangentially fired unit and a 33 MWe (net) cyclone fired unit serve as the host sites. The project is structured into three phases: (I) Design and Permitting, (ii) Construction and Startup and (iiI) Operation, data Collection, Reporting and Disposition. Phase I of the project has been completed. Phase ii has been completed at the tangentially fired site, where Phase iiI testing has begun. The Phase ii construction outage at the cyclone fired site occurred in May and June, 1991. Phase I process design specifications utilizing chemical kinetics, fluid mixing and heat transfer modeling established the basis for detailed engineering designs of the GR-SI hardware systems. The process simulation work also showed that target emission reductions of 60% in NOx and 50% SO2 would be achievable. Following design and installation, successful demonstration of the GR-SI technology has occurred through operation and testing at the tangentially fired unit. Emission reductions of 60% in NOx and 50% SO2 have been attained. This paper will summarize the technical aspects of the demonstration, report on the schedule and progress of the project, and discuss the opera
作者:
T. KnappInstitut fur Regelungstechnik
Fachbereich Regelsystemtechnik and Prozefbutomatisierung Technische Hochschule Darmstadt Darmstadt Germany
A description is given of how a PC can be used for the analysis, the design, and the test of digital control systems. Special emphasis is given to a program package called CADREG-PC. With the program package CADREG-PC...
详细信息
A description is given of how a PC can be used for the analysis, the design, and the test of digital control systems. Special emphasis is given to a program package called CADREG-PC. With the program package CADREG-PC a control algorithm can be easily designed and tested without any time-consuming theoretical modeling.< >
The concepts and techniques of object-oriented modeling are used to model the assessment knowledge as well as part definition data to automate manufacturability assessment. Each type of knowledge (i.e., assessment and...
详细信息
The concepts and techniques of object-oriented modeling are used to model the assessment knowledge as well as part definition data to automate manufacturability assessment. Each type of knowledge (i.e., assessment and control knowledge) is represented in terms of production rules and abstracted into objects to facilitate the management of knowledge and modeling of the entire assessment process. Part definition data are represented in an object-oriented part model with the characteristics of data abstraction, adaptability and modularization. The behavior of the assessment process is defined in terms of methods in both part models and assessment process objects. The approaches to conflict resolution used in the system are considered.< >
The authors propose a new knowledge acquisition method for solving planning problems. Since original teaching data include ambiguous and faulty data, the data distribution is first evaluated by discriminant analysis. ...
详细信息
The authors propose a new knowledge acquisition method for solving planning problems. Since original teaching data include ambiguous and faulty data, the data distribution is first evaluated by discriminant analysis. Then, by introducing confidence probability and suspended region concepts, effective data are selected and used as teaching data for the version space method. The proposed method can acquire strategic knowledge in concise descriptions and has been successfully installed in several scheduling and operation assignment systems. These applications confirm a reduction in the knowledge acquisition effort.< >
The authors explore the possibility of using EEG (electroencephalographic) signals for automatic machine classification of the level of anesthesia that a patient is in. EEG data obtained under different levels of anes...
详细信息
The authors explore the possibility of using EEG (electroencephalographic) signals for automatic machine classification of the level of anesthesia that a patient is in. EEG data obtained under different levels of anesthesia have been modeled as an AR (autoregressive) process for that purpose. It is shown that AR model order, the AR power spectral density, and the second and fourth moments of the probability density function of the EEG signals can be used for classifying the level of anesthesia into low, medium, and high levels.< >
Super Simulation Shell (SSS) is a simulation system based on KEE and SimKit from IntelliCorp. The purpose of SSS is to provide steel plant engineers with an environment in which they can easily build computer simulati...
详细信息
Super Simulation Shell (SSS) is a simulation system based on KEE and SimKit from IntelliCorp. The purpose of SSS is to provide steel plant engineers with an environment in which they can easily build computer simulation models. These engineers have difficulties using existing systems to solve problems such as the determination of a transporter's potential, the number of machines, operation control, facility layout, and estimation of in-process inventory and lead time. SSS is based on the technologies of object-oriented programming; some of them are provided by KEE and SimKit, and others come from the concepts and components in steel plants. SSS also includes functions to easily input attributes on screen, to make animation for model validation, to suggest control strategies and to collect output data for analysis.< >
The R&M 2000 Workstation, a microcomputer-based modeling and analysis environment, has been developed to evaluate weapon systems with respect to the five reliability and maintainability (R&M) 2000 Goals. The w...
详细信息
The R&M 2000 Workstation, a microcomputer-based modeling and analysis environment, has been developed to evaluate weapon systems with respect to the five reliability and maintainability (R&M) 2000 Goals. The workstation helps the logistician, engineer, or program manager identify relevant information and evaluate potential effects of alternative actions to support their particular logistics analysis. The workstation evaluates all issues against consistent assessment criteria wherein 'workfiles' can be manipulated to quantify supportability assessments and to perform logistics trade-offs. By maintaining a 'control' using consistent, established assessment criteria throughout the analysisprocess, an audit trail is formed whereby all supportability decisions may be traced back through the decision process.< >
作者:
R.V. IyerS. GhoshLEMS
Division of Engineering Brown University Providence RI USA
In DARYN, the decision process for every train is executed by an on-board process that negotiates for temporary ownership of the tracks with the respective station controlling the tracks, through explicit processor to...
详细信息
In DARYN, the decision process for every train is executed by an on-board process that negotiates for temporary ownership of the tracks with the respective station controlling the tracks, through explicit processor to processor communication primitives. This processor then computes its own route utilizing the results of its negotiation, its knowledge of the track layout of the entire system, and its evaluation of the cost function. Every station's decision process is also executed by a dedicated processor that maintains absolute control over a given set of tracks and participates in the negotiation with the trains. Since the computational responsibility is distributed over all the logical entities of the system, DARYN offers the potential of superior performance over the traditional uniprocessor approach. The development of a realistic model of a railway network based on the DARYN approach and implementation on a loosely coupled parallel processor system are reported. Experimental results indicate DARYN's feasibility, significant superiority over the traditional approach, and that it exhibits, in general, the notion of performance scalability.< >
暂无评论