The transition from Java 1.4 to Java 1.5 has provided the programmer with more flexibility due to the inclusion of several new language constructs, such as parameterized types. This transition is expected to increase ...
详细信息
The transition from Java 1.4 to Java 1.5 has provided the programmer with more flexibility due to the inclusion of several new language constructs, such as parameterized types. This transition is expected to increase the number of class clusters exhibiting different combinations of class characteristics. In this paper we investigate how the number and distribution of clusters are expected to change during this transition. We present the results of an empirical study were we analyzed applications written in both Java 1.4 and 1.5. In addition, we show how the variability of the combinations of class characteristics may affect the testing of class members. (C) 2008 Elsevier Inc. All rights reserved.
To maintain sensitivity to new physics in the coming years of Large Hadron Collider (LHC) operations, A Toroidal LHC ApparatuS (ATLAS) collaboration has been working on upgrading a portion of the front-end (FE) electr...
详细信息
To maintain sensitivity to new physics in the coming years of Large Hadron Collider (LHC) operations, A Toroidal LHC ApparatuS (ATLAS) collaboration has been working on upgrading a portion of the front-end (FE) electronics and replacing some parts of the detector with new devices that can operate under the much harsher background conditions of future LHC runs. The legacy FE of the ATLAS detector sent data to the data acquisition (DAQ) system via the so-called Read Out Drivers (RODs) custom-made VMEbus boards devoted to data processing, configuration, and control. The data were then received by the Read Out System (ROS), which was responsible for buffering them during the High-Level Trigger (HLT) processing. From Run 3 onward, all new trigger and detector systems will be read out using new components, replacing the combination of the ROD and the ROS. This new path will feature an application called the Software Read Out Driver (SW ROD), which will run on a commodity server receiving FE data via the Front-End Link eXchange (FELIX) system. The SW ROD will perform event fragment building and buffering as well as serving the data on request to the HLT. The SW ROD application has been designed as a highly customizable high-performance framework providing support for detector-specific event building and data processing algorithms. The implementation that will be used for Run 3 of the LHC is capable of building event fragments at a rate of 100 kHz from an input stream consisting of up to 120 MHz of individual data packets. This document will cover the design and the implementation of the SW ROD application and will present the results of performance measurements executed on the server models selected to host SW ROD applications during Run 3.
Soil profile data that characterize the physical and chemical properties of a soil are among the required set of inputs for ecological, crop and other dynamic simulation models. A web-based soil information system oft...
详细信息
Soil profile data that characterize the physical and chemical properties of a soil are among the required set of inputs for ecological, crop and other dynamic simulation models. A web-based soil information system often provides the site-specific soil data of which formats are not readily compatible to crop models. The Soil daTA Retrieval Tool (START) was developed to automate a series of procedures for preparation of soil input data that includes the retrieval of soil profile data from the information system, reorganization of data, estimation of soil parameters, and creation of input files foi simulation models. In a case study, the START was implemented to support the SoilGrids database operated by the International Soil Reference and Information Center. It took about 0.33% of time for the START to create soil input files compared with manual preparation. These results suggest that the START could provide an efficient approach for preparation of soil input files especially for sites where little soil information is available.
One of the main drawbacks of Element Free Galerkin (EFG) method is its dependence on moving least square shape functions which don't satisfy the Kronecker Delta property, so in this method it's not possible to...
详细信息
One of the main drawbacks of Element Free Galerkin (EFG) method is its dependence on moving least square shape functions which don't satisfy the Kronecker Delta property, so in this method it's not possible to apply Dirichlet boundary conditions directly. The aim of the present paper is to discuss different aspects of three widely used methods of applying Dirichlet boundary conditions in EFG method, called Lagrange multipliers, penalty method, and coupling with finite element method. Numerical simulations are presented to compare the results of these methods form the perspective of accuracy, convergence and computational expense. These methods have been implemented in an objectoriented programing environment, called INSANE, and the results are presented and compared with the analytical solutions.
Because of the extensive uses of components, the Component-Based Software Engineering (CBSE) process is quite different from that of the traditional waterfall approach. CBSE not only requires focus on system specifica...
详细信息
Because of the extensive uses of components, the Component-Based Software Engineering (CBSE) process is quite different from that of the traditional waterfall approach. CBSE not only requires focus on system specification and development, but also requires additional consideration for overall system context, individual components properties and component acquisition and integration process. The term component-based software development (CBD) can be referred to as the process for building a system using components. CBD life cycle consists of a set of phases, namely, identifying and selecting components based on stakeholder requirements, integrating and assembling the selected components and updating the system as components evolve over time with newer versions. This work presents an indicative literature survey of techniques proposed for different phases of the CBD life cycle. The aim of this survey is to help provide a better understanding of different CBD techniques for each of these areas.
This paper proposes a methodology to solve contingency load transfer of distribution systems by applying the object-oriented expert system. By this method, the faulted area is isolated and restored and the unfaulted, ...
详细信息
This paper proposes a methodology to solve contingency load transfer of distribution systems by applying the object-oriented expert system. By this method, the faulted area is isolated and restored and the unfaulted, but out of service, area is restored effectively. Since the fault restoration has to be performed in a short time period and the affected area has to be constrained in a small area, it has become a critical issue to enhance the system reliability for the distribution system operation. The knowledge rule base with object-oriented programming can be designed to support distribution contingency management in an effective manner. The distribution facilities are designed as a class in the database and the distribution operation rules are created to form the knowledge base. To demonstrate the effectiveness of the proposed method, one of the Taipower distribution systems is selected for computer simulation. It is concluded that the contingency load transfer of distribution systems can be solved efficiently by identifying the proper switching operation to solve the distribution contingency problem. (C) 1997 Elsevier Science S.A.
Real-time and on-line optimisation methods for control of large-scale systems use real-time sensor data to optimise the system's performance. The data need to be pre-processed for faults to safeguard against inapp...
详细信息
Real-time and on-line optimisation methods for control of large-scale systems use real-time sensor data to optimise the system's performance. The data need to be pre-processed for faults to safeguard against inappropriate control actions. Conventional real-time signal processing techniques are likely to fail because of the system's complexity and the processing speed required. A generic, real-time knowledge-based system-FLASH (FLexible Al for Signal Handling in real time)-has been developed to detect, diagnose and replace faulty sensor readings. Its effectiveness in processing signals for control in a complex, sensor-rich environment has been demonstrated in real-time environmental control of greenhouse microclimate. (C) 1998 Elsevier Science B.V. All rights reserved.
Aspect-orientedprogramming (AOP), now practically a consolidated academic discipline, has yet to build more solid industrial foundations, especially in the realms of the .NET platform. It's believed that this sit...
详细信息
Aspect-orientedprogramming (AOP), now practically a consolidated academic discipline, has yet to build more solid industrial foundations, especially in the realms of the .NET platform. It's believed that this situation is caused by the lack of a robust and user-friendly AOP tool for .NET comparable with the Java-based AspectJ. This work investigates the basic infrastructure required for building such a tool: aspect-oriented weaving with the common language runtime (CLR) environment. In this regard, a classification schema is built, analysing the attributes a hypothetical aspect weaver for .NET might have. It assesses the different classes of weavers that can be built on top of the CLR today and investigates what extensions to the platform would be needed in order to enable more sophisticated weaving technologies. Some typical use cases for the resulting AOP tools, and classify what attributes a corresponding weaver would need to have in order to fulfil these requirements. Finally, two existing aspect weaver implementations in terms of these very same attributes are analysed.
This paper describes the development of an intelligent process-control system for the production of fiber-reinforced organic polymeric composites. The composite material consists of a polymer matrix (polyamide resin F...
详细信息
This paper describes the development of an intelligent process-control system for the production of fiber-reinforced organic polymeric composites. The composite material consists of a polymer matrix (polyamide resin F174) and high-modulus quartz-fiber reinforcement. This composite material has good mechanical properties at high temperatures, and possesses a low dielectric constant, making it suitable for applications in missile radomes. The problem is that the raw materials are chemically reactive, and the process-control system must enable adaptation to variations in the temperature-time exposure of the raw materials and/or variations which may occur in the materials received from different suppliers. The uniqueness of the control system lies in that it is self-directing, and relies on information derived from sensors (laser fiber-optic probes and dielectric sensors) placed within the material. In addition, a materials-transformation model based on the chemical kinetics of the polymerization process calculates a number of key polymer parameters, such as degree of imidization, degree of cure, molecular weight distribution, and polydispersity ratio in situ. The collective ability to collect high-quality sensor information, to run sophisticated but robust process models in real time, to make complex decisions using artificial intelligence (AI), and to implement these decisions for controlling the structure of the actual material being processed represents a significant breakthrough in materials and process capability in this field. The focus of this work, measuring and controlling the physical and chemical properties of the material, rather than the physical attributes of the processing machinery, is an important paradigm shift. (C) 1998 Elsevier Science Ltd. All rights reserved.
The Unified Modeling Language (UML) is the de facto standard for object-oriented software analysis and design modeling. However, few empirical studies exist which investigate the costs and evaluate the benefits of usi...
详细信息
The Unified Modeling Language (UML) is the de facto standard for object-oriented software analysis and design modeling. However, few empirical studies exist which investigate the costs and evaluate the benefits of using UML in realistic contexts. Such studies are needed so that the software industry can make informed decisions regarding the extent to which they should adopt UML in their development practices. This is the first controlled experiment that investigates the costs of maintaining and the benefits of using UML documentation during the maintenance and evolution of a real nontrivial system, using professional developers as subjects, working with a state-of-the-art UML tool during an extended period of time. The subjects in the control group had no UML documentation. In this experiment, the subjects in the UML group had, on average, a practically and statistically significant 54 percent increase in the functional correctness of changes (p = 0.03) and an insignificant 7 percent overall improvement in design quality (p = 0.22), though a much larger improvement was observed on the first change task ( 56 percent), at the expense of an insignificant 14 percent increase in development time caused by the overhead of updating the UML documentation (p = 0.35).
暂无评论