This paper is organized as follows. Part A presents the context of reconfiguring transitic systems and the main idea in implementing the decision step. It comprises sections 1 to 3. Section 3 presents an example that ...
详细信息
This paper is organized as follows. Part A presents the context of reconfiguring transitic systems and the main idea in implementing the decision step. It comprises sections 1 to 3. Section 3 presents an example that illustrates the concepts presented in the next sections. Parts B and C express the models and principles used to simulate transitic systems, the result of which will be helpful for choosing the new configuration. Part B focuses mainly on models. It comprises sections 4 to 6. Part C focuses mainly on simulation principles. It comprises sections 7 to 10.
In this paper we present an approach to integrating optimisation tools with simulation software to achieve user-defined objectives. A formal protocol is presented, including specific definitions of the requirements of...
详细信息
In this paper we present an approach to integrating optimisation tools with simulation software to achieve user-defined objectives. A formal protocol is presented, including specific definitions of the requirements of the simulator, optimiser, and design interface. A discussion of the mathematical issues and the efficiency of different approaches for computing sensitivities is given. We then discuss the source code modifications necessary to accomplish the integration of simulation and optimisation. (C) 1996 by Elsevier Science Inc.
OpenMAC is presented in this study as an innovative reconfigurable platform which overcomes the limitations of state-of-the-art experimental tools to test medium access control (MAC) protocols. The purpose of the Open...
详细信息
OpenMAC is presented in this study as an innovative reconfigurable platform which overcomes the limitations of state-of-the-art experimental tools to test medium access control (MAC) protocols. The purpose of the OpenMAC platform is to simplify the prototyping process by enabling the implementation of MAC protocols designed in C++, relieving the protocol designer from the hardware and timing aspects, and thus avoiding the need to code optimised C/assembly or hardware description language (HDL). Aiming to reduce the hardware design and implementation costs with respect to custom hardware solutions, the OpenMAC platform has been implemented on an inexpensive off-the-shelf reconfigurable field programmable gate array (FPGA)-based development board with a processor embedded in the FPGA. The challenge presented by the proposed platform is to fulfil strict MAC time-constraints with compiled straightforward and clean C++ code. For this purpose, the OpenMAC platform introduces an innovative hardware/software partitioning concept for MAC protocol implementation which is based on a shared-memory architecture. Measurements carried out on an FPGA board demonstrate that this platform meets the short inter-frame space (SIFS) specification of the IEEE 802.11 standard, hence enabling field testing of prototyped MAC protocols.
We present an object-oriented framework, named DOOLINES , for non-linear static and dynamic analyses of slender marine structures which often appear in offshore structures employed in the petroleum and gas industries ...
详细信息
We present an object-oriented framework, named DOOLINES , for non-linear static and dynamic analyses of slender marine structures which often appear in offshore structures employed in the petroleum and gas industries as, among others, flexible risers, steel catenary risers, umbilicals, floating hoses, and mooring lines. DOOLINES allows the rapid development of tailored, modular, reusable and extensible large-size systems, being itself extensible. These properties, along with the ease of use of our framework, are assessed by means of case studies. Code examples are provided.
This research examines the structural complexity of software and, specifically, the potential interaction of the two dominant dimensions of structural complexity, coupling and cohesion. Analysis based on an informatio...
详细信息
This research examines the structural complexity of software and, specifically, the potential interaction of the two dominant dimensions of structural complexity, coupling and cohesion. Analysis based on an information processing view of developer cognition results in a theoretically driven model with cohesion as a moderator for a main effect of coupling on effort. An empirical test of the model was devised in a software maintenance context utilizing both procedural and object-oriented tasks, with professional software engineers as participants. The results support the model in that there was a significant interaction effect between coupling and cohesion on effort, even though there was no main effect for either coupling or cohesion. The implication of this result is that, when designing, implementing, and maintaining software to control complexity, both coupling and cohesion should be considered jointly, instead of independently. By providing guidance on structuring software for software professionals and researchers, these results enable software to continue as the solution of choice for a wider range of richer, more complex problems.
While early knowledge-based systems suffered the frequent criticism of having little relevance to the real world, an increasing number of current applications deal with complex, real-world problems. Due to the complex...
详细信息
While early knowledge-based systems suffered the frequent criticism of having little relevance to the real world, an increasing number of current applications deal with complex, real-world problems. Due to the complexity of real-world situations, no one general software technique can produce adequate results in different problem domains, and artificial intelligence usually needs to be integrated with conventional paradigms for efficient solutions. The complexity and diversity of real-world applications have also forced the researchers in the AI field to focus more on the integration of diverse knowledge representation and reasoning techniques for solving challenging, real-world problems. Our development environment, BEST (Blackboard-based Expert Systems Toolkit), is aimed to provide the ability to produce large-scale, evolvable, heterogeneous intelligent systems. BEST incorporates the best of multiple programming paradigms in order to avoid restricting users to a single way of expressing either knowledge or data. It combines rule-based programming, object-oriented programming, logic programming, procedural programming and blackboard modelling in a single architecture for knowledge engineering, so that the user can tailor a style of programming to his application, using any or arbitrary combinations of methods to provide a complete solution. The deep integration of all these techniques yields a toolkit more effective even for a specific single application than any technique in isolation or collections of multiple techniques less fully integrated. Within the basic, knowledge-based programming paradigm, BEST offers a multiparadigm language for representing complex knowledge, including incomplete and uncertain knowledge. Its problem solving facilities include truth maintenance, inheritance over arbitrary relations, temporal and hypothetical reasoning, opportunistic control, automatic partitioning and scheduling, and both blackboard and distributed problem-solving paradi
With increasing use of component-based development (CBD), the process for selecting software from repositories is a critical concern for quality systems development. As support for developers blending in-house and thi...
详细信息
With increasing use of component-based development (CBD), the process for selecting software from repositories is a critical concern for quality systems development. As support for developers blending in-house and third party software, the context-driven component evaluation (CdCE) process provides a three-phase approach to software selection: filtering to a short list, functional evaluation and ranking. The process was developed through iterative experimentation on real-world data. CdCE has tool support to generate classifier models, shortlists and test cases as artefacts that provide for a repeatable, transparent process that can be reused as the system evolves. Although developed for software component selection, the CdCE process framework can be easily modified for other selection tasks by substituting templates, tools, evaluation criteria and/or repositories. In this article the authors describe the CdCE process and its development, the CdCE framework as a reusable pattern for software selection and provide a case study where the process is applied.
This study introduces a developed method to a smart computer-aided design/manufacturing (CAD/CAM) system, where layout design, process planning, and comprehensive computerized numerical control (CNC) code generation c...
详细信息
This study introduces a developed method to a smart computer-aided design/manufacturing (CAD/CAM) system, where layout design, process planning, and comprehensive computerized numerical control (CNC) code generation can be implemented to satisfy laser cutting holes, tapping, irregular and complicated profile processing, engraving, and burr back-scraping. The smart CAD/CAM(SCAM) system is developed as a commercial software product or application and firstly applied to flexible sheet metal machining center (BGL 130R). In this study, a formal modeling method involving Petri nets and first-order predicate logic is proposed to develop the smart manufacturing system. High-level Petri nets are employed to achieve the formal application architecture design of data flow for various functions, and the first-order logic used to represent the process plan is defined and deduced according to the machining methods. The developed system possesses the following characteristics: (1) a sound and complete deductive system to implement various types of trajectory planning, automatic generation, and validation of the CNC code;(2) a convenient design input environment and readiness for re-design and modification by adding specific design functions and using standard design procedures on a widely used CAD/CAM package;(3) helpful for designers in sheet metal layout designing, layout interference detection, process planning validation, preprocess manufacturing operation of CNC code generation, and autodefinition of storable file names;and (4) formal and simple in human-computer interaction, automatic and intelligent in process operations, and satisfactory in terms of the requirements of the flexible sheet metal machining center (BGL 130R).
Purpose The purpose of this paper is to evaluate the capabilities of the generalized finite element method (GFEM) under the context of the geometrically nonlinear analysis. The effect of large displacements and deform...
详细信息
Purpose The purpose of this paper is to evaluate the capabilities of the generalized finite element method (GFEM) under the context of the geometrically nonlinear analysis. The effect of large displacements and deformations, typical of such analysis, induces a significant distortion of the element mesh, penalizing the quality of the standard finite element method approximation. The main concern here is to identify how the enrichment strategy from GFEM, that usually makes this method less susceptible to the mesh distortion, may be used under the total and updated Lagrangian formulations. Design/methodology/approach An existing computational environment that allows linear and nonlinear analysis, has been used to implement the analysis with geometric nonlinearity by GFEM, using different polynomial enrichments. Findings The geometrically nonlinear analysis using total and updated Lagrangian formulations are considered in GFEM. Classical problems are numerically simulated and the accuracy and robustness of the GFEM are highlighted. Originality/value This study shows a novel study about GFEM analysis using a complete polynomial space to enrich the approximation of the geometrically nonlinear analysis adopting the total and updated Lagrangian formulations. This strategy guarantees the good precision of the analysis for higher level of mesh distortion in the case of the total Lagrangian formulation. On the other hand, in the updated Lagrangian approach, the need of updating the degrees of freedom during the incremental and iterative solution are for the first time identified and discussed here.
Empirical validation of code metrics has a long history of success. Many metrics have been shown to be good predictors of external features, such as correlation to bugs. Our study provides an alternative explanation t...
详细信息
Empirical validation of code metrics has a long history of success. Many metrics have been shown to be good predictors of external features, such as correlation to bugs. Our study provides an alternative explanation to such validation, attributing it to the confounding effect of size. In contradiction to received wisdom, we argue that the validity of a metric can be explained by its correlation to the size of the code artifact. In fact, this work came about in view of our failure in the quest of finding a metric that is both valid and free of this confounding effect. Our main discovery is that, with the appropriate (non-parametric) transformations, the validity of a metric can be accurately (with R-squared values being at times as high as 0.97) predicted from its correlation with size. The reported results are with respect to a suite of 26 metrics, that includes the famous Chidamber and Kemerer metrics. Concretely, it is shown that the more a metric is correlated with size, the more able it is to predict external features values, and vice-versa. We consider two methods for controlling for size, by linear transformations. As it turns out, metrics controlled for size, tend to eliminate their predictive capabilities. We also show that the famous Chidamber and Kemerer metrics are no better than other metrics in our suite. Overall, our results suggest code size is the only "unique" valid metric.
暂无评论