The paper presents how we solved the mediation challenge in a model driven, service oriented fashion, how we verify properties of the mediator via model checking in the jABC, and how to systematically export jABC/jETI...
详细信息
The paper presents how we solved the mediation challenge in a model driven, service oriented fashion, how we verify properties of the mediator via model checking in the jABC, and how to systematically export jABC/jETI orchestrated services as Web services. Due to the lack of maturity of the involved environments and external components, the latter task is less easy and the solutions possible today are less stable than one would expect from these technologies.
Data flow processing is a common task of embedded systems which is usually modeled as a pipeline. Errors in a block of this pipeline can be propagated through it thus leading to unexpected and erroneous behaviors. For...
详细信息
Data flow processing is a common task of embedded systems which is usually modeled as a pipeline. Errors in a block of this pipeline can be propagated through it thus leading to unexpected and erroneous behaviors. For safety related applications, this pipeline has to be able to identify and react to failures. The DMOSES model-driven development method uses deterministic UML activities to describe and implement data flow processing. This method ensures deterministic behavior of concurrent processing. Design by Contract defines formal, precise and verifiable interfaces for software components. We propose a development method for safe data flow processing based on the integration of this concept in deterministic UML activities. This integration allows the identification of errors by detection of contracts violation. This paper presents an extension of the DMOSES tool for contracts verification at the model level and their monitoring at runtime.
The article describes the procedure for analyzing information resources of the system of dynamic integration of weakly structured data in the web-environment to determine the common features of information resources a...
详细信息
ISBN:
(数字)9781728132143
ISBN:
(纸本)9781728132150
The article describes the procedure for analyzing information resources of the system of dynamic integration of weakly structured data in the web-environment to determine the common features of information resources and identify links between them. Model-based shared resource definitions and language describing the rules of access to resources, examines the process of creating an object adapter defining the common features of the information resources and identify relationships between them using the rules of the method of "black box". The process of determining the structural-dynamic model of the domain Mashup-application is described.
Non-deterministically behaving test cases cause developers to lose trust in their regression test suites and to eventually ignore failures. Detecting flaky tests is therefore a crucial task in maintaining code quality...
详细信息
Non-deterministically behaving test cases cause developers to lose trust in their regression test suites and to eventually ignore failures. Detecting flaky tests is therefore a crucial task in maintaining code quality, as it builds the necessary foundation for any form of systematic response to flakiness, such as test quarantining or automated debugging. Previous research has proposed various methods to detect flakiness, but when trying to deploy these in an industrial context, their reliance on instrumentation, test reruns, or language-specific artifacts was inhibitive. In this paper, we therefore investigate the prediction of flaky tests without such requirements on the underlying programming language, CI, build or test execution framework. Instead, we rely only on the most commonly available artifacts, namely the tests’ outcomes and durations, as well as basic information about the code evolution to build predictive models capable of detecting flakiness. Furthermore, our approach does not require additional reruns, since it gathers this data from existing test executions. We trained several established classifiers on the suggested features and evaluated their performance on a large-scale industrial software system, from which we collected a data set of 100 flaky and 100 non-flaky test- and code-histories. The best model was able to achieve an F1-score of 95.5% using only 3 features: the tests’ flip rates, the number of changes to source files in the last 54 days, as well as the number of changed files in the most recent pull request.
A multi-language-based data interface system for heterogeneous distributed processing is introduced. A prototyped environment based on this system is discussed, and an evaluation of the prototyped system is presented....
详细信息
A multi-language-based data interface system for heterogeneous distributed processing is introduced. A prototyped environment based on this system is discussed, and an evaluation of the prototyped system is presented. It is shown that by keeping the syntax of the specification language flexible and close to existing high-level languages, a user can learn the interface language quickly. Semantically, this data interface views structured data as consisting of two parts: the data values themselves and the representation of the structure among the data values. Through this separation, it is possible to have pipelined data type checking and data conversion operations.< >
The article describes the design of a system for dynamic integration of weakly structured data using Mash-Up technology. Functional requirements to the system of dynamic data integration based on Mash-Up technology ar...
详细信息
ISBN:
(数字)9781728132143
ISBN:
(纸本)9781728132150
The article describes the design of a system for dynamic integration of weakly structured data using Mash-Up technology. Functional requirements to the system of dynamic data integration based on Mash-Up technology are characterized. An algorithm for constructing an ontological model of all integral systems and an algorithm for obtaining information resources from an integral system is proposed. The architecture and principles of the Shares and Discounts Mashup dynamic integration of weakly structured data are considered.
Hydrographic survey performed by the U.S. Army Corps of Engineers (USACE), on the 350 miles of Mississippi river is discussed. It is observed that the river loses velocity, dropping silt, sand and sediment at each of ...
详细信息
Hydrographic survey performed by the U.S. Army Corps of Engineers (USACE), on the 350 miles of Mississippi river is discussed. It is observed that the river loses velocity, dropping silt, sand and sediment at each of its mouth. USACE uses differential global positioning system (DGPS), computer aided design (CAD), geospatial, wireless and internet technologies to preprocess survey data in the field. The survey data is helpful in dredging operations on the lower Mississippi to ensure navigable waterways.
Robot applications, represented as plans, are used to outline a viewpoint that robustness needs to be emphasized in two areas: in the plan representation and in the underlying system software. Robot applications are i...
详细信息
ISBN:
(纸本)0818620811
Robot applications, represented as plans, are used to outline a viewpoint that robustness needs to be emphasized in two areas: in the plan representation and in the underlying system software. Robot applications are inherently distributed, since the hardware usually comprises a set of independent actuators and sensors, with the robot programs acting as links between them. A special model of distributed computation, the RS (Robot Schemas) model, has been designed to handle the issues of robot plan representation, and an overview of the model is presented. An initial implementation of the model with minimal execution support demonstrated that the domain-dependent aspect of robustness on its own was not sufficient for robust behavior. Consequently, the OS has been augmented with real-time scheduling, and monitoring facilities.< >
An adaptive algorithm for LD converter steel refining control is presented. Avoiding the traditionally used balance approach, here a real-time process parameters estimation and control is applied. A Recursive Least Sq...
详细信息
An adaptive algorithm for LD converter steel refining control is presented. Avoiding the traditionally used balance approach, here a real-time process parameters estimation and control is applied. A Recursive Least Squares method is used and a minimum variance control strategy is realized. The original idea is that the converter steel making process| is considered as a continuous process which is sampled and one heat time is considered as one sample time. So, the process model parameters are adjusted after each heat and the algorithm adapts very fast to the technological conditions chanson. A distributed PS/XT computer two-leves control system is realized. Successful industrial results are obtained.
暂无评论