The automotive industry seeks to include more and more features in its vehicles. For this purpose, the necessary policy shift towards multi-core technology is in full swing. To eventually exploit the extra processing ...
详细信息
The automotive industry seeks to include more and more features in its vehicles. For this purpose, the necessary policy shift towards multi-core technology is in full swing. To eventually exploit the extra processing power, there is much additional effort needed for coping with the tremendously increased complexity. This is largely due to the elaborate parallelization process that spans a vast search space. Consequently, there is a strong need for innovative methods and appropriate tools for the migration of legacy single-core software. We use the results of a data dependency analysis performed on AUTOSAR system descriptions to determine advantageous partitions as well as initial task-to-core mappings. Afterwards, the extracted information serves as input for the simulation within a multi-core timing tool suite. Here, the initial solution is evaluated with respect to proper scheduling and metrics like cross-core communication rates, communication latencies, or core load distribution. A subsequent optimization process improves the initial solution and enables a comparative assessment. To demonstrate the benefit, we substantially expand a previous case study by applying our approach to two complex engine management systems and by showing the advantages compared to a parallelization process without preceding dependency analysis and initial partition/mapping suggestions.
Structural Health Monitoring aims to identify damages in engineering structures by monitoring changes in their vibration response. Unsupervised learning algorithms can be used to obtain a model of the undamaged condit...
详细信息
A characteristic feature of many practical decision-making tasks is their multicriteria, which leads to the complexity of information processing when finding a solution. Failure to consider many criteria can lead to i...
详细信息
The proceedings contain 68 papers. The special focus in this conference is on Rough Computing, Rough Set Theory and Its Applications. The topics include: Decision rules, bayes’ rule and rough sets;from computation wi...
ISBN:
(纸本)3540666451
The proceedings contain 68 papers. The special focus in this conference is on Rough Computing, Rough Set Theory and Its Applications. The topics include: Decision rules, bayes’ rule and rough sets;from computation with measurements to computation with perceptions;on text mining techniques for personalization;a road to discovery science;approximate distributed synthesis and granular semantics for computing with words;discovery of rules about complications;rough genetic algorithms;a rough-fuzzy neural computational approach;toward spatial reasoning in the framework of rough mereology;an algorithm for finding equivalence relations from tables with non-deterministic information;on the extension of rough sets under incomplete information;an alternative formulation;formal rough concept analysis;noise reduction in telecommunication channels using rough sets and neural networks;rough set analysis of electrostimilation test database for the prediction of post-operative profits in cochlear implanted patients;a rough set-based approach to text classification;modular rough fuzzy MLP;correspondence and complexity results;handling missing values in rough set analysis of multi-attribute and multi-criteria decision problems;the generic rough set inductive logic programming model and motifs in strings;rough problem settings for inductive logic programming;using rough sets with heuristics to feature selection;the discretization of continuous attributes based on compatibility rough set and genetic algorithm;level cut conditioning approach to the necessity measure specification;four c-regression methods and classification functions and context-free fuzzy sets in data mining context.
The biomedical signals are often corrupted by noise in their acquisition or transmission resulting in lower Signal to Noise Ratio (SNR), which brings problematic obstacles to successive biomedical signal processing. S...
详细信息
The biomedical signals are often corrupted by noise in their acquisition or transmission resulting in lower Signal to Noise Ratio (SNR), which brings problematic obstacles to successive biomedical signal processing. So suppressing noise and improving SNR effectively is an essential procedure and key issue in the research on biomedical signal processing. In this paper, we propose a novel multi-model fast denoising method based on the Wavelet transform threshold denoising. The proposed denoising scheme not only solves the Pseudo-Gibbs phenomenon to filter the signal effectively but also preserves the signal details to retain the diagnostic information. Meanwhile, the summed data processing method is advanced to realize the fast denoising. The simulation experiments on electrocardiogram(ECG) indicate that the proposed method can effectively and quickly separate signal from noise.
To help produce accurate and consistent maritime hazard products, the National Tsunami Hazard Mitigation Program organized a benchmarking workshop to evaluate the numerical modeling of tsunami currents. Thirteen teams...
详细信息
To help produce accurate and consistent maritime hazard products, the National Tsunami Hazard Mitigation Program organized a benchmarking workshop to evaluate the numerical modeling of tsunami currents. Thirteen teams of internationalresearchers, using a set of tsunami models currently utilized for hazard mitigation studies, presented results for a series of benchmarking problems;these results are summarized in this paper. Comparisons focus on physical situations where the currents are shear and separation driven, and are thus de-coupled from the incident tsunami waveform. In general, we find that models of increasing physical complexity provide better accuracy, and that low-order three-dimensional models are superior to high-order two-dimensional models. Inside separation zones and in areas strongly affected by eddies, the magnitude of both model-data errors and inter-model differences can be the same as the magnitude of the mean flow. Thus, we make arguments for the need of an ensemble modeling approach for areas affected by large-scale turbulent eddies, where deterministic simulation may be misleading. As a result of the analyses presented herein, we expect that tsunami modelers now have a better awareness of their ability to accurately capture the physics of tsunami currents, and therefore a better understanding of how to use these simulation tools for hazard assessment and mitigation efforts. (C) 2017 Elsevier Ltd. All rights reserved.
暂无评论