Indoor Positioning systems (IPS) have recently emerged as a crucial technology in the Internet of Things (IoT), with widespread applications in smart cities and homes. Radio frequency-based fingerprinting, enabling lo...
详细信息
This research investigates the application of Blockchain technology to enhance the security, transparency, and efficiency of biomedical data management. By analyzing existing data management practices in the biomedica...
详细信息
In this paper, we introduce a complementary and straightforward mechanism for anomaly detection tailored for smart city infrastructures, utilizing a combination of regression algorithms. Our methodology employs two di...
详细信息
ISBN:
(纸本)9798350369458;9798350369441
In this paper, we introduce a complementary and straightforward mechanism for anomaly detection tailored for smart city infrastructures, utilizing a combination of regression algorithms. Our methodology employs two distinct regression models to generate future predictions from a given dataset. The primary model is crafted to yield high-fidelity predictions, while the secondary model is purposefully designed to introduce a degree of noise. Both models work together as a defense against Flooding attacks through the detection of abnormal levels of data inflow (detection of outliers). We calculate the alignment cost, or Euclidean distance, between the predictions from these two models, establishing a threshold against which real future traffic can be evaluated. The alignment cost or euclidean distance of the actual traffic is computed in relation to the high-quality predictions and then compared with the established threshold to pinpoint anomalies. Through experimentation with various regression algorithms, including linear regression, support vector regression, decision trees, etc., we identified an optimal combination for peak performance. Our assessments, grounded in comprehensive smart city datasets, center on the process of transforming complex non-linear data into a more appropriate form to detect anomalous data points. Conclusively, the dualmodel anomaly detection framework we propose stands out as an invaluable tool in defending smart city infrastructures from data irregularities and potential threats, highlighting the criticality of bespoke solutions in contemporary urban digital environments.
The increasing volume and velocity of sciencedata necessitate the frequent movement of enormous data volumes as part of routine research activities. As a result, limited wide-area bandwidth often leads to bottlenecks...
详细信息
ISBN:
(纸本)9798350339864
The increasing volume and velocity of sciencedata necessitate the frequent movement of enormous data volumes as part of routine research activities. As a result, limited wide-area bandwidth often leads to bottlenecks in research progress. However, in many cases, consuming applications (e.g., for analysis, visualization, and machine learning) can achieve acceptable performance on reduced-precision data, and thus researchers may wish to compromise on data precision to reduce transfer and storage costs. Error-bounded lossy compression presents a promising approach as it can significantly reduce data volumes while preserving data integrity based on user-specified error bounds. In this paper, we propose a novel data transfer framework called Ocelot that integrates error-bounded lossy compression into the Globus data transfer infrastructure. We note four key contributions: (1) Ocelot is the first integration of lossy compression in Globus to significantly improve scientific data transfer performance over wide area network (WAN). (2) We propose an effective machine-learning based lossy compression quality estimation model that can predict the quality of error-bounded lossy compressors, which is fundamental to ensure that transferred data are acceptable to users. (3) We develop optimized strategies to reduce the compression time overhead, counter the compute-node waiting time, and improve transfer speed for compressed files. (4) We perform evaluations using many real-world scientific applications across different domains and distributed Globus endpoints. Our experiments show that Ocelot can improve dataset transfer performance substantially, and the quality of lossy compression (time, ratio and data distortion) can be predicted accurately for the purpose of quality assurance.
With the development of 3D laser scanning technology, point cloud data has been widely used in various fields. As the most critical link after the point cloud data collection, the denoising process of point cloud data...
详细信息
The use of big datasystems has become prevalent across sensitive domains, including health, defense and finance, among others. These big datasystems are often complex and with complexity often comes vulnerabilities....
mdx II is an Infrastructure-as-a-Service (IaaS) cloud platform designed to accelerate datascience research and foster cross-disciplinary collaborations among universities and research institutions in Japan. Unlike tr...
详细信息
Image-guided point cloud completion task aims to utilize image information to address the uncertainties in point cloud completion inference. Although acquiring 2D image data is relatively simpler than 3D data, it is s...
The paper proposes an automated data exploration and analysis method based on Attribute Frequency Statistical Feature Ratio (AFSFR). It integrates AutoVis and data Preprocessing Methods to design and develop AutoEDA-S...
详细信息
The development of intelligent networked vehicle technology and the testing of related algorithms require a large number of datasets as the foundation. The existing datasets are mainly collected from foreign traffic s...
详细信息
暂无评论