This paper investigates the impact of data valuation metrics (variability and coefficient of variation) on the feature importance in classification models. data valuation is an emerging topic in the fields of data sci...
详细信息
Decentralized Finance (DeFi) aims to use advancements in both computation and cryptography to tackle economic problems. Therefore, it must operate within the intersection of constraints from both the computer science ...
Decentralized Finance (DeFi) aims to use advancements in both computation and cryptography to tackle economic problems. Therefore, it must operate within the intersection of constraints from both the computer science and economic domains. We explore a foundational question at the junction of those fields: Can we synthesize variable market-clearing risk-free yield for native tokens via smart contracts? We use a stylized model representing a large class of decentralized consensus algorithms to show this is impossible. This undecidability result places bounds on what decentralized financial products can be built and constrains the shape of future developments in DeFi. Among other limitations, our results reveal that markets in DeFi are incomplete.
The data-intensive applications of today's big data era often produce a large memory footprint. As a result, a significant volume of data needs to travel from memory to the CPU under the traditional Von-Neumann co...
The data-intensive applications of today's big data era often produce a large memory footprint. As a result, a significant volume of data needs to travel from memory to the CPU under the traditional Von-Neumann computing paradigm. Near-memory processing (NMP) or processing-in-memory (PIM) is a potential alternate computation framework where a computation unit is placed near the memory (or inside the memory) and a portion of an application is executed on it (termed computation offloading) aiming to reduce the amount of data movement and its consequences. Although a few computation offloading strategies have been proposed in recent times, the existing approaches do not consider the data locality offered by the last level cache and the overall execution time of the application while designing their policies. In this paper, we propose a data locality-aware computation offloading strategy for a hybrid computing system comprising the host processor and NMP-enabled 3D memory. After the application code is instrumented using the LLVM compiler framework, the strategy offloads a portion of an application to NMP if its estimated overall execution time is less. An extensive simulation performed on a set of standard simulators for a bunch of large graph-based application benchmarks reports the effectiveness of the proposed strategy by achieving a maximum speedup of 40% and 11.8% as compared to the host-only configuration and the state-of-art policy, respectively. The proposed strategy also reduces the off-chip data transfer and energy consumption by a significant margin as compared to the host-only configuration (avg 27%) and the state-of-art policy (avg 28%). Further, the proposed policy reduces the LLC miss rate by 57% as compared to the state-of-art policy.
Since many years smartphones are utilised for human activity recognition (HAR), important healthcare recommendations and telemedicine. Deep learning (DL) and machine learning techniques are commonly employed in studie...
详细信息
Semi-supervised learning, a system dedicated to making networks less dependent on labeled data, has become a popular paradigm due to its strong performance. A common approach is to use pseudo-labels with unlabeled dat...
详细信息
In this paper, the experimental studies for increasing the load processing efficiency in distributed data centers of communication operators are presented. The communication operators widely use the up-to-day technolo...
In this paper, the experimental studies for increasing the load processing efficiency in distributed data centers of communication operators are presented. The communication operators widely use the up-to-day technologies function virtualization such as NFV, SDN, Network Slicing, Edge computing, and bDDN, which require a significant amount of calculations, and accordingly consume significant amounts of energy during load processing in data centers. Existing approaches do not provide simultaneous improvements in energy efficiency and performance while meeting SLA requirements. To ensure energy-efficient dataprocessing and improve its processing performance, while maintaining the quality of SLA requirements, an approach was proposed, the essence of which is to take into account the daily load and allocate dataprocessing resources in such a way as to minimize energy consumption and not lose performance during processing. When comparing the proposed approach with Backfill load planning, it was possible to determine an increase in energy efficiency indicators up to 9.953%, and up to 26.382% in comparison with Round Robin.
In this paper, we present an experimental validation of a photovoltaic/electrolysis system dedicated to supplying an alternating load and producing hydrogen. The system uses new way to produce hydrogen by adapting the...
详细信息
Single Ended Line Testing can provide diagnostic data on telecommunications access networks such as VDSL without an electrical cable connection to the local exchange building. This information includes Uncalibrated Ec...
Single Ended Line Testing can provide diagnostic data on telecommunications access networks such as VDSL without an electrical cable connection to the local exchange building. This information includes Uncalibrated Echo Response data which may be used to perform baselining and fingerprinting on a tested line. We present novel algorithms to detect changes in the electrical state of a line, terminations within a DSL cabinet, modem power state changes, and in-premises loop length changes to produce a highly specific model of long term physical line conditions and provide business benefit in improved knowledge for engineer dispatch and in providing a frequency-domain alternative to existing time-domain methods without requiring additional equipment.
Satellite imagery analysis is crucial for various applications such as environmental monitoring, land-use management, and disaster management. However, traditional image processing methods often struggle with accurate...
Satellite imagery analysis is crucial for various applications such as environmental monitoring, land-use management, and disaster management. However, traditional image processing methods often struggle with accurately classifying different land use types due to variations in texture, shape, and pattern. This research work proposes a deep learning-based approach for land use classification in satellite imagery data. In this work, a Convolutional Neural Network (CNN) model has been trained on a large dataset of satellite images and corresponding land use labels, achieving high performance in land use classification as measured by metrics such as Jaccard index and accuracy from the dice function and focal function. The proposed approach offers several advantages as compared to the tradition ones. This trained model can recognize subtle differences in land use types that might be challenging to discern using traditional methods. The model is highly efficient, capable of processing large volumes of data in a relatively short time. It can be easily adapted to different satellite imagery datasets and land use classification tasks, making it a flexible and versatile approach. With the proposed methodology on analysing images against 6 classes we get a land cover classification accuracy of 81.47%.
This paper innovatively proposes a database access information security management method under the big data platform. This paper first designs and implements a network information security management system based on ...
详细信息
This paper innovatively proposes a database access information security management method under the big data platform. This paper first designs and implements a network information security management system based on B/S architecture. The system consists of a safety management platform management center, a WEB console, an agency center and a processing center. At the same time, the system covers a total of five sub-modules including security monitoring management, security policy management, event analysis and processing, security incident response and system management. Secondly, this paper constructs an attribute-based multi-authorization encryption system, which reduces the number of matching operations and improves the efficiency of password utilization. Then, this paper builds a user hierarchical role key tree in the big data platform user role authentication center, and describes the relationship between the big data platform user role and the encryption key of extremely sensitive information. Finally, simulation experiments are carried out in this paper. The results show that the establishment of the encryption system enhances the security of web page access information, and realizes the security management of information platform access information under the big data platform.
暂无评论