Large-scale DL on HPC systems like Frontier and Summit uses distributed node-local caching to address scalability and performance challenges. However, as these systems grow more complex, the risk of node failures incr...
详细信息
Identifying influence-maximizing (IM) nodes is an effective strategy for designing efficient communication networks or contain the spreading of information for rumors and epidemics with limited budgets. With the growi...
详细信息
In recent years, with the progress of distributed photovoltaic technology and the reduction of costs, the installed scale of distributed photovoltaic has grown rapidly. Due to the high average photovoltaic permeabilit...
详细信息
Split learning (SL) is a distributed deep-learning approach that enables individual data owners to train a shared model over their joint data without exchanging it with one another. SL has been the subject of much res...
详细信息
ISBN:
(数字)9781665471770
ISBN:
(纸本)9781665471770
Split learning (SL) is a distributed deep-learning approach that enables individual data owners to train a shared model over their joint data without exchanging it with one another. SL has been the subject of much research in recent years, leading to the development of several versions for facilitating distributed learning. However, the majority of this work mainly focuses on optimizing the training process while largely ignoring the design and implementation of practical tool support. To fill this gap, we present our automated software framework for training deep neural networks from decentralized data based on our extended version of SL, termed Blind Learning. Specifically, we shed light on the underlying optimization algorithm, explain the design and implementation details of our framework, and present our preliminary evaluation results. We demonstrate that Blind Learning is 65% more computationally efficient than SL and can produce better performing models. Moreover, we show that running the same job in our framework is at least 4.5 x faster than PySyft. Our goal is to spur the development of proper tool support for distributed deep learning.
This study presents Weighted Sampled Split Learning (WSSL), an innovative framework tailored to bolster privacy, robustness, and fairness in distributed machine learning systems. Unlike traditional approaches, WSSL di...
详细信息
Addressing the complexities of querying unstructured graphs such as knowledge graphs and social networks, this paper introduces D KWS, a novel distributed keyword search system. Leveraging a monotonic property, we ens...
详细信息
Cloud-based smart agriculture systems struggle with real-time processing and connectivity in remote areas. This study integrates edge computing with WSNs to create a real-time crop monitoring and auto-irrigation syste...
详细信息
This work proposes a real-time sentiment analysis pipeline on customer feedback using Yelp and addresses the high-volume dynamic user-generated contents processing problem. The proposal integrates state-of-the-art mac...
详细信息
Accurate and timely hyperlocal weather predictions are essential for various applications, ranging from agriculture to disaster management. In this paper, we propose a novel approach that combines hyperlocal weather p...
详细信息
The InterPlanetary File System (IPFS) is a popular decentralized peer-to-peer network for exchanging data. While there are many use cases for IPFS, the success of these use cases depends on the network. In this paper,...
详细信息
ISBN:
(数字)9781665488792
ISBN:
(纸本)9781665488792
The InterPlanetary File System (IPFS) is a popular decentralized peer-to-peer network for exchanging data. While there are many use cases for IPFS, the success of these use cases depends on the network. In this paper, we provide a passive measurement study of the IPFS network, investigating peer dynamics and curiosities of the network. With the help of our measurement, we estimate the network size and confirm the results of previous active measurement studies.
暂无评论