Search over encrypted data is a hot topic. In this paper, we propose a secure scheme for searching the encrypted servers. Such scheme enables the authorised user to search multiple servers with multi-keyword queries a...
详细信息
The growing demand and dependence upon cloud services have garnered an increasing level of threat to user data and security. Some of such critical web and cloud platforms have become constant targets for persistent ma...
详细信息
The growing demand and dependence upon cloud services have garnered an increasing level of threat to user data and security. Some of such critical web and cloud platforms have become constant targets for persistent malicious attacks that attempt to breach security protocol and access user data and information in an unauthorized manner. While some of such security compromises may result from insider data and access leaks, a substantial proportion continues to remain attributed to security flaws that may exist within the core web technologies with which such critical infrastructure and services are developed. This paper explores the direct impact and significance of security in the Software Development Life Cycle(SDLC) through a case study that covers some 70 public domain web and cloud platforms within Saudi Arabia. Additionally, the major sources of security vulnerabilities within the target platforms as well as the major factors that drive and influence them are presented and discussed through experimental evaluation. The paper reports some of the core sources of security flaws within such critical infrastructure by implementation with automated security auditing and manual static code analysis. The work also proposes some effective approaches, both automated and manual, through which security can be ensured through-out the SDLC and safeguard user data integrity within the cloud.
Growing accuracy and robustness of Deep Neural Networks (DNN) models are accompanied by growing model capacity (going deeper or wider). However, high memory requirements of those models make it difficult to execute th...
详细信息
Background: Improving the accessibility of screening diabetic kidney disease (DKD) and differentiating isolated diabetic nephropathy from non-diabetic kidney disease (NDKD) are two major challenges in the field of dia...
详细信息
Background: Improving the accessibility of screening diabetic kidney disease (DKD) and differentiating isolated diabetic nephropathy from non-diabetic kidney disease (NDKD) are two major challenges in the field of diabetes care. We aimed to develop and validate an artificial intelligence (AI) deep learning system to detect DKD and isolated diabetic nephropathy from retinal fundus images. Methods: In this population-based study, we developed a retinal image-based AI-deep learning system, DeepDKD, pretrained using 734 084 retinal fundus images. First, for DKD detection, we used 486 312 retinal images from 121 578 participants in the Shanghai Integrated Diabetes Prevention and Care system for development and internal validation, and ten multi-ethnic datasets from china, Singapore, Malaysia, Australia, and the UK (65 406 participants) for external validation. Second, to differentiate isolated diabetic nephropathy from NDKD, we used 1068 retinal images from 267 participants for development and internal validation, and three multi-ethnic datasets from china, Malaysia, and the UK (244 participants) for external validation. Finally, we conducted two proof-of-concept studies: a prospective real-world study with 3 months' follow-up to evaluate the effectiveness of DeepDKD in screening DKD;and a longitudinal analysis of the effectiveness of DeepDKD in differentiating isolated diabetic nephropathy from NDKD on renal function changes with 4·6 years' follow-up. Findings: For detecting DKD, DeepDKD achieved an area under the receiver operating characteristic curve (AUC) of 0·842 (95% CI 0·838–0·846) on the internal validation dataset and AUCs of 0·791–0·826 across external validation datasets. For differentiating isolated diabetic nephropathy from NDKD, DeepDKD achieved an AUC of 0·906 (0·825–0·966) on the internal validation dataset and AUCs of 0·733–0·844 across external validation datasets. In the prospective study, compared with the metadata model, DeepDKD could detect DKD wit
Privacy-preserving data aggregation has been extensively studied in the past decades. However, most of these works target at specific aggregation functions such as additive or multiplicative aggregation functions. Mea...
详细信息
ISBN:
(纸本)9781538641293
Privacy-preserving data aggregation has been extensively studied in the past decades. However, most of these works target at specific aggregation functions such as additive or multiplicative aggregation functions. Meanwhile, they assume there exists a trusted authority which facilitates the keys and other information distribution. In this paper, we aim to devise a communication efficient and privacy-preserving protocol that can exactly compute arbitrary data aggregation functions without trusted authority. In our model, there exist one untrusted aggregator and n participants. We assume that all communication channels are insecure and are subject to eavesdropping attacks. Our protocol is designed under the semi-honest model, and it can also tolerate k (k ≤ n-2) collusive adversaries. Our protocol achieves (n - k) -source anonymity. That is, for the source of each collected data aparting from the colluded participants, what the aggregator learns is only from one of the (n - k) non-colluded ones. Compared with recent work [1] that computes arbitrary aggregation functions by collecting all the participants' data using the trusted authority, our protocol increases merely by at most a factor of O(([logn/loglogn]) 2 ) in terms of computation time and communication cost. The key of our protocol is that we have designed algorithms that can efficiently assign unique sequence numbers to each participant without the trusted authority.
作者:
Li, HeJin, HaiServices Computing Technology
System Lab Cluster Grid Computing Lab School of Computer Science and Technology Huazhong University of Science and Technology Wuhan430074 China
MapReduce is a popular programming model and an associated implementation for parallel processing big data in the distributed environment. Since large scaled MapReduce data centers usually provide services to many use...
详细信息
作者:
Huang, YuServices Computing Technology and System Lab
Big Data Technology and System Lab Cluster and Grid Computing Lab School of Computer Science and Technology Huazhong University of Science and Technology Wuhan430074 China
—Time-evolving stream datasets exist ubiquitously in many real-world applications where their inherent hot keys often evolve over times. Nevertheless, few existing solutions can provide efficient load balance on thes...
详细信息
Code offloading is promising to accelerate mobile applications and save energy of mobile devices by shifting some computation to cloud. However, existing code offloading systems suffer from a long communication delay ...
详细信息
作者:
Chen, HanhuaZhang, FanJin, HaiCluster and Grid Computing Lab
Services Computing Technology and System Lab Big Data Technology and System Lab School of Computer Science and Technology Huazhong University of Science and Technology Wuhan430074 China
Real-world stream data with skewed distribution raises unique challenges to distributed stream processing systems. Existing stream workload partitioning schemes usually use a 'one size fits all' design, which ...
详细信息
Exploitation of covert channels in smartphone operating systems may lead to furtive data transmission between applications with different permissions, which might threaten users' privacy. Restricting the access to...
详细信息
暂无评论