In the IoT setting, many resource constrained devices outsource their collected data to the cloud. To ensure the outsourced data has not been lost, these devices need some mechanism to check the integrity of their dat...
详细信息
In the IoT setting, many resource constrained devices outsource their collected data to the cloud. To ensure the outsourced data has not been lost, these devices need some mechanism to check the integrity of their data. Furthermore, in some settings, ad hoc devices need to act as a group, and in this group, any member may require to verify the integrity of outsourced cloud storage. Aiming at solving this problem, in AISACCS'15, first proposed the concept of group-based proofs of storage (GPoS). In GPoS, a group manager can authorize data owners as group members, and then these group members can outsource files to the cloud storage server;later, each member can verify the integrity of the outsourced cloud storage. They also give a concrete construction of GPoS. Unfortunately, in this paper, we find their scheme is not secure. Recently, also proposed a dynamic group-based integrity auditing protocol for outsrouced cloud storage;we also show their scheme is not secure either. Finally, we give an improved scheme and roughly analysis to its security and performance.
The application of online examination is used more and more widely. Test paper composition is the core function in this application. Online examination system requires that the test paper should be quick and flexible,...
详细信息
The application of online examination is used more and more widely. Test paper composition is the core function in this application. Online examination system requires that the test paper should be quick and flexible, and the composition of questions should be reasonable and random. Test paper should not only meet the specific needs of users, ensure fairness, impartiality, but also be accurate and efficient. The intelligent test paper composition algorithm based on genetic algorithm is studied in this paper. Then this method is used to organize the structure of the test paper automatically, composites the examination content. Proposals are put forward to examines the degree of students' mastery of knowledge based on this method. This algorithm can automatically composite test papers according to the conditions of difficulty degree, knowledge coverage, and the proportion of questions. It can automatically composite test papers for online assessment. The practice of online testing system based on this algorithm shows that the algorithm is scientific and effective. On the premise of guaranteeing the quality of the test paper, it greatly improves the efficiency and pertinence of online testing. It is beneficial to improve students' learning efficiency and intellectualization of education.
In the ieee 802.11 WLAN, as the number of nodes increases, the collision will increase accordingly, the channel utilization will decrease, and the total throughput will not increase and decrease, resulting in system p...
详细信息
In the ieee 802.11 WLAN, as the number of nodes increases, the collision will increase accordingly, the channel utilization will decrease, and the total throughput will not increase and decrease, resulting in system performance degradation. This paper designs a simple and effective scheme based on the root cause of the problem—the dynamic time slot algorithm based on the number of nodes(DSTBNN). Then, through in-depth theoretical analysis, it is proved that the method can effectively improve system throughput and improve network performance. Finally, using NS2 simulation software, several simulation experiments were carried out in various scenarios. After analyzing the experimental results, the algorithm was proved to be simple and effective. The network system performance can be optimized according to the number of nodes, the system throughput is improved, and the network performance is significantly improved.
cloud features significantly affect the reconstruction of extensive air showers, and their characterization plays an important role in atmospheric monitoring. A multi-directional characterization of the cloud pattern ...
详细信息
The Large Hadron Collider (LHC)' operating at the international CERN Laboratory in Geneva, Switzerland, is leading bigdata driven scientific explorations. Experiments at the LHC explore the fundamental nature of ...
The Large Hadron Collider (LHC)' operating at the international CERN Laboratory in Geneva, Switzerland, is leading bigdata driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe. computing models for the High Luminosity LHC era anticipate a growth of storage needs of at least orders of magnitude; it will require new approaches in data storage organization and data handling. In our project we address the fundamental problem of designing of architecture to integrate a distributed heterogeneous disk resources for LHC experiments and other data- intensive science applications and to provide access to data from heterogeneous computing facilities. We have prototyped a federated storage for Russian T1 and T2 centers located in Moscow, St.-Petersburg and Gatchina, as well as Russian / CERN federation. We have conducted extensive tests of underlying network infrastructure and storage endpoints with synthetic performance measurement tools as well as with HENP-specific workloads, including the ones running on supercomputing platform, cloudcomputing and Grid for ALICE and ATLAS experiments. We will present our current accomplishments with running LHC dataanalysis remotely and locally to demonstrate our ability to efficiently use federated data storage experiment wide within National Academic facilities for High Energy and Nuclear Physics as well as for other data-intensive science applications, such as bio-informatics.
暂无评论