A capacity using covert channel mitigation method (CUCCMM) was proposed. This method uses capacity as metric for channel danger measure according to trusted computersystem evaluation criteria (TCSEC) and multiple pro...
详细信息
A capacity using covert channel mitigation method (CUCCMM) was proposed. This method uses capacity as metric for channel danger measure according to trusted computersystem evaluation criteria (TCSEC) and multiple probabilities based protocol selection policy (MPBPSP) to guide die application of secure concurrency control protocol. The algorithms of channel's capacity measure and policy's parameter computing are also presented. Experimental results show that CUCCMM implements the restriction criterion on channel's capacity effectively and accurately, and the MPBPSP significantly decreases the influence of restriction operation on real-time performance.
This paper summarizes the technical activities of a three-year-long IEEE Task Force (TF) on State Estimation (SE) for Integrated Energy systems (IES). It presents the formal definition and characteristics of IES, alon...
详细信息
This letter proposes a secure beamforming design for downlink non-orthogonal multiple access (NOMA) systems utilizing fluid antenna systems (FAS). We consider a setup where a base station (BS) with M fluid antennas (F...
详细信息
In this paper, we consider a mixed boundary value problem with a double phase partial differential operator, an obstacle effect and a multivalued reaction convection term. Under very general assumptions, an existence ...
详细信息
This paper studies the affine frequency division multiplexing (AFDM)-empowered sparse code multiple access (SCMA) system, referred to as AFDM-SCMA, for supporting massive connectivity in high-mobility environments. Fi...
详细信息
As we all know, data is one of the most valuable assets, however, raw data is often problematic, not conducive to the training of algorithm models. To cope with this, we can process the dirty data with cleaning system...
As we all know, data is one of the most valuable assets, however, raw data is often problematic, not conducive to the training of algorithm models. To cope with this, we can process the dirty data with cleaning systems [1] to obtain standard clean data for data statistics, data mininig and other use. Instead of manually modifying data, writing SQLs or other cumbersome methods which are popular present ways to clean data, the article proposes an approach by making use of the Hadoop big data platform to support massive data and support the cleaning of multiple heterogeneous data sources. Moreover, our system prototype supports custom rules and algorithms, can export results to a specified database, greatly simplifying the workload of data cleaning personnel. Based on the system design and theoretical verification presented in this paper, the author implemented a big data cleaning tool based on big data platform. The typical data cleaning process shows that the data cleaning can be achieved and user operations can be simplified on the basis of the theory proposed in this paper.
The deterministic generation of robust soliton comb has significant meaning for the optical frequency combs to be widely used in various applications. As a novel form of microcomb, Soliton crystal holds the advantages...
暂无评论