Technological advancements in the 21st century have led to the rise of "big data," characterized by datasets so vast and complex that traditional database systems struggle to manage them. this term denotes d...
详细信息
ISBN:
(数字)9798350364729
ISBN:
(纸本)9798350364736
Technological advancements in the 21st century have led to the rise of "big data," characterized by datasets so vast and complex that traditional database systems struggle to manage them. this term denotes datasets that are too vast and complex to be processed by traditional database systems. As big data continues to evolve, it introduces significant challenges related to data processing, storage, and analysis. In response, quantum-inspired algorithms have emerged as a promising solution to these challenges. Quantum-inspired algorithms, which draw on principles from quantum computing, present a promising solution to the challenges of big data processing, storage, and analysis. Techniques such as Quantum Annealing, Quantum Circuits and Gates, and Quantum Parallelism leverage quantum principles like superposition and entanglement to enhance data processing efficiency and speed. these methods offer significant improvements over traditional tools like Hadoop, MapReduce, RapidMiner, and R programming, which are used for distributed storage, processing, and analytics of large datasets. By integrating quantum-inspired algorithms with existing data processing techniques, this study aims to address the limitations of classical methods and advance the field of big data analytics.
ProVerif over-approximates the attacker’s power to enable verification of processes under replication. Unfortunately, this results in ProVerif finding false attacks. this problem is particularly common in protocols w...
详细信息
this has been increasing in difficulty for accomplishing time-consuming and complex tasks withthe rise in the number of Internet of things (IoT) devices and the generated big data, along with problems of expanding co...
详细信息
ISBN:
(数字)9798350360165
ISBN:
(纸本)9798350360172
this has been increasing in difficulty for accomplishing time-consuming and complex tasks withthe rise in the number of Internet of things (IoT) devices and the generated big data, along with problems of expanding concerns often in the form of latency and use of power. Fog computing is the potential solution advanced by a dispersed form of computing and has a solution to these issues. IoT-Fog applications are limited in meeting time constraints to reduce service latency and energy consumption, due to the limited processing capacity that fog computing devices have. Based on this factor, this paper now proposes an alternative approach by using Fuzzy logic to find the solution based on Dynamic Integer Linear programming to reduce the workload in the clinical decision support system by efficient allocation of resources to the IoT devices from the Fog compute layer while accounting for the system constraints and resource availability. the proposed task-priority scheme is integrated into the proposed clinical decision support system and reduces the latency and energy consumption in fog nodes. Experiments show that the proposed method outperforms the benchmark methods in terms of energy consumption and service delay.
Manifold learning is a nonlinear dimensionality reduction technique that reveals the essential features and structure of data through dimensionality reduction. this technique has enormous theoretical research and indu...
详细信息
ISBN:
(数字)9798350363609
ISBN:
(纸本)9798350363616
Manifold learning is a nonlinear dimensionality reduction technique that reveals the essential features and structure of data through dimensionality reduction. this technique has enormous theoretical research and industry application value in fields such as data visualization, denoising, and anomaly detection. Locally linear embedding (LLE) is a classic algorithm in manifold learning, which projects high-dimensional data into a low-dimensional space while maintaining the same algebraic structure. However, the optimization objective function of LLE uses L2 norm to measure linear approximation error, which can easily amplify and reduce the error. therefore, in this paper, L1 norm is used instead of L2 norm to overcome this deficiency, but it also brings about the problem of non-smoothness in the optimization objective function. To address this issue, in this paper, a derivative-free optimization method called Neiderreit sequence initialize Ali Baba and the forty thieves (NSAFT) algorithm has been proposed, and demonstrates its effectiveness in finding the optimal solution for the objective function of LLE through numerical experiments.
In the Internet environment, software gradually moves from closed, static and controllable status towards open, dynamic and unpredictable state. How to propose suitable software theory for such adaptive software has b...
详细信息
In the Internet environment, software gradually moves from closed, static and controllable status towards open, dynamic and unpredictable state. How to propose suitable software theory for such adaptive software has become the challenging issue facing the computer science and technology. Applicable formal theoretical basis is one of signs that software technology achieves maturity, while support for protocol, analysis and verification of adaptive software architecture is inadequate in existing mobile and concurrent theories. Although the software architecture technology has now entered into the golden era of development, there are still many issues to be resolved, one of which is the need for effective mechanism to describe, analyze and verify software architecture. Bigraph puts emphasis on two factors of calculated position and connection on the basis of the existing theories, and a relatively complete and extensible theoretical framework is established. Nowadays, bigraph theory has now begun to be applied, and studies are gradually carried out on extension and shift of bigraph theory basis, description of concurrency theory, bigraph logic, modeling of pervasive computing system, and BPL programming language of bigraph. Hence, bigraph theory can provide a solid foundation for the formal methods of adaptive software architecture.
暂无评论