The latest video coding standard high efficiency video coding (HEVC) has made a significant progress in compression efficiency than previous standard H.264/advanced video coding (AVC) while it has led to a tremendous ...
详细信息
ISBN:
(纸本)9781509053179
The latest video coding standard high efficiency video coding (HEVC) has made a significant progress in compression efficiency than previous standard H.264/advanced video coding (AVC) while it has led to a tremendous increase in encoding computations. Recently, a Bayesian model based transform unit (TU) depth decision approach has been designed to accelerate TU depth decision, which requires numerous variance computations. In this work, a novel relevant feature based Bayesian model is proposed for fast TU depth decision. Experimental results demonstrate that the best performance is achieved while the depths of upper TU, left TU and co-located TU are all taken into considerations. Moreover, as compared with previous research, the proposed algorithm reduces much more encoding computations while keeping the video quality and compression efficiency more or less intact.
With this paper, we contribute to the growing research area of feature-based analysis of bio-inspired computing. In this research area, problem instances are classified according to different features of the underlyin...
详细信息
ISBN:
(纸本)9781509006243
With this paper, we contribute to the growing research area of feature-based analysis of bio-inspired computing. In this research area, problem instances are classified according to different features of the underlying problem in terms of their difficulty of being solved by a particular algorithm. We investigate the impact of different sets of evolved instances for building prediction models in the area of algorithm selection. Building on the work of Poursoltan and Neumann [1], [2], we consider how evolved instances can be used to predict the best performing algorithm for constrained continuous optimisation from a set of bio-inspired computing methods, namely high performing variants of differential evolution, particle swarm optimization, and evolution strategies. Our experimental results show that instances evolved with a multi-objective approach in combination with random instances of the underlying problem allow to build a model that accurately predicts the best performing algorithm for a wide range of problem instances.
Image segmentation is an important step in the domain of image processing in which we segment the image into several parts which carry certain type of information for the user. Image segmentation is very difficult ste...
详细信息
ISBN:
(纸本)9781509026135
Image segmentation is an important step in the domain of image processing in which we segment the image into several parts which carry certain type of information for the user. Image segmentation is very difficult step in the processing of the image which aims at extracting the information from image. Clustering is used to segment the image. Clustering algorithms are part of data mining algorithm that groups the data into various number of given clusters. All the data points in one cluster have similar properties based on which they are clustered i.e. each cluster has minimum difference between its points and maximum difference from other cluster data points. The proposed algorithm uses k-mean algorithm and firefly to cluster image pixels into k cluster for segmentation. Since k-mean clustering algorithm is gets trapped in local optima it is optimized using firefly algorithm. Swarm intelligence based algorithms forms the basis of the firefly algorithm which has several application and used to solve optimization problems. Firefly algorithm has been applied in many research and optimization areas. Firefly algorithm and its hybridized version have been used to solve various problems successfully. To apply firefly algorithm to wide areas of problem the firefly algorithm must be modified or integrated with other algorithms. Presently metaheuristic nature of algorithm plays an important role and current optimization algorithm include this nature and are very efficient in solving NP-hard problems.
The skyline query is a most useful tool to find out attractive products. However, it does little to help select the product combinations with the maximum discount rate. Motivated by this, we identify an interesting pr...
详细信息
ISBN:
(纸本)9781509040940
The skyline query is a most useful tool to find out attractive products. However, it does little to help select the product combinations with the maximum discount rate. Motivated by this, we identify an interesting problem, a most preferential skyline product (MPSP) combination discovering problem, which is NP-hard, for the first time in the literature. This problem aims to report all skyline product combinations having the maximum discount rate. Since the exact algorithm for the MPSP is not scalable to large or high-dimensional datasets, we design an incremental greedy algorithm. The experiment results demonstrate the efficiency and effectiveness of the proposed algorithm.
With the rapid development of computer technology, image processing technology is significantly improved. In this paper, a method of moving target tracking based on computer vison is studied. By combining with the res...
详细信息
With the rapid development of computer technology, image processing technology is significantly improved. In this paper, a method of moving target tracking based on computer vison is studied. By combining with the results of the Marr image research, we first introduce the framework of computer vision theory via a bottom-up visual tracking processing method. To be specific, we take the rocket as an example. Consecutively applying the segmentation algorithm and the tracking algorithm, we find that the accuracy of separation and the instantaneity of image can be enhanced, which makes a significant contribution to the target tracking technology.
High-Efficiency Video Coding (HEVC) is the latest video coding standard of the Joint Collaborative Team on Video Coding (JCT-VC). HEVC noticeably improves compression performance when comparing with previous standards...
详细信息
ISBN:
(纸本)9781509045662
High-Efficiency Video Coding (HEVC) is the latest video coding standard of the Joint Collaborative Team on Video Coding (JCT-VC). HEVC noticeably improves compression performance when comparing with previous standards such as H264, providing a major leap forward in video compression technology. However this improvement is achieved increasing the complexity of the encoding process. In this paper an optimized CU size decision algorithm is proposed in order to reduce HEVC computational cost by means of temporal homogeneity classification, which is directly applied to the input image using a GPU.
Due to the increasing of the size of the datasets, techniques for instance selection have been applied for reducing the data to a manageable volume, leading to a reduction of the computational resources that are neces...
详细信息
ISBN:
(纸本)9781509044603
Due to the increasing of the size of the datasets, techniques for instance selection have been applied for reducing the data to a manageable volume, leading to a reduction of the computational resources that are necessary for performing the learning process. Besides that, algorithms of instance selection can also be applied for removing useless, erroneous or noisy instances, before applying learning algorithms. In the last years, several approaches for instance selection have been proposed. However, most of them have high time complexity and, due to this, they cannot be used for dealing with large datasets. In this paper, we present an algorithm called CDIS that can be viewed as an improvement of a recently proposed density-based approach for instance selection. The main contribution of this paper is a formal characterization of a novel density function that is adopted by the CDIS algorithm. The CDIS algorithm evaluates the instances of each class separately and keeps only the densest instances in a given (arbitrary) neighborhood. This ensures a reasonably low time complexity. Our approach was evaluated on 20 well-known data sets and its performance was compared with the performance of 6 state-of-the-art algorithms, considering three measures: accuracy, reduction and effectiveness. For evaluating the accuracy achieved using the datasets produced by the algorithms, we applied the KNN algorithm. The results show that our approach achieves a performance (in terms of balance of accuracy and reduction) that is better or comparable to the performances of the other algorithms considered in the evaluation.
This paper focuses on the time analysis of different matrix multiplication algorithms. In the paper, the methods invented by Karatsuba and Strassen are analyzed and implemented; theoretical and real time is calculated...
详细信息
ISBN:
(纸本)9781509012701
This paper focuses on the time analysis of different matrix multiplication algorithms. In the paper, the methods invented by Karatsuba and Strassen are analyzed and implemented; theoretical and real time is calculated. Afterwards the Karatsuba and the Strassen algorithms are merged, and combining these two algorithms a new approach is designed, which can be considered as a method for reducing the time complexity.
In this paper, we analyse how multiple competing cloud platforms set effective service prices between Web service providers and consumers. We propose a novel economic framework to model this problem. Cloud platforms r...
详细信息
ISBN:
(纸本)9781509053827
In this paper, we analyse how multiple competing cloud platforms set effective service prices between Web service providers and consumers. We propose a novel economic framework to model this problem. Cloud platforms run double auction mechanisms, where Web service is commodity traded by service providers (sellers) and service consumers (buyers). Multiple cloud platforms compete against each other to attract service providers and consumers. Specifically, we use game theory to analyse the pricing policies of competing cloud platforms, where service providers and consumers can choose to participate in any of the platforms, and bid or ask for the Web service. The platform selection and bidding strategies of service providers and consumers are affected by the pricing policies and vice versa, and so we propose a co-learning algorithm based on fictitious play to analyse this problem. In more detail, we investigate a setting with two competing cloud platforms who can adopt either equilibrium k pricing policy or discriminatory k pricing policy. We find that, when both cloud platforms use the same type of pricing policy, they can co-exist in equilibrium, and they have an extreme bias to service providers or consumers when setting k. When both platforms adopt different types of policies, we find that all service providers and consumers converge to the discriminatory k pricing policy and so the two competing platforms can no longer co-exist.
In this paper, the security issue is investigated for networked control systems (NCSs) where the physical plant is controlled by a remote observer-based controller. The communication channel from system measurement to...
详细信息
In this paper, the security issue is investigated for networked control systems (NCSs) where the physical plant is controlled by a remote observer-based controller. The communication channel from system measurement to remote control centre is vulnerable to attacks from malicious adversaries. Here, false data injection (FDI) attacks are considered. The aim is to find the so-called insecurity conditions under which the NCS is insecure in the sense that there exist FDI attacks that can bypass the anomaly detector but still destabilize the overall system. In particular, a new necessary and sufficient condition for the insecurity is derived when the communication channel is compromised by the adversary. Moreover, a specific algorithm is proposed with which the NCS is shown to be insecure. A simulation example is utilized to demonstrate the usefulness of the proposed conditions/algorithms in the secure control problem.
暂无评论