A brain computer interface (BCI) system translates a person's brain activity into useful control or communication signals. In this paper, an effective P300-based BCI identification algorithm using median filtering...
详细信息
ISBN:
(纸本)9781467345729
A brain computer interface (BCI) system translates a person's brain activity into useful control or communication signals. In this paper, an effective P300-based BCI identification algorithm using median filtering and Bayesian classifier is proposed to improve the classification accuracy and computation efficiency of P300-based BCI. Median filtering is firstly applied to remove noises and Bayesian Linear Discriminant Analysis (BLDA) is then employed for classification. Testing on the P300 speller paradigm in dataset II of 2004 BCI Competition III, we show that a 90% average classification accuracy can be achieved and the highest accuracy is 100%. The proposed method is also computationally efficient and thus it represents a practical implementation for man-computer communication control, especially for on-line applications.
In this paper, we analyze the dual pivot Quicksort, a variant of the standard Quicksort algorithm, in which two pivots are used for the partitioning of the array. We are solving recurrences of the expected number of k...
详细信息
In this paper, we analyze the dual pivot Quicksort, a variant of the standard Quicksort algorithm, in which two pivots are used for the partitioning of the array. We are solving recurrences of the expected number of key comparisons and exchanges performed by the algorithm, obtaining the exact and asymptotic total average values contributing to its time complexity. Further, we compute the average number of partitioning stages and the variance of the number of key comparisons. In terms of mean values, dual pivot Quicksort does not appear to be faster than ordinary algorithm.
Background: The double-cut-and-join (DCJ) is a model that is able to efficiently sort a genome into another, generalizing the typical mutations (inversions, fusions, fissions, translocations) to which genomes are subj...
详细信息
Background: The double-cut-and-join (DCJ) is a model that is able to efficiently sort a genome into another, generalizing the typical mutations (inversions, fusions, fissions, translocations) to which genomes are subject, but allowing the existence of circular chromosomes at the intermediate steps. In the general model many circular chromosomes can coexist in some intermediate step. However, when the compared genomes are linear, it is more plausible to use the so-called restricted DCJ model, in which we proceed the reincorporation of a circular chromosome immediately after its creation. These two consecutive DCJ operations, which create and reincorporate a circular chromosome, mimic a transposition or a block-interchange. When the compared genomes have the same content, it is known that the genomic distance for the restricted DCJ model is the same as the distance for the general model. If the genomes have unequal contents, in addition to DCJ it is necessary to consider indels, which are insertions and deletions of DNA segments. Linear time algorithms were proposed to compute the distance and to find a sorting scenario in a general, unrestricted DCJ-indel model that considers DCJ and indels. Results: In the present work we consider the restricted DCJ-indel model for sorting linear genomes with unequal contents. We allow DCJ operations and indels with the following constraint: if a circular chromosome is created by a DCJ, it has to be reincorporated in the next step (no other DCJ or indel can be applied between the creation and the reincorporation of a circular chromosome). We then develop a sorting algorithm and give a tight upper bound for the restricted DCJ-indel distance. Conclusions: We have given a tight upper bound for the restricted DCJ-indel distance. The question whether this bound can be reduced so that both the general and the restricted DCJ-indel distances are equal remains open.
Multiclass cancer classification is still a challenging task in the field of machine learning. A novel multiclass approach is proposed in this work as a combination of multiple binary classifiers. It is an example of ...
详细信息
ISBN:
(纸本)9781467352345
Multiclass cancer classification is still a challenging task in the field of machine learning. A novel multiclass approach is proposed in this work as a combination of multiple binary classifiers. It is an example of Error Correcting Output Codes algorithms, applying data transmission coding techniques to improve the classification as a combination of binary classifiers. The proposed method combines the One Against All, OAA, approach with a set of classifiers separating each class-pair from the rest, called Pair Against All, PAA. The OAA+PAA approach has been tested on seven publicly available datasets. It has been compared with the common OAA approach and with state of the art alternatives. The obtained results showed how the OAA+PAA algorithm consistently improves the OAA results, unlike other ECOC algorithms presented in the literature.
Traditional packet classification algorithms in Giga bit Intrusion Detection System (GIDS) always focus on static characteristic of the signature and ignore the traffic characteristic totally. In this paper we argue t...
详细信息
ISBN:
(纸本)9781467312882
Traditional packet classification algorithms in Giga bit Intrusion Detection System (GIDS) always focus on static characteristic of the signature and ignore the traffic characteristic totally. In this paper we argue that efficiency of the classification algorithm is up to how current traffic visits the tree, the more well-proportioned the classification tree could partition the traffic, the more efficient it would be. So optimization methods using dynamic traffic characteristics are exploited. Our contributions lie in three folds. Firstly, a novel best classification tree is formally defined aiming to minimize the visit cost of the traffic in the slot, based on which optimization methods are exploited. Secondly, Packet Feature Entropy is proposed to measure how efficiently a packet field can partition the traffic, and the popular 14 packet fields used in Snort are investigated in detail by 10Gbps backbone trace and Netflow data. Finally, adaptive updating strategies are discussed by analyzing the experiment results.
Document classification is a well-known problem that is focused on assigning predefined labels or categories to the documents found in the searched collection. Many classical algorithms were developed for solving of t...
详细信息
ISBN:
(纸本)9781467317139
Document classification is a well-known problem that is focused on assigning predefined labels or categories to the documents found in the searched collection. Many classical algorithms were developed for solving of this problem. They usually have large time complexity and with increasing number of documents it is necessary to find algorithm which are able to find solution in reasonable time. Such algorithms are usually inspired by biological processes. Even such meta-heuristics algorithms become too slow when the number of documents is really large and it is necessary to optimize them for faster processing. This paper describes a document classification algorithm based on Particle Swarm Optimization with implementation of one and two GPUs.
Many applications need to process streams, for analyzing and monitoring their data for inferring useful information. Database analysts are structuring Data Stream Management systems (DSMS) so that applications can sub...
详细信息
ISBN:
(纸本)9781467351416
Many applications need to process streams, for analyzing and monitoring their data for inferring useful information. Database analysts are structuring Data Stream Management systems (DSMS) so that applications can subject queries to get timely information from streams. Unlike the traditional database system, managing and processing of stream database raise several challenges. In this paper, we portrayed a new load shedding system for queries consisting of one or more aggregate operators with sliding windows. We utilized different types of window aggregate function to drop the tuple in DataStream. This method is conscious of the window properties of its window aggregate operators in the query plan. Accordingly, it plausibly divides the input stream into windows and probabilistically decides which tuple to drop based on the window function. This decision is further encoded into tuple by marking the ones that are disallowed from starting new windows. Unlike previous methods, our method conserve consistency of windows all over a query plan, and always distributes subsets of original query responds with negligible deprivation in the quality of the result.
The goal of this paper is to identify primary (licensed) users when the secondary (non-licensed) system is transmitting without Quiet Period in the same frequency band. The paper further presents the LTE operating in ...
详细信息
ISBN:
(纸本)9781467311564
The goal of this paper is to identify primary (licensed) users when the secondary (non-licensed) system is transmitting without Quiet Period in the same frequency band. The paper further presents the LTE operating in TV channels, where DVB-T and PMSE transmitters are considered as primary users by default. In this context, the LTE system is acting as a secondary system which during a first step i) verifies if a TV channel is available for opportunistic access, and which during a second step ii) will start using the TV channel if the band has been previously identified as free. However, since a primary user may start transmitting anytime, the identification becomes very difficult if the LTE system continues to transmit, because a secondary receiver could detect its own secondary system rather than the primary user. This paper therefore supposes that LTE system embeds classification capabilities, allowing primary user discrimination from LTE transmissions. Finally, simulation results are showing the efficiency of the proposed classification algorithm with respect to regulatory requirements.
This paper presents a binary classification algorithm that is based on the minimization of the energy of slack variables, called the Mean Squared Slack (MSS). A novel kernel extension is proposed which includes the wi...
详细信息
ISBN:
(纸本)9781467310680
This paper presents a binary classification algorithm that is based on the minimization of the energy of slack variables, called the Mean Squared Slack (MSS). A novel kernel extension is proposed which includes the withholding of just a subset of input patterns that are misclassified during training. The later leads to a time and memory efficient system that converges in a few iterations. Two datasets are exploited for performance evaluation, namely the adult and the vertebral column dataset. Experimental results demonstrate the effectiveness of the proposed algorithm with respect to computation time and scalability. Accuracy is also high. In specific, it equals 84.951% for the adult dataset and 91.935%, for the vertebral column dataset, outperforming state-of-theart methods.
In this paper, toward a brain-computer interface for information retrieval, we performed preliminary experiments on post-saccadic event related potential (ERP) during web browsing. Two channel electroencephalogram was...
详细信息
ISBN:
(纸本)9781467317139
In this paper, toward a brain-computer interface for information retrieval, we performed preliminary experiments on post-saccadic event related potential (ERP) during web browsing. Two channel electroencephalogram was analyzed with five subjects during simple string retrieval tasks using web browser with active eye movements, i.e., saccadic eye movements. After the saccadic eye movements the ERP signals were observed in target string retrievals, whereas not observed in standard string retrievals. The classification algorithm was applied and the average decoding performance of single-shot post-saccadic ERP could be 78.2% and 77.6% in recall and precision, respectively. The eye-tracking was also studied and the eye gaze points were confirmed to be on and around target strings when the ERP was elicited. This result suggests that the post-saccadic ERP with eye-tracking leads to a possibility of the use in the brain-computer interface for information retrieval.
暂无评论