With the rapid development of Internet technology, crowdsourcing, as a flexible, effective and low-cost problem-solving method, has begun to receive more and more attention. The use of crowdsourcing to evaluate the qu...
With the rapid development of Internet technology, crowdsourcing, as a flexible, effective and low-cost problem-solving method, has begun to receive more and more attention. The use of crowdsourcing to evaluate the quality of linked data has also become a research hotspot. This paper proposes the concept of Domain Specialization Test (DST), which uses domain professional testing tasks DSTs to evaluate the professionalism of workers, and combines the idea of Mini-batch Gradient Descent (MBGD) to improve the EM algorithm, and the MBEM algorithm is proposed to achieve efficient and accurate evaluation of task results. The experimental results show that the proposed method can screen out the appropriate workers for the linked data crowdsourcing task and improve the accuracy and iteration efficiency of the results.
With the extensive application of the knowledge base (KB), how to complete it is a hot topic on Semantic Web. However, many problems go with the big data, and the event matching is one of these problems, which is find...
With the extensive application of the knowledge base (KB), how to complete it is a hot topic on Semantic Web. However, many problems go with the big data, and the event matching is one of these problems, which is finding out the entities referring to the same things in the real world and also the key point in the extending process. To enrich the emergency knowledge base (E-SKB) we constructed before, we need to filter out the news from several web pages and find the same news to avoid data redundancy. In this paper, we proposed a hierarchy blocking method to reduce the times of comparisons and narrow down the scope by extracting the news properties as the blocking keys. The method transforms the event matching problem into a clustering problem. Experimental results show that the proposed method is superior to the existing text clustering algorithm with high precision and less comparison times.
Feature selection based on information theory plays an important role in classification algorithm due to its computational efficiency and independent from classification method. It is widely used in many application a...
详细信息
Feature selection based on information theory plays an important role in classification algorithm due to its computational efficiency and independent from classification method. It is widely used in many application areas like data mining, bioinformatics and machine learning. But drawbacks of these methods are the neglect of the feature interaction and overestimation of features significance due to the limitations of goal functions criterion. To address this problem, we proposed a new feature goal function RJMIM. The method employed joint mutual information and information interaction, which alleviates the shortcomings of overestimation of the feature significance as demonstrated both theoretically and experimentally. The experiments conducted to verify the performance of the proposed method, it compared with four well-known feature selection methods use three publically available datasets from UCI. The average classification accuracy and C4.5 classifier is used to assess the effectiveness of RJMIM method.
Information entropy and its extension, which are important generalization of entropy, have been applied in many research domains today. In this paper, a novel generalized relative entropy is constructed to avoid some ...
详细信息
Some open theoretical questions are addressed on how the mind and brain represent and process concepts, particularly as they are instantiated in particular human languages. Recordings of neuroimaging data should provi...
详细信息
作者:
Beierle, FelixAizawa, AkikoBeel, JoeranService-centric Networking
Technische Universität Berlin Telekom Innovation Laboratories Berlin Germany
Digital Content and Media Sciences Research Division Tokyo Japan Trinity College Dublin
School of Computer Science and Statistics Intelligent Systems Discipline Knowledge and Data Engineering Group ADAPT Centre Dublin Ireland
We investigate the problem of choice overload - the difficulty of making a decision when faced with many options - when displaying related-article recommendations in digital libraries. So far, research regarding to ho...
详细信息
Objectives The purpose of this study was to evaluate the accuracy of noninvasive reconstructions of epicardial potentials, electrograms, activation and recovery isochrones, and beat origins by simultaneously performin...
详细信息
Objectives The purpose of this study was to evaluate the accuracy of noninvasive reconstructions of epicardial potentials, electrograms, activation and recovery isochrones, and beat origins by simultaneously performing electrocardiographic imaging (ECGI) and invasive epicardial electrography in intact animals. Background Noninvasive imaging of electrical potentials at the epicardium, known as ECGI, is increasingly applied in patients to assess normal and abnormal cardiac electrical activity. Methods Body-surface potentials and epicardial potentials were recorded in normal anesthetized dogs. Computed tomography scanning provided a torso-heart geometry that was used to reconstruct epicardial potentials from body-surface potentials. Results Electrogram reconstructions attained a moderate accuracy compared with epicardial recordings (median correlation coefficient: 0.71), but with considerable variation (interquartile range: 0.36 to 0.86). This variation could be explained by a spatial mismatch (overall resolution was <20 mm) that was most apparent in regions with electrographic transition. More accurate derivation of activation times (Pearson R: 0.82), recovery times (R: 0.73), and the origin of paced beats (median error: 10 mm;interquartile range: 7 to 17 mm) was achieved by a spatiotemporal approach that incorporates the characteristics of the respective electrogram and neighboring electrograms. Reconstruction of beats from repeated single-site pacing showed a stable localization of origin. Cardiac motion, currently ignored in ECGI, correlates negatively with reconstruction accuracy. Conclusions ECGI shows a decent median accuracy, but variability in electrogram reconstruction can be sizable. At present, therefore, clinical interpretations of ECGI should not be made on the basis of single electrograms only. Incorporating local spatiotemporal characteristics allows for accurate reconstruction of epicardial activation and recovery patterns, and beat origin localization
—In order to retrieve unlabeled images by textual queries, cross-media similarity computation is a key ingredient. Although novel methods are continuously introduced, little has been done to evaluate these methods to...
详细信息
Cross-modal semantic mapping and cross-media retrieval are key problems of the multimedia search *** study analyzes the hierarchy,the functionality,and the structure in the visual and auditory sensations of cognitive ...
详细信息
Cross-modal semantic mapping and cross-media retrieval are key problems of the multimedia search *** study analyzes the hierarchy,the functionality,and the structure in the visual and auditory sensations of cognitive system,and establishes a brain-like cross-modal semantic mapping framework based on cognitive computing of visual and auditory *** mechanism of visual-auditory multisensory integration,selective attention in thalamo-cortical,emotional control in limbic system and the memory-enhancing in hippocampal were considered in the ***,the algorithms of cross-modal semantic mapping were *** results show that the framework can be effectively applied to the cross-modal semantic mapping,and also provides an important significance for brain-like computing of non-von Neumann structure.
On September 5, 2015, the State Council of Chinese Government, China’s cabinet formally announced its Action Framework for Promoting Big data (***, 2015). This is the milestone for China to catch up the global wave o...
详细信息
暂无评论