Text Summarization is an essential area in text mining,which has procedures for text *** natural language processing,text summarization maps the documents to a representative set of descriptive ***,the objective of te...
详细信息
Text Summarization is an essential area in text mining,which has procedures for text *** natural language processing,text summarization maps the documents to a representative set of descriptive ***,the objective of text extraction is to attain reduced expressive contents from the text *** summarization has two main areas such as abstractive,and extractive *** text summarization has further two approaches,in which the first approach applies the sentence score algorithm,and the second approach follows the word embedding *** such text extractions have limitations in providing the basic theme of the underlying *** this paper,we have employed text summarization by TF-IDF with PageRank keywords,sentence score algorithm,and Word2Vec word *** study compared these forms of the text summarizations with the actual text,by calculating cosine ***,TF-IDF based PageRank keywords are extracted from the other two extractive *** intersection over these three types of TD-IDF keywords to generate the more representative set of keywords for each text document is *** technique generates variable-length keywords as per document diversity instead of selecting fixedlength keywords for each *** form of abstractive summarization improves metadata similarity to the original text compared to all other forms of summarized *** also solves the issue of deciding the number of representative keywords for a specific text *** evaluate the technique,the study used a sample of more than eighteen hundred text *** abstractive summarization follows the principles of deep learning to create uniform similarity of extracted words with actual text and all other forms of text *** proposed technique provides a stable measure of similarity as compared to existing forms of text summarization.
Artificial Intelligence (AI) is a vast field that allows the development of programs capable of simulating human intelligence. One of the most used AI techniques that is very important the preparation of raw data whic...
详细信息
Purpose: Grid computing, cloud computing (CC), utility computing and software as a service are emerging technologies predicted to result in massive consolidation as meta-level computing services of everything beneath ...
详细信息
Refining 3D aorta segmentation is essential for clinical aorta analysis. The small tubular diameter of the aorta branches and the discontinuity of neighboring information make it difficult to get a continuous semantic...
详细信息
A improved swarm optimization method based on particle swarm optimization (PSO) and simplified swarm optimization (SSO) is proposed to adjust the weight in artificial neural network. This method is a modification of t...
详细信息
This work presents an efficient implementation of affinity propagation (AP) on clusters of graphical processing units (GPUs). AP is a state-of-the-art method for finding exemplars in data sets described by similarity ...
详细信息
Lucky imaging is a high-resolution astronomical image recovery technique with two classic implementation algorithms,*** selecting,shifting and adding in image space,and data selecting and image synthesizing in Fourier...
详细信息
Lucky imaging is a high-resolution astronomical image recovery technique with two classic implementation algorithms,*** selecting,shifting and adding in image space,and data selecting and image synthesizing in Fourier *** paper proposes a novel lucky imaging algorithm where with space-domain and frequency-domain selection rates as a link,the two classic algorithms are combined successfully,making each algorithm a proper subset of the novel hybrid *** results show that with the same experiment dataset and platform,the high-resolution image obtained by the proposed algorithm is superior to that obtained by the two classic *** paper also proposes a new lucky image selection and storage scheme,which can greatly save computer memory and enable lucky imaging algorithm to be implemented in a common desktop or laptop with small memory and to process astronomical images with more frames and larger *** addition,through simulation analysis,this paper discusses the binary star detection limits of the novel lucky imaging algorithm and traditional ones under different atmospheric conditions.
In this work we propose to initialize rectifier neural networks with random projection matrices. We focus on Convolutional Neural Networks and fully-connected networks with pretraining. Our results show, that in convo...
详细信息
A novel hybrid algorithm Quantum Immune(QI), which combines Quantum Algorithm (QA) and Immune Clonal Selection(ICS) Algorithm, has been presented for dealing with multi-extremum and multi-parameter problem based on AB...
详细信息
In recent years, information extraction from tweets has been challenging for researchers in the fields of knowledge discovery and data mining. Unlike formal text, such as news articles and pieces of longer content, tw...
详细信息
暂无评论