The rapid development of artificial intelligence has brought the artificial intelligence threat theory as well as the problem about how to evaluate the intelligence level of intelligent products. Both need to find a q...
详细信息
Large-scale matrix computation becomes essential for many datadata applications, and hence the problem of sketching matrix with small space and high precision has received extensive study for the past few years. This...
详细信息
ISBN:
(纸本)9781450335317
Large-scale matrix computation becomes essential for many datadata applications, and hence the problem of sketching matrix with small space and high precision has received extensive study for the past few years. This problem is often considered in the row-update streaming model, where the data set is a matrix A E R'd, and the processor receives a row (1 x d) of A at each timestamp. The goal is to maintain a smaller matrix (termed approximation matrix, or simply approximation) B E Rex d as an approximation to A, such that the covariance error MATA BT B11 is small and < n. This paper studies continuous tracking approximations to the matrix defined by a sliding window of most recent rows. We consider both sequence-based and time-based window. We show that maintaining ATA exactly requires linear space in the sliding window model, as opposed to 0(d2) space in the streaming model. With this observation, we present three general frameworks for matrix sketching on sliding windows. The sampling techniques give random samples of the rows in the window according to their squared norms. The Logarithmic Method converts a mergeable streaming matrix sketch into a matrix sketch on time-based sliding windows. The Dyadic Interval framework converts arbitrary streaming matrix sketch into a matrix sketch on sequence based sliding windows. In addition to proving all algorithmic properties theoretically, we also conduct extensive empirical study with real data sets to demonstrate the efficiency of these algorithms.
In this paper, an application mode and method of knowledgemanagement in remanufacturing engineering management is established based on Nonaka's SECI model. The relationships between knowledge transfer,knowledge s...
详细信息
In this paper, an application mode and method of knowledgemanagement in remanufacturing engineering management is established based on Nonaka's SECI model. The relationships between knowledge transfer,knowledge sharing and remanufacturing engineering management are highlighted. It is noticeable that a great deal of knowledge transfer and sharing activities, which can improve the performance of remanufacturing engineering management constantly, are involved in remanufacturing engineering.
In this study, we describe the intersection of three conics based on the singularities of their corresponding Jacobian curve. In particular, we show that certain singular points and sub-lines of the Jacobian curve are...
详细信息
In this study, we describe the intersection of three conics based on the singularities of their corresponding Jacobian curve. In particular, we show that certain singular points and sub-lines of the Jacobian curve are the precise common points and common tangent lines of the conics, respectively. Based on our results, these points or the tangent line can be computed as the singularities of the Jacobian curve. These results facilitate investigations of the relationships between a net of conics and their Jacobian curve. (C) 2015 Elsevier Ltd. All rights reserved.
In this paper, we develop an extended model for the project portfolio selection problem over a planning horizon with multiple time periods. The model incorporates the factors of project divisibility and interdependenc...
详细信息
In this paper, we develop an extended model for the project portfolio selection problem over a planning horizon with multiple time periods. The model incorporates the factors of project divisibility and interdependency at the same time for real-life applications. The project divisibility is considered as a strategy, not an unfortunate event as in the literature, in choosing the best execution schedule for the projects, and the classical concept of"project interdependencies" among fully executed projects is then extended to the portions of executed projects. Additional constraints of reinvestment consideration, setup cost, cardinality restriction, precedence relationship and scheduling are also included in the model. For efficient computations, an equivalent mixed integer linear programming representation of the proposed model is derived. Numerical examples under four scenarios are presented to highlight the characteristics of the proposed model. In particular, the positive effects of project divisibility are shown for the first time.
作者:
Shen, JiangjianLong, WenChinese Acad Sci
Res Ctr Ficititious Econ & Data Sci Beijing 100190 Peoples R China UCAS
Sch Econ & Management Beijing 100190 Peoples R China Chinese Acad Sci
Key Lab Big Data Min & Knowledge Management Beijing 100190 Peoples R China
This article uses Markov regime Switching ARCH (SWARCH) model to research the volatility of Chinese stock market industry sectors, finding that all industry sectors were able to be significantly divided into two regim...
详细信息
This article uses Markov regime Switching ARCH (SWARCH) model to research the volatility of Chinese stock market industry sectors, finding that all industry sectors were able to be significantly divided into two regimes, the high volatility regime and the low volatility regime. For different regime transfer, we can classify all sectors into three categories. Further the article analyzes the regime characteristics of industry sectors. The results show that the correlation coefficient in high volatility regime is higher than that in low volatility regime. (C) 2016 Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license.
In this paper we study the problem of learning discriminative features (segments), often referred to as shapelets [Ye and Keogh, 2009] of time series, from unlabeled time series data. Discovering shapelets for time se...
详细信息
In this paper we study the problem of learning discriminative features (segments), often referred to as shapelets [Ye and Keogh, 2009] of time series, from unlabeled time series data. Discovering shapelets for time series classification has been widely studied, where many search-based algorithms are proposed to efficiently scan and select segments from a pool of candidates. However, such types of search-based algorithms may incur high time cost when the segment candidate pool is large. Alternatively, a recent work [Grabocka et al., 2014] uses regression learning to directly learn, instead of searching for, shapelets from time series. Motivated by the above observations, we propose a new Unsupervised Shapelet Learning Model (USLM) to efficiently learn shapelets from unlabeled time series data. The corresponding learning function integrates the strengths of pseudo-class label, spectral analysis, shapelets regularization term and regularized least-squares to auto-learn shapelets, pseudo-class labels and classification boundaries simultaneously. A coordinate descent algorithm is used to iteratively solve the learning function. Experiments show that USLM outperforms searchbased algorithms on real-world time series data.
In large-scale learning problem, the scalability of learning algorithms is usually the key factor affecting the algorithm practical performance, which is determined by both the time complexity of the learning algorith...
详细信息
In large-scale learning problem, the scalability of learning algorithms is usually the key factor affecting the algorithm practical performance, which is determined by both the time complexity of the learning algorithms and the amount of supervision information (i.e., labeled data). Learning with label proportions (LLP) is a new kind of machine learning problem which has drawn much attention in recent years. Different from the well-known supervised learning, LLP can estimate a classifier from groups of weakly labeled data, where only the positive/negative class proportions of each group are known. Due to its weak requirements for the input data, LLP presents a variety of real-world applications in almost all the fields involving anonymous data, like computer vision, fraud detection and spam filtering. However, even through the required labeled data is of a very small amount, LLP still suffers from the long execution time a lot due to the high time complexity of the learning algorithm itself. In this paper, we propose a very fast learning method based on inversing output scaling process and extreme learning machine, namely Inverse Extreme Learning Machine (IELM), to address the above issues. IELM can speed up the training process by order of magnitudes for large datasets, while achieving highly competitive classification accuracy with the existing methods at the same time. Extensive experiments demonstrate the significant speedup of the proposed method. We also demonstrate the feasibility of IELM with a case study in real-world setting: modeling image attributes based on ImageNet Object Attributes dataset.
Geological disaster recognition, especially, landslide recognition, is of vital importance in disaster prevention, disaster monitoring and other applications. As more and more optical remote sensing images are availab...
详细信息
Geological disaster recognition, especially, landslide recognition, is of vital importance in disaster prevention, disaster monitoring and other applications. As more and more optical remote sensing images are available in recent years, landslide recognition on optical remote sensing images is in demand. Therefore, in this paper, we propose a deep learning based landslide recognition method for optical remote sensing images. In order to capture more distinct features hidden in landslide images, a particular wavelet transformation is proposed to be used as the preprocessing method. Next, a corrupting & denoising method is proposed to enhance the robustness of the model in recognize landslide features. Then, a deep auto-encoder network with multiple hidden layers is proposed to learn the high-level features and representations of each image. A softmax classifier is used for class prediction. Experiments are conducted on the remote sensing images from Google Earth. The experimental results indicate that the proposed wav DAE method outperforms the state-of-the-art classifiers both in efficiency and accuracy.
作者:
Yang, ZhuofanShi, YongChinese Acad Sci
Management Sch Univ Chinese Acad Sci Res Ctr Fictitious Econ & Data Sci Beijing Peoples R China Chinese Acad Sci
Res Ctr Fictitious Econ & Data Sci Key Res Lab Big Data Min & Knowledge Management Beijing Peoples R China
This paper examine the importance of website quality indicator in e-commerce efficiency evaluation. It applies data Envelopment Analysis (DEA) models and traditional production theory to account for how scale affects ...
详细信息
ISBN:
(纸本)9781509047710
This paper examine the importance of website quality indicator in e-commerce efficiency evaluation. It applies data Envelopment Analysis (DEA) models and traditional production theory to account for how scale affects efficiency in pure e-commerce firms. The results of DEA models identify website quality, user scale and user experience each contribute to improving operations efficiency, but scale expansion often leads to overall operations inefficiency. Due to excessive scale expansion, all inefficient Decision Making Units exhibit decreasing return to scale. These findings may help managers resize the scale of their operations. This research enriches scale economic theory and improves the indicator system in e-commerce efficiency evaluation.
暂无评论