Probabilistic data Association (PDA) is Bayesian approach of deriving likelihood function for all measurements reported by radar. Tracking of Launch Vehicle with skin mode radar involves processing of multiple returns...
Probabilistic data Association (PDA) is Bayesian approach of deriving likelihood function for all measurements reported by radar. Tracking of Launch Vehicle with skin mode radar involves processing of multiple returns from rocket, separated stages and rocket plumes, which makes association of track measurements more uncertain. Using PDA we bring down the uncertainty of measurements by computing posterior probability for all measurements reported by radar during real time. Measurement with highest posterior probability is selected and given as input to processing filter. In this paper we analyze Probabilistic data Association (PDA) performance during tracking of Launch Vehicle by Multi Object Tracking Radar.
Food wastage and capturing lineage from production to consumption is a bigger concern. Yielding, storage and transportation areas have evolved to a great extent associated to manufacturing and automation which lead to...
Food wastage and capturing lineage from production to consumption is a bigger concern. Yielding, storage and transportation areas have evolved to a great extent associated to manufacturing and automation which lead to technical advancements in food processing industry. In such situation, losses are generally observed in the crop production which are sometimes minimal and ignored. However, in some cases these losses are huge and are becoming a threat to the both producers and consumers. Here we considered data related to dairy products and analysed the production losses especially while processing them in the treating unit. Literature on parameters and associated data analysis in the form of graphical representation are provided in the appropriate sections of the paper. Linear regression and correlation were envisaged in view of incorporating machine learning techniques understanding production losses. Karl Pearson’s correlation provides an observation related to association of parameters which are desired to be less coupled in terms of employing proposed newer methodology.
This paper proposes a digital campus cloud data storage security and privacy protection strategy under the Internet of Things environment. Campus users first apply for the attribute key from the trust center and then ...
This paper proposes a digital campus cloud data storage security and privacy protection strategy under the Internet of Things environment. Campus users first apply for the attribute key from the trust center and then outsource the encrypted private data to the cloud server. The authorized user generates keyword traps through attribute keys. Only when the attributes of the authorized user meet the specified access control tree, the authorized user is allowed to search the cloud encrypted data through this trap door. A homomorphic encryption algorithm based on ECC is proposed based on the advantages of high security performance and low computational complexity. The algorithm is applied to cloud storage ciphertext processing and cloud computingdata aggregation privacy protection. The results show that the scheme cannot only ensure the security of data storage but also protect the identity privacy of campus users.
The growing availability of smart meter data from households is contributing to the digitalization of the energy sector and driving its transformation into new business concepts, such as flexibility markets. This data...
详细信息
Nowadays, people ’s lives are increasingly inseparable from big data. The development of big data has a profound impact on human society. At the same time, the generation of massive data has promoted the transformati...
详细信息
Nowadays, people ’s lives are increasingly inseparable from big data. The development of big data has a profound impact on human society. At the same time, the generation of massive data has promoted the transformation of dataprocessing methods. In this development trend, cloud computing came into being. Cloud computing is to provide online cloud services to network users by building a large shared platform, so as to achieve on demand allocation of resources. data centers are the core infrastructure of cloud computing, Therefore, the performance of the data center network determines the quality of service provided by cloud computing. However, traditional data center networks have long been unable to meet the needs of today ’s society for large data centers. This paper summarizes the new data center network structure pro-posed in recent years from the aspects of switch-centric, server-centric, wireless data center, wired and wireless combination, introduces its performance, fault tolerant routing efficiency and other advantages, and looks forward to the future development of data center network structure.
iTrace is community eye-tracking infrastructure that enables conducting eye-tracking studies within an Integrated Development Environment (IDE). It consists of a set of tools for gathering eye-tracking data on large r...
iTrace is community eye-tracking infrastructure that enables conducting eye-tracking studies within an Integrated Development Environment (IDE). It consists of a set of tools for gathering eye-tracking data on large real software projects within an IDE during studies on source code. Once the raw eye-tracking data is collected, processing is necessary before it can be used for analysis. Rather than provide the raw data for researchers to analyze and write their own customize scripts, we introduce iTrace-Toolkit - a suite of tools that assists with combining different data files generated from iTrace and its IDE plugins (namely Visual Studio, Atom, and Eclipse). iTrace-Toolkit also provides the crucial mapping of the valid raw eye-tracking data to source code tokens and finally generates fixations (an important metric in eye-tracking for comprehension) using three commonly used algorithms based on distance and velocity of eye movements. iTrace-Toolkit keeps track of all participant data and tasks during a given study and produces a complete lightweight database of the raw, mapped, and fixation data that is standardized and ready to be used by statistical tools. A simple GUI interface is provided for quick access to filter the data after an eye-tracking study. iTrace-Toolkit also allows for the export of the data or subset of the data to text formats for further statistical processing. YouTube Video: https://***/watchv9j20sOANh8w
For the adversarial multi-armed bandit problem with delayed feedback, we consider that the delayed feedback results are from multiple users and are unrestricted on internal distribution. As the player picks an arm, fe...
详细信息
Block chain can be used as a new kind of distributed database system, can be applied in areas such as carbon footprint. However, its practical application generally faces problems such as simple query function and low...
详细信息
Block chain can be used as a new kind of distributed database system, can be applied in areas such as carbon footprint. However, its practical application generally faces problems such as simple query function and low query performance due to the limitation of data storage mode. Therefore, for the application scenario of blockchain in carbon footprint, query performance optimization is a research topic that cannot be ignored. In this paper, we first introduce the current popular blockchain query schemes. Then we adopt a query optimization method based on MySQL, an external database, to support more complex and efficient query requirements. Finally, we verify the effectiveness and feasibility of the proposed scheme through experiments.
With the opportunity of the digital transformation of the vertical industry represented by the power grid in the 5G era, in order to meet the communication needs of all kinds of new services in the power grid applicat...
详细信息
With the opportunity of the digital transformation of the vertical industry represented by the power grid in the 5G era, in order to meet the communication needs of all kinds of new services in the power grid application scene, in order to adapt to the comprehensive perception and intelligent interconnection development of the future power grid under the new situation, the intelligence of the power Internet of things supporting Yunbian collaboration needs to be improved urgently. Under the 5G architecture, how to effectively protect the business data of communication terminals has become the focus of relevant developers in the security industry. In order to solve the problem of illegal intrusion of equipment and secure data transmission, a big data security access processing device based on 5G is designed in this paper. In this device, the access terminal authentication module authenticates the user identity information through audio and video, two-dimensional code and other technologies; the data trust verification module confirms the data security through whitelist, data scanning and other methods; the data software and hardware encryption module encrypts the data through encryption chip and IPSEC protocol and other technologies. Combining these technologies, the secure transmission of network data is realized. In addition, by testing the throughput of the encryption algorithm, it is found that the smaller the key value of the encryption algorithm, the greater the throughput.
Plant disease detection is a crucial task in agriculture to ensure healthy crop production. It is vital to identify plant diseases early in order to avert economic and environmental damages. A Machine learning-based a...
Plant disease detection is a crucial task in agriculture to ensure healthy crop production. It is vital to identify plant diseases early in order to avert economic and environmental damages. A Machine learning-based approach including Deep Learning techniques for identifying plant disease was developed using Grape leaf images in this work. Inception v3, Squeezenet, VGG-16, and VGG-19 are some embedders that are used in the suggested method to convert raw image data into a fixed-length vector representation, also known as a “embedding” or “feature vector.” Our research concludes that CNN stands out with exceptional accuracy above 99.70% compared to classifiers like Naive Bayes, Logistic Regression, and SVM. Additionally, Inception V3 proved to be an efficient feature extractor, enhancing feature representation and reducing processing time.
暂无评论