In the modern era, the internet is an emerging technology with a rapidly growing user. Sensitive content uploaders consider this platform as a way to transmit disturbing visual content, such as animated cartoon videos...
详细信息
ISBN:
(数字)9798350376425
ISBN:
(纸本)9798350376432
In the modern era, the internet is an emerging technology with a rapidly growing user. Sensitive content uploaders consider this platform as a way to transmit disturbing visual content, such as animated cartoon videos to circulate unsuitable content among youngsters. Previous researches show that the interaction of online sensitive information has substantial offline effects on historically impoverished communities. As a result, it is strongly recommended that social media platforms have an automatic real-time sensitive content filtering method for detecting contextual information. In this proposed model, a novel deep-learning framework is developed for detecting sensitive content from contextual information. For this, the suggested architecture performs under three stages (i) text collection, (ii) preprocessing, and (iii) detection. Initially, the required texts from the contextual information are gathered from the standard available data resource. The collected text is subjected to text preprocessing, where the text is cleaned and prepared for further analysis. Finally, in the detection phase, the developed Transformer Residual Bidirectional Long Short-Term Memory (TR-BiLSTM) is employed to learn the effective representation from the preprocessed text and perform the sensitive content detection. Furthermore, performance comparisons are carried out between the proposed frameworks with other standard approaches. The developed architecture achieved better results in detecting the sensitive content from the text.
An automatic vending machine is designed to supply people with a variety of items, such as snacks, beverages, newspapers, and tickets without any human intervention. According to the money that is deposited into a ven...
详细信息
The paper investigates incorporating and implementing RPA and AI technologies within NFS to improve efficiency and boost service quality. Robotic Process Automation enables the streamlining of repetitive processes. It...
The paper investigates incorporating and implementing RPA and AI technologies within NFS to improve efficiency and boost service quality. Robotic Process Automation enables the streamlining of repetitive processes. It simplifies work processes and releases personnel for more critical projects. On the contrary, AI strengthens NFS providers via data-informed decision-making. Additionally, it facilitates anticipatory maintenance and preemptive network administration. The document explores particular RPA and cognitive computing instances, including AI-powered network issue resolution and AI-driven preventive maintenance for the network hardware. Furthermore, it emphasizes the advantages and constraints of adopting robotic process automation and artificial intelligence within the NFS framework. It considers aspects including data handling, growth potential, and moral implications. Prospects for the future related to RPA and AI within the NFS domain are also analyzed. It foresees self-governing network management, sophisticated predictive analytics, and merging with IoT and edge computing technologies. The article highlights the revolutionary capability of robotic process automation and artificial intelligence in transforming the field service sector, aiming for increased efficiency, reliability, and a customer-centric approach. It emphasizes embracing these technologies to keep up in the dynamic business sector.
ERP systems are merging their powers to generate a dynamic collaboration. By examining the integration between RPA (Robotic Process Automation) and ERP (Enterprise Resource Planning) systems, we can identify their sha...
ERP systems are merging their powers to generate a dynamic collaboration. By examining the integration between RPA (Robotic Process Automation) and ERP (Enterprise Resource Planning) systems, we can identify their shared potential for enhancing process flow. Combining automation and ERP integration makes RPA a powerful tool for process management across multiple industries. The paper examines the basics of RPA and its ERP integration, highlighting the advantages of enhanced productivity and fewer errors. Addressing security, compliance, and change management issues simultaneously. In addition to top-down and bottom-up methods, process analysis and selecting suitable processes are covered. Practical examples offer insightful observations on this integration’s success. The union of RPA and ERP creates a powerful combination that can elevate operational effectiveness, facilitate strategic insights, and reshape the workplace.
there are numerous distinct strategies and techniques that fall under the huge class of neural network-primarily based deep gaining knowledge of in device getting to know. those strategies revolve around developing an...
there are numerous distinct strategies and techniques that fall under the huge class of neural network-primarily based deep gaining knowledge of in device getting to know. those strategies revolve around developing and schooling synthetic neural networks, that are computational models that mimic the capabilities and structure of the human brain, to analyze huge datasets and make predictions or decisions based totally on the *** of the key challenges in actual-time information analysis is being capable of method incoming information fast and efficiently, as well as adapt to converting data styles or sudden occasions. traditional device learning algorithms are frequently restricted in their ability to deal with these demanding situations, that is in which deep gaining knowledge of techniques are available *** getting to know techniques contain education neural networks with a couple of layers, every of which strategies one-of-a-kind aspects of the data. This lets in for greater complex and accurate evaluation of the facts, in addition to the potential to handle larger and more numerous datasets. One famous method in deep studying is known as convolutional neural networks (CNNs). Those networks are specifically properly-suitable for studying visual information, including snap shots, by means of breaking down the data into smaller parts and studying them at one of a kind ranges of abstraction. This permits for the identification of patterns and features in the data which could then be used for category or prediction responsibilities.
This article presents the results of an investigation and analysis of information on emerging technologies to achieve two purposes. The first purpose is to create three reference models: IOT reference model, AMI refer...
This article presents the results of an investigation and analysis of information on emerging technologies to achieve two purposes. The first purpose is to create three reference models: IOT reference model, AMI reference model and DA reference model, applicable for the implementation of solutions to different real-life problems. The second purpose is to present the proposal of a single integrated reference model, called Quysqua, which synergistically combines the three models mentioned above.
In the current era of technology,the Internet and web technologies become the center source of *** to the huge amount of contents,one of the main challenges of modern information technology is aimed at how to reduce a...
详细信息
In the current era of technology,the Internet and web technologies become the center source of *** to the huge amount of contents,one of the main challenges of modern information technology is aimed at how to reduce and manage information in a structured way with mobilizing users to the similar kind of relevant ***,any intelligent system should be able to understand people's interest about a particular type of information and automatically mobilize him to the similar kind of available information *** idea of high level Activity Streams along with its standardized format can play a vital role to solve this problem in the broader *** paper introduces a novel system called CoASGen (Consolidation and Activity Streams Generator) which is able to automatically generate high level Activity Streams after aggregating and consolidating from different independent systems (*** a software company context:version management system,wikis,bug trackers etc.).It retrieves life time information as heterogeneous web feed by sensing user activities from those independent systems and then it transforms several similar types of atomic activities into high level Activity Streams using semantic technologies along with its specific standardized ***,it shows these high level Activity Streams to the user interface which is able to automatically motivate users to find relevant information easily without either missing any data or losing valuable *** system solves the problem "data silos" by reducing and managing information in a structured way.
作者:
M. FeemsterD.M. DawsonA. BehalW. DixonMatthew Feemster received the B.S degree in Electrical Engineering from Clemson University
Clemson South Carolina in December 1994. Upon graduation he remained at Clemson University and received the M.S. degree in Electrical Engineering in 1997. During this time he also served as a research/teaching assistant. His research work focused on the design and implementation of various nonlinear control algorithms with emphasis on the induction motor and mechanical systems with friction present. He is currently working toward his Ph.D. degree in Electrical Engineering at Clemson University. Darren M. Dawson was born in 1962
in Macon Georgia. He received an Associate Degree in Mathematics from Macon Junior College in 1982 and a B.S. Degree in Electrical Engineering from the Georgia Institute of Technology in 1984. He then worked for Westinghouse as a control engineer from 1985 to 1987. In 1987 he returned to the Georgia Institute of Technology where he received the Ph.D. Degree in Electrical Engineering in March 1990. During this time he also served as a research/teaching assistant. In July 1990 he joined the Electrical and Computer Engineering Department and the Center for Advanced Manufacturing (CAM) at Clemson University where he currently holds the position of Professor. Under the CAM director's supervision he currently leads the Robotics and Manufacturing Automation Laboratory which is jointly operated by the Electrical and Mechanical Engineering departments. His main research interests are in the fields of nonlinear based robust adaptive and learning control with application to electro-mechanical systems including robot manipulators motor drives magnetic bearings flexible cables flexible beams and high-speed transport systems. Aman Behal was born in India in 1973. He received his Masters Degree in Electrical Engineering from Indian Institute of Technology
Bombay in 1996. He is currently working towards a Ph.D in Controls and Robotics at Clemson University. His research focuses on the control of no
In this paper, we extend the observer/control strategies previously published in [25] to an n -link, serially connected, direct drive, rigid link, revolute robot operating in the presence of nonlinear friction effects...
详细信息
In this paper, we extend the observer/control strategies previously published in [25] to an n -link, serially connected, direct drive, rigid link, revolute robot operating in the presence of nonlinear friction effects modeled by the Lu-Gre model. In addition, we also present a new adaptive control technique for compensating for the nonlinear parameterizable Stribeck effects. Specifically, an adaptive observer/controller scheme is developed which contains a feedforward approximation of the Stribeck effects. This feedforward approximation is used in a composite controller/observer strategy which forces the average square integral of the position tracking error to an arbitrarily small value. Experimental results are included to illustrate the performance of the proposed controllers.
Aboard current ships, such as the DDG 51, engineering control and damage control activities are manpower intensive. It is anticipated that, for future combatants, the workload demand arising from operation of systems ...
详细信息
Aboard current ships, such as the DDG 51, engineering control and damage control activities are manpower intensive. It is anticipated that, for future combatants, the workload demand arising from operation of systems under conditions of normal steaming and during casualty response will need to be markedly reduced via automated monitoring, autonomous control, and other technology initiatives. Current DDG 51 class ships can be considered as a manpower baseline and under Condition III typical engineering control involves seven to eight watchstanders at manned stations in the Central Control Station, the engine rooms and other machinery spaces. In contrast to this manning level, initiatives such as DD 21 and the integrated engineering plant (IEP) envision a partnership between the operator and the automation system, with more and more of the operator's functions being shifted to the automation system as manning levels decrease. This paper describes some human systems integration studies of workload demand reduction and, consequently, manning reduction that can be achieved due to application of several advanced technology concepts. Advanced system concept studies in relation to workload demand are described and reviewed including. Piecemeal applications of diverse automation and remote control technology concepts to selected high driver tasks in current DDG 51 activities. Development of the reduced ship's crew by virtual presence system that will provide automated monitoring and display to operators of machinery health, compartment conditions, and personnel health. The IEP envisions the machinery control system as a provider of resources that are used by various consumers around the ship. Resource needs and consumer priorities are at all times dependent upon the ship's current mission and the availability of equipment pawnbrokers.
作者:
Leite, MJMensh, DRMichael J. Leite:is a Principal Engineer with PRC
Inc. a division of Litton Industries. He supports combat system engineering for theater air and missile defense. His other tasks have included the command and control for the AEGIS shipbuilding program systems engineering for the 21st Century Surface Combatant combat system survivability and the development of NATO standardization agreements for naval ordnance. He was previously a Senior Engineer with San Diego Gas & Electric with responsibility for its energy application and lighting programs. Prior to joining SDG&E Mr. Leite was a commissioned officer in the U.S. Navy where he served in operations and engineering assignments. Following active duty he accepted a Naval Reserve commission and has retired with the rank of Captain. His assignments included command operational and engineering tours. Mr. Leite has also served as an expert witness in admiralty and engineering matters. He is a gradate of the University of California Berkeley with a Bachelor of Science Degree in Engineering and also holds a Masters Degree in Business Administration from National University in San Diego. Mr. Leite is a Registered Professional Engineer in the States of California and Minnesota. Mr. Leite is a member of ASNE ASCE MORS the Illuminating Engineering Society and the U.S. Naval institute. Dennis Roy Mensh:is a Senior Engineer with PRC
Inc. a division of Litton Industries in Crystal City VA where he supports modeling and simulation tasking for combat systems. He received BS and MS degrees in applied Physics from Lopola College in Baltimore MD and the American University in Washington DC. He has also completed the course work towards a Ph.D. degree in computer science specializing in the fields of Operations Reseurch Anabsis Systems Analysis and Computer Modeling and Simulation. Previously he was employed at the White Oak Laboratory of the Naval Surface Warfare Carter in Silver Spring MD where he worked in the areas of naval sensor and weapon system analysis
This paper defines, develops and examines a set of generic analysis tools that can be applied to Models and Simulations at the systemsengineering level of fidelity. The tools examine the performance and effectiveness...
详细信息
This paper defines, develops and examines a set of generic analysis tools that can be applied to Models and Simulations at the systemsengineering level of fidelity. The tools examine the performance and effectiveness of Sensors;Weapons;and Battle Management, Command, Control, Communications, computers, and Intelligence ((BMCI)-I-4) systems and equipment. The Measures of Performance (MOPs), Measures of Effectiveness (MOEs) and Measures of Force Effectiveness (MOFEs) were extracted from the Modular Command and Control Structure Paradigm which was developed at the Naval Postgraduate School. The paradigm provides for the development of evaluation criteria (MOPs, MOEs, and MOFEs) in a framework that ensures the traceability of system performance and effectiveness to the system operational requirements as specified in the Operational Requirements Document (ORD). Also, the analysis tools provide insight and valid estimates of numerical measures of the defined system functionality threads, which represent the system's operational requirements as specified in the ORD. The tools are directly transferrable and applicable to test and evaluation exercise events which are conducted in support of the development and acquisition of systems and equipment. Once the levels of system performance have been defined, the Paradigm generates a quantitative database that becomes a useful tool in system tradeoffs and selection. Once the alternative system suites have been defined, the suites can be analyzed in terms of system functionality threads and their corresponding performance capabilities versus cost.
暂无评论