Digital twins are software implementations of their physical counterparts. These software representations act through application program interfaces (APIs) to the physical devices they monitor, engage with and possibl...
详细信息
ISBN:
(纸本)9781728139258
Digital twins are software implementations of their physical counterparts. These software representations act through application program interfaces (APIs) to the physical devices they monitor, engage with and possibly control. Here a development model is proposed to ensure the desired reliability and performance of the API which is critical to the overall success of the Digital Twin.
In this work, we present an unmanned aerial vehicle (UAV) simulation-based, hardware and software development and verification architecture structured around the Robot Operating System (ROS). One of the key expectatio...
详细信息
ISBN:
(数字)9781624105784
ISBN:
(纸本)9781624105784
In this work, we present an unmanned aerial vehicle (UAV) simulation-based, hardware and software development and verification architecture structured around the Robot Operating System (ROS). One of the key expectations of such a system is a graceful increase in architectural and computational complexity as the number of vehicles and vehicle complexity increases. In addition, the system is expected to provide the ability to test and verify algorithms both at the software and hardware level before real flight operations. This requirement also couples with the requested flexibility of updating the models and the algorithms based on the results coming from real operations. As such, the designed architecture allows joint simulation and testing at both hardware and software layers for multiple vehicle and swarm operations. Specifically, the architecture consists of distinct and networked layers where hardware elements such as autopilot systems (e.g., Pixhawk, Ardupilot etc.), ground stations and external motion capture/localization systems (e.g., Vicon, Otus Tracker etc.) are integrated around the ROS simulation shell. In addition, the dynamics, sensor models, motion planning and other features can be driven by highly parallel MATLAB/Simulink models. Visualization and visual sensing is obtained through linking of virtual reality with simulation environments such as Gazebo and Airsim. This highly reconfigurable architecture allows research teams to work on multidisciplinary areas such as modeling, control, computer vision, artificial intelligence and machine learning within the same simulation and test environment.
Purpose This paper aims to report upon the further development of a hybrid application programming interface (API) plug-in to building information modelling (BIM) entitled confined spaces safety monitoring system &quo...
详细信息
Purpose This paper aims to report upon the further development of a hybrid application programming interface (API) plug-in to building information modelling (BIM) entitled confined spaces safety monitoring system "CoSMoS". Originally designed to engineer-out environmental hazards associated with working in a building's confined spaces (during the construction phase of a building's life-cycle), this second generation version is expanded upon to use archival records to proactively learn from data generated within a sensor network during the building's operations and maintenance (O&M) phase of asset management (AM). Design/methodology/approach An applied research methodological approach adopted used a two-phase process. In phase one, a conceptual model was created to provide a "blueprint map" to integrate BIM, sensor-based networks and data analytics (DA) into one integral system. A literature review provided the basis for the conceptual model's further development. In phase two, the conceptual model was transposed into the prototype's development environment as a proof of concept using primary data accrued from a large educational building. Findings An amalgamation of BIM, historical sensor data accrued and the application of DA demonstrate that CoSMoS provides an opportunity for the facilities management (FM) team to monitor pertinent environmental conditions and human behaviour within buildings that may impact upon occupant/worker safety. Although working in confined spaces is used to demonstrate the inherent potential of CoSMoS, the system could readily be expanded to analyse sensor-based network's historical data of other areas of building performance, maintenance and safety. Originality/value This novel prototype has automated safety applications for FM during the asset lifecycle and maintenance phase of a building's O&M phase of AM. Future work is proposed in several key areas, namely, develop instantaneous indicators of current safety performance within a build
In the aftermath of the Cambridge Analytica controversy, social media platform providers such as Facebook and Twitter have severely restricted access to platform data via their application programming interfaces (APIs...
详细信息
In the aftermath of the Cambridge Analytica controversy, social media platform providers such as Facebook and Twitter have severely restricted access to platform data via their application programming interfaces (APIs). This has had a particularly critical effect on the ability of social media researchers to investigate phenomena such as abuse, hate speech, trolling, and disinformation campaigns, and to hold the platforms to account for the role that their affordances and policies might play in facilitating such dysfunction. Alternative data access frameworks, such as Facebook's partnership with the controversial Social Science One initiative, represent an insufficient replacement for fully functional APIs, and the platform providers' actions in responding to the Cambridge Analytica scandal raise suspicions that they have instrumentalised it to actively frustrate critical, independent, public interest scrutiny by scholars. Building on a critical review of Facebook's public statements through its own platforms and the mainstream media, and of the scholarly responses these have drawn, this article outlines the societal implications of the APIcalypse', and reviews potential options for scholars in responding to it.
Model-Based Systems Engineering (MBSE) has been increasingly embraced by both industry and government to keep track of system complexity. MBSE allows the engineer to represent the system in a comprehensive computer mo...
详细信息
ISBN:
(数字)9781624105890
ISBN:
(纸本)9781624105890
Model-Based Systems Engineering (MBSE) has been increasingly embraced by both industry and government to keep track of system complexity. MBSE allows the engineer to represent the system in a comprehensive computer model, allowing for better traceability, tracking, and information consistency. The vision and promise of MBSE is one where systems models and analyses are tightly integrated in an automated, collaborative, easily accessible, and secure framework. However, the current state-of-the-art falls short of this promise due to a significant gap between MBSE tools and their integration with analysis tools. Phoenix Integration proposes to develop and prototype a framework that would help realize the vision and promise of MBSE. This prototype framework will be web-based and will use existing tools and frameworks already deployed and being used by industry. This will be done by leveraging available existing technology as well as commercial products currently under development. At the center of the framework is the connection between No Magic Teamwork Cloud Server and ModelCenter (R) MBSE. Teamwork Cloud Server is a web-based MBSE collaboration platform, while ModelCenter (R) MBSE is a next generation MBSE analysis integration tool currently being commercially developed at Phoenix Integration. This framework will be connected to distributed or high-performance computing resources for quick analysis execution, as well as a continuous integration server for automated execution in response to a model change. In addition to being able to interact with the systems model through a web environment, the user would be able to execute the associated analyses and workflows using information from the systems model. Automatic requirements verification can be performed through automated analysis execution whenever a change in the systems model is detected. Results can be displayed on a web-enabled dashboard, together with interactive charts and plots to help visualize results and d
Approximate computing is necessary to meet deadlines in some compute-intensive applications like simulation. Building them requires a high level of expertise from the application designers as well as a significant dev...
详细信息
ISBN:
(纸本)9781450362771
Approximate computing is necessary to meet deadlines in some compute-intensive applications like simulation. Building them requires a high level of expertise from the application designers as well as a significant development effort. Some application programming interfaces greatly facilitate their conception but they still heavily rely on the developer's domain-specific knowledge and require many modifications to successfully generate an approximate version of the program. In this paper we present new techniques to semi-automatically discover relevant approximate computing parameters. We believe that superior compiler-user interaction is the key to improved productivity. After pinpointing the region of interest to optimize, the developer is guided by the compiler in making the best implementation choices. Static analysis and runtime monitoring are used to infer approximation parameter values for the application. We evaluated these techniques on multiple application kernels that support approximation and show that with the help of our method, we achieve similar performance as non-assisted, hand-tuned version while requiring minimal intervention from the user.
作者:
Kazuki NakamaeHidemasa BonoLaboratory of BioDX
PtBio Co-Creation Research Center Genome Editing Innovation Center Hiroshima University 3-10-23 Kagamiyama Higashi-Hiroshima 739-0046 Japan Laboratory of Genome Informatics
Graduate School of Integrated Sciences for Life Hiroshima University 3-10-23 Kagamiyama Higashi-Hiroshima 739-0046 Japan
Bioinformatics has become an indispensable technology in molecular biology for genome editing. In this review, we outline various bioinformatic techniques necessary for genome editing research. We first review state-o...
详细信息
Bioinformatics has become an indispensable technology in molecular biology for genome editing. In this review, we outline various bioinformatic techniques necessary for genome editing research. We first review state-of-the-art computational tools developed for genome editing studies. We then introduce a bio-digital transformation (BioDX) approach, which fully utilizes existing databases for biological innovation, and uses publicly available bibliographic full-text data and transcriptome data to survey genome editing target genes in model organism species, where substantial genomic information and annotation are readily available. We also discuss genome editing attempts in species with almost no genomic information. The transcriptome data, sequenced genomes, and functional annotations for these species are described, with a primary focus on the bioinformatic tools used for these analyses. Finally, we conclude on the need to maintain a database of genome editing resources for future development of genome editing research. Our review shows that the integration and maintenance of useful resources remains a challenge for bioinformatics research in genome editing, and that it is crucial for the research community to work together to create and maintain such databases in the future.
According to World Health Organization, around 7 million people die every year due to diseases caused by air pollution. With the improvements in Internet of Things in the recent years, environmental sensing systems ha...
详细信息
According to World Health Organization, around 7 million people die every year due to diseases caused by air pollution. With the improvements in Internet of Things in the recent years, environmental sensing systems has started to gain importance. By using technologies like Cloud Computing, RFID, Wireless Sensor Networks, and open application programming interfaces, it has become easier to collect data for visualization on different platforms. However, collected data need to be represented in an efficient way for better understanding and analysis, which requires design of data visualization tools. The GreenIoT initiative aims to provide open data with its infrastructure for sustainable city development in Uppsala. An environmental web application is presented within this thesis project, which visualizes the gathered environmental data to help municipality organizations to implement new policies for sustainable urban planning, and citizens to gain more knowledge to take sustainable decisions in their daily life. The application has been developed making use of the 4Dialog API, which is developed to provide data from a dedicated cloud storage for visualization purposes. According to the evaluation presented in this thesis, further development is needed to improve the performance to provide faster and more reliable service as well as the accessibility to promote openness and social inclusion.
Interoperability in healthcare has traditionally been focused around data exchange between business entities, for example, different hospital systems. However, there has been a recent push towards patient-driven inter...
详细信息
Interoperability in healthcare has traditionally been focused around data exchange between business entities, for example, different hospital systems. However, there has been a recent push towards patient-driven interoperability, in which health data exchange is patient-mediated and patient-driven. Patient-centered interoperability, however, brings with it new challenges and requirements around security and privacy, technology, incentives, and governance that must be addressed for this type of data sharing to succeed at scale. In this paper, we look at how blockchain technology might facilitate this transition through five mechanisms: (1) digital access rules, (2) data aggregation, (3) data liquidity, (4) patient identity, and (5) data immutability. We then look at barriers to blockchain-enabled patient-driven interoperability, specifically clinical data transaction volume, privacy and security, patient engagement, and incentives. We conclude by noting that while patient-driving interoperability is an exciting trend in healthcare, given these challenges, it remains to be seen whether blockchain can facilitate the transition from institution-centric to patient-centric data sharing. (c) 2018 The Authors. Published by Elsevier B.V. on behalf of Research Network of Computational and Structural Biotechnology. This is an open access article under the CC BY license (http://***/licenses/by/4.0/).
Clinical decision support tools for risk prediction are readily available, but typically require workflow interruptions and manual data entry so are rarely used. Due to new data interoperability standards for electron...
详细信息
Clinical decision support tools for risk prediction are readily available, but typically require workflow interruptions and manual data entry so are rarely used. Due to new data interoperability standards for electronic health records (EHRs), other options are available. As a clinical case study, we sought to build a scalable, web-based system that would automate calculation of kidney failure risk and display clinical decision support to users in primary care practices. We developed a single-page application, web server, database, and application programming interface to calculate and display kidney failure risk. Data were extracted from the EHR using the Consolidated Clinical Document Architecture interoperability standard for Continuity of Care Documents (CCDs). EHR users were presented with a noninterruptive alert on the patient's summary screen and a hyperlink to details and recommendations provided through a web application. Clinic schedules and CCDs were retrieved using existing application programming interfaces to the EHR, and we provided a clinical decision support hyperlink to the EHR as a service. We debugged a series of terminology and technical issues. The application was validated with data from 255 patients and subsequently deployed to 10 primary care clinics where, over the course of 1 year, 569 533 CCD documents were processed. We validated the use of interoperable documents and open-source components to develop a low-cost tool for automated clinical decision support. Since Consolidated Clinical Document Architecture-based data extraction extends to any certified EHR, this demonstrates a successful modular approach to clinical decision support.
暂无评论