The aim of this study was to evaluate obstetric electronic data processing (EDP) in Austria and to analyse its problems, advantages and acceptance in a single big obstetric department. We sent questionnaires to every ...
详细信息
The aim of this study was to evaluate obstetric electronic data processing (EDP) in Austria and to analyse its problems, advantages and acceptance in a single big obstetric department. We sent questionnaires to every obstetric department in the country. The overall response rate was 77% (73 departments). Only 24 (33%) were using computer aided documentation, but these covered 63% of deliveries in Austria. The proportionate times spent on documentation were 57% for physicians and 43% for midwives, with physicians playing a bigger role in larger departments using electronic documentation. Sixty-five percent of physicians and 31% of midwives readily accepted computerization. We also studied an obstetric department with over 3000 births per year. Twenty-five percent of the medical staff did not believe that computerization saved time, although they appreciated its value to administration and for producing printouts. Advantages in completeness (92%) and accuracy (76%) were recognized. After 6 month's use acceptance of EDP documentation improved significantly.
Currently, the use of electronic scales is increasing rapidly, which is not surprising considering its accuracy, the case of use and the increased compliance. The value of Visual Analogue Scales as a mean to objectify...
详细信息
Currently, the use of electronic scales is increasing rapidly, which is not surprising considering its accuracy, the case of use and the increased compliance. The value of Visual Analogue Scales as a mean to objectify subjective variables has long been recognised. The current study aimed to validate the electronic Visual Analogue Scale of Anxiety (eVAAS). Seventy-one subjects, control subjects (n = 46) and Panic Disorder patients (n = 25), filled out the paper VAAS and the eVAAS in a randomised order. Panic was provoked using 35% CO2 inhalation allowing us to include maximal scores in our analyses. The correlation between eVAAS and pVAAS was very strong and highly significant (r = 0.98, p < 0.001). pVAAS scores were slightly higher than eVAAS scores (p < 0.001), but this difference is clinically unimportant. The VAAS established on a tablet PC is a useful and valid measure of anxiety and holds intrinsic benefits for anxiety assessment. (C) 2008 Elsevier Inc. All rights reserved.
Information processing is a main feature of daily work in internal medicine-even in the analog world. This perspective helps to better understand the importance of human skills and the possibilities of information tec...
详细信息
Information processing is a main feature of daily work in internal medicine-even in the analog world. This perspective helps to better understand the importance of human skills and the possibilities of information technology.
On behalf of the German Federal Ministry for Research and Technology, we investigated the use of electronic data processing for clinical toxicological purposes. Initially, sufficient funds were available for a compreh...
详细信息
On behalf of the German Federal Ministry for Research and Technology, we investigated the use of electronic data processing for clinical toxicological purposes. Initially, sufficient funds were available for a comprehensive approach to the problem and programs covering the following areas were established:
Due to a subsequent lack of funds, it was necessary to develop a partial solution: this solution is what we call the Index Line. The Index Line—limited data on poisonings—should enable the user to receive information by telex from the German Institute for Medical Documentation and Information, i.e., information on “who has what, where”. As a first step the continuous Index Line registration of all cases of poisoning recorded at the poison control centers in Munich, Freiburg, Hamburg, and Nürnberg as well as at the State Institute for Food, Pharmaceutical and Forensic Chemistry in Berlin was founded in 1975. To participate in the program, complete instructions are necessary.
Create a competitive advantage with data quality data is rapidly becoming the powerhouse of industry, but low-quality data can actually put a company at a disadvantage. To be used effectively, data must accurately ref...
详细信息
ISBN:
(数字)9781118840962
ISBN:
(纸本)9781118342329
Create a competitive advantage with data quality data is rapidly becoming the powerhouse of industry, but low-quality data can actually put a company at a disadvantage. To be used effectively, data must accurately reflect the real-world scenario it represents, and it must be in a form that is usable and accessible. Quality data involves asking the right questions, targeting the correct parameters, and having an effective internal management, organization, and access system. It must be relevant, complete, and correct, while falling in line with pervasive regulatory oversight programs. Competing with High Quality data: Concepts, Tools and Techniques for Building a Successful Approach to data Quality takes a holistic approach to improving data quality, from collection to usage. Author Rajesh Jugulum is globally-recognized as a major voice in the data quality arena, with high-level backgrounds in international corporate finance. In the book, Jugulum provides a roadmap to data quality innovation, covering topics such as: The four-phase approach to data quality control Methodology that produces data sets for different aspects of a business Streamlined data quality assessment and issue resolution A structured, systematic, disciplined approach to effective data gathering The book also contains real-world case studies to illustrate how companies across a broad range of sectors have employed data quality systems, whether or not they succeeded, and what lessons were learned. High-quality data increases value throughout the information supply chain, and the benefits extend to the client, employee, and shareholder. Competing with High Quality data: Concepts, Tools and Techniques for Building a Successful Approach to data Quality provides the information and guidance necessary to formulate and activate an effective data quality plan today.
Heterogeneous multicores provide alternative core types and potentially multiple voltage-frequency levels to execute workloads more efficiently. One fundamental obstacle for capitalizing their potential performance an...
详细信息
ISBN:
(纸本)9781509064625
Heterogeneous multicores provide alternative core types and potentially multiple voltage-frequency levels to execute workloads more efficiently. One fundamental obstacle for capitalizing their potential performance and energy gains is identifying the most appropriate configuration (core type and voltage-frequency pair) for executing the computations at hand. In this paper, we analyze an ARM big. LITTLE architecture and show that the most efficient configuration is not always the expected one. We study the performance and energy tradeoffs of the big and the LITTLE ARM cores at different voltage and frequency levels. To do so we use various workloads and observe the overheads and benefits from using one configuration over another. Subsequently, we investigate how the workload characteristics and their execution on a particular core type affect energy consumption. We develop a lightweight energy model, suitable for runtime use, to accurately capture the above tradeoffs. Our model uses as input parameters only the instructions per cycle (IPC) and instruction mix. We evaluate the accuracy of the model across the two core types, different frequencies and various benchmarks. The model is able to predict the changes in the energy consumption of a program when moving from one configuration to another with an average error of 4.7%. Moreover, it is able to sort correctly 96% of the configurations across all benchmarks based on their energy consumption. Finally, our energy model can predict correctly for 22 out of 26 benchmarks the configuration that minimizes the energy-delay product (EDP);in the remaining four benchmarks the increase in EDP is less than 2.46%.
This book covers the most essential techniques for designing and building dependable distributed systems, from traditional fault tolerance to the blockchain technology. Topics include checkpointing and logging, recove...
详细信息
ISBN:
(数字)9781119682127
ISBN:
(纸本)9781119681953
This book covers the most essential techniques for designing and building dependable distributed systems, from traditional fault tolerance to the blockchain technology. Topics include checkpointing and logging, recovery-orientated computing, replication, distributed consensus, Byzantine fault tolerance, as well as blockchain. This book intentionally includes traditional fault tolerance techniques so that readers can appreciate better the huge benefits brought by the blockchain technology and why it has been touted as a disruptive technology, some even regard it at the same level of the Internet. This book also expresses a grave concern on using traditional consensus algorithms in blockchain because with the limited scalability of such algorithms, the primary benefits of using blockchain in the first place, such as decentralization and immutability, could be easily lost under cyberattacks.
Distributed source coding is one of the key enablers for efficient cooperative communication. The potential applications range from wireless sensor networks, ad-hoc networks, and surveillance networks, to robust low-c...
详细信息
ISBN:
(数字)9781118705957
ISBN:
(纸本)9780470688991
Distributed source coding is one of the key enablers for efficient cooperative communication. The potential applications range from wireless sensor networks, ad-hoc networks, and surveillance networks, to robust low-complexity video coding, stereo/Multiview video coding, HDTV, hyper-spectral and multispectral imaging, and biometrics. The book is divided into three sections: theory, algorithms, and applications. Part one covers the background of information theory with an emphasis on DSC; part two discusses designs of algorithmic solutions for DSC problems, covering the three most important DSC problems: Slepian-Wolf, Wyner-Ziv, and MT source coding; and part three is dedicated to a variety of potential DSC applications. Key features: Clear explanation of distributed source coding theory and algorithms including both lossless and lossy designs. Rich applications of distributed source coding, which covers multimedia communication and data security applications. Self-contained content for beginners from basic information theory to practical code implementation. The book provides fundamental knowledge for engineers and computer scientists to access the topic of distributed source coding. It is also suitable for senior undergraduate and first year graduate students in electrical engineering; computer engineering; signal processing; image/video processing; and information theory and communications.
objectives. - Medication errors are common at the time of administration. To prevent them, technologies allowing consistency check by bar code technology at bedside have been develo-ped. Our study focuses on the evalu...
详细信息
objectives. - Medication errors are common at the time of administration. To prevent them, technologies allowing consistency check by bar code technology at bedside have been develo-ped. Our study focuses on the evaluation of a BarCode Medication Administration (BCMA) called EASYSCAN with electronic Medication Administration Record (e-MAR) to verify both patient's identity and medication to be ***. - A prospective observational study was conducted during seven weeks in a French medicine ward. The performance of the system was evaluated by the success rate of BCMA and by the average time for administration with and without EASYSCAN. A satisfaction questionnaire about BCMA was proposed to ***. - We observed 182 administrations including 87 (48%) with EASYSCAN. The verifica-tion of the patient's identity was successful in 77% of administrations and 65% of the drugs were scanned successfully. The main causes of check failures were the lack of datamatrix on the drug (81%), error messages (14%) and the lack of system functionality (5%). The average time for administration per patient was significantly increased: 4.68 min/patient with versus 2.87 min/patient without ***. - The study shows the EASYSCAN's performance in its first version. Material and software evolutions and an increase of nurses'pratices will be necessary to continue the expe-rimentation of this system still unpublished in France.(c) 2021 Academie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.
暂无评论