The Electrocardiogram (ECG) provides a detailed representation of the heart's electrical activity, emerging as a crucial resource for continuous cardiac health monitoring. Recent advances in Artificial Intelligenc...
详细信息
ISBN:
(数字)9798350387025
ISBN:
(纸本)9798350387032
The Electrocardiogram (ECG) provides a detailed representation of the heart's electrical activity, emerging as a crucial resource for continuous cardiac health monitoring. Recent advances in Artificial intelligence (AI) techniques have revolutionized ECG signalprocessing, creating new possibilities for every-day health monitoring. The exploitation of these AI technologies has driven a growing interest in wearable devices where the challenge is to implement these functionalities on limited memory resource hardware. In this scenario, the Edge Computing paradigm, where computation occurs near the data source rather than in a remote data center, emerges as a promising solution. This article proposes an efficient approach for Myocardial Infarction (MI) detection based on Deep Learning (DL) methods using spectrogram and a 1D Convolutional Neural Network (1D-CNN). The aim is to strike a balance between computational efficiency and accuracy enabling practical application on wearable devices. In the presented work, a study on the impact of the spectrogram parameters and results on the 1D-CNN was conducted. The final training phase yielded a remarkable accuracy of 95.94%, showcasing the efficacy of the proposed approach. Notably, the trained model was successfully deployed on a 32-bit microcontroller featuring an ARM Cortex-M4 architecture, underscoring the feasibility of real-world implementation for embedded systems in healthcare applications.
Finding relevant information in a vast and growing amount of data has become significant since the arrival of the internet. An Information Retrieval System is described as searching and retrieving a list of documents,...
详细信息
ISBN:
(数字)9798350363203
ISBN:
(纸本)9798350363210
Finding relevant information in a vast and growing amount of data has become significant since the arrival of the internet. An Information Retrieval System is described as searching and retrieving a list of documents, such as web pages or other items, in response to a user query. There are many weighting schemes, such as Boolean, Term Frequency or TF, and Term-Frequency-Inverse-Document-Frequency or TFIDF. In Boolean, a ‘1’ indicates the presence of a term in the document, whereas a ‘0’ indicates its absence. In TF, the term is represented by the number of occurrences in a document. TFIDF is the most popular one. It combines TF with an inverse document frequency IDF; IDF gives less weight to terms that frequently occur in many documents. With reference to the information retrieval system literature, TFIDF gives more accurate results on the computationally expensive expenses. Term Frequency is less expensive than TFIDF, and the least expensive one is the Boolean weighting scheme. In this paper, we investigated the effect of different weighting schemes and found out that, in many cases, Boolean and TF performed the same as TFIDF and, in a few cases, outperformed them. The experiments in this paper were conducted on many datasets of variant sizes and types. The dataset documents were indexed using TFIDF, TF, and Boolean. Then, using the queries that come with the dataset, we computed the precision and recall by comparing the results of different weighting schemes; the queries were also indexed using TFIDF, TF, and Boolean weighting schemes. The objective is to show that the use of the Boolean or TF weighting scheme, which is considered not computationally expensive, instead of TFIDF, which is regarded as very computationally expensive, does not significantly affect the results and, in a few cases, gives better results.
The ieee International symposium on Biomedical Imaging (ISBI) is a scientific conference dedicated to mathematical, algorithmic, and computational aspects of biological and biomedical imaging, across all scales of obs...
The ieee International symposium on Biomedical Imaging (ISBI) is a scientific conference dedicated to mathematical, algorithmic, and computational aspects of biological and biomedical imaging, across all scales of observation. It fosters knowledge transfer among different imaging communities and contributes to an integrative approach to biomedical imaging. ISBI is a joint initiative from the ieeesignalprocessing Society (SPS) and the ieee Engineering in Medicine and Biology Society (EMBS). The 2018 meeting will include tutorials, and a scientific program composed of plenary talks, invited special sessions, challenges, as well as oral and poster presentations of peer-reviewed papers. High-quality papers are requested containing original contributions to the topics of interest including image formation and reconstruction, computational and statistical imageprocessing and analysis, dynamic imaging, visualization, image quality assessment, and physical, biological, and statistical modeling. Accepted 4-page regular papers will be published in the symposium proceedings published by ieee and included in ieee Xplore. To encourage attendance by a broader audience of imaging scientists and offer additional presentation opportunities, ISBI 2018 will continue to have a second track featuring posters selected from 1-page abstract submissions without subsequent archival publication.
A novel, computationally efficient and robust scheme for multiple initial point prediction has been proposed in this paper. A combination of spatial and temporal predictors has been used for initial motion vector pred...
详细信息
A novel, computationally efficient and robust scheme for multiple initial point prediction has been proposed in this paper. A combination of spatial and temporal predictors has been used for initial motion vector prediction, determination of magnitude and direction of motion and search pattern selection. Initially three predictors from the spatio-temporal neighbouring blocks are selected. If all these predictors point to the same quadrant then a simple search pattern based on the direction and magnitude of the final predicted motion vector is selected. However if the predictors belong to different quadrants then we start the search from multiple initial points to get a clear idea of the location of minimum point. In this case a small rood search pattern has been selected. The predictive search center is closer to the global minimum and thus decreases the effect of monotonic error surface assumption and its impact on the motion field. Its additional advantage is that it moves the search closer to the global minimum hence increases the computation speed. Further computational speed up has been obtained by considering the zero motion threshold for no motion blocks, and, specialized rood search pattern. The image quality measured in terms of PSNR also shows good results.
International benchmarking competitions have become fundamental for the comparative performance assessment of image analysis methods. However, little attention has been given to investigating what can be learnt from t...
International benchmarking competitions have become fundamental for the comparative performance assessment of image analysis methods. However, little attention has been given to investigating what can be learnt from these competitions. Do they really generate scientific progress? What are common and successful participation strategies? What makes a solution superior to a competing method? To address this gap in the literature, we performed a multicenter study with all 80 competitions that were conducted in the scope of ieee ISBI 2021 and MICCAI 2021. Statistical analyses performed based on comprehensive descriptions of the submitted algorithms linked to their rank as well as the underlying participation strategies revealed common characteristics of winning solutions. These typically include the use of multi-task learning (63%) and/or multi-stage pipelines (61%), and a focus on augmentation (100%), image preprocessing (97%), data curation (79%), and post-processing (66%). The “typical” lead of a winning team is a computer scientist with a doctoral degree, five years of experience in biomedical image analysis, and four years of experience in deep learning. Two core general development strategies stood out for highly-ranked teams: the reflection of the metrics in the method design and the focus on analyzing and handling failure cases. According to the organizers, 43% of the winning algorithms exceeded the state of the art but only 11% completely solved the respective domain problem. The insights of our study could help researchers (1) improve algorithm development strategies when approaching new problems, and (2) focus on open research questions revealed by this work.
暂无评论