Modern radar must adapt to changing environments, and changepoint detection is a method to do so. Many radar systems employ a prediction action cycle to proactively determine transmission modes while spectrum sharing....
Modern radar must adapt to changing environments, and changepoint detection is a method to do so. Many radar systems employ a prediction action cycle to proactively determine transmission modes while spectrum sharing. This method constructs and implements a model of the environment to predict unused frequencies, and subsequently transmits in predicted available bands. For these selection strategies to succeed, the assessment of the underlying environmental models must be robust to change. Changepoint detection increases this robustness. Changepoint detection is the identification of sudden changes, or changepoints, in the distribution from which data is drawn. This information allows the models to discard “garbage” data from a previous distribution, which has no relation to the current state of the environment. For spectrum sharing applications, these changepoints may represent interferers leaving and entering the spectral environment, or changing their spectrum access patterns. In this work, we demonstrate the effectiveness of the addition of changepoint detection to spectrum sharing algorithms in changing environments. Bayesian online changepoint detection (BOCD) is applied to the sense-and-predict algorithm to increase the accuracy of its models and improve its performance. The use of changepoint detection allows for dynamic and robust spectrum sharing even as interference patterns change suddenly and dramatically. BOCD is especially advantageous because it enables online changepoint detection, allowing models to be updated continuously as data are collected. This strategy can also be applied to other predictive algorithms that create and maintain models in changing environments.
Underfrequency (UF) load shedding schemes are traditionally implemented in two ways: One approach is based on manual load shedding, with system operators requesting loads to be shed ahead of anticipated stressful oper...
详细信息
Breast cancer is a prevalent and diverse type of cancer that exhibits unique clinicopathologic characteristics, making the correct identification of its subtype critical to providing targeted treatment and increasing ...
Breast cancer is a prevalent and diverse type of cancer that exhibits unique clinicopathologic characteristics, making the correct identification of its subtype critical to providing targeted treatment and increasing survival rates. This identification process involves testing for the presence of four key molecular biomarkers, namely estrogen receptor (ER), progesterone receptor (PR), human epidermal growth factor receptor 2 (HER2), and antigen Ki67. For accurate diagnosis ,the expertise of a pathologist and immunohistochemistry is required. To overcome this diagnostic challenge, we present a novel approach based on a deep learning pipeline for automated classification. Our approach can detect tumor and non-tumoral regions of the HER2 biomarker. Our deep learning framework comprises a Dense Convolutional Network (DenseNet), which process whole slide images (WSIs) of breast tissues, dividing them into patches for input into the network. Moreover, our approach provides both patchwise and pixelwise classification and analyzes ten WSIs of breast cancer histology. Our proposed approach generates an image map that classifies slide images on the pixel-level, detecting the status of hormone HER2 receptor as either positive or negative. The obtained results show that our deep learning-based approach has the potential to enhance the pathologist’s capabilities in diagnosing histopathological images with automated classification.
In this paper, we propose a digital twin (DT)assisted resource demand prediction scheme to enhance prediction accuracy for multicast short video streaming. Particularly, we first construct user DTs (UDTs) for collecti...
详细信息
This work is a full research-to-practice paper that describes a predictive method to improve the prediction of student test scores. Predicting student test scores is difficult. However, doing so can improve education ...
详细信息
ISBN:
(数字)9798350351507
ISBN:
(纸本)9798350363067
This work is a full research-to-practice paper that describes a predictive method to improve the prediction of student test scores. Predicting student test scores is difficult. However, doing so can improve education greatly by improving advising, scheduling, tutoring assignment and other educational processes. This research extends previous research by using a domain space reduction technique to improve accuracy. Factor Analysis is used to reduce the number of domain attributes for improving the accuracy of a Neural Network to predict student test scores. In this research datasets for Mathematics and Language of high school student test scores were used. Test scores were predicted using a Neural Network computing the Mean Absolute Error as a measurement of accuracy. The datasets have 30 domain attributes each. Factor Analysis was used to reduce the domain size from between 1 to 29, each time using it to train the Neural Network. Because the Mean Absolute Error may vary depending upon which records in the dataset are used for training versus testing, 50 trials of each dataset size were executed producing an Average Mean Absolute Error for each domain size. A statistical test was used to show statistical significance between the Neural Network without Factor Analysis and the Neural Network with varying domain sizes using Factor Analysis. Results were very promising and correspond to previous research that used Principal Component Analysis. Numerous domain sizes had significantly better Average Mean Absolute Errors than the accuracy of the Neural Network without Factor Analysis. This research shows that reducing the domain size using Factor Analysis can greatly improve the accuracy of Neural Networks when predicting student test scores. The best improvements occurred when domain sizes were very small ranging from 2 to 6. Domain reduction techniques, such as Factor Analysis, have been shown to improve predictive models for student test score prediction. Future research
In the wake of disasters, rapid and efficient search and rescue operations are essential. Unmanned aerial vehicles (UAVs) have become instrumental in such scenarios, providing real-time video streaming that can be use...
详细信息
Sim-to-real transfer, which trains RL agents in the simulated environments and then deploys them in the real world, has been widely used to overcome the limitations of gathering samples in the real world. Despite the ...
The COVID-19 outbreak demonstrated the significance of computer models in comprehending the complex dynamics of disease transmission. Epidemiological models have played a crucial role in understanding the complexities...
详细信息
ISBN:
(数字)9798350367560
ISBN:
(纸本)9798350367577
The COVID-19 outbreak demonstrated the significance of computer models in comprehending the complex dynamics of disease transmission. Epidemiological models have played a crucial role in understanding the complexities of COVID-19, developing effective strategies for its containment, and supporting theories regarding its transmission. The SIR (Susceptible-Infectious-Recovered) paradigm classifies people into 3 separate categories: risky, infectious, and healed. Agent-Based Models (ABM) are advanced frameworks to depict actions and relation-ships of people within a community including their various de-mographics and habits. The models have been utilized to analyze the impact of COVID-19 on specific locations, healthcare systems, and the efficacy of various treatments. However, they may face difficulties in specific situations because to the need for significant computing power and information for optimization. computer models have been useful in studying disease transmission and creating effective antiviral therapies.
We present VBPI-Mixtures, an algorithm designed to enhance the accuracy of phylogenetic posterior distributions, particularly for tree-topology and branch-length approximations. Despite the Variational Bayesian Phylog...
详细信息
This paper investigates the usage of hybrid automatic repeat request (HARQ) protocols for power-efficient and reliable communications over free space optical (FSO) links. By exploiting the large coherence time of the ...
This paper investigates the usage of hybrid automatic repeat request (HARQ) protocols for power-efficient and reliable communications over free space optical (FSO) links. By exploiting the large coherence time of the FSO channel, the proposed transmission schemes combat turbulence-induced fading by retransmitting the failed packets in the same coherence interval. To assess the performance of the presented HARQ technique, we extract a theoretical framework for the outage performance. In more detail, a closed-form expression for the outage probability (OP) is reported and an approximation for the high signal-to-noise ratio (SNR) region is extracted. Building upon the theoretical framework, we formulate a transmission power allocation problem throughout the retransmission rounds. This optimization problem is solved numerically through the use of an iterative algorithm. In addition, the average throughput of the HARQ schemes under consideration is examined. Simulation results validate the theoretical analysis under different turbulence conditions and demonstrate the performance improvement, in terms of both OP and throughput, of the proposed HARQ schemes compared to fixed transmit power HARQ benchmarks.
暂无评论