Interactive Whiteboard (IWB) Systems are increasingly being used to replace traditional whiteboards in educational activities, making them more efficient and interesting for both teachers and students. However, no stu...
Interactive Whiteboard (IWB) Systems are increasingly being used to replace traditional whiteboards in educational activities, making them more efficient and interesting for both teachers and students. However, no studies have been conducted to investigate how age, educational level, and employment status can impact the adoption and usage of IWB systems in the educational setting. The study aims to analyze the acceptance and the influencing factors on faculty members’ intention and usage behavior towards IWB systems in teaching, utilizing the Modified-Unified Theory of Acceptance and Use of Technology 2 (UTAUT2) model. 135 faculty members from various colleges and departments across Batangas State University ARASOF-Nasugbu Campus were surveyed and data was analyzed using Partial Least Squares Structural Equation Modeling (PLS-SEM) through SmartPLS software for hypotheses testing. The results of the study indicate that faculty members’ behavioral intention to use interactive whiteboard systems in teaching is positively influenced by performance expectancy and social influence. Moreover, facilitating conditions and habit positively impact usage behavior. This study, therefore, concludes that there is significant roles of performance expectancy and social influence as well as the moderating effect of age on the behavioral intention to use interactive whiteboard (IWB) systems in education. Furthermore, the study confirms that effort expectancy, hedonic motivation, and status as moderating variables significantly influence faculty members’ usage behavior when using IWB systems in teaching.
One of the new tools included in the AV1 video codec is the adaptive filtering scheme used in the sample interpolation process. This scheme includes three different filter families called Regular, Sharp and Smooth, of...
One of the new tools included in the AV1 video codec is the adaptive filtering scheme used in the sample interpolation process. This scheme includes three different filter families called Regular, Sharp and Smooth, offering high flexibility for motion estimation (ME) and motion compensation (MC). However, the high number of interpolation filters also leads to greater complexity and energy consumption, since the generation of samples at sub-pixel position is a costly process. This paper proposes a low-power and high-throughput hardware accelerator focused on the AV1 interpolation filters called Multiversion Interpolation Processor (MVIP). The accelerator includes the three AV1 interpolation filter families, with versions that employ operand isolation for power reduction in unused filters. The accelerator also includes a precise MVIP assuming the MC scenario, besides two approximate versions to reduce the cost on the ME scenario. The proposed design is able to process 8K video at 50fps in MC and 2,656.14 Msamples/sec in ME, with a power dissipation of 41.30mW.
The agent learns to organize decision behavior to achieve a behavioral goal, such as reward maximization, and reinforcement learning is often used for this optimization. Learning an optimal behavioral strategy is diff...
详细信息
Introduction: The intensive care of neonates is associated with the entry of a large volume of data in their medical records. The treatment of this data can be done through Machine Learning: a tool capable of assistin...
详细信息
Mitochondrial division inhibitor 1 (Mdivi-1) is a well-known synthetic compound aimed at inhibiting dynamin-related protein 1 (Drp1) to suppress mitochondrial fission, making it a valuable tool for studying mitochondr...
详细信息
The Versatile Video Coding (VVC) standard introduced several novel encoding tools for intra frame prediction, increasing the encoder complexity when compared to previous standards. Among these novelties is the MIP too...
The Versatile Video Coding (VVC) standard introduced several novel encoding tools for intra frame prediction, increasing the encoder complexity when compared to previous standards. Among these novelties is the MIP tool, whose acceleration has not been tackled in the literature. Therefore, an efficient parallelization of MIP prediction targeting GPU platforms is now proposed. The presented technique makes use of alternative reference samples and computes the distortion in an approximate manner to expose and potentiate massive parallelism. Moreover, the adopted prediction scheduling and memory communication were tailored by considering the GPUs' architecture and memory hierarchy. When compared with a CPU execution, this work is capable to accelerate the MIP prediction up to 105 times at the cost of a negligible coding efficiency loss of 0.284 % BD-BR.
Sudden cardiac arrest (SCA) poses a significant health challenge, necessitating accurate predictions of neurological outcomes in comatose patients, where good outcomes are defined as the recovery of most cognitive fun...
详细信息
Population diversity management is crucial for the quality of solutions in Evolutionary Algorithms. Many techniques require assistance to handle diverse problem characteristics and may prematurely converge in local op...
详细信息
Basement relief gravimetry is a key application in geophysics, particularly important for oil exploration and mineral prospecting. It involves solving an inverse geophysical problem, where the parameters of a geologic...
详细信息
Basement relief gravimetry is a key application in geophysics, particularly important for oil exploration and mineral prospecting. It involves solving an inverse geophysical problem, where the parameters of a geological model are inferred from observed data. In this context, the geological model consists of the depths of constant-density prisms representing the basement relief, and the data correspond to the gravitational anomalies caused by these prisms. Inverse geophysical problems are typically ill-posed, as defined by Hadamard, meaning that small perturbations in the data can result in large variations in the solutions. To address this instability, regularization techniques, such as those proposed by Tikhonov, are employed to stabilize the solutions. This study presents a comparative analysis of various regularization techniques applied to the gravimetric inversion problem, including Smoothness Constraints, Total Variation, the Discrete Cosine Transform (DCT), and the Discrete Wavelet Transform (DWT) using Daubechies D4 wavelets. Optimization methods are commonly used in inverse geophysical problems because of their ability to find optimal parameters that minimize the objective function—in this case, the depths of the prisms that best explain the observed gravitational anomalies. The Genetic Algorithm (GA) was selected as the optimization technique. GA is based on Darwinian evolutionary theory, specifically the principle of natural selection, where the fittest individuals in a population are selected to pass on their traits. In optimization, this translates to selecting solutions that most effectively minimize the objective function. The results, evaluated using fit metrics and cumulative error analysis, demonstrate the effectiveness of all the regularization techniques and the Genetic Algorithm. Among the methods tested, the Smoothness constraint was briefly the most effective for the first and second synthetic models. For the third model, which was based on re
This paper aims to investigate the mathematical problem-solving capabilities of Chat Generative Pre-Trained Transformer (ChatGPT) in case of Bayesian reasoning. The study draws inspiration from Zhu & Gigerenzer...
暂无评论