Competence Assessment by Chunk Hierarchy Evaluation with Transcription-tasks (CACHET) was proposed by Cheng [14]. It analyses micro-behaviors captured during cycles of stimulus viewing and copying in order to probe ch...
详细信息
ISBN:
(纸本)9781450394215
Competence Assessment by Chunk Hierarchy Evaluation with Transcription-tasks (CACHET) was proposed by Cheng [14]. It analyses micro-behaviors captured during cycles of stimulus viewing and copying in order to probe chunk structures in memory. This study extends CACHET by applying it to the domain of graphs and charts. Since drawing strategies are diverse, a new interactive stimulus presentation method is introduced: Transcription with Incremental Presentation of the Stimulus (TIPS). TIPS aims to reduce strategy variations that mask the chunking signal by giving users manual element-by-element control over the display of the stimulus. The potential of TIPS, is shown by the analysis of six participants transcriptions of stimuli of different levels of familiarity and complexity that reveal clear signals of chunking. To understand how the chunk size and individual differences drive TIPS measurements, a CPM-GOMS model was constructed to formalize the cognitive process involved in stimulus comprehension and chunk creation.
We present a novel methodology for crafting effective public messages by combining large language models (LLMs) and conjoint analysis. Our approach personalizes messages for diverse personas – context-specific archet...
详细信息
While there has been rapid growth in smart home research from a technical perspective - focusing on home automation, devices, software, and protocols - few review papers examine the human-centered perspective. A human...
详细信息
ISBN:
(纸本)9781450394215
While there has been rapid growth in smart home research from a technical perspective - focusing on home automation, devices, software, and protocols - few review papers examine the human-centered perspective. A human-centered focus is crucial for achieving the goals of providing natural, convenient, comfortable, friendly, and safe user experiences in the smart home. To understand key innovations in human-centered smart home research, we analyzed keyword changes over time via 19,091 papers from 2000 to 2022, then selected 55 papers from high-impact venues in the last five years, and summarized them through a combination of qualitative and quantitative methods. Our analysis revealed five research trends with unique characteristics and interdependence. Drawing on this review, we elaborate on the future of smart home design research with respect to multidisciplinary development, stakeholder involvement, and the shift of design implications.
Energy-efficient computing uses power management techniques such as frequency scaling to save energy. Implementing energy-efficient techniques on large-scale computingsystems is challenging for several reasons. While...
详细信息
Using big data, distributed computingsystems such as Apache Hadoop requires processing massive amount of data to support business and research applications. Thus, it is critical to ensure the cyber security of such s...
详细信息
ISBN:
(纸本)9781450392365
Using big data, distributed computingsystems such as Apache Hadoop requires processing massive amount of data to support business and research applications. Thus, it is critical to ensure the cyber security of such systems. To better defend from advanced cyber attacks that pose threats to even well-protected enterprises, system-auditing based techniques have been adopted for monitoring system activities and assisting attack investigation. In this demo, we are building a system that collects system auditing logs from a big data system and performs data analysis to understand how system auditing can be used more effectively to assist attack investigation on big systems. We also built a demo application that detects unexpected file deletion and presents root causes for the deletion.
Computer-mediated collaboration often relies on symmetrical interactions between users, where all the collaborators use identical devices. However, in some cases, either due to constraints (e.g. users in different env...
详细信息
General-purpose GPUs have become common in modern computingsystems to accelerate applications in many domains, including machine learning, high-performance computing, and autonomous driving. However, inefficiencies a...
详细信息
ISBN:
(纸本)9781450392051
General-purpose GPUs have become common in modern computingsystems to accelerate applications in many domains, including machine learning, high-performance computing, and autonomous driving. However, inefficiencies abound in GPU-accelerated applications, which prevent them from obtaining bare-metal performance. Performance tools play an important role in understanding performance inefficiencies in complex code bases. Many GPU performance tools pinpoint time-consuming code and provide high-level performance insights but overlook one important performance issue-value-related inefficiencies, which exist in many GPU code bases. In this paper, we present VALUEEXPERT, a novel tool to pinpoint value-related inefficiencies in GPU applications. VALUEEXPERT monitors application execution to capture values produced and used by each load and store operation in GPU kernels, recognizes multiple value patterns, and provides intuitive optimization guidance. We address systemic challenges in collecting, maintaining, and analyzing voluminous performance data from many GPU threads to make VALUEEXPERT applicable to complex applications. We evaluate VALUEEXPERT on a wide range of well-tuned benchmarks and applications, including PyTorch, Darknet, LAMMPS, Castro, and many others. VALUEEXPERT is able to identify previously unknown performance issues and provide suggestions for nontrivial performance improvements with typically less than five lines of code changes. We verify our optimizations with application developers and upstream fixes to their repositories.
Analyzing data subgroups is a common data science task to build intuition about a dataset and identify areas to improve model performance. However, subgroup analysis is prohibitively difficult in datasets with many fe...
详细信息
Recent advances in automatic code generation have made tools like GitHub Copilot attractive for programmers, as they allow for the creation of code blocks by simply providing descriptive prompts to the AI. While resea...
详细信息
Tensor decomposition is a fundamental multi-dimensional data analysis tool for many data-driven applications. However, the rapidly increasing data requires an efficient distributed dynamic tensor decomposition without...
详细信息
暂无评论