Persistent homology barcodes and diagrams are a cornerstone of topological data analysis that capture the "shape" of a wide range of complex data structures, such as point clouds, networks, and functions. Ho...
详细信息
Persistent homology barcodes and diagrams are a cornerstone of topological data analysis that capture the "shape" of a wide range of complex data structures, such as point clouds, networks, and functions. However, their use in statistical settings is challenging due to their complex geometric structure. In this paper, we revisit the persistent homology rank function, which is mathematically equivalent to a barcode and persistence diagram, as a tool for statistics and machine learning. Rank functions, being functions, enable the direct application of the statistical theory of functional data analysis (FDA)-a domain of statistics adapted for data in the form of functions. A key challenge they present over barcodes in practice, however, is their lack of stability-a property that is crucial to validate their use as a faithful representation of the data and therefore a viable summary statistic. In this paper, we fill this gap by deriving two stability results for persistent homology rank functions under a suitable metric for FDA integration. We then study the performance of rank functions in functional inferential statistics and machine learning on real data applications, in both single and multiparameter persistent homology. We find that the use of persistent homology captured by rank functions offers a clear improvement over existing non-persistence-based approaches.
While parallel programming, particularly on graphics processing units (GPUs), and numerical optimization hold immense potential to tackle real-world computational challenges across disciplines, their inherent complexi...
详细信息
ISBN:
(数字)9798350364606
ISBN:
(纸本)9798350364613
While parallel programming, particularly on graphics processing units (GPUs), and numerical optimization hold immense potential to tackle real-world computational challenges across disciplines, their inherent complexity and technical demands often act as daunting barriers to entry. This, unfortunately, limits accessibility and diversity within these crucial areas of computer science. To combat this challenge and ignite excitement among undergraduate learners, we developed an application-driven course, harnessing robotics as a lens to demystify the intricacies of these topics making them tangible and engaging. Our course's prerequisites are limited to the required undergraduate introductory core curriculum, opening doors for a wider range of students. Our course also features a large final-project component to connect theoretical learning to applied practice. In our first offering of the course we attracted 27 students without prior experience in these topics and found that an overwhelming majority of the students felt that they learned both technical and soft skills such that they felt prepared for future study in these fields.
Today, computergraphics and graphic image processing techniques have been widely used in daily life and industrial production. Due to the development of computers, computergraphics has brought more convenience to ou...
详细信息
The articles in this special section focus on data physicalization. The practice of representing data in physical form has existed for thousands of years, yet it has only become an area of investigation and exploratio...
详细信息
The articles in this special section focus on data physicalization. The practice of representing data in physical form has existed for thousands of years, yet it has only become an area of investigation and exploration for scientists, designers, and artists much more recently.3 Advances in areas such as digital fabrication, actuated tangible interfaces, and shape-changing displays have spurred an emerging area of research now called Data Physicalization.1 This Special Issue of IEEE computergraphics and Applications presents four articles spanning a wide breadth of current data physicalization research, from theory to practice.
The paper considers the use of a neural network for the binary classification of magnetic resonance imaging images to establish a possible disease of COVID-19. Processing of input data and their reduction to one forma...
详细信息
This paper considers one problem of automated proving of MTP inequalities f(x) > 0 over [0, π/2] on boundary points. It suggests one algorithm for symbolic checking for f(0) = 0 or f(π/2) = 0 without use of numer...
详细信息
Today, computergraphics and graphic image processing techniques have been widely used in daily life and industrial production. Due to the development of computers, computergraphics has brought more convenience to ou...
Today, computergraphics and graphic image processing techniques have been widely used in daily life and industrial production. Due to the development of computers, computergraphics has brought more convenience to our daily life. In order to give full play to the value of computers, this paper takes the Hakka paper-cut art with local characteristics as the starting point, first of all its development history, artistic characteristics, compositional forms, expression techniques, cultural connotations, Hakka paper-cut patterns, and the symbolic meaning of folk customs, and then we design a visualization system for the paper-cut works of Gannan Hakka based on computergraphics. In addition, the system provides a solution for the integration of Gannan Hakka paper-cut art and Jiangxi native product packaging design and provides a reference for the theory and practice of modern native product packaging design.
Boolean Matrix Factorization (BMF) aims to find an approximation of a given binary matrix as the Boolean product of two low-rank binary matrices. Binary data is ubiquitous in many fields, and representing data by bina...
详细信息
ISBN:
(纸本)9781728186719
Boolean Matrix Factorization (BMF) aims to find an approximation of a given binary matrix as the Boolean product of two low-rank binary matrices. Binary data is ubiquitous in many fields, and representing data by binary matrices is common in medicine, natural language processing, bioinformatics, computergraphics, among many others. Factorizing a matrix into low-rank matrices is used to gain more information about the data, like discovering relationships between the features and samples, roles and users, topics and articles, etc. In many applications, the binary nature of the factor matrices could enormously increase the interpretability of the data. Unfortunately, BMF is computationally hard and heuristic algorithms are used to compute Boolean factorizations. Very recently, the theoretical breakthrough was obtained independently by two research groups. Ban et al. (SODA 2019) and Fomin et al. (Trans. Algorithms 2020) show that BMF admits an efficient polynomial-time approximation scheme (EPTAS). However, despite the theoretical importance, the high double-exponential dependence of the running times from the rank makes these algorithms unimplementable in practice. The primary research question motivating our work is whether the theoretical advances on BMF could lead to practical algorithms. The main conceptional contribution of our work is the following. While EPTAS for BMF is a purely theoretical advance, the general approach behind these algorithms could serve as the basis in designing better heuristics. We also use this strategy to develop new algorithms for related F-p-Matrix Factorization. Here, given a matrix A over a finite field GF(p) where p is a prime, and an integer r, our objective is to find a matrix B over the same field with GF(p)-rank at most r minimizing some norm of A - B. Our empirical research on synthetic and real-world data demonstrates the advantage of the new algorithms over previous works on BMF and F-p-Matrix Factorization.
Ray tracing is an inherent part of photorealistic image synthesis algorithms. The problem of ray tracing is to find the nearest intersection with a given ray and scene. Although this geometric operation is relatively ...
详细信息
Ray tracing is an inherent part of photorealistic image synthesis algorithms. The problem of ray tracing is to find the nearest intersection with a given ray and scene. Although this geometric operation is relatively simple, in practice, we have to evaluate billions of such operations as the scene consists of millions of primitives, and the image synthesis algorithms require a high number of samples to provide a plausible result. Thus, scene primitives are commonly arranged in spatial data structures to accelerate the search. In the last two decades, the bounding volume hierarchy (BVH) has become the de facto standard acceleration data structure for ray tracing-based rendering algorithms in offline and recently also in real-time applications. In this report, we review the basic principles of bounding volume hierarchies as well as advanced state of the art methods with a focus on the construction and traversal. Furthermore, we discuss industrial frameworks, specialized hardware architectures, other applications of bounding volume hierarchies, best practices, and related open problems.
暂无评论