The traditional method of estimating an Event Related Potential (ERP) is to take the average of signal epochs time locked to a set of similar experimental events. This averaging method is useful as long as the experim...
详细信息
ISBN:
(纸本)9781457702150
The traditional method of estimating an Event Related Potential (ERP) is to take the average of signal epochs time locked to a set of similar experimental events. This averaging method is useful as long as the experimental procedure can sufficiently isolate the brain or non-brain process of interest. However, if responses from multiple cognitive processes, time locked to multiple classes of closely spaced events, overlap in time with varying inter-event intervals, averaging will most likely fail to identify the individual response time courses. For this situation, we study estimation of responses to all recorded events in an experiment by a single model using standard linear regression (the rERP technique). Applied to data collected during a Rapid Serial Visual Presentation (RSVP) task, our analysis shows: (1) The rERP technique accounts for more variance in the data than averaging when individual event responses are highly overlapping;(2) the variance accounted for by the estimates is concentrated into a fewer ICA components than raw EEG channel signals.
Recent work has constructed economic mechanisms that are both truthful and differentially private. In these mechanisms, privacy is treated separately from the truthfulness;it is not incorporated in players' utilit...
详细信息
The objective of this work is to build a high performance computing framework for simulating, analyzing and visualizing oil spill trajectories driven by winds and ocean currents. We adopt a particle model for oil and ...
详细信息
Simple Gaussian Mixture Models (GMMs) learned from pixels of natural image patches have been recently shown to be surprisingly strong performers in modeling the statistics of natural images. Here we provide an in dept...
详细信息
ISBN:
(纸本)9781627480031
Simple Gaussian Mixture Models (GMMs) learned from pixels of natural image patches have been recently shown to be surprisingly strong performers in modeling the statistics of natural images. Here we provide an in depth analysis of this simple yet rich model. We show that such a GMM model is able to compete with even the most successful models of natural images in log likelihood scores, denoising performance and sample quality. We provide an analysis of what such a model learns from natural images as a function of number of mixture components - including covariance structure, contrast variation and intricate structures such as textures, boundaries and more. Finally, we show that the salient properties of the GMM learned from natural images can be derived from a simplified Dead Leaves model which explicitly models occlusion, explaining its surprising success relative to other models.
We describe a computational model for studying the complexity of self-assembled structures with active molecular components. Our model captures notions of growth and movement ubiquitous in biological systems. The mode...
详细信息
High speed networks have characteristics of high bandwidth, long queuing delay, and high burstiness which make it difficult to address issues such as fairness, low queuing delay and high link utilization. Current high...
详细信息
High speed networks have characteristics of high bandwidth, long queuing delay, and high burstiness which make it difficult to address issues such as fairness, low queuing delay and high link utilization. Current high speed networks carry heterogeneous TCP flows which makes it even more challenging to address these issues. Since sender centric approaches do not meet these challenges, there have been several proposals to address them at router level via queue management (QM) schemes. These QM schemes have been fairly successful in addressing either fairness issues or large queuing delay but not both at the same time. We propose a new QM scheme called Approximated-Fair and Controlled-Delay (AFCD) queuing for high speed networks that aims to meet following design goals: approximated fairness, controlled low queuing delay, high link utilization and simple implementation. The design of AFCD utilizes a novel synergistic approach by forming an alliance between approximated fair queuing and controlled delay queuing. It uses very small amount of state information in sending rate estimation of flows and makes drop decision based on a target delay of individual flow. Through experimental evaluation in a l0Gbps high speed networking environment, we show AFCD meets our design goals by maintaining approximated fair share of bandwidth among flows and ensuring a controlled very low queuing delay with a comparable link utilization.
Graph partitioning algorithms play a central role in data analysis and machine learning. Most useful graph partitioning criteria correspond to optimizing a ratio between the cut and the size of the partitions, this ra...
详细信息
Graph partitioning algorithms play a central role in data analysis and machine learning. Most useful graph partitioning criteria correspond to optimizing a ratio between the cut and the size of the partitions, this ratio leads to an NP-hard problem that is only solved approximately. This makes it difficult to know whether failures of the algorithm are due to failures of the optimization or to the criterion being optimized. In this paper we present a framework that seeks and finds the optimal solution of several NP-hard graph partitioning problems. We use a classical approach to ratio problems where we repeatedly ask whether the optimal solution is greater than or less than some constant - λ. Our main insight is the equivalence between this "λ question" and performing inference in a graphical model with many local potentials and one high-order potential. We show that this specific form of the highorder potential is amenable to message-passing algorithms and how to obtain a bound on the optimal solution from the messages. Our experiments show that in many cases our approach yields the global optimum and improves the popular spectral solution.
Although human object recognition is supposedly robust to viewpoint, much research on human perception indicates that there is a preferred or "canonical" view of objects. This phenomenon was discovered more ...
详细信息
ISBN:
(纸本)9781627480031
Although human object recognition is supposedly robust to viewpoint, much research on human perception indicates that there is a preferred or "canonical" view of objects. This phenomenon was discovered more than 30 years ago but the canonical view of only a small number of categories has been validated experimentally. Moreover, the explanation for why humans prefer the canonical view over other views remains elusive. In this paper we ask: Can we use Internet image collections to learn more about canonical views? We start by manually finding the most common view in the results returned by Internet search engines when queried with the objects used in psychophysical experiments. Our results clearly show that the most likely view in the search engine corresponds to the same view preferred by human subjects in experiments. We also present a simple method to find the most likely view in an image collection and apply it to hundreds of categories. Using the new data we have collected we present strong evidence against the two most prominent formal theories of canonical views and provide novel constraints for new theories.
暂无评论