咨询与建议

限定检索结果

文献类型

  • 9 篇 期刊文献
  • 2 篇 会议

馆藏范围

  • 11 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 10 篇 工学
    • 7 篇 计算机科学与技术...
    • 4 篇 电气工程
    • 4 篇 信息与通信工程
    • 4 篇 软件工程
    • 2 篇 控制科学与工程
    • 2 篇 网络空间安全
  • 3 篇 理学
    • 3 篇 数学
    • 2 篇 统计学(可授理学、...
    • 1 篇 化学
    • 1 篇 生物学
  • 2 篇 管理学
    • 2 篇 图书情报与档案管...
  • 1 篇 医学
    • 1 篇 基础医学(可授医学...

主题

  • 2 篇 distribution tra...
  • 1 篇 associative stor...
  • 1 篇 differential pri...
  • 1 篇 antibodies
  • 1 篇 structured query...
  • 1 篇 linear transform...
  • 1 篇 deep reinforceme...

机构

  • 9 篇 department of st...
  • 6 篇 department of co...
  • 6 篇 center for found...
  • 5 篇 simons institute...
  • 2 篇 center for found...
  • 2 篇 department of co...
  • 1 篇 ensemble ai ca 9...
  • 1 篇 maynooth interna...
  • 1 篇 department of co...
  • 1 篇 ensemble ai ca u...
  • 1 篇 center for found...
  • 1 篇 department of ma...
  • 1 篇 center for found...
  • 1 篇 department of ph...
  • 1 篇 school of mathem...
  • 1 篇 center for found...
  • 1 篇 simons institute...
  • 1 篇 center for found...
  • 1 篇 department of in...
  • 1 篇 department of ph...

作者

  • 11 篇 liu han
  • 11 篇 hu jerry yao-chi...
  • 6 篇 song zhao
  • 5 篇 wu weimin
  • 2 篇 liu erzhi
  • 2 篇 wu dennis
  • 1 篇 wen yibo
  • 1 篇 gilani ammar
  • 1 篇 zhang lichen
  • 1 篇 chen minshuo
  • 1 篇 xu chenwei
  • 1 篇 lee yi-chen
  • 1 篇 su maojiang
  • 1 篇 wang wei-po
  • 1 篇 liu hude
  • 1 篇 li chenyang
  • 1 篇 pi sophia
  • 1 篇 chen hong-yu
  • 1 篇 huang yu-chao
  • 1 篇 li zhuoru

语言

  • 7 篇 英文
  • 4 篇 其他
检索条件"机构=Center for Foundation Models and Generative AI"
11 条 记 录,以下是1-10 订阅
排序:
Provably Optimal Memory Capacity for Modern Hopfield models: Transformer-Compatible Dense Associative Memories as Spherical Codes  38
Provably Optimal Memory Capacity for Modern Hopfield Models:...
收藏 引用
38th Conference on Neural Information Processing Systems, NeurIPS 2024
作者: Hu, Jerry Yao-Chieh Wu, Dennis Liu, Han Center for Foundation Models and Generative AI United States Department of Computer Science United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We study the optimal memorization capacity of modern Hopfield models and Kernelized Hopfield models (KHMs), a transformer-compatible class of Dense Associative Memories. We present a tight analysis by establishing a c...
来源: 评论
Differentially Private Kernel Density Estimation
arXiv
收藏 引用
arXiv 2024年
作者: Liu, Erzhi Hu, Jerry Yao-Chieh Reneau, Alex Song, Zhao Liu, Han Center for Foundation Models and Generative AI Department of Computer Science Northwestern University EvanstonIL60208 United States Ensemble AI San FranciscoCA94133 United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA94720 United States Center for Foundation Models and Generative AI Department of Computer Science Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We introduce a refined differentially private (DP) data structure for kernel density estimation (KDE) with 1, 2 and pp kernels. This new DP data structure offers not only improved privacy-utility tradeoff but also bet... 详细信息
来源: 评论
Provably Optimal Memory Capacity for Modern Hopfield models: Transformer-Compatible Dense Associative Memories as Spherical Codes
arXiv
收藏 引用
arXiv 2024年
作者: Hu, Jerry Yao-Chieh Wu, Dennis Liu, Han Center for Foundation Models and Generative AI United States Department of Computer Science United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We study the optimal memorization capacity of modern Hopfield models and Kernelized Hopfield models (KHMs), a transformer-compatible class of Dense Associative Memories. We present a tight analysis by establishing a c... 详细信息
来源: 评论
On Statistical Rates and Provably Efficient Criteria of Latent Diffusion Transformers (DiTs)  38
On Statistical Rates and Provably Efficient Criteria of Late...
收藏 引用
38th Conference on Neural Information Processing Systems, NeurIPS 2024
作者: Hu, Jerry Yao-Chieh Wu, Weimin Li, Zhuoru Pi, Sophia Song, Zhao Liu, Han Center for Foundation Models and Generative AI Northwestern University EvanstonIL60208 United States Department of Computer Science Northwestern University EvanstonIL60208 United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA94720 United States
We investigate the statistical and computational limits of latent Diffusion Transformers (DiTs) under the low-dimensional linear latent space assumption. Statistically, we study the universal approximation and sample ...
来源: 评论
On Differentially Private String Distances
arXiv
收藏 引用
arXiv 2024年
作者: Hu, Jerry Yao-Chieh Liu, Erzhi Liu, Han Song, Zhao Zhang, Lichen Ensemble AI San FranciscoCA United States Center for Foundation Models and Generative AI Department of Computer Science Northwestern University EvanstonIL United States Center for Foundation Models and Generative AI Department of Computer Science Department of Statistics and Data Science Northwestern University EvanstonIL United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA United States Department of Mathematics & Computer Science and Artificial Intelligence Laboratory MIT CambridgeMA United States
Given a database of bit strings A1, . . ., Am ∈ {0, 1}n, a fundamental data structure task is to estimate the distances between a given query B ∈ {0, 1}n with all the strings in the database. In addition, one might ... 详细信息
来源: 评论
On Statistical Rates and Provably Efficient Criteria of Latent Diffusion Transformers (DiTs)
arXiv
收藏 引用
arXiv 2024年
作者: Hu, Jerry Yao-Chieh Wu, Weimin Song, Zhao Liu, Han Center for Foundation Models Generative AI Department of Computer Science Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA94720 United States
We investigate the statistical and computational limits of latent Diffusion Transformers (DiTs) under the low-dimensional linear latent space assumption. Statistically, we study the universal approximation and sample ... 详细信息
来源: 评论
AlignAb: Pareto-Optimal Energy Alignment for Designing Nature-Like Antibodies
arXiv
收藏 引用
arXiv 2024年
作者: Wen, Yibo Xu, Chenwei Hu, Jerry Yao-Chieh Liu, Han Center for Foundation Models and Generative AI Northwestern University EvanstonIL60208 United States Department of Computer Science Northwestern University EvanstonIL60208 United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We present a three-stage framework for training deep learning models specializing in antibody sequence-structure co-design. We first pre-train a language model using millions of antibody sequence data. Then, we employ... 详细信息
来源: 评论
Universal Approximation with Softmax Attention
arXiv
收藏 引用
arXiv 2025年
作者: Hu, Jerry Yao-Chieh Liu, Hude Chen, Hong-Yu Wu, Weimin Liu, Han Center for Foundation Models and Generative AI Northwestern University EvanstonIL60208 United States Department of Computer Science Northwestern University EvanstonIL60208 United States School of Mathematical Sciences Fudan University Shanghai200433 China Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We prove that with linear transformations, both (i) two-layer self-attention and (ii) one-layer self-attention followed by a softmax function are universal approximators for continuous sequence-to-sequence functions o... 详细信息
来源: 评论
Transformers are Deep Optimizers: Provable In-Context Learning for Deep Model Training
arXiv
收藏 引用
arXiv 2024年
作者: Wu, Weimin Su, Maojiang Hu, Jerry Yao-Chieh Song, Zhao Liu, Han Center for Foundation Models and Generative AI Northwestern University EvanstonIL60208 United States Department of Computer Science Northwestern University EvanstonIL60208 United States Department of Information and Computing Science USTC Anhui Hefei230026 China Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA94720 United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We investigate the transformer’s capability for in-context learning (ICL) to simulate the training process of deep models. Our key contribution is providing a positive example of using a transformer to train a deep n... 详细信息
来源: 评论
On Statistical Rates of Conditional Diffusion Transformers: Approximation, Estimation and Minimax Optimality
arXiv
收藏 引用
arXiv 2024年
作者: Hu, Jerry Yao-Chieh Wu, Weimin Lee, Yi-Chen Huang, Yu-Chao Chen, Minshuo Liu, Han Center for Foundation Models and Generative AI Northwestern University EvanstonIL60208 United States Department of Computer Science Northwestern University EvanstonIL60208 United States Department of Physics National Taiwan University Taipei106319 Taiwan Physics Division National Center for Theoretical Sciences Taipei106319 Taiwan Department of Industrial Engineering & Management Sciences Northwestern University EvanstonIL60208 United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We investigate the approximation and estimation rates of conditional diffusion transformers (DiTs) with classifier-free guidance. We present a comprehensive analysis for "in-context" conditional DiTs under f...
来源: 评论