咨询与建议

限定检索结果

文献类型

  • 9 篇 期刊文献
  • 4 篇 会议

馆藏范围

  • 13 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 10 篇 工学
    • 7 篇 计算机科学与技术...
    • 4 篇 电气工程
    • 4 篇 信息与通信工程
    • 4 篇 软件工程
    • 2 篇 控制科学与工程
    • 2 篇 网络空间安全
  • 3 篇 理学
    • 3 篇 数学
    • 2 篇 统计学(可授理学、...
    • 1 篇 化学
    • 1 篇 生物学
  • 2 篇 管理学
    • 2 篇 图书情报与档案管...
  • 1 篇 医学
    • 1 篇 基础医学(可授医学...

主题

  • 2 篇 distribution tra...
  • 1 篇 deep neural netw...
  • 1 篇 associative stor...
  • 1 篇 differential pri...
  • 1 篇 antibodies
  • 1 篇 structured query...
  • 1 篇 linear transform...

机构

  • 9 篇 department of st...
  • 6 篇 department of co...
  • 6 篇 center for found...
  • 5 篇 simons institute...
  • 2 篇 center for found...
  • 2 篇 center for found...
  • 2 篇 department of co...
  • 2 篇 center for found...
  • 2 篇 department of co...
  • 1 篇 ensemble ai ca 9...
  • 1 篇 maynooth interna...
  • 1 篇 department of co...
  • 1 篇 ensemble ai ca u...
  • 1 篇 center for found...
  • 1 篇 department of ma...
  • 1 篇 center for found...
  • 1 篇 simons institute...
  • 1 篇 department of ph...
  • 1 篇 school of mathem...
  • 1 篇 department of st...

作者

  • 11 篇 liu han
  • 11 篇 hu jerry yao-chi...
  • 6 篇 song zhao
  • 5 篇 wu weimin
  • 2 篇 jerry yao-chieh ...
  • 2 篇 liu erzhi
  • 2 篇 wu dennis
  • 2 篇 han liu
  • 1 篇 zhao song
  • 1 篇 wen yibo
  • 1 篇 gilani ammar
  • 1 篇 zhuoru li
  • 1 篇 zhang lichen
  • 1 篇 chen minshuo
  • 1 篇 xu chenwei
  • 1 篇 dennis wu
  • 1 篇 lee yi-chen
  • 1 篇 su maojiang
  • 1 篇 weimin wu
  • 1 篇 wang wei-po

语言

  • 7 篇 其他
  • 6 篇 英文
检索条件"机构=Center for Foundation Models and Generative AI and Department of Computer Science"
13 条 记 录,以下是1-10 订阅
排序:
Provably Optimal Memory Capacity for Modern Hopfield models: Transformer-Compatible Dense Associative Memories as Spherical Codes  38
Provably Optimal Memory Capacity for Modern Hopfield Models:...
收藏 引用
38th Conference on Neural Information Processing Systems, NeurIPS 2024
作者: Hu, Jerry Yao-Chieh Wu, Dennis Liu, Han Center for Foundation Models and Generative AI United States Department of Computer Science United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We study the optimal memorization capacity of modern Hopfield models and Kernelized Hopfield models (KHMs), a transformer-compatible class of Dense Associative Memories. We present a tight analysis by establishing a c...
来源: 评论
Provably optimal memory capacity for modern hopfield models: transformer-compatible dense associative memories as spherical codes  24
Provably optimal memory capacity for modern hopfield models:...
收藏 引用
Proceedings of the 38th International Conference on Neural Information Processing Systems
作者: Jerry Yao-Chieh Hu Dennis Wu Han Liu Center for Foundation Models and Generative AI and Department of Computer Science Northwestern University Evanston IL Department of Computer Science Northwestern University Evanston IL Center for Foundation Models and Generative AI and Department of Computer Science and Department of Statistics and Data Science Northwestern University Evanston IL
We study the optimal memorization capacity of modern Hopfield models and Kernelized Hopfield models (KHMs), a transformer-compatible class of Dense Associative Memories. We present a tight analysis by establishing a c...
来源: 评论
On statistical rates and provably efficient criteria of latent diffusion transformers (DiTs)  24
On statistical rates and provably efficient criteria of late...
收藏 引用
Proceedings of the 38th International Conference on Neural Information Processing Systems
作者: Jerry Yao-Chieh Hu Weimin Wu Zhuoru Li Sophia Pi Zhao Song Han Liu Center for Foundation Models and Generative AI and Department of Computer Science Northwestern University Evanston IL Department of Statistics and Data Science Northwestern University Evanston IL Department of Computer Science Northwestern University Evanston IL Simons Institute for the Theory of Computing UC Berkeley Berkeley CA Center for Foundation Models and Generative AI and Department of Computer Science and Department of Statistics and Data Science Northwestern University Evanston IL
We investigate the statistical and computational limits of latent Diffusion Transformers (DiTs) under the low-dimensional linear latent space assumption. Statistically, we study the universal approximation and sample ...
来源: 评论
On Statistical Rates and Provably Efficient Criteria of Latent Diffusion Transformers (DiTs)  38
On Statistical Rates and Provably Efficient Criteria of Late...
收藏 引用
38th Conference on Neural Information Processing Systems, NeurIPS 2024
作者: Hu, Jerry Yao-Chieh Wu, Weimin Li, Zhuoru Pi, Sophia Song, Zhao Liu, Han Center for Foundation Models and Generative AI Northwestern University EvanstonIL60208 United States Department of Computer Science Northwestern University EvanstonIL60208 United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA94720 United States
We investigate the statistical and computational limits of latent Diffusion Transformers (DiTs) under the low-dimensional linear latent space assumption. Statistically, we study the universal approximation and sample ...
来源: 评论
Provably Optimal Memory Capacity for Modern Hopfield models: Transformer-Compatible Dense Associative Memories as Spherical Codes
arXiv
收藏 引用
arXiv 2024年
作者: Hu, Jerry Yao-Chieh Wu, Dennis Liu, Han Center for Foundation Models and Generative AI United States Department of Computer Science United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We study the optimal memorization capacity of modern Hopfield models and Kernelized Hopfield models (KHMs), a transformer-compatible class of Dense Associative Memories. We present a tight analysis by establishing a c... 详细信息
来源: 评论
Differentially Private Kernel Density Estimation
arXiv
收藏 引用
arXiv 2024年
作者: Liu, Erzhi Hu, Jerry Yao-Chieh Reneau, Alex Song, Zhao Liu, Han Center for Foundation Models and Generative AI Department of Computer Science Northwestern University EvanstonIL60208 United States Ensemble AI San FranciscoCA94133 United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA94720 United States Center for Foundation Models and Generative AI Department of Computer Science Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We introduce a refined differentially private (DP) data structure for kernel density estimation (KDE) with 1, 2 and pp kernels. This new DP data structure offers not only improved privacy-utility tradeoff but also bet... 详细信息
来源: 评论
On Differentially Private String Distances
arXiv
收藏 引用
arXiv 2024年
作者: Hu, Jerry Yao-Chieh Liu, Erzhi Liu, Han Song, Zhao Zhang, Lichen Ensemble AI San FranciscoCA United States Center for Foundation Models and Generative AI Department of Computer Science Northwestern University EvanstonIL United States Center for Foundation Models and Generative AI Department of Computer Science Department of Statistics and Data Science Northwestern University EvanstonIL United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA United States Department of Mathematics & Computer Science and Artificial Intelligence Laboratory MIT CambridgeMA United States
Given a database of bit strings A1, . . ., Am ∈ {0, 1}n, a fundamental data structure task is to estimate the distances between a given query B ∈ {0, 1}n with all the strings in the database. In addition, one might ... 详细信息
来源: 评论
On Statistical Rates and Provably Efficient Criteria of Latent Diffusion Transformers (DiTs)
arXiv
收藏 引用
arXiv 2024年
作者: Hu, Jerry Yao-Chieh Wu, Weimin Song, Zhao Liu, Han Center for Foundation Models Generative AI Department of Computer Science Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA94720 United States
We investigate the statistical and computational limits of latent Diffusion Transformers (DiTs) under the low-dimensional linear latent space assumption. Statistically, we study the universal approximation and sample ... 详细信息
来源: 评论
AlignAb: Pareto-Optimal Energy Alignment for Designing Nature-Like Antibodies
arXiv
收藏 引用
arXiv 2024年
作者: Wen, Yibo Xu, Chenwei Hu, Jerry Yao-Chieh Liu, Han Center for Foundation Models and Generative AI Northwestern University EvanstonIL60208 United States Department of Computer Science Northwestern University EvanstonIL60208 United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We present a three-stage framework for training deep learning models specializing in antibody sequence-structure co-design. We first pre-train a language model using millions of antibody sequence data. Then, we employ... 详细信息
来源: 评论
In-Context Deep Learning via Transformer models
arXiv
收藏 引用
arXiv 2024年
作者: Wu, Weimin Su, Maojiang Hu, Jerry Yao-Chieh Song, Zhao Liu, Han Center for Foundation Models and Generative AI Northwestern University EvanstonIL60208 United States Department of Computer Science Northwestern University EvanstonIL60208 United States Simons Institute for the Theory of Computing UC Berkeley BerkeleyCA94720 United States Department of Statistics and Data Science Northwestern University EvanstonIL60208 United States
We investigate the transformer’s capability to simulate the training process of deep models via in-context learning (ICL), i.e., in-context deep learning. Our key contribution is providing a positive example of using... 详细信息
来源: 评论