咨询与建议

限定检索结果

文献类型

  • 14,558 篇 会议
  • 663 篇 期刊文献
  • 101 册 图书
  • 40 篇 学位论文
  • 1 篇 科技报告

馆藏范围

  • 15,362 篇 电子文献
  • 1 种 纸本馆藏

日期分布

学科分类号

  • 11,025 篇 工学
    • 10,359 篇 计算机科学与技术...
    • 5,436 篇 软件工程
    • 1,474 篇 信息与通信工程
    • 963 篇 电气工程
    • 925 篇 控制科学与工程
    • 446 篇 生物工程
    • 223 篇 网络空间安全
    • 220 篇 化学工程与技术
    • 187 篇 机械工程
    • 175 篇 生物医学工程(可授...
    • 144 篇 电子科学与技术(可...
    • 102 篇 仪器科学与技术
    • 99 篇 安全科学与工程
  • 2,494 篇 理学
    • 1,163 篇 数学
    • 655 篇 物理学
    • 520 篇 生物学
    • 395 篇 统计学(可授理学、...
    • 241 篇 系统科学
    • 235 篇 化学
  • 2,427 篇 管理学
    • 1,755 篇 图书情报与档案管...
    • 760 篇 管理科学与工程(可...
    • 241 篇 工商管理
    • 106 篇 公共管理
  • 1,761 篇 文学
    • 1,709 篇 外国语言文学
    • 184 篇 中国语言文学
  • 514 篇 医学
    • 303 篇 临床医学
    • 284 篇 基础医学(可授医学...
    • 113 篇 公共卫生与预防医...
  • 278 篇 法学
    • 249 篇 社会学
  • 238 篇 教育学
    • 225 篇 教育学
  • 100 篇 农学
  • 98 篇 经济学
  • 9 篇 艺术学
  • 7 篇 哲学
  • 4 篇 军事学

主题

  • 3,557 篇 natural language...
  • 1,786 篇 natural language...
  • 953 篇 computational li...
  • 740 篇 semantics
  • 682 篇 machine learning
  • 613 篇 deep learning
  • 520 篇 natural language...
  • 352 篇 computational mo...
  • 343 篇 accuracy
  • 339 篇 training
  • 335 篇 large language m...
  • 335 篇 sentiment analys...
  • 325 篇 feature extracti...
  • 312 篇 data mining
  • 290 篇 speech processin...
  • 260 篇 speech recogniti...
  • 256 篇 transformers
  • 236 篇 neural networks
  • 218 篇 iterative method...
  • 212 篇 support vector m...

机构

  • 85 篇 carnegie mellon ...
  • 52 篇 university of ch...
  • 46 篇 tsinghua univers...
  • 45 篇 carnegie mellon ...
  • 43 篇 zhejiang univers...
  • 43 篇 national univers...
  • 38 篇 nanyang technolo...
  • 36 篇 university of sc...
  • 36 篇 university of wa...
  • 35 篇 univ chinese aca...
  • 34 篇 carnegie mellon ...
  • 33 篇 gaoling school o...
  • 33 篇 stanford univers...
  • 32 篇 school of artifi...
  • 32 篇 alibaba grp peop...
  • 29 篇 tsinghua univ de...
  • 28 篇 harbin institute...
  • 26 篇 microsoft resear...
  • 26 篇 language technol...
  • 26 篇 peking universit...

作者

  • 55 篇 zhou guodong
  • 50 篇 neubig graham
  • 46 篇 liu yang
  • 39 篇 sun maosong
  • 36 篇 zhang min
  • 34 篇 liu qun
  • 33 篇 smith noah a.
  • 28 篇 schütze hinrich
  • 27 篇 liu zhiyuan
  • 26 篇 wen ji-rong
  • 26 篇 lapata mirella
  • 24 篇 chang kai-wei
  • 23 篇 zhou jie
  • 23 篇 yang diyi
  • 23 篇 zhao hai
  • 23 篇 zhao wayne xin
  • 21 篇 chua tat-seng
  • 20 篇 dredze mark
  • 18 篇 biemann chris
  • 18 篇 fung pascale

语言

  • 14,282 篇 英文
  • 966 篇 其他
  • 113 篇 中文
  • 18 篇 法文
  • 14 篇 土耳其文
  • 2 篇 德文
  • 2 篇 西班牙文
  • 2 篇 俄文
检索条件"任意字段=Conference on empirical methods in natural language processing"
15363 条 记 录,以下是751-760 订阅
排序:
Losing Visual Needles in Image Haystacks: Vision language Models are Easily Distracted in Short and Long Contexts
Losing Visual Needles in Image Haystacks: Vision Language Mo...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Sharma, Aditya Saxon, Michael Wang, William Yang University of California Santa Barbara United States
We present LOCOVQA, a dynamic benchmark generator for evaluating long-context extractive reasoning in vision language models (VLMs). LOCOVQA augments test examples for mathematical reasoning, VQA, and character recogn... 详细信息
来源: 评论
How to Leverage Demonstration Data in Alignment for Large language Model? A Self-Imitation Learning Perspective
How to Leverage Demonstration Data in Alignment for Large La...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Xiao, Teng Li, Mingxiao Yuan, Yige Zhu, Huaisheng Cui, Chao Honavar, Vasant G. Artificial Intelligence Research Laboratory Pennsylvania State University United States Tencent AI Lab China Institute of Computing Technology Chinese Academy of Sciences China Tsinghua University China
This paper introduces a novel generalized self-imitation learning (GSIL) framework, which effectively and efficiently aligns large language models with offline demonstration data. We develop GSIL by deriving a surroga... 详细信息
来源: 评论
Scaling Parameter-Constrained language Models with Quality Data
Scaling Parameter-Constrained Language Models with Quality D...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Chang, Ernie Paltenghi, Matteo Li, Yang Lin, Pin-Jie Zhao, Changsheng Huber, Patrick Liu, Zechun Rabatin, Rastislav Shi, Yangyang Chandra, Vikas AI at Meta United States Iowa State University United States Virginia Tech United States
Scaling laws in language modeling traditionally quantify training loss as a function of dataset size and model parameters, providing compute-optimal estimates but often neglecting the impact of data quality on model g... 详细信息
来源: 评论
Unveiling Multi-level and Multi-modal Semantic Representations in the Human Brain using Large language Models
Unveiling Multi-level and Multi-modal Semantic Representatio...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Nakagi, Yuko Matsuyama, Takuya Koide-Majima, Naoko Yamaguchi, Hiroto Q. Kubo, Rieko Nishimoto, Shinji Takagi, Yu Osaka University Japan National Institute of Information and Communications Technology Japan National Institute of Informatics Japan
In recent studies, researchers have used large language models (LLMs) to explore semantic representations in the brain;however, they have typically assessed different levels of semantic content, such as speech, object... 详细信息
来源: 评论
A Unified Framework and Dataset for Assessing Societal Bias in Vision-language Models
A Unified Framework and Dataset for Assessing Societal Bias ...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Sathe, Ashutosh Jain, Prachi Sitaram, Sunayana Microsoft Research India MSR
Vision-language models (VLMs) have gained widespread adoption in both industry and academia. In this study, we propose a unified framework for systematically evaluating gender, race, and age biases in VLMs with respec... 详细信息
来源: 评论
Word-Conditioned 3D American Sign language Motion Generation
Word-Conditioned 3D American Sign Language Motion Generation
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Dong, Lu Wang, Xiao Nwogu, Ifeoma University at Buffalo SUNY United States
Sign words are the building blocks of any sign *** this work, we present wSignGen, a word-conditioned 3D American Sign language (ASL) generation model dedicated to synthesizing realistic and grammatically accurate mot... 详细信息
来源: 评论
Can LMs Generalize to Future Data? An empirical Analysis on Text Summarization
Can LMs Generalize to Future Data? An Empirical Analysis on ...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Cheang, Chi Seng Chan, Hou Pong Wong, Derek F. Liu, Xuebo Li, Zhaocong Sun, Yanming Liu, Shudong Chao, Lidia S. Univ Macau Dept Comp & Informat Sci NLP2CT Lab Macau Peoples R China Univ Macau Inst Collaborat Innovat Macau Peoples R China Harbin Inst Technol Inst Comp & Intelligence Shenzhen Peoples R China
Recent pre-trained language models (PLMs) achieve promising results in existing abstractive summarization datasets. However, existing summarization benchmarks overlap in time with the standard pre-training corpora and... 详细信息
来源: 评论
Tasneef: A Fast and Effective Hybrid Representation Approach for Arabic Text Classification
收藏 引用
IEEE ACCESS 2024年 12卷 120804-120826页
作者: Louail, Maroua Hamdi-Cherif, Chafia Kara-Mohamed Hamdi-Cherif, Aboubekeur Ferhat Abbas Univ Setif 1 Comp Sci Dept LRSD Lab Setif 19000 Algeria Ferhat Abbas Univ Setif 1 Comp Sci Dept Setif 19000 Algeria
The Arabic language role in actual global affairs entails sophisticated natural language processing techniques, especially in text classification. This paper presents Tasneef as a novel hybrid approach to tackle compu... 详细信息
来源: 评论
Distilling Instruction-following Abilities of Large language Models with Task-aware Curriculum Planning
Distilling Instruction-following Abilities of Large Language...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Yue, Yuanhao Wang, Chengyu Huang, Jun Wang, Peng School of Computer Science Fudan University Shanghai China Alibaba Cloud Computing Hangzhou China
Instruction tuning aims to align large language models (LLMs) with open-domain instructions and human-preferred responses. While several studies have explored autonomous approaches to distilling and annotating instruc... 详细信息
来源: 评论
Providing a natural language processing App for language Teachers  26th
Providing a Natural Language Processing App for Language Tea...
收藏 引用
26th International conference on Interactive Collaborative Learning (ICL) - Towards a Hybrid, Flexible and Socially Engaged Higher Education / 52nd IGIP International conference on Engineering Pedagogy
作者: Posekany, Alexandra Dolezal, Dominik TU Vienna Univ Technol Vienna Austria TGM Vienna Inst Technol Vienna Austria Univ Vienna Vienna Austria
natural language processing (NLP) is a common application for Artificial Intelligence. The goal is to provide language teachers with a simple to apply tool for topic model analyses to integrate into their classroom. T... 详细信息
来源: 评论