咨询与建议

限定检索结果

文献类型

  • 14,463 篇 会议
  • 653 篇 期刊文献
  • 101 册 图书
  • 40 篇 学位论文
  • 1 篇 科技报告

馆藏范围

  • 15,257 篇 电子文献
  • 1 种 纸本馆藏

日期分布

学科分类号

  • 10,943 篇 工学
    • 10,283 篇 计算机科学与技术...
    • 5,409 篇 软件工程
    • 1,461 篇 信息与通信工程
    • 953 篇 电气工程
    • 879 篇 控制科学与工程
    • 446 篇 生物工程
    • 221 篇 网络空间安全
    • 220 篇 化学工程与技术
    • 186 篇 机械工程
    • 174 篇 生物医学工程(可授...
    • 141 篇 电子科学与技术(可...
    • 100 篇 仪器科学与技术
    • 100 篇 安全科学与工程
  • 2,473 篇 理学
    • 1,150 篇 数学
    • 649 篇 物理学
    • 518 篇 生物学
    • 391 篇 统计学(可授理学、...
    • 241 篇 系统科学
    • 232 篇 化学
  • 2,417 篇 管理学
    • 1,748 篇 图书情报与档案管...
    • 758 篇 管理科学与工程(可...
    • 240 篇 工商管理
    • 104 篇 公共管理
  • 1,761 篇 文学
    • 1,709 篇 外国语言文学
    • 184 篇 中国语言文学
  • 510 篇 医学
    • 299 篇 临床医学
    • 282 篇 基础医学(可授医学...
    • 112 篇 公共卫生与预防医...
  • 277 篇 法学
    • 249 篇 社会学
  • 237 篇 教育学
    • 224 篇 教育学
  • 100 篇 农学
  • 97 篇 经济学
  • 9 篇 艺术学
  • 7 篇 哲学
  • 4 篇 军事学

主题

  • 3,534 篇 natural language...
  • 1,768 篇 natural language...
  • 952 篇 computational li...
  • 741 篇 semantics
  • 680 篇 machine learning
  • 609 篇 deep learning
  • 520 篇 natural language...
  • 347 篇 computational mo...
  • 336 篇 training
  • 333 篇 accuracy
  • 331 篇 sentiment analys...
  • 329 篇 large language m...
  • 320 篇 feature extracti...
  • 311 篇 data mining
  • 290 篇 speech processin...
  • 261 篇 speech recogniti...
  • 252 篇 transformers
  • 235 篇 neural networks
  • 217 篇 iterative method...
  • 212 篇 support vector m...

机构

  • 85 篇 carnegie mellon ...
  • 51 篇 university of ch...
  • 45 篇 tsinghua univers...
  • 45 篇 carnegie mellon ...
  • 43 篇 zhejiang univers...
  • 43 篇 national univers...
  • 38 篇 nanyang technolo...
  • 36 篇 university of wa...
  • 35 篇 univ chinese aca...
  • 34 篇 university of sc...
  • 34 篇 carnegie mellon ...
  • 33 篇 stanford univers...
  • 32 篇 gaoling school o...
  • 32 篇 school of artifi...
  • 32 篇 alibaba grp peop...
  • 29 篇 tsinghua univ de...
  • 28 篇 harbin institute...
  • 27 篇 language technol...
  • 27 篇 peking universit...
  • 26 篇 microsoft resear...

作者

  • 55 篇 zhou guodong
  • 50 篇 neubig graham
  • 46 篇 liu yang
  • 39 篇 sun maosong
  • 36 篇 zhang min
  • 34 篇 liu qun
  • 33 篇 smith noah a.
  • 28 篇 schütze hinrich
  • 27 篇 liu zhiyuan
  • 27 篇 lapata mirella
  • 26 篇 wen ji-rong
  • 24 篇 chang kai-wei
  • 23 篇 zhou jie
  • 23 篇 yang diyi
  • 23 篇 zhao hai
  • 23 篇 zhao wayne xin
  • 21 篇 chua tat-seng
  • 20 篇 dredze mark
  • 18 篇 biemann chris
  • 18 篇 fung pascale

语言

  • 14,663 篇 英文
  • 481 篇 其他
  • 105 篇 中文
  • 18 篇 法文
  • 15 篇 土耳其文
  • 2 篇 西班牙文
  • 2 篇 俄文
检索条件"任意字段=Conference on empirical methods in natural language processing"
15258 条 记 录,以下是301-310 订阅
排序:
Knowledge Graph Enhanced Large language Model Editing
Knowledge Graph Enhanced Large Language Model Editing
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Zhang, Mengqi Ye, Xiaotian Liu, Qiang Ren, Pengjie Wu, Shu Chen, Zhumin School of Computer Science and Technology Shandong University China School of Computer Science Beijing University of Posts and Telecommunications China Institute of Automation Chinese Academy of Sciences China
Large language models (LLMs) are pivotal in advancing natural language processing (NLP) tasks, yet their efficacy is hampered by inaccuracies and outdated knowledge. Model editing emerges as a promising solution to ad... 详细信息
来源: 评论
Retrofitting Light-weight language Models for Emotions using Supervised Contrastive Learning
Retrofitting Light-weight Language Models for Emotions using...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Shah, Sapan Reddy, Sreedhar Bhattacharyya, Pushpak Tata Consultancy Serv TCS Res Pune Maharashtra India Indian Inst Technol Mumbai Maharashtra India
We present a novel retrofitting method to induce emotion aspects into pre-trained language models (PLMs) such as BERT and RoBERTa. Our method updates pre-trained network weights using contrastive learning so that the ... 详细信息
来源: 评论
Communicating with Speakers and Listeners of Different Pragmatic Levels
Communicating with Speakers and Listeners of Different Pragm...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Naszádi, Kata Oliehoek, Frans A. Monz, Christof Language Technology Lab University of Amsterdam Netherlands Delft University of Technology Netherlands
This paper explores the impact of variable pragmatic competence on communicative success through simulating language learning and conversing between speakers and listeners with different levels of reasoning abilities.... 详细信息
来源: 评论
Homophone Disambiguation Reveals Patterns of Context Mixing in Speech Transformers
Homophone Disambiguation Reveals Patterns of Context Mixing ...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Mohebbi, Hosein Chrupala, Grzegorz Zuidema, Willem Alishahi, Afra Tilburg Univ CSAI Tilburg Netherlands Univ Amsterdam ILLC Amsterdam Netherlands
Transformers have become a key architecture in speech processing, but our understanding of how they build up representations of acoustic and linguistic structure is limited. In this study, we address this gap by inves... 详细信息
来源: 评论
TelBench: A Benchmark for Evaluating Telco-Specific Large language Models
TelBench: A Benchmark for Evaluating Telco-Specific Large La...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Lee, Sunwoo Arya, Dhammiko Cho, Seung-Mo Han, Gyoung-Eun Hong, Seokyoung Jang, Wonbeom Lee, Seojin Park, Sohee Sek, Sereimony Song, Injee Yoon, Sungbin Davis, Eric SK Telecom Korea Republic of
The telecommunications industry, characterized by its vast customer base and complex service offerings, necessitates a high level of domain expertise and proficiency in customer service center operations. Consequently... 详细信息
来源: 评论
Empowering Multi-step Reasoning across languages via Program-Aided language Models
Empowering Multi-step Reasoning across Languages via Program...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Ranaldi, Leonardo Pucci, Giulia Haddow, Barry Birch, Alexandra School of Informatics University of Edinburgh United Kingdom Department of Computing Science University of Aberdeen United Kingdom
In-context learning methods are commonly employed as inference strategies, where Large language Models (LLMs) are elicited to solve a task by leveraging provided demonstrations without requiring parameter updates. Amo... 详细信息
来源: 评论
AI for Science in the Era of Large language Models
AI for Science in the Era of Large Language Models
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Bi, Zhenyu Xu, Minghao Tang, Jian Wang, Xuan Department of Computer Science Virginia Tech United States Mila - Quebec AI Institute Canada
The capabilities of AI in the realm of science span a wide spectrum, from the atomic level, where it solves partial differential equations for quantum systems, to the molecular level, predicting chemical or protein st... 详细信息
来源: 评论
Deciphering Stereotypes in Pre-Trained language Models
Deciphering Stereotypes in Pre-Trained Language Models
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Ma, Weicheng Scheible, Henry Wang, Brian Veeramachaneni, Goutham Chowdhary, Pratim Sung, Alan Koulogeorge, Andrew Wang, Lili Yang, Diyi Vosoughi, Soroush Dartmouth Coll Dept Comp Sci Hanover NH 03755 USA Stanford Univ Dept Comp Sci Stanford CA 94305 USA
Warning: This paper discusses content that could potentially trigger discomfort due to the presence of stereotypes. This paper addresses the issue of demographic stereotypes present in Transformer-based pre-trained la... 详细信息
来源: 评论
Learning from Mistakes: Iterative Prompt Relabeling for Text-to-Image Diffusion Model Training
Learning from Mistakes: Iterative Prompt Relabeling for Text...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Chen, Xinyan Ge, Jiaxin Zhang, Tianjun Liu, Jiaming Zhang, Shanghang State Key Laboratory of Multimedia Information Processing School of Computer Science Peking University China University of Science and Technology of China China UC Berkeley United States
Diffusion models have shown impressive performance in many domains. However, the model's capability to follow natural language instructions (e.g., spatial relationships between objects, generating complex scenes) ...
来源: 评论
Self-Detoxifying language Models via Toxification Reversal
Self-Detoxifying Language Models via Toxification Reversal
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Leong, Chak Tou Cheng, Yi Wang, Jiashuo Wang, Jian Li, Wenjie Hong Kong Polytech Univ Dept Comp Hong Kong Peoples R China
language model detoxification aims to minimize the risk of generating offensive or harmful content in pretrained language models (PLMs) for safer deployment. Existing methods can be roughly categorized as finetuning-b... 详细信息
来源: 评论