咨询与建议

限定检索结果

文献类型

  • 14,463 篇 会议
  • 653 篇 期刊文献
  • 101 册 图书
  • 40 篇 学位论文
  • 1 篇 科技报告

馆藏范围

  • 15,257 篇 电子文献
  • 1 种 纸本馆藏

日期分布

学科分类号

  • 10,943 篇 工学
    • 10,283 篇 计算机科学与技术...
    • 5,409 篇 软件工程
    • 1,461 篇 信息与通信工程
    • 953 篇 电气工程
    • 879 篇 控制科学与工程
    • 446 篇 生物工程
    • 221 篇 网络空间安全
    • 220 篇 化学工程与技术
    • 186 篇 机械工程
    • 174 篇 生物医学工程(可授...
    • 141 篇 电子科学与技术(可...
    • 100 篇 仪器科学与技术
    • 100 篇 安全科学与工程
  • 2,473 篇 理学
    • 1,150 篇 数学
    • 649 篇 物理学
    • 518 篇 生物学
    • 391 篇 统计学(可授理学、...
    • 241 篇 系统科学
    • 232 篇 化学
  • 2,417 篇 管理学
    • 1,748 篇 图书情报与档案管...
    • 758 篇 管理科学与工程(可...
    • 240 篇 工商管理
    • 104 篇 公共管理
  • 1,761 篇 文学
    • 1,709 篇 外国语言文学
    • 184 篇 中国语言文学
  • 510 篇 医学
    • 299 篇 临床医学
    • 282 篇 基础医学(可授医学...
    • 112 篇 公共卫生与预防医...
  • 277 篇 法学
    • 249 篇 社会学
  • 237 篇 教育学
    • 224 篇 教育学
  • 100 篇 农学
  • 97 篇 经济学
  • 9 篇 艺术学
  • 7 篇 哲学
  • 4 篇 军事学

主题

  • 3,534 篇 natural language...
  • 1,768 篇 natural language...
  • 952 篇 computational li...
  • 741 篇 semantics
  • 680 篇 machine learning
  • 609 篇 deep learning
  • 520 篇 natural language...
  • 347 篇 computational mo...
  • 336 篇 training
  • 333 篇 accuracy
  • 331 篇 sentiment analys...
  • 329 篇 large language m...
  • 320 篇 feature extracti...
  • 311 篇 data mining
  • 290 篇 speech processin...
  • 261 篇 speech recogniti...
  • 252 篇 transformers
  • 235 篇 neural networks
  • 217 篇 iterative method...
  • 212 篇 support vector m...

机构

  • 85 篇 carnegie mellon ...
  • 51 篇 university of ch...
  • 45 篇 tsinghua univers...
  • 45 篇 carnegie mellon ...
  • 43 篇 zhejiang univers...
  • 43 篇 national univers...
  • 38 篇 nanyang technolo...
  • 36 篇 university of wa...
  • 35 篇 univ chinese aca...
  • 34 篇 university of sc...
  • 34 篇 carnegie mellon ...
  • 33 篇 stanford univers...
  • 32 篇 gaoling school o...
  • 32 篇 school of artifi...
  • 32 篇 alibaba grp peop...
  • 29 篇 tsinghua univ de...
  • 28 篇 harbin institute...
  • 27 篇 language technol...
  • 27 篇 peking universit...
  • 26 篇 microsoft resear...

作者

  • 55 篇 zhou guodong
  • 50 篇 neubig graham
  • 46 篇 liu yang
  • 39 篇 sun maosong
  • 36 篇 zhang min
  • 34 篇 liu qun
  • 33 篇 smith noah a.
  • 28 篇 schütze hinrich
  • 27 篇 liu zhiyuan
  • 27 篇 lapata mirella
  • 26 篇 wen ji-rong
  • 24 篇 chang kai-wei
  • 23 篇 zhou jie
  • 23 篇 yang diyi
  • 23 篇 zhao hai
  • 23 篇 zhao wayne xin
  • 21 篇 chua tat-seng
  • 20 篇 dredze mark
  • 18 篇 biemann chris
  • 18 篇 fung pascale

语言

  • 14,663 篇 英文
  • 481 篇 其他
  • 105 篇 中文
  • 18 篇 法文
  • 15 篇 土耳其文
  • 2 篇 西班牙文
  • 2 篇 俄文
检索条件"任意字段=Conference on empirical methods in natural language processing"
15258 条 记 录,以下是571-580 订阅
排序:
Pre-training language Models for Comparative Reasoning
Pre-training Language Models for Comparative Reasoning
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Yu, Mengxia Zhang, Zhihan Yu, Wenhao Jiang, Meng Univ Notre Dame Notre Dame IN 46556 USA Tencent AI Seattle Lab Seattle WA USA
Comparative reasoning is a process of comparing objects, concepts, or entities to draw conclusions, which constitutes a fundamental cognitive ability. In this paper, we propose a novel framework to pre-train language ... 详细信息
来源: 评论
An Effective Deployment of Diffusion LM for Data Augmentation in Low-Resource Sentiment Classification
An Effective Deployment of Diffusion LM for Data Augmentatio...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Chen, Zhuowei Wang, Lianxi Wu, Yuben Liao, Xinfeng Tian, Yujia Zhong, Junyang Guangdong University of Foreign Studies Guangzhou China Guangzhou Key Laboratory of Multilingual Intelligent Processing Guangzhou China
Sentiment classification (SC) often suffers from low-resource challenges such as domain-specific contexts, imbalanced label distributions, and few-shot scenarios. The potential of the diffusion language model (LM) for... 详细信息
来源: 评论
ITINERA: Integrating Spatial Optimization with Large language Models for Open-domain Urban Itinerary Planning
ITINERA: Integrating Spatial Optimization with Large Languag...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Tang, Yihong Wang, Zhaokai Qu, Ao Yan, Yihao Wu, Zhaofeng Zhuang, Dingyi Kai, Jushi Hou, Kebing Guo, Xiaotong Zhao, Jinhua Zhao, Zhan Ma, Wei Tutu AI University of Hong Kong Hong Kong Shanghai Jiao Tong University China Massachusetts Institute of Technology United States The Hong Kong Polytechnic University Hong Kong
Citywalk, a recently popular form of urban travel, requires genuine personalization and understanding of fine-grained requests compared to traditional itinerary planning. In this paper, we introduce the novel task of ... 详细信息
来源: 评论
Conditional language Policy: A General Framework for Steerable Multi-Objective Finetuning
Conditional Language Policy: A General Framework for Steerab...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Wang, Kaiwen Kidambi, Rahul Sullivan, Ryan Agarwal, Alekh Dann, Christoph Michi, Andrea Gelmi, Marco Li, Yunxuan Gupta, Raghav Dubey, Avinava Ramé, Alexandre Ferret, Johan Cideron, Geoffrey Hou, Le Yu, Hongkun Ahmed, Amr Mehta, Aranyak Hussenot, Léonard Bachem, Olivier Leurent, Edouard Google United States
Reward-based finetuning is crucial for aligning language policies with intended behaviors (e.g., creativity and safety). A key challenge is to develop steerable language models that trade-off multiple (conflicting) ob... 详细信息
来源: 评论
Enhancing Advanced Visual Reasoning Ability of Large language Models
Enhancing Advanced Visual Reasoning Ability of Large Languag...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Li, Zhiyuan Liu, Dongnan Zhang, Chaoyi Wang, Heng Xue, Tengfei Cai, Weidong School of Computer Science The University of Sydney Australia
Recent advancements in Vision-language (VL) research have sparked new benchmarks for complex visual reasoning, challenging models' advanced reasoning ability. Traditional Vision-language Models (VLMs) perform well... 详细信息
来源: 评论
Pushdown Layers: Encoding Recursive Structure in Transformer language Models
Pushdown Layers: Encoding Recursive Structure in Transformer...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Murty, Shikhar Sharma, Pratyusha Andreas, Jacob Manning, Christopher D. Stanford Univ Dept Comp Sci Stanford CA 94305 USA MIT CSAIL Cambridge MA USA
Recursion is a prominent feature of human language, and fundamentally challenging for self-attention due to the lack of an explicit recursive-state tracking mechanism. Consequently, Transformer language models poorly ... 详细信息
来源: 评论
Make Some Noise: Unlocking language Model Parallel Inference Capability through Noisy Training
Make Some Noise: Unlocking Language Model Parallel Inference...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Wang, Yixuan Luo, Xianzhen Wei, Fuxuan Liu, Yijun Zhu, Qingfu Zhang, Xuanyu Yang, Qing Xu, Dongliang Che, Wanxiang Harbin Institute of Technology Harbin China Science Technology Co. Ltd. China
Existing speculative decoding methods typically require additional model structure and training processes to assist the model for draft token generation. This makes the migration of acceleration methods to the new mod... 详细信息
来源: 评论
GPTAraEval: A Comprehensive Evaluation of ChatGPT on Arabic NLP
GPTAraEval: A Comprehensive Evaluation of ChatGPT on Arabic ...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Khondaker, Md Tawkat Islam Waheed, Abdul Nagoudi, El Moatez Billah Abdul-Mageed, Muhammad Univ British Columbia Deep Learning & Nat Language Proc Grp Vancouver BC Canada MBZUAI Dept Nat Language Proc Abu Dhabi U Arab Emirates MBZUAI Dept Machine Learning Abu Dhabi U Arab Emirates
ChatGPT's emergence heralds a transformative phase in NLP, particularly demonstrated through its excellent performance on many English benchmarks. However, the model's efficacy across diverse linguistic contex... 详细信息
来源: 评论
Efficiently Computing Susceptibility to Context in language Models
Efficiently Computing Susceptibility to Context in Language ...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Liu, Tianyu Du, Kevin Sachan, Mrinmaya Cotterell, Ryan ETH zurich Switzerland
One strength of modern language models is their ability to incorporate information from a user-input context when answering queries. However, they are not equally sensitive to the subtle changes to that context. To qu...
来源: 评论
language Models in Dialogue: Conversational Maxims for Human-AI Interactions
Language Models in Dialogue: Conversational Maxims for Human...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Miehling, Erik Nagireddy, Manish Sattigeri, Prasanna Daly, Elizabeth M. Piorkowski, David Richards, John T. IBM Research United States
Modern language models, while sophisticated, exhibit some inherent shortcomings, particularly in conversational settings. We claim that many of the observed shortcomings can be attributed to violation of one or more c... 详细信息
来源: 评论