咨询与建议

限定检索结果

文献类型

  • 14,463 篇 会议
  • 654 篇 期刊文献
  • 101 册 图书
  • 40 篇 学位论文
  • 1 篇 科技报告

馆藏范围

  • 15,258 篇 电子文献
  • 1 种 纸本馆藏

日期分布

学科分类号

  • 10,944 篇 工学
    • 10,283 篇 计算机科学与技术...
    • 5,408 篇 软件工程
    • 1,463 篇 信息与通信工程
    • 954 篇 电气工程
    • 880 篇 控制科学与工程
    • 446 篇 生物工程
    • 221 篇 网络空间安全
    • 220 篇 化学工程与技术
    • 186 篇 机械工程
    • 174 篇 生物医学工程(可授...
    • 142 篇 电子科学与技术(可...
    • 101 篇 仪器科学与技术
    • 99 篇 安全科学与工程
  • 2,473 篇 理学
    • 1,150 篇 数学
    • 649 篇 物理学
    • 518 篇 生物学
    • 391 篇 统计学(可授理学、...
    • 241 篇 系统科学
    • 232 篇 化学
  • 2,416 篇 管理学
    • 1,748 篇 图书情报与档案管...
    • 757 篇 管理科学与工程(可...
    • 239 篇 工商管理
    • 104 篇 公共管理
  • 1,761 篇 文学
    • 1,709 篇 外国语言文学
    • 184 篇 中国语言文学
  • 510 篇 医学
    • 299 篇 临床医学
    • 283 篇 基础医学(可授医学...
    • 111 篇 公共卫生与预防医...
  • 276 篇 法学
    • 248 篇 社会学
  • 237 篇 教育学
    • 224 篇 教育学
  • 100 篇 农学
  • 96 篇 经济学
  • 9 篇 艺术学
  • 7 篇 哲学
  • 4 篇 军事学

主题

  • 3,535 篇 natural language...
  • 1,768 篇 natural language...
  • 952 篇 computational li...
  • 740 篇 semantics
  • 681 篇 machine learning
  • 609 篇 deep learning
  • 520 篇 natural language...
  • 347 篇 computational mo...
  • 338 篇 training
  • 333 篇 accuracy
  • 331 篇 sentiment analys...
  • 329 篇 large language m...
  • 321 篇 feature extracti...
  • 311 篇 data mining
  • 290 篇 speech processin...
  • 260 篇 speech recogniti...
  • 252 篇 transformers
  • 235 篇 neural networks
  • 217 篇 iterative method...
  • 212 篇 support vector m...

机构

  • 85 篇 carnegie mellon ...
  • 51 篇 university of ch...
  • 45 篇 tsinghua univers...
  • 45 篇 carnegie mellon ...
  • 43 篇 zhejiang univers...
  • 43 篇 national univers...
  • 38 篇 nanyang technolo...
  • 36 篇 university of wa...
  • 35 篇 univ chinese aca...
  • 34 篇 university of sc...
  • 34 篇 carnegie mellon ...
  • 33 篇 stanford univers...
  • 32 篇 gaoling school o...
  • 32 篇 school of artifi...
  • 32 篇 alibaba grp peop...
  • 29 篇 tsinghua univ de...
  • 28 篇 harbin institute...
  • 27 篇 language technol...
  • 27 篇 peking universit...
  • 26 篇 microsoft resear...

作者

  • 55 篇 zhou guodong
  • 50 篇 neubig graham
  • 46 篇 liu yang
  • 39 篇 sun maosong
  • 36 篇 zhang min
  • 34 篇 liu qun
  • 33 篇 smith noah a.
  • 28 篇 schütze hinrich
  • 27 篇 liu zhiyuan
  • 27 篇 lapata mirella
  • 26 篇 wen ji-rong
  • 24 篇 chang kai-wei
  • 23 篇 zhou jie
  • 23 篇 yang diyi
  • 23 篇 zhao hai
  • 23 篇 zhao wayne xin
  • 21 篇 chua tat-seng
  • 20 篇 dredze mark
  • 18 篇 biemann chris
  • 18 篇 fung pascale

语言

  • 14,662 篇 英文
  • 482 篇 其他
  • 106 篇 中文
  • 18 篇 法文
  • 15 篇 土耳其文
  • 2 篇 西班牙文
  • 2 篇 俄文
检索条件"任意字段=Conference on empirical methods in natural language processing"
15259 条 记 录,以下是641-650 订阅
排序:
GOME: Grounding-based Metaphor Binding With Conceptual Elaboration For Figurative language Illustration
GOME: Grounding-based Metaphor Binding With Conceptual Elabo...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Zhang, Linhao Liu, Jintao Jin, Li Wang, Hao Wei, Kaiwen Xu, Guangluan Aerospace Information Research Institute Chinese Academy of Sciences China School of Electronic Electrical and Communication Engineering University of Chinese Academy of Sciences China North China University of Technology China Chongqing University China
The illustration or visualization of figurative language, such as linguistic metaphors, is an emerging challenge for existing Large language Models (LLMs) and multimodal *** to their comparison of seemingly unrelated ... 详细信息
来源: 评论
Learning Knowledge-Enhanced Contextual language Representations for Domain natural language Understanding
Learning Knowledge-Enhanced Contextual Language Representati...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Zhang, Taolin Xu, Ruyao Wang, Chengyu Duan, Zhongjie Chen, Cen Qiu, Minghui Cheng, Dawei He, Xiaofeng Qian, Weining East China Normal Univ Shanghai Peoples R China Alibaba Grp Hangzhou Peoples R China Tongji Univ Shanghai Peoples R China
Knowledge-Enhanced Pre-trained language Models (KEPLMs) improve the performance of various downstream NLP tasks by injecting knowledge facts from large-scale Knowledge Graphs (KGs). However, existing methods for pre-t... 详细信息
来源: 评论
Unleashing the Power of Large language Models in Zero-shot Relation Extraction via Self-Prompting
Unleashing the Power of Large Language Models in Zero-shot R...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Liu, Siyi Li, Yang Li, Jiang Yang, Shan Lan, Yunshi EPFL Lausanne Switzerland MYBank Ant Group Beijing China Beihang University Beijing China East China Normal University Beijing China
Recent research in zero-shot Relation Extraction (RE) has focused on using Large language Models (LLMs) due to their impressive zero-shot capabilities. However, current methods often perform suboptimally, mainly due t... 详细信息
来源: 评论
LM-Interview: An Easy-to-use Smart Interviewer System via Knowledge-guided language Model Exploitation
LM-Interview: An Easy-to-use Smart Interviewer System via Kn...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Li, Hanming Yu, Jifan Li, Ruimiao Hao, Zhanxin Yan, Xuan Yuan, Jiaxin Xu, Bin Li, Juanzi Liu, Zhiyuan Department of Computer Science and Technology BNRist Tsinghua Univeristy China Tsinghua University China
Semi-structured interviews are a crucial method of data acquisition in qualitative research. Typically controlled by the interviewer, the process progresses through a question-and-answer format, aimed at eliciting inf... 详细信息
来源: 评论
Nearest Neighbor Machine Translation is Meta-Optimizer on Output Projection Layer
Nearest Neighbor Machine Translation is Meta-Optimizer on Ou...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Gaol, Ruize Zhang, Zhirui Dui, Yichao Liu, Lemao Wang, Rui Shanghai Jiao Tong Univ Shanghai Peoples R China Tencent AI Lab Shenzhen Peoples R China Univ Sci & Technol China Hefei Peoples R China
Nearest Neighbor Machine Translation (kNN-MT) has achieved great success in domain adaptation tasks by integrating pre-trained Neural Machine Translation (NMT) models with domain-specific token-level retrieval. Howeve... 详细信息
来源: 评论
Goal-Driven Explainable Clustering via language Descriptions
Goal-Driven Explainable Clustering via Language Descriptions
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Wang, Zihan Shang, Jingbo Zhong, Ruiqi
Unsupervised clustering is widely used to explore large corpora, but existing formulations neither consider the users' goals nor explain clusters' meanings. We propose a new task formulation, "Goal-Driven... 详细信息
来源: 评论
Navigating the Grey Area: How Expressions of Uncertainty and Overconfidence Affect language Models
Navigating the Grey Area: How Expressions of Uncertainty and...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Zhou, Kaitlyn Jurafsky, Dan Hashimoto, Tatsunori Stanford Univ Stanford CA 94305 USA
The increased deployment of LMs for real-world tasks involving knowledge and facts makes it important to understand model epistemology: what LMs think they know, and how their attitudes toward that knowledge are affec... 详细信息
来源: 评论
ExpertEase: A Multi-Agent Framework for Grade-Specific Document Simplification with Large language Models
ExpertEase: A Multi-Agent Framework for Grade-Specific Docum...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Mo, Kaijie Hu, Renfen School of International Chinese Language Education Beijing Normal University China
Text simplification is crucial for making texts more accessible, yet current research primarily focuses on sentence-level simplification, neglecting document-level simplification and the different reading levels of ta... 详细信息
来源: 评论
IndoCL: Benchmarking Indonesian language Development Assessment
IndoCL: Benchmarking Indonesian Language Development Assessm...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Lin, Nankai Wu, Hongyan Zheng, Weixiong Liao, Xingming Jiang, Shengyi Yang, Aimin Xiao, Lixian School of Computer Science and Technology Guangdong University of Technology China College of Computer National University of Defense Technology China School of Information Technology and Engineering Guangzhou College of Commerce China School of Computer Science and Intelligence Education Lingnan Normal University China Faculty of Asian Languages and Cultures Guangdong University of Foreign Studies China
Recently, the field of language acquisition (LA) has significantly benefited from natural language processing technologies. A crucial task in LA involves tracking the evolution of language learners' competence, na... 详细信息
来源: 评论
Document-Level Machine Translation with Large language Models
Document-Level Machine Translation with Large Language Model...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Wang, Longyue Lyu, Chenyang Ji, Tianbo Zhang, Zhirui Yu, Dian Shi, Shuming Tu, Zhaopeng Tencent AI Lab Shenzhen Guangdong Peoples R China MBZUAI Abu Dhabi U Arab Emirates Dublin City Univ Dublin Ireland
Large language models (LLMs) such as ChatGPT can produce coherent, cohesive, relevant, and fluent answers for various natural language processing (NLP) tasks. Taking documentlevel machine translation (MT) as a testbed... 详细信息
来源: 评论