咨询与建议

限定检索结果

文献类型

  • 14,558 篇 会议
  • 663 篇 期刊文献
  • 101 册 图书
  • 40 篇 学位论文
  • 1 篇 科技报告

馆藏范围

  • 15,362 篇 电子文献
  • 1 种 纸本馆藏

日期分布

学科分类号

  • 11,025 篇 工学
    • 10,359 篇 计算机科学与技术...
    • 5,436 篇 软件工程
    • 1,474 篇 信息与通信工程
    • 963 篇 电气工程
    • 925 篇 控制科学与工程
    • 446 篇 生物工程
    • 223 篇 网络空间安全
    • 220 篇 化学工程与技术
    • 187 篇 机械工程
    • 175 篇 生物医学工程(可授...
    • 144 篇 电子科学与技术(可...
    • 102 篇 仪器科学与技术
    • 99 篇 安全科学与工程
  • 2,494 篇 理学
    • 1,163 篇 数学
    • 655 篇 物理学
    • 520 篇 生物学
    • 395 篇 统计学(可授理学、...
    • 241 篇 系统科学
    • 235 篇 化学
  • 2,427 篇 管理学
    • 1,755 篇 图书情报与档案管...
    • 760 篇 管理科学与工程(可...
    • 241 篇 工商管理
    • 106 篇 公共管理
  • 1,761 篇 文学
    • 1,709 篇 外国语言文学
    • 184 篇 中国语言文学
  • 514 篇 医学
    • 303 篇 临床医学
    • 284 篇 基础医学(可授医学...
    • 113 篇 公共卫生与预防医...
  • 278 篇 法学
    • 249 篇 社会学
  • 238 篇 教育学
    • 225 篇 教育学
  • 100 篇 农学
  • 98 篇 经济学
  • 9 篇 艺术学
  • 7 篇 哲学
  • 4 篇 军事学

主题

  • 3,557 篇 natural language...
  • 1,786 篇 natural language...
  • 953 篇 computational li...
  • 740 篇 semantics
  • 682 篇 machine learning
  • 613 篇 deep learning
  • 520 篇 natural language...
  • 352 篇 computational mo...
  • 343 篇 accuracy
  • 339 篇 training
  • 335 篇 large language m...
  • 335 篇 sentiment analys...
  • 325 篇 feature extracti...
  • 312 篇 data mining
  • 290 篇 speech processin...
  • 260 篇 speech recogniti...
  • 256 篇 transformers
  • 236 篇 neural networks
  • 218 篇 iterative method...
  • 212 篇 support vector m...

机构

  • 85 篇 carnegie mellon ...
  • 52 篇 university of ch...
  • 46 篇 tsinghua univers...
  • 45 篇 carnegie mellon ...
  • 43 篇 zhejiang univers...
  • 43 篇 national univers...
  • 38 篇 nanyang technolo...
  • 36 篇 university of sc...
  • 36 篇 university of wa...
  • 35 篇 univ chinese aca...
  • 34 篇 carnegie mellon ...
  • 33 篇 gaoling school o...
  • 33 篇 stanford univers...
  • 32 篇 school of artifi...
  • 32 篇 alibaba grp peop...
  • 29 篇 tsinghua univ de...
  • 28 篇 harbin institute...
  • 26 篇 microsoft resear...
  • 26 篇 language technol...
  • 26 篇 peking universit...

作者

  • 55 篇 zhou guodong
  • 50 篇 neubig graham
  • 46 篇 liu yang
  • 39 篇 sun maosong
  • 36 篇 zhang min
  • 34 篇 liu qun
  • 33 篇 smith noah a.
  • 28 篇 schütze hinrich
  • 27 篇 liu zhiyuan
  • 26 篇 wen ji-rong
  • 26 篇 lapata mirella
  • 24 篇 chang kai-wei
  • 23 篇 zhou jie
  • 23 篇 yang diyi
  • 23 篇 zhao hai
  • 23 篇 zhao wayne xin
  • 21 篇 chua tat-seng
  • 20 篇 dredze mark
  • 18 篇 biemann chris
  • 18 篇 fung pascale

语言

  • 14,282 篇 英文
  • 966 篇 其他
  • 113 篇 中文
  • 18 篇 法文
  • 14 篇 土耳其文
  • 2 篇 德文
  • 2 篇 西班牙文
  • 2 篇 俄文
检索条件"任意字段=Conference on empirical methods in natural language processing"
15363 条 记 录,以下是921-930 订阅
排序:
Automatic Transcription of Handwritten Old Occitan language
Automatic Transcription of Handwritten Old Occitan Language
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Arias, Esteban Garces Pai, Vallari Schoeffel, Matthias Heumann, Christian Assenmacher, Matthias LMU Dept Stat Munich Germany Bavarian Acad Sci BAdW Munich Germany LMU Munich Ctr Machine Learning MCML Munich Germany
While existing neural network-based approaches have shown promising results in Handwritten Text Recognition (HTR) for high-resource languages and standardized/machinewritten text, their application to low-resource lan... 详细信息
来源: 评论
MADNet: Maximizing Addressee Deduction Expectation for Multi-Party Conversation Generation
MADNet: Maximizing Addressee Deduction Expectation for Multi...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Gu, Jia-Chen Tan, Chao-Hong Chu, Caiyuan Ling, Zhen-Hua Tao, Chongyang Liu, Quan Liu, Cong Univ Sci & Technol China Natl Engn Res Ctr Speech & Language Informat Proc Hefei Peoples R China iFLYTEK Res Hefei Peoples R China Peking Univ Beijing Peoples R China State Key Lab Cognit Intelligence Beijing Peoples R China
Modeling multi-party conversations (MPCs) with graph neural networks has been proven effective at capturing complicated and graphical information flows. However, existing methods rely heavily on the necessary addresse... 详细信息
来源: 评论
Appraising the Potential Uses and Harms of Large language Models for Medical Systematic Reviews
Appraising the Potential Uses and Harms of Large Language Mo...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Yun, Hye Sun Marshall, Iain J. Trikalinos, Thomas A. Wallace, Byron C. Northeastern Univ Boston MA 02115 USA Kings Coll London London England Brown Univ Providence RI 02912 USA
Medical systematic reviews play a vital role in healthcare decision making and policy. However, their production is time-consuming, limiting the availability of high-quality and up-to-date evidence summaries. Recent a... 详细信息
来源: 评论
Editing the Mind of Giants: An In-Depth Exploration of Pitfalls of Knowledge Editing in Large language Models
Editing the Mind of Giants: An In-Depth Exploration of Pitfa...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Hsueh, Cheng-Hsun Huang, Paul Kuo-Ming Lin, Tzu-Han Liao, Che-Wei Fang, Hung-Chieh Huang, Chao-Wei Chen, Yun-Nung National Taiwan University Taipei Taiwan
Knowledge editing is a rising technique for efficiently updating factual knowledge in large language models (LLMs) with minimal alteration of ***, recent studies have identified side effects, such as knowledge distort... 详细信息
来源: 评论
QUITE: Quantifying Uncertainty in natural language Text in Bayesian Reasoning Scenarios
QUITE: Quantifying Uncertainty in Natural Language Text in B...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Schrader, Timo Pierre Lange, Lukas Razniewski, Simon Friedrich, Annemarie Bosch Center for Artificial Intelligence Renningen Germany University of Augsburg Augsburg Germany ScaDS.AI TU Dresden Dresden Germany
Reasoning is key to many decision making processes. It requires consolidating a set of rule-like premises that are often associated with degrees of uncertainty and observations to draw conclusions. In this work, we ad... 详细信息
来源: 评论
When LLMs Meet Acoustic Landmarks: An Efficient Approach to Integrate Speech into Large language Models for Depression Detection
When LLMs Meet Acoustic Landmarks: An Efficient Approach to ...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Zhang, Xiangyu Liu, Hexin Xu, Kaishuai Zhang, Qiquan Liu, Daijiao Ahmed, Beena Epps, Julien The University of New South Wales Australia Nanyang Technological University Singapore The Hong Kong Polytechnic University Hong Kong
Depression is a critical concern in global mental health, prompting extensive research into AI-based detection methods. Among various AI technologies, Large language Models (LLMs) stand out for their versatility in me... 详细信息
来源: 评论
DL-QAT: Weight-Decomposed Low-Rank Quantization-Aware Training for Large language Models
DL-QAT: Weight-Decomposed Low-Rank Quantization-Aware Traini...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Ke, Wenjin Li, Zhe Li, Dong Tian, Lu Barsoum, Emad Advanced Micro Devices Inc. Beijing China
Improving the efficiency of inference in Large language Models (LLMs) is a critical area of research. Post-training Quantization (PTQ) is a popular technique, but it often faces challenges at low-bit levels, particula... 详细信息
来源: 评论
Enhancing Discourse Dependency Parsing with Sentence Dependency Parsing: A Unified Generative Method Based on Code Representation
Enhancing Discourse Dependency Parsing with Sentence Depende...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Shen, Zizhuo Shao, Yanqiu Li, Wei Beijing Language and Culture University China
Due to the high complexity of Discourse Dependency Parsing (DDP) tasks, their existing annotation resources are relatively scarce compared to other NLP tasks, and different DDP tasks also have significant differences ... 详细信息
来源: 评论
Is ChatGPT a Financial Expert? Evaluating language Models on Financial natural language processing
Is ChatGPT a Financial Expert? Evaluating Language Models on...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Guo, Yue Xu, Zian Yang, Yi Hong Kong Univ Sci & Technol Hong Kong Peoples R China
The emergence of Large language Models (LLMs), such as ChatGPT, has revolutionized general natural language preprocessing (NLP) tasks. However, their expertise in the financial domain lacks a comprehensive evaluation.... 详细信息
来源: 评论
PERSONALIZED PIECES: Efficient Personalized Large language Models through Collaborative Efforts
PERSONALIZED PIECES: Efficient Personalized Large Language M...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Tan, Zhaoxuan Liu, Zheyuan Jiang, Meng University of Notre Dame United States
Personalized large language models (LLMs) aim to tailor interactions, content, and recommendations to individual user preferences. While parameter-efficient fine-tuning (PEFT) methods excel in performance and generali... 详细信息
来源: 评论