咨询与建议

限定检索结果

文献类型

  • 14,549 篇 会议
  • 662 篇 期刊文献
  • 101 册 图书
  • 40 篇 学位论文
  • 1 篇 科技报告

馆藏范围

  • 15,352 篇 电子文献
  • 1 种 纸本馆藏

日期分布

学科分类号

  • 11,015 篇 工学
    • 10,349 篇 计算机科学与技术...
    • 5,460 篇 软件工程
    • 1,467 篇 信息与通信工程
    • 956 篇 电气工程
    • 892 篇 控制科学与工程
    • 447 篇 生物工程
    • 221 篇 网络空间安全
    • 220 篇 化学工程与技术
    • 186 篇 机械工程
    • 177 篇 生物医学工程(可授...
    • 141 篇 电子科学与技术(可...
    • 101 篇 仪器科学与技术
    • 100 篇 安全科学与工程
  • 2,486 篇 理学
    • 1,156 篇 数学
    • 654 篇 物理学
    • 520 篇 生物学
    • 394 篇 统计学(可授理学、...
    • 241 篇 系统科学
    • 232 篇 化学
  • 2,427 篇 管理学
    • 1,756 篇 图书情报与档案管...
    • 759 篇 管理科学与工程(可...
    • 241 篇 工商管理
    • 106 篇 公共管理
  • 1,762 篇 文学
    • 1,710 篇 外国语言文学
    • 184 篇 中国语言文学
  • 515 篇 医学
    • 303 篇 临床医学
    • 286 篇 基础医学(可授医学...
    • 113 篇 公共卫生与预防医...
  • 279 篇 法学
    • 249 篇 社会学
  • 239 篇 教育学
    • 226 篇 教育学
  • 100 篇 农学
  • 96 篇 经济学
  • 10 篇 艺术学
  • 7 篇 哲学
  • 4 篇 军事学

主题

  • 3,552 篇 natural language...
  • 1,789 篇 natural language...
  • 953 篇 computational li...
  • 741 篇 semantics
  • 683 篇 machine learning
  • 612 篇 deep learning
  • 520 篇 natural language...
  • 352 篇 computational mo...
  • 343 篇 accuracy
  • 339 篇 training
  • 334 篇 large language m...
  • 334 篇 sentiment analys...
  • 325 篇 feature extracti...
  • 312 篇 data mining
  • 290 篇 speech processin...
  • 260 篇 speech recogniti...
  • 255 篇 transformers
  • 236 篇 neural networks
  • 218 篇 iterative method...
  • 212 篇 support vector m...

机构

  • 85 篇 carnegie mellon ...
  • 51 篇 university of ch...
  • 46 篇 tsinghua univers...
  • 45 篇 carnegie mellon ...
  • 43 篇 zhejiang univers...
  • 43 篇 national univers...
  • 38 篇 nanyang technolo...
  • 36 篇 university of sc...
  • 36 篇 university of wa...
  • 35 篇 univ chinese aca...
  • 34 篇 carnegie mellon ...
  • 33 篇 stanford univers...
  • 32 篇 gaoling school o...
  • 32 篇 alibaba grp peop...
  • 31 篇 school of artifi...
  • 29 篇 tsinghua univ de...
  • 28 篇 harbin institute...
  • 27 篇 peking universit...
  • 26 篇 microsoft resear...
  • 26 篇 language technol...

作者

  • 55 篇 zhou guodong
  • 50 篇 neubig graham
  • 46 篇 liu yang
  • 39 篇 sun maosong
  • 36 篇 zhang min
  • 34 篇 liu qun
  • 33 篇 smith noah a.
  • 28 篇 schütze hinrich
  • 26 篇 wen ji-rong
  • 26 篇 liu zhiyuan
  • 26 篇 lapata mirella
  • 24 篇 chang kai-wei
  • 23 篇 zhou jie
  • 23 篇 yang diyi
  • 23 篇 zhao hai
  • 23 篇 zhao wayne xin
  • 21 篇 chua tat-seng
  • 20 篇 dredze mark
  • 18 篇 biemann chris
  • 18 篇 fung pascale

语言

  • 14,307 篇 英文
  • 930 篇 其他
  • 114 篇 中文
  • 18 篇 法文
  • 14 篇 土耳其文
  • 2 篇 德文
  • 2 篇 西班牙文
  • 2 篇 俄文
检索条件"任意字段=Conference on empirical methods in natural language processing"
15353 条 记 录,以下是1231-1240 订阅
排序:
Finer: Investigating and Enhancing Fine-Grained Visual Concept Recognition in Large Vision language Models
Finer: Investigating and Enhancing Fine-Grained Visual Conce...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Kim, Jeonghwan Ji, Heng University of Illinois Urbana-Champaign United States
Recent advances in instruction-tuned Large Vision-language Models (LVLMs) have imbued the models with the ability to generate high-level, image-grounded explanations with ease. While such capability is largely attribu... 详细信息
来源: 评论
FROM WORDS TO WIRES: Generating Functioning Electronic Devices from natural language Descriptions
FROM WORDS TO WIRES: Generating Functioning Electronic Devic...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Jansen, Peter Univ Arizona Tucson AZ 85721 USA
In this work, we show that contemporary language models have a previously unknown skill - the capacity for electronic circuit design from high-level textual descriptions, akin to code generation. We introduce two benc... 详细信息
来源: 评论
A Comprehensive Survey of Scientific Large language Models and Their Applications in Scientific Discovery
A Comprehensive Survey of Scientific Large Language Models a...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Zhang, Yu Chen, Xiusi Jin, Bowen Wang, Sheng Ji, Shuiwang Wang, Wei Han, Jiawei University of Illinois Urbana-Champaign United States University of California Los Angeles United States University of Washington Seattle United States Texas A&M University United States
In many scientific fields, large language models (LLMs) have revolutionized the way text and other modalities of data (e.g., molecules and proteins) are handled, achieving superior performance in various applications ... 详细信息
来源: 评论
MIND: Multimodal Shopping Intention Distillation from Large Vision-language Models for E-commerce Purchase Understanding
MIND: Multimodal Shopping Intention Distillation from Large ...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Xu, Baixuan Wang, Weiqi Shi, Haochen Ding, Wenxuan Jing, Huihao Fang, Tianqing Bai, Jiaxin Liu, Xin Yu, Changlong Li, Zheng Luo, Chen Yin, Qingyu Yin, Bing Chen, Long Song, Yangqiu Department of Computer Science and Engineering HKUST Hong Kong *** Inc. Palo AltoCA United States
Improving user experience and providing personalized search results in E-commerce services heavily rely on understanding purchase intention. However, existing methods for acquiring large-scale intentions bank on disti... 详细信息
来源: 评论
Injecting structural hints: Using language models to study inductive biases in language learning
Injecting structural hints: Using language models to study i...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Papadimitriou, Isabel Jurafsky, Dan Stanford Univ Comp Sci Dept Stanford CA 94305 USA
Both humans and large language models are able to learn language without explicit structural supervision. What inductive biases make this learning possible? We address this fundamental cognitive question by leveraging... 详细信息
来源: 评论
Knowledge-Augmented language Model Verification
Knowledge-Augmented Language Model Verification
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Baek, Jinheon Jeong, Soyeong Kang, Minki Park, Jong C. Hwang, Sung Ju Korea Adv Inst Sci & Technol Daejeon South Korea
Recent language Models (LMs) have shown impressive capabilities in generating texts with the knowledge internalized in parameters. Yet, LMs often generate the factually incorrect responses to the given queries, since ... 详细信息
来源: 评论
Construction and Application of Text Classification Model under natural language processing  24
Construction and Application of Text Classification Model un...
收藏 引用
International conference on Modeling, natural language processing and Machine Learning (CMNM)
作者: Sun, Zhongnuo Gao, Pan Technol Vocat Coll Dezhou Dezhou Lectromech Engn Sch Yucheng 251200 Shandong Peoples R China
With the prevalence of the Internet and various types of social media, our daily life is surrounded by a huge amount of text information, which can provide us with the convenience of accessing information and communic... 详细信息
来源: 评论
Query-OPT: Optimizing Inference of Large language Models via Multi-Query Instructions in Meeting Summarization
Query-OPT: Optimizing Inference of Large Language Models via...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Laskar, Md Tahmid Rahman Khasanova, Elena Fu, Xue-Yong Chen, Cheng Bhushan, Shashi T.N. Dialpad Inc VancouverBC Canada
This work focuses on the task of query-based meeting summarization, in which the summary of a context (meeting transcript) is generated in response to a specific query. When using Large language Models (LLMs) for this... 详细信息
来源: 评论
Fisher Information-based Efficient Curriculum Federated Learning with Large language Models
Fisher Information-based Efficient Curriculum Federated Lear...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Liu, Ji Ren, Jiaxiang Jin, Ruoming Zhang, Zijie Zhou, Yang Valduriez, Patrick Dou, Dejing HiThink Research Zhejiang Hangzhou China Auburn University Auburn United States Kent State University Kent United States University of Texas at San Antonio San Antonio United States Inria University of Montpellier CNRS LIRMM France LNCC Petropolis Brazil Fudan University Shanghai China BEDI Cloud Beijing China
As a promising paradigm to collaboratively train models with decentralized data, Federated Learning (FL) can be exploited to fine-tune Large language Models (LLMs). While LLMs correspond to huge size, the scale of the... 详细信息
来源: 评论
Are Large language Models Good Classifiers? A Study on Edit Intent Classification in Scientific Document Revisions
Are Large Language Models Good Classifiers? A Study on Edit ...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Ruan, Qian Kuznetsov, Ilia Gurevych, Iryna Technical University of Darmstadt Germany
Classification is a core NLP task architecture with many potential applications. While large language models (LLMs) have brought substantial advancements in text generation, their potential for enhancing classificatio... 详细信息
来源: 评论