咨询与建议

限定检索结果

文献类型

  • 14,549 篇 会议
  • 662 篇 期刊文献
  • 101 册 图书
  • 40 篇 学位论文
  • 1 篇 科技报告

馆藏范围

  • 15,352 篇 电子文献
  • 1 种 纸本馆藏

日期分布

学科分类号

  • 11,015 篇 工学
    • 10,349 篇 计算机科学与技术...
    • 5,460 篇 软件工程
    • 1,467 篇 信息与通信工程
    • 956 篇 电气工程
    • 892 篇 控制科学与工程
    • 447 篇 生物工程
    • 221 篇 网络空间安全
    • 220 篇 化学工程与技术
    • 186 篇 机械工程
    • 177 篇 生物医学工程(可授...
    • 141 篇 电子科学与技术(可...
    • 101 篇 仪器科学与技术
    • 100 篇 安全科学与工程
  • 2,486 篇 理学
    • 1,156 篇 数学
    • 654 篇 物理学
    • 520 篇 生物学
    • 394 篇 统计学(可授理学、...
    • 241 篇 系统科学
    • 232 篇 化学
  • 2,427 篇 管理学
    • 1,756 篇 图书情报与档案管...
    • 759 篇 管理科学与工程(可...
    • 241 篇 工商管理
    • 106 篇 公共管理
  • 1,762 篇 文学
    • 1,710 篇 外国语言文学
    • 184 篇 中国语言文学
  • 515 篇 医学
    • 303 篇 临床医学
    • 286 篇 基础医学(可授医学...
    • 113 篇 公共卫生与预防医...
  • 279 篇 法学
    • 249 篇 社会学
  • 239 篇 教育学
    • 226 篇 教育学
  • 100 篇 农学
  • 96 篇 经济学
  • 10 篇 艺术学
  • 7 篇 哲学
  • 4 篇 军事学

主题

  • 3,552 篇 natural language...
  • 1,789 篇 natural language...
  • 953 篇 computational li...
  • 741 篇 semantics
  • 683 篇 machine learning
  • 612 篇 deep learning
  • 520 篇 natural language...
  • 352 篇 computational mo...
  • 343 篇 accuracy
  • 339 篇 training
  • 334 篇 large language m...
  • 334 篇 sentiment analys...
  • 325 篇 feature extracti...
  • 312 篇 data mining
  • 290 篇 speech processin...
  • 260 篇 speech recogniti...
  • 255 篇 transformers
  • 236 篇 neural networks
  • 218 篇 iterative method...
  • 212 篇 support vector m...

机构

  • 85 篇 carnegie mellon ...
  • 51 篇 university of ch...
  • 46 篇 tsinghua univers...
  • 45 篇 carnegie mellon ...
  • 43 篇 zhejiang univers...
  • 43 篇 national univers...
  • 38 篇 nanyang technolo...
  • 36 篇 university of sc...
  • 36 篇 university of wa...
  • 35 篇 univ chinese aca...
  • 34 篇 carnegie mellon ...
  • 33 篇 stanford univers...
  • 32 篇 gaoling school o...
  • 32 篇 alibaba grp peop...
  • 31 篇 school of artifi...
  • 29 篇 tsinghua univ de...
  • 28 篇 harbin institute...
  • 27 篇 peking universit...
  • 26 篇 microsoft resear...
  • 26 篇 language technol...

作者

  • 55 篇 zhou guodong
  • 50 篇 neubig graham
  • 46 篇 liu yang
  • 39 篇 sun maosong
  • 36 篇 zhang min
  • 34 篇 liu qun
  • 33 篇 smith noah a.
  • 28 篇 schütze hinrich
  • 26 篇 wen ji-rong
  • 26 篇 liu zhiyuan
  • 26 篇 lapata mirella
  • 24 篇 chang kai-wei
  • 23 篇 zhou jie
  • 23 篇 yang diyi
  • 23 篇 zhao hai
  • 23 篇 zhao wayne xin
  • 21 篇 chua tat-seng
  • 20 篇 dredze mark
  • 18 篇 biemann chris
  • 18 篇 fung pascale

语言

  • 14,307 篇 英文
  • 930 篇 其他
  • 114 篇 中文
  • 18 篇 法文
  • 14 篇 土耳其文
  • 2 篇 德文
  • 2 篇 西班牙文
  • 2 篇 俄文
检索条件"任意字段=Conference on empirical methods in natural language processing"
15353 条 记 录,以下是1271-1280 订阅
排序:
The Comprehensive Analysis of the Effect of Chinese Word Segmentation on Fuzzy-Based Classification Algorithms for Agricultural Questions
收藏 引用
INTERNATIONAL JOURNAL OF FUZZY SYSTEMS 2024年 第8期26卷 2726-2749页
作者: Zhao, Xinyue Huang, Jianing Zhang, Jing Song, Yunsheng Shandong Agr Univ Sch Informat Sci & Engn Daizong St Tai An 271018 Shandong Peoples R China Shandong Agr Univ Key Lab Huang Huai Hai Smart Agr Technol Minist Agr & Rural Affars Daizong St Tai An 271018 Shandong Peoples R China
Fuzzy logic is the core method for handling uncertainty and vagueness of information in agricultural natural language processing, and it also plays a crucial role in word segmentation and text classification algorithm... 详细信息
来源: 评论
FuxiTranyu: A Multilingual Large language Model Trained with Balanced Data
FuxiTranyu: A Multilingual Large Language Model Trained with...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Sun, Haoran Jin, Renren Xu, Shaoyang Pan, Leiyu Supryadi Cui, Menglong Du, Jiangcun Lei, Yikun Yang, Lei Shi, Ling Xiao, Juesi Zhu, Shaolin Xiong, Deyi TJUNLP Lab College of Intelligence and Computing Tianjin University China
Large language models (LLMs) have demonstrated prowess in a wide range of tasks. However, many LLMs exhibit significant performance discrepancies between high- and low-resource languages. To mitigate this challenge, w... 详细信息
来源: 评论
On the Effectiveness of Pre-Trained language Models for Legal natural language processing: An empirical Study
收藏 引用
IEEE ACCESS 2022年 10卷 75835-75858页
作者: Song, Dezhao Gao, Sally He, Baosheng Schilder, Frank Thomson Reuters Eagan MN 55123 USA Thomson Reuters New York NY 10036 USA Meta Platforms Inc Menlo Pk CA 94025 USA
We present the first comprehensive empirical evaluation of pre-trained language models (PLMs) for legal natural language processing (NLP) in order to examine their effectiveness in this domain. Our study covers eight ... 详细信息
来源: 评论
PerturbScore: Connecting Discrete and Continuous Perturbations in NLP
PerturbScore: Connecting Discrete and Continuous Perturbatio...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Li, Linyang Ren, Ke Shao, Yunfan Wang, Pengyu Qiu, Xipeng Fudan Univ Sch Comp Sci Shanghai Peoples R China Fudan Univ Shanghai Key Lab Intelligent Informat Proc Shanghai Peoples R China
With the rapid development of neural network applications in NLP, model robustness problem is gaining more attention. Different from computer vision, the discrete nature of texts makes it more challenging to explore r... 详细信息
来源: 评论
FastMem: Fast Memorization of Prompt Improves Context Awareness of Large language Models
FastMem: Fast Memorization of Prompt Improves Context Awaren...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Zhu, Junyi Liu, Shuochen Yu, Yu Tang, Bo Yan, Yibo Li, Zhiyu Xiong, Feiyu Xu, Tong Blaschko, Matthew B. ESAT-PSI KU Leuven Belgium University of Science and Technology of China China Institute for Advanced Algorithms Research Shanghai China National University of Singapore Singapore
Large language models (LLMs) excel in generating coherent text, but they often struggle with context awareness, leading to inaccuracies in tasks requiring faithful adherence to provided information. We introduce FastM... 详细信息
来源: 评论
Developing a Pragmatic Benchmark for Assessing Korean Legal language Understanding in Large language Models
Developing a Pragmatic Benchmark for Assessing Korean Legal ...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Kim, Yeeun Choi, Jinhwan Choi, Young Rok Park, Hai Jin Choi, Eunkyung Hwang, Wonseok University of Seoul Korea Republic of LBox Korea Republic of Hanyang University Korea Republic of
Large language models (LLMs) have demonstrated remarkable performance in the legal domain, with GPT-4 even passing the Uniform Bar Exam in the U.S. However their efficacy remains limited for non-standardized tasks and... 详细信息
来源: 评论
LAMBDA: Large language Model-Based Data Augmentation for Multi-Modal Machine Translation
LAMBDA: Large Language Model-Based Data Augmentation for Mul...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Wang, Yusong Li, Dongyuan Shen, Jialun Xu, Yicheng Xu, Mingkun Funakoshi, Kotaro Okumura, Manabu Tokyo Institute of Technology Tokyo Japan Guangdong Institute of Intelligence Science and Technology Hengqin Guangdong Zhuhai519031 China
Multi-modal machine translation (MMT) can reduce ambiguity and semantic distortion compared with traditional machine translation (MT) by utilizing auxiliary information such as images. However, current MMT methods fac... 详细信息
来源: 评论
Ask-before-Plan: Proactive language Agents for Real-World Planning
Ask-before-Plan: Proactive Language Agents for Real-World Pl...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Zhang, Xuan Deng, Yang Ren, Zifeng Ng, See-Kiong Chua, Tat-Seng National University of Singapore Singapore Singapore Management University Singapore
The evolution of large language models (LLMs) has enhanced the planning capabilities of language agents in diverse real-world scenarios. Despite these advancements, the potential of LLM-powered agents to comprehend am... 详细信息
来源: 评论
Transformer Working Memory Enables Regular language Reasoning And natural language Length Extrapolation
Transformer Working Memory Enables Regular Language Reasonin...
收藏 引用
conference on empirical methods in natural language processing (EMNLP)
作者: Chi, Ta-Chung Fan, Ting-Han Rudnicky, Alexander I. Ramadge, Peter J. Carnegie Mellon Univ Pittsburgh PA 15213 USA Princeton Univ Princeton NJ 08544 USA
Unlike recurrent models, conventional wisdom has it that Transformers cannot perfectly model regular languages. Inspired by the notion of working memory, we propose a new Transformer variant named RegularGPT. With its... 详细信息
来源: 评论
GAMA: A Large Audio-language Model with Advanced Audio Understanding and Complex Reasoning Abilities
GAMA: A Large Audio-Language Model with Advanced Audio Under...
收藏 引用
2024 conference on empirical methods in natural language processing, EMNLP 2024
作者: Ghosh, Sreyan Kumar, Sonal Seth, Ashish Evuru, Chandra Kiran Reddy Tyagi, Utkarsh Sakshi, S. Nieto, Oriol Duraiswami, Ramani Manocha, Dinesh University of Maryland College Park United States Adobe United States
Perceiving and understanding non-speech sounds and non-verbal speech is essential to making decisions that help us interact with our surroundings. In this paper, we propose GAMA, a novel General-purpose Large AudioLan... 详细信息
来源: 评论