咨询与建议

限定检索结果

文献类型

  • 26,206 篇 会议
  • 406 册 图书
  • 373 篇 期刊文献
  • 3 篇 学位论文
  • 1 篇 资讯

馆藏范围

  • 26,989 篇 电子文献
  • 2 种 纸本馆藏

日期分布

学科分类号

  • 16,309 篇 工学
    • 14,939 篇 计算机科学与技术...
    • 11,116 篇 软件工程
    • 2,698 篇 信息与通信工程
    • 1,672 篇 控制科学与工程
    • 1,472 篇 电气工程
    • 1,408 篇 生物工程
    • 614 篇 生物医学工程(可授...
    • 609 篇 化学工程与技术
    • 557 篇 机械工程
    • 475 篇 网络空间安全
    • 430 篇 电子科学与技术(可...
    • 389 篇 仪器科学与技术
    • 365 篇 光学工程
    • 276 篇 安全科学与工程
  • 5,060 篇 理学
    • 2,060 篇 数学
    • 1,467 篇 生物学
    • 1,465 篇 物理学
    • 776 篇 统计学(可授理学、...
    • 620 篇 系统科学
    • 617 篇 化学
  • 4,691 篇 管理学
    • 3,144 篇 图书情报与档案管...
    • 1,819 篇 管理科学与工程(可...
    • 760 篇 工商管理
  • 825 篇 医学
    • 597 篇 临床医学
    • 557 篇 基础医学(可授医学...
    • 380 篇 公共卫生与预防医...
    • 250 篇 药学(可授医学、理...
  • 674 篇 法学
    • 592 篇 社会学
  • 573 篇 文学
    • 401 篇 外国语言文学
  • 440 篇 教育学
    • 429 篇 教育学
  • 249 篇 经济学
  • 176 篇 农学
  • 19 篇 艺术学
  • 17 篇 军事学
  • 1 篇 哲学

主题

  • 5,603 篇 natural language...
  • 1,694 篇 natural language...
  • 1,537 篇 semantics
  • 1,441 篇 natural language...
  • 1,181 篇 machine learning
  • 1,162 篇 sentiment analys...
  • 1,113 篇 deep learning
  • 906 篇 accuracy
  • 897 篇 training
  • 802 篇 data mining
  • 774 篇 feature extracti...
  • 758 篇 computational mo...
  • 753 篇 artificial intel...
  • 544 篇 knowledge engine...
  • 519 篇 speech recogniti...
  • 494 篇 large language m...
  • 487 篇 transformers
  • 471 篇 data models
  • 462 篇 information retr...
  • 460 篇 support vector m...

机构

  • 57 篇 chitkara univers...
  • 51 篇 lovely professio...
  • 49 篇 school of comput...
  • 33 篇 university of ch...
  • 31 篇 school of comput...
  • 31 篇 centre of interd...
  • 31 篇 xi'an university...
  • 30 篇 school of comput...
  • 28 篇 tsinghua univers...
  • 28 篇 school of comput...
  • 27 篇 school of inform...
  • 26 篇 zhejiang univers...
  • 26 篇 microsoft resear...
  • 24 篇 institute of inf...
  • 24 篇 school of comput...
  • 23 篇 school of inform...
  • 22 篇 beijing univ pos...
  • 22 篇 school of inform...
  • 22 篇 institute of inf...
  • 21 篇 college of compu...

作者

  • 29 篇 liu yang
  • 27 篇 wan wanggen
  • 20 篇 tiejun zhao
  • 18 篇 zhou guodong
  • 18 篇 wang lei
  • 16 篇 zhang tao
  • 16 篇 zhao hai
  • 16 篇 yang yang
  • 15 篇 sabetzadeh mehrd...
  • 15 篇 lei li
  • 15 篇 liu ying
  • 14 篇 li sheng
  • 14 篇 wu jin
  • 14 篇 zhang lei
  • 14 篇 liu jun
  • 13 篇 li yang
  • 13 篇 f. ren
  • 12 篇 li peng
  • 12 篇 wang jing
  • 12 篇 chang yi

语言

  • 26,127 篇 英文
  • 763 篇 其他
  • 149 篇 中文
  • 5 篇 土耳其文
  • 2 篇 西班牙文
  • 1 篇 法文
  • 1 篇 葡萄牙文
检索条件"任意字段=International Conference on Natural Language Processing and Knowledge Engineering"
26991 条 记 录,以下是11-20 订阅
排序:
Evaluation and Analysis of the Chinese Semantic Dependency Understanding Ability of Large language Models  13th
Evaluation and Analysis of the Chinese Semantic Dependency U...
收藏 引用
13th international conference on natural language processing and Chinese Computing
作者: Shen, Zizhuo Li, Wei Shao, Yanqiu Beijing Language & Culture Univ Beijing Peoples R China
Semantic Dependency Graph is a framework for representing deep semantic knowledge through flexible graph structures. While recent works indicate that large language models (LLMs) have impressive language and knowledge... 详细信息
来源: 评论
Evaluating Class Membership Relations in knowledge Graphs Using Large language Models
Evaluating Class Membership Relations in Knowledge Graphs Us...
收藏 引用
21st international conference on The Semantic Web (ESWC)
作者: Allen, Bradley P. Groth, Paul T. Univ Amsterdam Amsterdam Netherlands
A backbone of knowledge graphs are their class membership relations, which assign entities to a given class. As part of the knowledge engineering process, we propose a new method for evaluating the quality of these re... 详细信息
来源: 评论
Research on large language model news recommendation technology based on mixed prompt  24
Research on large language model news recommendation technol...
收藏 引用
3rd international conference on Artificial Intelligence and Intelligent Information processing, AIIIP 2024
作者: Wei, Han Qu, Dan Xu, Minchen Peng, Sisi Guo, Zhigang Chen, Gang Information Engineering University Henan Zhengzhou China
With the emergence of massive news every day leading to information overload, it is difficult for people to select the content they are really interested in from numerous news articles. The language knowledge and stro... 详细信息
来源: 评论
knowledge-Enhanced Utterance Domain Classification with Keywords-Assisted Concept Denoising Network  13th
Knowledge-Enhanced Utterance Domain Classification with Keyw...
收藏 引用
13th international conference on natural language processing and Chinese Computing
作者: Huang, Peijie Huang, Boxi Xu, Yuhong Chen, Weiting Li, Jia South China Agr Univ Coll Math & Informat Guangzhou Peoples R China
Utterance Domain Classification (UDC) is essential for Spoken language Understanding (SLU), a task analogous to short text classification. Short texts are often challenging to understand due to their lack of context, ... 详细信息
来源: 评论
RAVL: A Retrieval-Augmented Visual language Model Framework for knowledge-Based Visual Question Answering  13th
RAVL: A Retrieval-Augmented Visual Language Model Framework ...
收藏 引用
13th international conference on natural language processing and Chinese Computing
作者: Chai, Naiquan Zou, Dongsheng Liu, Jiyuan Wang, Hao Yang, Yuming Song, Xinyi Chongqing Univ Sch Comp Sci Chongqing Peoples R China
knowledge-based visual question answering (VQA) requires external knowledge in addition to the image content to answer questions. Recent studies convert images to text descriptions and then generate answers or acquire... 详细信息
来源: 评论
ParaFusion-Extended: Large Scale Paraphrase Dataset Integrating Lexico-Phrasal knowledge  30th
ParaFusion-Extended: Large Scale Paraphrase Dataset Integrat...
收藏 引用
30th international conference on Computational and Experimental engineering and Sciences
作者: Jayawardena, Lasal Yapa, Prasan Robert Gordon Univ Garthdee HouseGarthdee Rd Aberdeen AB10 7AQ Scotland Inst Informat Technol 57 Ramakrishna Rd Colombo 00600 Sri Lanka
Paraphrasing, the art of rephrasing text while retaining its original meaning, lies at the core of natural language understanding and generation. With the rise of demand for more domain-specialized models, high-qualit... 详细信息
来源: 评论
ProSide: knowledge Projector and Sideway for Pre-trained language Models  13th
ProSide: Knowledge Projector and Sideway for Pre-trained Lan...
收藏 引用
13th international conference on natural language processing and Chinese Computing
作者: He, Chaofan Lu, Gewei Shen, Liping Shanghai Jiao Tong Univ AI Inst Dept Comp Sci & Engn X LANCE LabMoE Key Lab Artificial Intelligence Shanghai Peoples R China
Research indicates that incorporating external knowledge into pre-trained language models (PLMs) can enhance their performance on knowledge-driven downstream tasks. However, most approaches either require retraining t... 详细信息
来源: 评论
LaiDA: Linguistics-Aware In-Context Learning with Data Augmentation for Metaphor Components Identification  13th
LaiDA: Linguistics-Aware In-Context Learning with Data Augme...
收藏 引用
13th international conference on natural language processing and Chinese Computing
作者: Liu, Hongde He, Chenyuan Meng, Feiyang Niu, Changyong Jia, Yuxiang Zhengzhou Univ Sch Comp & Artificial Intelligence Zhengzhou Peoples R China
Metaphor Components Identification (MCI) contributes to enhancing machine understanding of metaphors, thereby advancing downstream natural language processing tasks. However, the complexity, diversity, and dependency ... 详细信息
来源: 评论
Sparse Mixture of Experts language Models Excel in knowledge Distillation  13th
Sparse Mixture of Experts Language Models Excel in Knowledge...
收藏 引用
13th international conference on natural language processing and Chinese Computing
作者: Xu, Haiyang Liu, Haoxiang Gong, Wei Wang, Hai Deng, Xianjun Univ Sci & Technol China Hefei 230026 Anhui Peoples R China Alibaba Grp Hangzhou Peoples R China Huazhong Univ Sci & Technol Sch Cyber Sci & Engn Wuhan Peoples R China Southeast Univ Sch Comp Sci & Engn Nanjing Peoples R China
knowledge distillation is an effective method for reducing the computational overhead of large language models. However, recent optimization efforts in distilling large language models have primarily focused on loss f... 详细信息
来源: 评论
Deep Reasoning of Large Models Based on knowledge Graph  16
Deep Reasoning of Large Models Based on Knowledge Graph
收藏 引用
2024 16th international conference on Graphics and Image processing, ICGIP 2024
作者: Chenghao, Cao Du, Zhenlong Li, Xiaoli College of Computer and Information Engineering Nanjing Tech University China
Although large language models (LLMs) have achieved significant success in various tasks, they often struggle with hallucination issues in scenarios requiring deep reasoning. Incorporating external knowledge into LLM ... 详细信息
来源: 评论