咨询与建议

限定检索结果

文献类型

  • 2 篇 期刊文献
  • 2 篇 会议

馆藏范围

  • 4 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 4 篇 工学
    • 4 篇 计算机科学与技术...
    • 2 篇 软件工程
  • 1 篇 文学
    • 1 篇 中国语言文学
    • 1 篇 外国语言文学

主题

  • 3 篇 computational li...
  • 1 篇 machine learning

机构

  • 2 篇 seoul natl univ ...
  • 2 篇 samsung sds
  • 2 篇 data science & a...
  • 2 篇 asri inmc aiis s...
  • 1 篇 deptment of ece ...
  • 1 篇 seoul natl univ ...
  • 1 篇 seoul natl univ ...
  • 1 篇 deptment of ece ...
  • 1 篇 seoul natl univ ...
  • 1 篇 seoul natl univ ...
  • 1 篇 samsung sds kore...

作者

  • 4 篇 song jongyoon
  • 4 篇 yoon sungroh
  • 3 篇 joe seongho
  • 3 篇 hwang bongkyu
  • 3 篇 park nohil
  • 3 篇 gwon youngjune l...
  • 2 篇 yung jaewoong
  • 1 篇 yun jaewoong
  • 1 篇 yu sangwon

语言

  • 3 篇 英文
  • 1 篇 其他
检索条件"机构=Deptment of ECE and Interdisciplinary Program in AI"
4 条 记 录,以下是1-10 订阅
排序:
Model Intrinsic Features of Fine-tuning based Text Summarization Models for Factual Consistency  61
Model Intrinsic Features of Fine-tuning based Text Summariza...
收藏 引用
61st Annual Meeting of the the Association-for-Computational-Linguistics (ACL)
作者: Song, Jongyoon Park, Nohil Hwang, Bongkyu Yung, Jaewoong Joe, Seongho Gwon, Youngjune L. Yoon, Sungroh Seoul Natl Univ Data Sci & AI Lab Seoul South Korea Samsung SDS Seoul South Korea Seoul Natl Univ Deptment ECE Seoul South Korea Seoul Natl Univ Interdisciplinary Program AI Seoul South Korea
In this study, we analyze the model intrinsic features of a summarization model by varying the fine-tuning objectives and datasets. We fine-tune BART models combining three fine-tuning objectives (negative log-likelih... 详细信息
来源: 评论
Entity-level Factual Adaptiveness of Fine-tuning based Abstractive Summarization Models  18
Entity-level Factual Adaptiveness of Fine-tuning based Abstr...
收藏 引用
18th Conference of the European-Chapter of the Association-for-Computational-Linguistics (EACL)
作者: Song, Jongyoon Park, Nohil Hwang, Bongkyu Yung, Jaewoong Joe, Seongho Gwon, Youngjune L. Yoon, Sungroh Seoul Natl Univ Data Sci & AI Lab Seoul South Korea Samsung SDS Seoul South Korea Seoul Natl Univ Deptment ECE & Interdisciplinary Program Ai Seoul South Korea Seoul Natl Univ ASRI INMC & AIIS Seoul South Korea
The Abstractive summarization models often generate factually inconsistent content particularly when the parametric knowledge of the model conflicts with the knowledge in the input document. In this paper, we analyze ... 详细信息
来源: 评论
Large Language Models are Skeptics: False Negative Problem of Input-conflicting Hallucination
arXiv
收藏 引用
arXiv 2024年
作者: Song, Jongyoon Yu, Sangwon Yoon, Sungroh Data Science & AI Laboratory Seoul National University Korea Republic of Deptment of ECE Interdisciplinary Program in AI Seoul National University Korea Republic of ASRI INMC AIIS Seoul National University Korea Republic of
In this paper, we identify a new category of bias that induces input-conflicting hallucinations, where large language models (LLMs) generate responses inconsistent with the content of the input context. This issue we ... 详细信息
来源: 评论
Entity-level Factual Adaptiveness of Fine-tuning based Abstractive Summarization Models
arXiv
收藏 引用
arXiv 2024年
作者: Song, Jongyoon Park, Nohil Hwang, Bongkyu Yun, Jaewoong Joe, Seongho Gwon, Youngjune L. Yoon, Sungroh Data Science & AI Laboratory Seoul National University Korea Republic of Samsung SDS Korea Republic of Deptment of ECE and Interdisciplinary Program in AI Seoul National University Korea Republic of ASRI INMC AIIS Seoul National University Korea Republic of
Abstractive summarization models often generate factually inconsistent content particularly when the parametric knowledge of the model conflicts with the knowledge in the input document. In this paper, we analyze the ... 详细信息
来源: 评论