咨询与建议

限定检索结果

文献类型

  • 14 篇 期刊文献
  • 11 篇 会议

馆藏范围

  • 25 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 22 篇 工学
    • 21 篇 计算机科学与技术...
    • 8 篇 电气工程
    • 3 篇 控制科学与工程
    • 2 篇 信息与通信工程
    • 2 篇 软件工程
    • 1 篇 仪器科学与技术
  • 3 篇 医学
    • 3 篇 临床医学
    • 2 篇 基础医学(可授医学...
  • 3 篇 管理学
    • 3 篇 管理科学与工程(可...

主题

  • 25 篇 data-free knowle...
  • 5 篇 data models
  • 4 篇 knowledge distil...
  • 3 篇 knowledge engine...
  • 3 篇 catastrophic for...
  • 3 篇 continual learni...
  • 3 篇 model compressio...
  • 3 篇 generators
  • 3 篇 federated learni...
  • 3 篇 training
  • 2 篇 generative adver...
  • 2 篇 contrastive lear...
  • 2 篇 servers
  • 2 篇 data heterogenei...
  • 2 篇 federated learni...
  • 1 篇 covid-19
  • 1 篇 derived gap
  • 1 篇 source-free doma...
  • 1 篇 data generation
  • 1 篇 fault diagnosis

机构

  • 2 篇 sun yat sen univ...
  • 1 篇 college of compu...
  • 1 篇 college of cyber...
  • 1 篇 sun yat sen univ...
  • 1 篇 fudan univ eye &...
  • 1 篇 indian institute...
  • 1 篇 univ munich muni...
  • 1 篇 hefei univ techn...
  • 1 篇 astar inst infoc...
  • 1 篇 zhejiang univ sh...
  • 1 篇 astar inst infoc...
  • 1 篇 natl inst techno...
  • 1 篇 monash univ melb...
  • 1 篇 jiangnan univ sc...
  • 1 篇 zhejiang univ pe...
  • 1 篇 southeast univ s...
  • 1 篇 zhejiang univ bl...
  • 1 篇 sungkyunkwan uni...
  • 1 篇 univ melbourne m...
  • 1 篇 zhejiang univ co...

作者

  • 3 篇 li jingru
  • 3 篇 yu zhi
  • 2 篇 li liangcheng
  • 2 篇 zhou sheng
  • 2 篇 wang haishuai
  • 2 篇 bu jiajun
  • 1 篇 wang shipeng
  • 1 篇 chang qing
  • 1 篇 yu xinlei
  • 1 篇 jiang chenyang
  • 1 篇 lin luojun
  • 1 篇 zhao rui
  • 1 篇 tang xuan
  • 1 篇 yang jun
  • 1 篇 zhang wenqiang
  • 1 篇 liu yang
  • 1 篇 xing lingkai
  • 1 篇 bao yuru
  • 1 篇 xie runshan
  • 1 篇 tianyi zhou joey

语言

  • 23 篇 英文
  • 2 篇 其他
检索条件"主题词=Data-Free Knowledge Distillation"
25 条 记 录,以下是1-10 订阅
排序:
data-free knowledge distillation based on GNN for Node Classification  29th
Data-free Knowledge Distillation based on GNN for Node Class...
收藏 引用
29th International Conference on database Systems for Advanced Applications (DASFAA)
作者: Zeng, Xinfeng Liu, Tao Zeng, Ming Wu, Qingqiang Wang, Meihong Xiamen Univ Sch Informat Xiamen Peoples R China Xiamen Univ Sch Film Xiamen Peoples R China
data-free knowledge distillation (KD) circumvents the limitation of knowledge extration from original training data by utilizing generated data. data-free KD has made good progress in models for processing grid data. ... 详细信息
来源: 评论
Effective and efficient conditional contrast for data-free knowledge distillation with low memory
收藏 引用
JOURNAL OF SUPERCOMPUTING 2025年 第4期81卷 1-21页
作者: Jiang, Chenyang Li, Zhendong Yang, Jun Wu, Yiqiang Li, Shuai Ningxia Univ Sch Informat Engn 489 Helan Mt West Rd Yinchuan 750021 Ningxia Peoples R China Key Lab Internet Water & Digital Water Governance Yinchuan 750021 Ningxia Peoples R China
data-free knowledge distillation has recently gained significant attention in the field of model compression, as it enables knowledge transfer from a trained teacher model to a smaller student model without requiring ... 详细信息
来源: 评论
data-free knowledge distillation via Contrastive Inversion and Cluster Alignment
Data-Free Knowledge Distillation via Contrastive Inversion a...
收藏 引用
2025 International Conference on Electrical Automation and Artificial Intelligence, ICEAAI 2025
作者: Zheng, Zhaoming Du, Zhibin School of Artificial Intelligence South China Normal University Foshan China
data-free knowledge distillation (DFKD) aims to transfer the knowledge from a large teacher network to a lightweight student network without accessing the original training data. However, existing DFKD methods face th... 详细信息
来源: 评论
Unpacking the Gap Box Against data-free knowledge distillation
收藏 引用
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2024年 第9期46卷 6280-6291页
作者: Wang, Yang Qian, Biao Liu, Haipeng Rui, Yong Wang, Meng Hefei Univ Technol Sch Comp Sci & Informat Engn Hefei 230002 Anhui Peoples R China Lenovo Res Beijing 100094 Peoples R China
data-free knowledge distillation (DFKD) improves the student model (S) by mimicking the class probability from apre-trained teacher model (T) without training data. Under such setting, an ideal scenario is that T can ... 详细信息
来源: 评论
FedAlign: Federated Model Alignment via data-free knowledge distillation for Machine Fault Diagnosis
收藏 引用
IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT 2024年 73卷 1页
作者: Sun, Wenjun Yan, Ruqiang Jin, Ruibing Zhao, Rui Chen, Zhenghua Southeast Univ Sch Instrument Sci & Engn Nanjing 210096 Jiangsu Peoples R China Xi An Jiao Tong Univ Sch Mech Engn Xian 710049 Shaanxi Peoples R China ASTAR Inst Infocomm Res Singapore 138632 Singapore Pluang Tech Singapore 049145 Singapore
Due to privacy issues, the data island problem of machine fault diagnosis widely exists in real industry. Federated learning (FL) has received much attention as a decentralized machine-learning paradigm that learns a ... 详细信息
来源: 评论
Variational data-free knowledge distillation for Continual Learning
收藏 引用
IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE 2023年 第10期45卷 12618-12634页
作者: Li, Xiaorong Wang, Shipeng Sun, Jian Xu, Zongben Xi An Jiao Tong Univ Sch Math & Stat Xian 710049 Shaanxi Peoples R China
Deep neural networks suffer from catastrophic forgetting when trained on sequential tasks in continual learning. Various methods rely on storing data of previous tasks to mitigate catastrophic forgetting, which is pro... 详细信息
来源: 评论
data-free knowledge distillation via generator-free data generation for Non-IID federated learning
收藏 引用
NEURAL NETWORKS 2024年 179卷 106627页
作者: Zhao, Siran Liao, Tianchi Fu, Lele Chen, Chuan Bian, Jing Zheng, Zibin Sun Yat Sen Univ Sch Comp Sci & Engn Guangzhou Peoples R China Sun Yat Sen Univ Sch Software Engn Zhuhai Peoples R China Sun Yat Sen Univ Sch Syst Sci & Engn Guangzhou Peoples R China
data heterogeneity (Non-IID) on Federated Learning (FL) is currently a widely publicized problem, which leads to local model drift and performance degradation. Because of the advantage of knowledge distillation, it ha... 详细信息
来源: 评论
data-free knowledge distillation in neural networks for regression
收藏 引用
EXPERT SYSTEMS WITH APPLICATIONS 2021年 175卷 114813-114813页
作者: Kang, Myeonginn Kang, Seokho Sungkyunkwan Univ Dept Ind Engn 2066 Seobu Ro Suwon 16419 South Korea
knowledge distillation has been used successfully to compress a large neural network (teacher) into a smaller neural network (student) by transferring the knowledge of the teacher network with its original training da... 详细信息
来源: 评论
De-confounded data-free knowledge distillation for Handling Distribution Shifts
De-confounded Data-free Knowledge Distillation for Handling ...
收藏 引用
IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
作者: Wang, Yuzheng Yang, Dingkang Chen, Zhaoyu Liu, Yang Liu, Siao Zhang, Wenqiang Zhang, Lihua Qi, Lizhe Fudan Univ Acad Engn & Technol Shanghai Engn Res Ctr AI & Robot Shanghai Peoples R China Fudan Univ Acad Engn & Technol Engn Res Ctr AI & Robot Minist Educ Shanghai Peoples R China Green Ecol Smart Technol Sch Enterprise Joint Res Shanghai Peoples R China
data-free knowledge distillation (DFKD) is a promising task to train high-performance small models to enhance actual deployment without relying on the original training data. Existing methods commonly avoid relying on... 详细信息
来源: 评论
Towards Effective data-free knowledge distillation via Diverse Diffusion Augmentation  24
Towards Effective Data-Free Knowledge Distillation via Diver...
收藏 引用
32nd ACM International Conference on Multimedia, MM 2024
作者: Li, Muquan Zhang, Dongyang He, Tao Xie, Xiurui Li, Yuan-Fang Qin, Ke University of Electronic Science and Technology of China Sichuan Chengdu China Monash University Melbourne Australia
data-free knowledge distillation (DFKD) has emerged as a pivotal technique in the domain of model compression, substantially reducing the dependency on the original training data. Nonetheless, conventional DFKD method... 详细信息
来源: 评论