基于储备池计算的深度学习模型数据并行训练优化方法
DOI:
CSTR:
作者:
作者单位:

广州华商学院 实验教学与网络技术管理中心

作者简介:

通讯作者:

中图分类号:

基金项目:


Data Parallel Training Optimization Method for Deep Learning Models Based on Reserve Pool Computing
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    深度学习模型的传统训练方法已难以满足类别多样性的训练需求,导致训练效率低,难以收敛。为此,研究一种基于储备池计算的深度学习模型数据并行训练优化方法。该方法对训练数据进行平衡和切分。构建储备池并利用飞蛾算法求取连接矩阵中神经元之间的连接密度、权重参数的最优值。以平衡和切分的训练数据为输入,利用构建的储备池计算优化深度学习模型并进行并行训练。结果表明:所研究方法的模型损失更小且模型曲线更早地趋于平稳,表明模型更加稳定,能够更快地收敛到了较好的解。另外,所研究方法训练过程中CPU使用率持续较高且波动较小,F1分数明显更高,表明该方法能够更有效地利用CPU资源,具有较好的泛化能力,证明了方法的训练效果。

    Abstract:

    Traditional training methods for deep learning models are no longer able to meet the training needs of class diversity, resulting in low training efficiency and difficulty in convergence. To this end, a data parallel training optimization method for deep learning models based on reserve pool computing is studied. This method balances and divides the training data. Build a reserve pool and use the moth algorithm to obtain the optimal values of the connection density and weight parameters between neurons in the connection matrix. Using balanced and segmented training data as input, utilize the constructed reserve pool to optimize the deep learning model and conduct parallel training. The results indicate that the model loss of the studied method is smaller and the model curve tends to stabilize earlier, indicating that the model is more stable and can converge to a better solution faster. In addition, the CPU usage rate of the studied method remained consistently high and fluctuated less during the training process, with significantly higher F1 scores, indicating that the method can more effectively utilize CPU resources and has good generalization ability, proving the training effectiveness of the method.

    参考文献
    相似文献
    引证文献
引用本文

黎祥远,徐胜超,吕峻闽.基于储备池计算的深度学习模型数据并行训练优化方法计算机测量与控制[J].,2025,33(7):227-233.

复制
相关视频

分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2025-02-05
  • 最后修改日期:2025-03-12
  • 录用日期:2025-03-13
  • 在线发布日期: 2025-07-16
  • 出版日期:
文章二维码