ARTICLE

Volume 1,Issue 6

Cite this article
4
Download
16
Citations
54
Views
20 July 2025

带L2惩罚的张量神经网络模型及其应用研究

科聿 向1 靖翔 黄1 卓熙 于1 丛婷 孙2
Show Less
1 辽宁大学 数学与统计学院, 中国
2 辽宁大学 环境学院, 中国
ASDS 2025 , 1(5), 89–96; https://doi.org/10.61369/ASDS.2025050019
© 2025 by the Author. Licensee Art and Design, USA. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution -Noncommercial 4.0 International License (CC BY-NC 4.0) ( https://creativecommons.org/licenses/by-nc/4.0/ )
Abstract

传统的卷积神经网络由卷积层、池化层、扁平化层和全连接层组成。为了保持原线性结构,减少过拟合的同时提高模型的泛化能力,本文在张量链式回归网络层的训练过程中,添加L2惩罚项,提高模型的泛化性和稳定性,并将这种方法应用于三个案例研究,实验结果表明,加入惩罚项后比没有惩罚项的张量链式网络,在测试集中均方差(MSE)表现更好,模型的鲁棒性得以提高。最后,我们将模型应用到用胸部癌症CT扫描预测乳腺癌,结果显示该模型表现出快速的训练速度,这表明我们提出的方法有效。

Keywords
机器学习
张量神经网络
张量TT分解
卷积神经网络
医疗图像处理
References

[1]Si Y, Zhang Y, Cai Y, et al. An efficient tensor regression for high-dimensional data[J].arXiv e-prints, 2022, arXiv:2205.13734.
 [2]El Sakka M, Mothe J, Ivanovici M. Images and CNN applications in smart agriculture[J]. European Journal of Remote Sensing, 2024, 57(1): 2352386.
 [3]Zhang D, Xiao B, Gao C, et al. Modeling Bilingual Sentence Processing: Evaluating RNN and Transformer Architectures for Cross-Language Structural Priming[C]//
 Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024). 2024: 127-136.
 [4]An S, Oh T J, Kim S W, et al. Self-clustered GAN for precipitation nowcasting[J]. Scientific Reports, 2024, 14: 9755.
 [5]Turchetti C, Falaschetti L. Tensor PCA from basis in tensor space[J]. arXiv e-prints, 2023, arXiv:2305.02803.
 [6]Tomov M, Sadinov S, Arsov B. Impedance Matching Optimization of RF Networks[J]. Engineering Proceedings, 2024, 70(1): 46.
 [7]Dereich S, Jentzen A. Convergence rates for the Adam optimizer[J]. arXiv e-prints,  2024, arXiv:2407.21078.
 [8]Li Z, Li B, Jahng S G, et al. Improved vgg algorithm for visual prosthesis image recognition[J]. IEEE Access, 2024, 12: 45727-45739.
 [9]Li X, Marcus D, Russell J, et al. Weibull parametric model for survival analysis in women with endometrial cancer using clinical and T2-weighted MRI radiomic features[J]. 
BMC Medical Research Methodology, 2024, 24(1): 107.
 [10]Aboutaleb A, Torabi M, Belzer B, et al. Deep Learning-based Auto-encoder for Time-offset Faster-than-Nyquist Downlink NOMA with Timing Errors and Imperfect 
CSI[J].IEEE Journal of Selected Topics in Signal Processing, 2024, 18(7): 1178-1193.
 [11]Li Z, Li B, Jahng S G, et al. Improved vgg algorithm for visual prosthesis image recognition[J]. IEEE Access, 2024, 12: 45727-45739.
 [12]Wang S, Gai K, Zhang S. Progressive feedforward collapse of resnet training[J]. arXiv e-prints, 2024, arXiv:2405.00985.
 [13]Sidiropoulos N D, De Lathauwer L, Fu X, et al. Tensor decomposition for signal processing and machine learning[J]. IEEE Transactions on signal processing, 2017, 65(13): 
3551-3582.
 [14]Chen G, Bai J, Ou Z, et al. PSFHS: intrapartum ultrasound image dataset for AI-based segmentation of pubic symphysis and fetal head[J]. Scientific Data, 2024, 11(1): 
436.
 [15]Pashaian M, Seyedin S. Speech Enhancement Using Joint DNN‐NMF Model Learned with Multi‐Objective Frequency Differential Spectrum Loss Function[J]. IET Signal 
Processing, 2024, 2024(1): 8881007.
 [16]Zhou H, Sarkar R. Leveraging Graph Machine Learning for Moonlighting Protein Prediction: A PPI Network and Physiochemical Feature Approach[J]. bioRxiv, 2023: 
2023.11. 13.566879.
 [17]Al Olaimat M, Bozdag S, Alzheimer’s Disease Neuroimaging Initiative. TA-RNN: An attention-based time-aware recurrent neural network architecture for electronic 
health records[J].  Bioinformatics, 2024, 40: i169-i179.
 [18]Bharadwaj V, Malik O A, Murray R, et al. Distributed-memory randomized algorithms for sparse tensor cp decomposition[C]//Proceedings of the 36th ACM Symposium 
on Parallelism in Algorithms and Architectures. 2024: 155-168.
 [19]Yuan S, Huang K. Exploring Numerical Priors for Low-Rank Tensor Completion with Generalized CP Decomposition[J]. arXiv e-prints, 2023,arXiv: 2302.05881.
 [20]Baghershahi P, Hosseini R, Moradi H. Efficient relation-aware neighborhood aggregation in graph neural networks via tensor decomposition[J]. arXiv e-prints,2022, 
arXiv:2212.05581.
 [21]Xiang L, Yin M, Zhang C, et al. Tdc: Towards extremely efficient cnns on gpus via hardware-aware Tucker decomposition[C]//Proceedings of the 28th ACM SIGPLAN 
Annual Symposium on Principles and Practice of Parallel Programming. 2023: 260-273.
 [22]Zhang Y, Zhu Y N, Zhang X. Compressing MIMO Channel Submatrices with Tucker Decomposition: Enabling Efficient Storage and Reducing SINR Computation 
Overhead[J]. arXiv e-prints, 2024,arXiv:2401.09792.
 [23]Novikov A, Podoprikhin D, Osokin A, et al. Tensorizing neural networks[J]. Advances in neural information processing systems, 2015, 28.
 [24]Kossaifi J, Lipton Z C, Kolbeinsson A, et al. Tensor regression networks[J]. Journal of Machine Learning Research, 2020, 21(123): 1-21.
 [25]Liu Y, Chakraborty N, Qin Z S, et al. Integrative Bayesian tensor regression for imaging genetics applications[J]. Frontiers in Neuroscience, 2023, 17: 1212218.
 [26]Liu Y, Liu J, Long Z, et al. Tensor regression[M]. Springer International Publishing, 2022.
 [27]Zhou Y, Tan K, Shen X, et al. A protein structure prediction approach leveraging transformer and CNN integration[C]//2024 7th International Conference on Advanced 
Algorithms and Control Engineering (ICAACE). IEEE, 2024: 749-753.
 [28]Dereich S, Jentzen A. Convergence rates for the Adam optimizer[J]. arXiv e-prints,  2024, arXiv:2407.21078.

Share
Back to top