Dynamic Unstructured Pruning Neural Network Image Super-resolution Reconstruction
Abstract
Many deep learning-based image super-resolution reconstruction algorithms improve the overall feature expression ability of a network by extending the depth of the network. However, excessively extending the depth of the network causes the model to be over-parameterized and complicated. Furthermore, redundant parameters increase the instability of feature expression. To address this issue, based on the unstructured pruning algorithm, the weight parameters are changed and the balanced learning strategy is used, this paper proposes a neural network unstructured pruning algorithm which is suitable for image super-resolution reconstruction tasks, called the unstructured pruning algorithm. Without changing the network structure and increasing the computational complexity, the overall feature expression ability of the network is improved by searching for an optimal yet sparse sub-network of the original network, which excludes the influence of redundant parameters and maximizes the ability of capturing fine-grained and richer features with limited parameters. The experimental results based on Set5, Set14 and BSD100 test sets show that, compared with the original network model and unstructured pruning algorithm, the PSNR and SSIM of the reconstructed images obtained by Dynamic unstructured pruning algorithm are improved, and they have richer detail features and clearer overall and local contours.DOI:
https://doi.org/10.31449/inf.v48i7.5332Downloads
Published
How to Cite
Issue
Section
License
Authors retain copyright in their work. By submitting to and publishing with Informatica, authors grant the publisher (Slovene Society Informatika) the non-exclusive right to publish, reproduce, and distribute the article and to identify itself as the original publisher.
All articles are published under the Creative Commons Attribution license CC BY 3.0. Under this license, others may share and adapt the work for any purpose, provided appropriate credit is given and changes (if any) are indicated.
Authors may deposit and share the submitted version, accepted manuscript, and published version, provided the original publication in Informatica is properly cited.







