Abstract:Aiming to handle the problem of propagation redundancy and deconstruction inefficiency of features in traditional capsule networks, this paper proposes an attention-based capsule network with shared parameters. The merits of such a network lie mainly in the following two issues: 1) a dynamic routing method based on an attention mechanism is proposed. This method calculates the correlation between low-level capsules to maintain the space information of features and pay more attention to the feature information with a high correlation, thus fulfilling the forward propagation; 2) a shared transformation matrix is developed in the dynamic routing layer. The high-level capsules are activated based on the voting consistency of the low-level capsules. Then, the transformation matrix with shared parameters is used to reduce the parameters of the model and obtain the robustness of the capsule network. Experimental results of comparison classification on five public datasets show that the capsule network proposed in this paper achieves the best classification results of 5.17%, 3.67% and 9.35% on the Fashion-MNIST, SVHN and CIFAR10 datasets, respectively. Moreover, it has significant robustness against the white-box anti-attack. In addition, the transformation experimental results on smallNORB and affNISH public datasets show that the proposed capsule network has obvious robustness to the transformation. Finally, the experimental results of computational efficiency show that the proposed capsule network with shared parameters reduces the parameters of traditional capsule networks by 4.9% without adding floating-point operations and has an overwhelming advantage in computation.