Transformationally invariant processors constructed by transformed input vectors or operators have been suggested and applied to many applications. In this study, transformationally identical processing based on combining results of all sub-processes with corresponding transformations either at the final processing step or at the beginning step were found to be equivalent through a special algebraical operation property. This technique can be applied to most convolutional neural network (CNN) systems. Specifically, a transformationally identical CNN system can be constructed by running internally symmetric operations in parallel with the same transformation family followed by a flatten layer with weights sharing among their corresponding transformation elements. Such a CNN can output the same result with any transformed input vector of the family. Interestingly, we found that this type of transformationally identical CNN system by combining symmetric operations at the flatten layer is mathematically equivalent to an ordinary CNN but combining all transformation versions of the input vector at the input layer as the forward propagation. However, their corresponding backpropagation processes would take different routes due to greatly different CNN structures between them.
View on arXiv