On the Continuity of Rotation Representations in Neural Networks
- 1 June 2019
- conference paper
- conference paper
- Published by Institute of Electrical and Electronics Engineers (IEEE)
- p. 5738-5746
- https://doi.org/10.1109/cvpr.2019.00589
Abstract
In neural networks, it is often desirable to work with various representations of the same space. For example, 3D rotations can be represented with quaternions or Euler angles. In this paper, we advance a definition of a continuous representation, which can be helpful for training deep neural networks. We relate this to topological concepts such as homeomorphism and embedding. We then investigate what are continuous and discontinuous representations for 2D, 3D, and n-dimensional rotations. We demonstrate that for 3D rotations, all representations are discontinuous in the real Euclidean spaces of four or fewer dimensions. Thus, widely used representations such as quaternions and Euler angles are discontinuous and difficult for neural networks to learn. We show that the 3D rotations have continuous representations in 5D and 6D, which are more suitable for learning. We also present continuous representations for the general case of the n-dimensional rotation group SO(n). While our main focus is on rotations, we also show that our constructions apply to other groups such as the orthogonal group and similarity transforms. We finally present empirical results, which show that our continuous rotation representations outperform discontinuous ones for several practical problems in graphics and vision, including a simple autoencoder sanity test, a rotation estimator for 3D point clouds, and an inverse kinematics solver for 3D human poses.Keywords
This publication has 17 references indexed in Scilit:
- DeMoN: Depth and Motion Network for Learning Monocular StereoPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2017
- PoseNet: A Convolutional Network for Real-Time 6-DOF Camera RelocalizationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2015
- Deep learningNature, 2015
- Uniform Sampling of Rotations for Discrete and Continuous Learning of 2D Shape ModelsPublished by IGI Global ,2013
- Constructive Approximation of Discontinuous Functions by Neural NetworksNeural Processing Letters, 2008
- Simultaneous Lp-approximation order for neural networksNeural Networks, 2005
- The essential order of approximation for neural networksScience China Information Sciences, 2004
- Practical Parameterization of Rotations Using the Exponential MapJournal of Graphics Tools, 1998
- Universal approximation bounds for superpositions of a sigmoidal functionIEEE Transactions on Information Theory, 1993
- Approximation capabilities of multilayer feedforward networksNeural Networks, 1991