Despite the successful application of deep learning to many problems involving jet substructure, typical approaches involve representing jets either as lists of four-vectors or as 2D images. This is mainly due to the compatibility of these structures with existing architectures, such as recurrent or convolutional networks. However, these networks fail to exhibit equivariance with respect to obvious symmetries associated to jet physics, such as rotations and boosts. Recent work in the field of representation learning has shown that equivariant (i.e. symmetry-respecting) architectures generally improve learning, allowing networks to perform better with fewer parameters. Using the example of boson tagging, we demonstrate the importance of equivariance, particularly with respect to boosts, for jet observables. We propose representing jets as functions on the 2-sphere, and construct learnable feature-matching kernels using spherical harmonics. We then demonstrate a network whose layers compute generalized convolutional operations over the desired symmetry groups, automatically resulting in equivariant representations throughout the network.