Motion relations in the world are highly structured, with velocities of visible objects often being nested within other moving reference frames. Does our visual system exploit these hierarchical relations when perceiving dynamic scenes? We studied the use of hierarchical priors in two tasks our visual system faces all the time: tracking of multiple objects, and the prediction of their future trajectories. Knowing the motion relations among objects provides critical information for mastering these tasks. We designed a new class of stochastic stimuli allowing us to distinguish the use of structured priors by participants from stimulus-intrinsic effects. Via Bayesian model comparison, we further identify features of the employed motion priors from human error patterns. If you are an aficionado/a of probabilistic psychophysics, or you simply enjoy watching wildly rotating dots, check out our paper published at Proceedings of the National Academy of Sciences.