Sensory motor integration involves transforming a pattern of sensory input to a motor output and is a core neural operation carried out by all nervous systems. Analyses of the neural mechanisms of visually-guided behavior in non-human primates have identified a set of transformations that are carried out by networks of neurons in the visual dorsal stream. A basic property of neurons within the visual pathway is that they display functional specializations for the goals of eye, arm or hand movements. More recently, a similar architecture has also been proposed for the organization of auditory-motor transformations for speech in humans. In this talk, I will examine transformations for looking, reaching and speaking. I will first present data from single unit and local field potential recordings simultaneously acquired in different regions of the posterior parietal cortex of monkeys performing coordinated look-and-reach movements. The results show that neurons that participate in coherent patterns of neuronal activity have a privileged role in selecting and preparing coordinated movements of the eye and arm. I will then turn to consider how auditory inputs are transformed into spoken utterances, for which a dorsal auditory stream has been proposed. I will present ECoG recordings from a cohort of epilepsy patients performing a set of tasks that allow us to identify the locus of auditory-motor transformations for speech. We find that sensory-motor transformations for speech reside in the dorsal stream, but occur bilaterally, contradicting the prevailing theories which have proposed lateralization of transformations for speech. Taken together, these results demonstrate a circuit-based approach to understanding neural computations, and indicate that neural mechanisms for speech in humans may be shared with those for other movements in non-human animals.