Characterizing the computations performed by high-level sensory regions of the brain remains enigmatic due to the many nonlinear signal transformations that separate the input sensory stimuli from the neural responses. In order to produce interpretable models of these computations, dimensionality reduction techniques can be employed to obtain a description of the neural computation in terms of a relevant, multicomponent subspace of the stimulus space. While a number of these techniques have been devised, many rely on computing second-order moments of the stimulus/response distribution leading to models with many more parameters than is ultimately necessary to capture the relevant subspace. For high-level sensory neurons in particular, these models can be prone to overfitting due to low effective sampling of the stimulus space when presented with natural stimuli. To address this, we reformulated a maximum entropy method as a low-rank matrix factorization problem. With the principled application of regularization, the low-rank method led to improved prediction accuracy and estimation of the relevant subspace than prior methods. The low-rank method was deployed to study the computations of neurons from high-level regions in the songbird brain yielding multiple relevant components spanning each neuron’s receptive field. The relevant components were then transformed using logical OR and logical AND operations highlighting potential differences in how regions and sensory systems process sensory information.