Deep neural networks can now achieve human-like levels of performance on tasks such as visual categorization, and are increasingly being viewed as a viable computational model for brain function. In this talk I will present recent work from my lab comparing deep neural networks with both behavioral and neuroimaging experiments (fMRI and MEG) investigating object and scene perception. While deep neural networks show a correspondence with both neuroimaging and behavioral data, our results reveal a complex relationship between the three domains. Given our findings, a key question is how can we move beyond establishing mere correspondences between models and brain data towards generating truly novel insight into the sensory representations underlying adaptive behavior.