We have a remarkable ability to interpret incoming sensory stimuli and plan task-appropriate behavioral responses. This talk will present parallel experimental and computational approaches aimed at understanding the circuit mechanisms and computations underlying flexible perceptual and categorical decisions. In particular, our work is aimed at understanding how visual feature encoding in upstream sensory cortical areas is transformed across the cortical hierarchy into more flexible task-related encoding in the parietal and prefrontal cortices. The experimental studies utilize multielectrode recording approaches to monitor activity of neuronal populations, as well as reversible cortical inactivation approaches, during performance of visually-based decision making tasks. In parallel, our computational work employs machine learning approaches to train recurrent artificial neural networks to perform the same tasks as in the experimental studies, allowing a deeper investigation of putative neural circuit mechanisms used by both artificial and biological networks to solve cognitively demanding behavioral tasks.