OVERVIEW
In everyday life, objects are rarely presented in isolation but often occur with other objects in particular contexts. How do humans learn and use contextual information to guide attention, perception, and action? In the lab, we attempt to elucidate the context effects on human behavior using psychophysical techniques. Our findings demonstrate that contextual information can be rapidly derived from visual/social scenes and used to modulate visual memory retrieval, object recognition, attentional deployment, visual consciousness, and goal-directed hand movements.