The study investigated how the primate cerebral cortex controls natural facial gestures, using a combination of functional magnetic resonance imaging and recording the activity of individual neurons.[1][3] Both lateral and medial "face motor" areas of the brain have been found to be involved in all types of gestures as well as encoding both voluntary and emotional expressions, challenging the classical model of two separate systems.[1][3] In each of these areas, neurons with broad tuning and highly specific neurons for a particular gesture have been found.[1] Analysis of the population of neurons showed that the category of the gesture can be distinguished from their activity not only during the movement itself, but also significantly before its start.[1] Activity, particularly in the motor and somatosensory cortices, predicted the time course of facial muscle movement during gesture production.[1] The researchers found a hierarchy of temporal coding strategies: the lateral motor and somatosensory cortices used highly dynamic, rapidly changing codes, while the medial cingulate cortex used a stable code that persisted before and during movement; the premotor cortex had moderately stable dynamics in the meantime.[1] The study shows that the production of facial gestures is supported by a distributed network in the cerebral cortex that is organized hierarchically according to temporal dynamics and combines both dynamic and stable codes in the control of facial expressions.[1][3]