On an intuitive level, our emotions seem to surface automatically, like reflexes waiting to be triggered by the correct external stimuli. If emotions are reliably evoked in this manner, each one may be mappable to a specific set of brain areas, in the same way that the sharp pain elicited by a pinprick can predictably be traced to activity in local nerve terminals. This was the line of thought of the Classical Theory of Emotion, which dominated the field of emotion research during the 20th century.
Framing emotions as primitive psychological outputs that simply ‘rush over us’, it proposed that each one might possess a fixed biological correlate, or neural fingerprint: a stereotypic, discriminable pattern of brain activity (involving specific brain regions) underpinning each emotion and separating it from the others. By extension, emotions would also be expected to elicit specific, measurable alterations in behaviour and physiology1.
Can different emotions be mapped to specific brain regions?
A 2010 meta-analysis encompassing thirty PET and fMRI whole-brain studies conducted by Vytal & Hamann concluded that happiness, anger, disgust, sadness, and fear exhibit distinct activation patterns within various brain regions. Notably, fear correlates with activation in the amygdala and insula, distinguishing it from other emotions – while anger is characterised by activity in both the parahippocampal gyrus and inferior frontal gyrus2. Recent studies have corroborated these correlations between specific emotions and neural activity patterns3.
However, it's important to recognise that these observed ‘fingerprints’ serve as averages, not proof that each emotion is reliably generated by the same brain areas. These averages aid in the differentiation of emotional categories but aren’t infallible predictors1.
Similarly, studies examining neural activation at the voxel level (e.g., Saarimäki et al., 4), reveal recognisable patterns of brain voxel activation corresponding to different emotions, but these patterns do not represent fixed circuitry. Each one is merely a statistical average of several occurrences, with individual episodes of emotion unlikely to engage all the brain voxels associated with them and potentially involving none of them5.
The theory of constructed emotion
Considering these findings, neuroscientist Lisa Feldman Barrett proposed the Theory of Constructed Emotion (ToCE) in 2018, proposing that the brain works adaptively to construct emotional instances that cannot be justifiably grouped together based on the neural circuitry, facial expressions, or behaviour that they involve1.
The amygdalae constitute a prime example of this flexibility in action. These small, almond-shaped nuclei in the temporal lobes have long been implicated in fear production. Anxiety does tend to align with the increased activity of amygdala neurons in mammals, as neuroimaging studies can demonstrate. However, humans with bilateral amygdala lesions, which render both amygdalae non-functional, are still capable of dreading hair-raising outcomes in the real world and panicking when threatened.
In their 2013 meta-analysis, Guillory and Bujarski concluded that emotions can be elicited through the electrical stimulation of specific brain regions. For instance, contrary to previous assumptions, joy can be induced independently through stimulation of the hippocampus, frontal gyrus, or supplementary motor area7.
Emotions can be generated flexibly, thanks to the brain’s design
The brain's ability to generate emotions flexibly underscores a concept known as degeneracy—a system property wherein various components can yield the same outcome. This trait enhances the adaptability of biological systems, a feature prominently observed throughout the brain.
Numerous excitable sites (dendrites) act as independent units with specific responsiveness in individual neurons. Together, these units can sum excitatory inputs in various ways to trigger action potentials. And within a given brain region, neurons can utilise different neurotransmitters and signalling cascades to achieve similar activity patterns8.
Given the evolutionary significance of emotions, their lack of fixed brain circuitry isn't surprising. Instead, emotions emerge through diverse combinations of brain areas, showcasing the brain's remarkable capacity for adaptive response.
Emotions also lack predictable physiological and behavioural effects
Studies indicate that facial expressions do not universally represent specific emotions, either. Emotion expression is known to vary across cultures; East Asians, for instance, primarily use eye movements, whereas Caucasians use their mouths9. Naturally, this makes us worse at inferring other people’s emotions than we often think we are. This is true even within the same culture; for example, research on the Trobriander community of Papua New Guinea found low accuracy in emotion recognition from images, even with multiple-choice cues10.
This variability is also seen in the autonomic nervous system's responses to emotions, which appear neither consistent nor specific. People of the same cultural background are more likely to respond to the same situations with similar changes in parameters such as skin conductance, heart rate, and blood pressure1. But a meta-analysis by Quigley & Barrett revealed significant variation in the physiological effects that any single emotion can induce, suggesting that they also aren’t reliable indicators.
Your brain’s overarching goal: to meet your body’s physiological needs
Ensuring that you stay alive is a balancing act for your brain. It needs to be aware of the physiological status of your entire body at all times and learn to anticipate what it will require before you dip into a detrimental negative energy balance.
Homeostasis refers to the tendency of all biological systems to self-regulate by defending various ‘set points’; your brain generally resists marked changes to your blood glucose levels, fat percentage, and energy expenditure, modulating processes like sleep and appetite as it deems necessary. However, in 1988, biologists Sterling and Eyer proposed a more nuanced model of bodily regulation called allostasis.
Unlike homeostasis, which portrays the brain as reactive—intervening only after a need arises—allostasis operates proactively. Drawing on past experiences, the brain pre-emptively initiates adjustments in the body’s status to meet its needs. This active process often entails departing from traditional set points, such as elevating blood pressure in anticipation of stress.
As Feldman Barrett explains in her book How Emotions Are Made: The Secret Life of the Brain, emotional states can also be viewed as allostatic responses because they heavily influence our behaviour. By dynamically producing emotions, the brain prompts behaviours that suit the body's requirements. This can include active responses, like feeling motivated to communicate during social interactions, which can reduce cortisol levels. Conversely, emotions can also inspire inaction by prompting conflict avoidance or encouraging much-needed rest.
The fundamental pillars of emotion: allostasis, interoception and affect
Allostasis is primarily orchestrated by the anterior insula cortex and the anterior, mid, and posterior cingulate cortices. These areas, termed 'limbic' cortical regions, differ from the neocortex's more intricate structures, primarily lacking granular cell layer IV or possessing only rudimentary ones. They modulate key components of the dopaminergic mesolimbic pathway—namely, the nucleus accumbens and the ventral tegmental area, crucial for reward processing—as well as the central nucleus of the amygdala, the periaqueductal grey, the hypothalamus, and the basal forebrain.
Since emotions can be considered allostatic responses, these same brain regions are responsible for emotion generation. To effectively coordinate allostatic responses—whether evoking emotions like envy or initiating the salivation response before eating—the brain constantly monitors the body's physiological status. This monitoring process, known as interoception, involves the continuous transmission of bodily information to the brain1.
Interoception encompasses a broad spectrum of inputs—visceral, nociceptive, and thermosensitive—as well as proprioceptive signals from muscles, tendons, and joints. These signals are conveyed to the brain via sensory neurons of the vagus nerve and the lamina I spinothalamic tract, informing the brain about the body's condition and requirements.
Importantly, interoception is subjectively experienced as affect—a fundamental facet of human experience. Unlike the intricate nature of emotions, affect unfolds along three primary dimensions: hedonic valence (pleasant/unpleasant), arousal (subduing/energising), and intensity. It may manifest as a primal need, like thirst, or underlie an enduring mood. Affect acts as a barometer for interoceptive input, with fluctuations within the body’s internal organs affecting it the most strongly.
Despite being a simpler construct than emotion, affect serves as the foundation for every emotion we experience. Unlike emotions, affect appears to be universally significant, as reflected in the presence of valence-related terms across all languages and the description of diverse energy and intensity levels.
Predictive processing: at the core of human perception
From the brain’s perspective in the skull, the body is an extension of the external world. In order to appropriately orchestrate movement and perception, the brain must, therefore, run a model representation of the ‘body in the world’, including interoceptive signals and external input. These ‘embodied simulations’ are achieved by intrinsic neural activity within large networks, including the default mode network (DMN). Cumulatively, they account for 20% of total bodily energy consumption.
Our conscious experience of emergent phenomena like emotions is only possible because of the active, predictive nature of the brain. Rather than waiting to passively receive interoceptive input from the body, the brain ceaselessly issues predictions concerning its energy requirements and the external circumstances at play.
The brain accomplishes this by intrinsic activity in the limbic cortices, as well as the default mode network (a large-scale cortical network implicated in many facets of high-level cognition including autobiographical memory retrieval). These simulations are constantly contrasted with real ascending sensory information from the body, with any discrepancies between the two flows giving rise to error signals that result in them being amended to better match the real-life scenario. This coalescence of predictions onto incoming interoceptive input is the basis of your ability to feel affective states.
Concepts: from affect to clear emotional states
Predictive models play a pivotal role not only in stable sensory perception but also in the construction of emotions. Interoceptive stimuli from the body's periphery, intrinsically linked to allostasis and survival, manifest in consciousness as feelings of affect. It's likely that affect transforms into a higher-dimensional emotion when we encounter concepts associated with specific affective states and their societal implications. However, translating affect into familiar, discernible emotions that we can label—like wonder, melancholy, or ennui—requires the integration of conceptual frameworks1.
Concepts are best viewed as patterns of neuronal activation ('brain states'), rather than fixed, static stored memories. The mPFC plays a necessary role in concept construction, being reciprocally connected with the hippocampus and intricately involved in memory consolidation and retrieval. Concepts with features in common (such as ‘things that fly’ and ‘winged animals’) can be encoded as rudimentary memory traces, or ‘engrams’, by overlapping mPFC neuronal populations1. However, these are then integrated with input from the sensory and motor regions that the mPFC is also densely innervated by, allowing fully-formed concepts to be dynamically and actively constructed in the moment, much like emotions.
Language is crucial in the process of emotion construction
The brain can only translate sensory input into meaningful perception if you have previously assimilated a concept relating to the stimuli involved, and language plays a vital role in this process. For example, we associate palatable food with adjectives such as ‘delicious’ early in life. And as we gain more experience, our brains learn to attribute different qualities like ‘rich’ and 'sweet’ to flavours that induce different but advantageous changes in our energy budget. Without prior exposure to these concepts, you might not discern between flavours as precisely and feel overwhelmed trying new foods.
Language is even more critical to emotion generation. Emotions emerge from affect when we have been exposed to specific situations in which the same affective state has been associated with words, whether we have absorbed them as children or intentionally learnt them in a foreign language. This allows emotions to be felt as emergent states, effectively detached from the bodily constituents of their underlying affective states.
Words provide the brain with powerful categorisation capabilities; a German speaker may seamlessly and invariably be overcome by ‘Schadenfreude’ upon experiencing certain stimuli in the right social context. In contrast, someone unaware of the word will be limited to feeling either a mixed affective state or the diluted form of a similar, poorer-fitting emotion.
Since language is recruited to assign meaning to affect, your vocabulary influences your emotional granularity. While learning new words is unlikely to entirely transform the valence of unpleasant emotions, it may prompt your brain to reframe them and distinguish between them with more finesse1.
Does the theory of constructed emotion make evolutionary sense?
One major criticism of the ToCE could be its perceived lack of alignment with evolutionary principles. If emotions aren’t passive, easily triggered responses, instead requiring language to be generated, then how did our ancestors stay alive? Such a mechanism for generating emotions would be evolutionarily unstable and likely eradicated from the human gene pool, given the crucial role deeply ingrained emotions play in fundamental survival tasks like securing food, protecting offspring, and evading predators.
However, just like there is a difference between basic affect (e.g., tiredness) and emotions, there are probably meaningful qualitative differences between different types of emotions. It’s highly likely that primitive emotions are somewhat ‘hardwired’ into the brain from birth in healthy people, not in the sense that they possess specific hardwired brain circuitry (we’ve seen that fear is possible without the amygdala, due to the brain’s inherent flexibility), but in the sense that the brain is ready to quickly generate them even in the absence of named concepts or similar past experiences.
Examples of primitive emotions are fear, affiliative drives (e.g., siblings looking out for each other or a mother being driven to care for her offspring), rage, and the drive to explore new places. Feldman Barrett's ToCE is probably more relevant to the range of emotional states beyond these primal emotions—ones that are more sophisticated, nuanced, and less directly tied to immediate survival needs. These emotions, such as the motivation to work inspired by something, the complex mixture of joy and sorrow during bittersweet experiences, or the subtle nuances of jealousy and resentment, likely rely heavily on our accumulated experiences, mental constructs, and range of vocabulary.
Summary: emotion generation is an active process
The ToCE doesn't imply that every emotion necessitates a named word to be felt. Babies can experience fear prior to learning the associated vocabulary; even if they perceive it more as a bodily sensation than a distinct mental 'experience' of the emotion, it still fulfils a comparable function to adult human fear.
However, the ToCE lifts the lid on a nebulous area of human psychology that was previously oversimplified, highlighting that:
Emotions (basic or sophisticated) probably do not correspond to specific neural circuits. These findings could hold significant implications for psychiatry. Psychiatric symptoms may be produced flexibly, like emotions, and the future of interventions like transcranial magnetic stimulation may lie in more effectively addressing broad networks implicated in specific disorders.
Sophisticated emotions are strongly influenced by language. The human brain uses predictive computation to make sense of the current physiological status of the body, stabilising the flux of incoming inputs into coherent states that are brought to your consciousness as affect. When a particular affective state has been associated with a concept linked to a word or phrase, you can feel it as an emergent emotion instead3.
Emotions are highly linked to our survival, whether primitive or not. Therefore, they are always influenced by our goals, our energy requirements, and the social context that we are in. This is why instances of ‘joy’ rarely coincide with threatening experiences or strong homeostatic impulses like hunger; the brain’s role is to orchestrate appropriate responses, not to always keep your mood as elevated as possible.
On the other hand, finishing an interview, accurately estimating the number of sweets in a jar at a fair or hearing that a murderer has been given a life sentence could all trigger ‘joy’. While these scenarios are very different, the brain could deem the goals underlying them (to perform well, to win prize money, to see a potential threat be removed from society) similar enough, in terms of the body’s requirements, to evoke the same emotional state.
Broadening your vocabulary can enrich your emotional gamut
Languages arise in response to what the inhabitants of a place deem worthy of description. However, they also actively mould their speakers’ perspectives and emotional gamuts by locking them into communication paradigms that the brain uses to decode information from the body and construct more ‘advanced’ human emotions. For example:
The Portuguese word ‘saudade’ describes a sense of homesickness that is melancholy but pleasant.
The Dutch word ‘gezellig’ describes pleasant, cosy, or convivial social situations. It can refer to people, places, or just the underlying ‘warmth’ of the social situation.
Japanese also includes words that diffusely describe relatively normal underlying states, with the word ‘shouganai’ conveying a peaceful acceptance of what is beyond one's control.
Experiencing novel things and assimilating new words—both in your spoken and foreign languages—increases your brain’s bank of encoded concepts. This may be an effective way to entice our brains into assigning slightly different meanings to affective states, enriching the emotional gamut that we can access and helping us reframe negative emotions as more neutral.
References
1 Barrett, L. F. (2017). The theory of constructed emotion: An active inference account of interoception and categorization. Social Cognitive and Affective Neuroscience, 12(1), 1–23.
2 Vytal, K., & Hamann, S. (2010). Neuroimaging support for discrete neural correlates of basic emotions: A voxel-based meta-analysis. Journal of Cognitive Neuroscience, 22(12), 2864–2885.
3 Kragel, P. A., & LaBar, K. S. (2015). Multivariate neural biomarkers of emotional states are categorically distinct. Social Cognitive and Affective Neuroscience, 10(11), 1437–1448.
4 Saarimäki, H., Gotsopoulos, A., Jääskeläinen, I. P., Lampinen, J., Vuilleumier, P., Hari, R., Sams, M., & Nummenmaa, L. (2016). Discrete neural signatures of basic emotions. Cerebral Cortex, 26(6), 2563–2573.
5 Clark-Polner, E., Johnson, T. D., & Barrett, L. F. (2016). Multivoxel pattern analysis does not provide evidence to support the existence of basic emotions. Cerebral Cortex, 26(5), 1875–1878.
6 Craig, A. D. (2015). How Do You Feel?: An Interoceptive Moment with Your Neurobiological Self. Princeton University Press.
7 Guillory, S. A., & Bujarski, K. A. (2014). Exploring emotions using invasive methods: Review of 60 years of human intracranial electrophysiology. Social Cognitive and Affective Neuroscience, 9(12), 1880–1889.
8 Sardi, S., Vardi, R., Sheinin, A., Goldental, A., & Kanter, I. (2017). New types of experiments reveal that a neuron functions as multiple independent threshold units. Scientific Reports, 7, 18036.
9 Jack, R. E., Garrod, O. G. B., Yu, H., & Schyns, P. G. (2012). Facial expressions of emotion are not culturally universal. Proceedings of the National Academy of Sciences, 109(19), 7241–7244.
10 Crivelli, C., Russell, J. A., Jarillo, S., & Fernández-Dols, J.-M. (2016). Recognizing spontaneous facial expressions of emotion in a small-scale society of Papua New Guinea. Emotion, 16(8), 1102–1113.