What does it mean to feel an emotion? What does that mean in the brain?
I talked about predictive processing and active inference before, and there is an important dimension that I didn’t mention (I could say it was for space reasons, but it was actually because I didn’t really think about it at the time). That dimension was the inside of your body.
Take your brain. If we buy into the predictive processing model, and on this blog we buy in to that model like 17th century Dutch people bought into tulips, then your brain is constantly generating a predictive model of the world, and then checking it against streams of sensory data.1
Let’s expand on that. The idea is that your brain has a model of what it is seeing, and it has some electrical data trickling through from your eyes, and it checks that data, and then it updates the model, if it thinks the model needs updating. Often, when people think about predictive processing, they stop there. This is already a complex model, and focusing on one sensory modality is enough to get at the core principles, while also allowing neuroscientists to control for variables effectively.
But your brain also has electrical data trickling through from the ears. And that affects what you see. If you hear a loud crash to your left, your generative model should issue a bunch of predictions that your cousin is not carrying your carefully packed porcelain as well as he could be. You turn your head and expect to see certain visual data - namely, your cousin rifling through the broken china your grandmother gave you and looking apologetically at you.
There is also data coming from your nose, and your skin, and tastebuds, and- everywhere. The brain is constantly combining all of this data in order to refine its model of the world. It has to interpolate all of these sources of information into its model, and in doing so, has to issue a second set of predictions, about the quality of that data. These second-order predictions are about how precise the data is.
I’ve touched on this before, indirectly, but as your brain combines data about the outside world into its model, it will update not only based on the data, but also on the quality of the data. If you are currently in a snowstorm, your brain is going to treat visual data as less precise, and is less likely to update based on it. If you are currently in a nightclub, auditory data is less likely to be precise. If you are currently being crushed by a trash compactor, your tactile data is less likely - you get the drift.
MIXING IT TOGETHER
So we have a bunch of predictions from different sources which are combined into a new model. And the most important set of these predictions are interoceptive ones.
At the risk of sounding like a badly-written Best Man speech: Webster’s defines interoception as “the collection of senses providing information to the organism about the internal state of the body.”
Here’s a list of visceral bits and bobs that send signals which would count as interoception:
All of these things send signals to your brain too, whether directly or indirectly.
Your body is keeping you alive. If you like being alive, that is important.
The balance between interoceptive signals and exteroceptive signals is an interesting one. Allen and Tsarikis make the argument that interoceptive data is actually much more important than exteroceptive data, because interoceptive data relates to things like keeping your heart pumping and making sure your blood is oxygenated. If these functions go wrong, you die. They argue that the body thus treats interoceptive signals as having very high precision.
Here emerges an important nuance in the predictive model. Your models are designed to allow you to function in the world. They are not designed to be an accurate representation of it. The former does not entail the latter.
Obviously, it mostly helps to have an accurate view of the outside world, but there are plenty of cases where it is either not that helpful, or you can get by just fine without an accurate model of the world, and accordingly there isn’t much of a need to update towards one.
So the brain may update its predictions in the way that this graph suggests:

The first chart shows the basic model of predictive processing, which we covered here. The brain starts its prior, and updates in response to new data (the likelihood). It doesn’t update all the way to the new data, because it has learned a bunch of statistical regularities in the environment which are helpful and unlikely to be erased by a single new data point.
In the second chart, we break down that likelihood into visual data, tactile data, and interoceptive (or visceral) data. It will treat the interoceptive data as inherently more precise than other forms of data, because that data is more important. You need to act in ways that maintain your homeostatic and allostatic mechanisms (a fancy way of talking about the processes that keep you alive).
So if the interoceptive data is shifted left, the brain should update based on that data rather than based on its exteroceptive data, as in the example below:

DO GET EMOTIONAL
This doesn’t mean the interoceptive data is necessarily accurate. It is just afforded a high precision. In her book How Emotions Are Made, Lisa Feldman Barrett argues that some interoceptive data is notoriously inaccurate - your brain may radically misinterpret signals from your body, or the parts of the body issuing signals may be operating on significant time lags.
So what are emotions then? In this model they are constructs of feeling that allow you to navigate the world. Embarassment is a tool to solve a certain problem - you realise you have spaghetti sauce on your face on a date - a complex set of reactions ensue in your body. Your heartrate quickens. You blush like a baboon’s butt. You laugh like a hyena.
None of this is specific to an ‘embarassment network’ in the brain. Emotion is a label that you and others in a given social context give to a certain pattern of bodily behaviour.
The same circuits that can produce anger can also produce happiness, merely with a different label. Schachter and Singer in the 1960s found that if you injected people with adrenaline without them knowing, the resulting set of emotions - whether that was anger or happiness - was shaped by the context they were in.
Remember, everything that you feel and sense is a simulation created by your brain. This means that when you feel your heart pound in your chest, or when you feel your lungs fill up with air, these are simulations of the interoceptive system by your brain.
All of these things aren’t real, and yet they are very real. They matter for our interactions with the world. As W.I. Thomas said, if we define situations as real they are real in their consequences.
Tulip mania has been largely overstated, but that’s less fun as a story.