by Sophia Virkar
art by Jillian Smith
Introduction
Humans make thousands of decisions every day. We are constantly deciding, whether it's to snooze our alarms or where to invest our money. While neuroscientists have yet to fully understand the brain’s processes for decision-making, it is generally accepted that our brains are wired to avoid risk and seek reward. However, decisions are never that simple, and humans are also asked to weigh several other factors such as rationality, emotions, morals, and intuition. Knowing this, how do we find a balance between these competing drives? In order to make better decisions, it is useful to know how and why we make decisions.
Understanding Decision-Making
Decision-making is too complex to simplify into one pathway or even a specific set of structures. In fact, there are many different types of decisions that utilize different brain patterns and outcomes. Value-based decision-making involves deliberation that requires a conscious assessment of all the variables associated with any particular decision [1]. An example of this would be deciding whether to eat a salad or hamburger. In contrast, perception-based decision-making is considered automatic and effortless, like making the decision to stop at a red light [1].
Although it is difficult to classify decision-making into a single process or specific brain regions, the biological processes at play can be simplified to follow the stimulus-response pathway. Let’s walk through the steps of making the value-based decision to make a salad. First, we receive sensory information from our environment like seeing and smelling the lunch options. Then, sensory neurons receive that stimulus information and relay it to the brain. Here, structures such as the prefrontal cortex and the ventral striatum may become involved. Lastly, motor neurons relay instructions to produce a motor output, like scooping a plate of salad for yourself; however, not all decisions require a motor output [2].
The reward system, also known as the mesolimbic system, plays a significant role in the processes that drive us to make certain decisions. A reward can be defined as a stimulus that the brain associates with positive or desirable outcomes. Recognition of a reward triggers a coordinated release of many different neurotransmitters, primarily dopamine, which initiate feelings of pleasure that guide behavior and motivation [3]. The mesolimbic system involves projections of dopamine neurons from the midbrain, which are important in transmitting information to the prefrontal cortex, hippocampus, amygdala, and many other structures [4]. This means that when we experience a reward stimulus, the mesolimbic ‘reward’ system is activated, causing a release of dopamine [4].
Rational Decision-Making
Rational choice theory is the classical view that we make decisions based on the outcome that will benefit us the most and is often used to explain economics and social behavior [5]. This means that people make decisions that maximize self-interest and utility. Say you were given the choice between receiving $0 or $5. Rational choice theory might argue that a rational person would go with the higher-value option. But if you’ve ever seen the question: “Take $5, or double it and give it to the next person?” you would know that decision-making is much more complicated than that. However, it is important to note that rational choice theory still might be able to explain this situation if someone believes altruism benefits their self-interest more than monetary reward.
In reality, decisions are made based on a variety of factors, including prior knowledge, the relative merit of the options, risks and rewards associated with their outcomes, and other costs associated with gathering evidence (e.g., the cost of elapsed time) [6]. While not exhaustive, this list covers many types of decisions, both simple and complex [6].
To keep up with the numerous daily decisions we make, our brain has developed shortcuts known as heuristics––which are often, but not always, effective at approaching a decision [7]. Heuristics stand in contrast to algorithms, which are sequences of defined rules that guarantee a solution to a problem over a longer period of time [2]. For example, if you are reading this article word for word, your brain is using an algorithm. However, if you are skimming each section based on what seems important, you are using a heuristic. However, heuristics introduce gaps in decision-making, illustrated by phenomena like the anchor bias. This bias occurs when individuals heavily rely on initial information [8]. For example, if you see a nice sweater for $100 with a 70% off tag, for example, you'd be far more likely to purchase that than one originally priced at $30. Another instance is known as the gambler’s fallacy, where people assume the outcome of an independent event is more or less likely based on previous events [9]. For example, if someone flipped a coin 3 times and got heads 3 times in a row, it would be easy to guess the next outcome would likely be tails, but in reality, the outcome is still a 50/50 chance of heads or tails. The prevalence of heuristics and biases in decision-making reveals how rational thinking is, more often than not, violated in the real world [10].
Emotional Decision-Making
Emotions can influence decision-making both positively and negatively. Our emotions tend to guide us toward avoiding negative feelings like guilt or regret and steer us toward embracing positive feelings like pride or satisfaction. Once the outcomes of our decisions materialize, we can feel new emotions, like guilt or surprise that guide our future decisions. Either way, our emotions and decisions go hand in hand.
In the ultimatum game, a classic economics experiment, two participants engage in a simple negotiation [11]. Player One is given $10 to split with Player Two. Player One can offer any portion of the money, ranging from $0 to $10, and keep the rest for themselves. The catch is, that if Player Two does not accept the offer, neither of them gets to keep the money. According to simple logic, Player Two should accept any offer, no matter how small, because receiving some money is better than nothing; however, this isn’t typically the case. When Player One offers only a few dollars, Player Two consistently turns down the offer, forfeiting the money. When asked why they turned down the offer, players explained that they did it to make their ‘greedy’ partner mad, showing how emotions play a role in decision-making [11].
Cognitive neuroscientist Alan Sanfey and his colleagues used a brain imaging technique known as functional magnetic resonance imaging, or fMRI, to observe peoples’ brains while they played the ultimatum game [12]. fMRI is a neuroimaging technique that can detect changes in blood flow caused by brain activity. The fMRI results showed that activity in the anterior insula, a region involved in emotional expression, increased when Player Two received less fair offers [12]. Additionally, these scans showed that the player’s prefrontal cortex, a region associated with planning and goal orientation among many other things, was activated. In this context, the goal is to make money. Sanfey tracked these two regions and saw what appeared to be a balancing act between emotion and reason as players made decisions. When the anterior insula was more active than the prefrontal cortex, Player Two tended to reject the offer, and when the player’s prefrontal cortex was more active, they tended to accept the offer. The balancing act between the prefrontal cortex and the anterior insula shows how emotions can sway a decision on whether or not to take a risk [12].
Emotions can be categorized into two types that affect our decision-making in different ways. The first is integral emotions, which are emotions that strongly and routinely shape our choices [13]. For example, a person who feels suspicious about taking the subway after dark is more likely to choose the bus. Integral emotions can operate at both the conscious and unconscious levels. The second type is incidental emotions, which carry from one situation to another independent situation [13]. For example, if you are angry you missed your train, your anger might automatically cause you to blame others for your being late, even if they have nothing to do with the original source of anger, and this carryover of incidental emotions usually occurs unconsciously.
Moral Decision-Making
Emotions are not the only personal factor influencing decisions; our morals and ethics also play a large role. Moral principles vary among individuals, rooted in distinct sets of fundamental priorities, and can impact decision-making in various ways. A great example of moral decision-making is the famous trolley dilemma. In this problem, people are told a trolley is on a track about to kill six people and are given the choice to either (1) pull a lever and switch the track, killing only one person or (2) do nothing and watch six people get killed [14]. One side argues it is their duty to save the lives of six over one, but others say the intentional killing of that one person is even worse.
Van Baar et al. sought to explore the neural correlates of moral strategies in a less dramatic context [15]. They looked at moral strategies in the context of reciprocity decisions (decisions that are mutually beneficial to both parties) and proposed two moral motives to explain reciprocity behavior: inequity aversion and guilt aversion. An inequity-averse player is someone whose goal is to ensure an even distribution of rewards. A guilt-averse player is someone whose goal is to match the expectations of the other player. Van Baar et al. performed research using the “Hidden Multiplier Trust Game”, which they renamed the “Investor Game.” [15]. Then, they identified the brain signatures associated with different moral strategies [15].
In this experiment, participants played the “Investor Game” while undergoing fMRI scans [15]. On each trial, an anonymous investor could send any number out of 10 game tokens to the trustee (the participant in the fMRI) and keep the remainder. The investor believed their investment would be multiplied by a fixed factor of 4 before transferring it to the trustee. However, the trustee knew the actual multiplier was different (×6, ×4, and ×2 in different respective trials). Crucially, the trustee knew that the investor was unaware of the actual value of the multiplier. Following the transfer, the trustee could choose to return any number of tokens from the multiplied investment to the investor but did not have to. The tokens were redeemed for real money at the end of the trial [15].
The trustees' varied decisions revealed different moral strategies determined by the researchers [15]. A guilt-averse trustee would return the number of tokens that were expected based on the investor’s belief in a fixed ×4 multiplier, irrespective of the actual multiplier of the respective trial. An inequity-averse trustee would instead base their decision on the total number of tokens they’d receive (dependent on the multiplier). Additionally, some participants exhibited a third moral strategy, greed, by keeping as many tokens as possible. Lastly, there was a group of people who switched their strategy based on what was most beneficial to them per trial, termed moral opportunism. The activity patterns in specific brain regions were found to be more similar among participants with similar moral strategies than those with differing strategies. These findings indicate that specific moral strategies are encoded in specific brain regions [15].
Intuitive Decision-Making
Decisions aren’t always emotionally or morally contemplated. Sometimes it is just easier to go with your gut. This could be a “gut feeling" about someone you just met, or maybe a “drop in your stomach” before something goes wrong. Intuition is defined as judgments that arise through unconscious, quick, and holistic associations that combine information [16]. For example, in the trolley problem, your rational thought might have been to spare six lives over one, but your intuition might have said that the intentional killing of that one person would be morally worse and that it is not up to you to decide who lives.
Intuition speeds up the process of decision-making and compliments analysis by providing a holistic perspective [16]. Trusting your gut can be controversial. Some people make big decisions, like which college to attend, based on gut feeling; some people don't believe in gut feeling at all, dismissing it as completely unreliable. Research shows that going with your gut can be effective in building distress tolerance and regulating emotions such as fear and stress [17]. For example, raising your hand more often in class, even if you are unsure of your answer, may gradually increase your comfort in making quick decisions. While trusting your gut isn’t a perfect method, your intuition can still be a powerful tool for making quick decisions [18, 19].
Conclusion
Human decisions are complex and involve many regions of the brain, such as the prefrontal cortex, amygdala, and hippocampus. These involved brain regions dynamically work together to facilitate decisions while assessing risk and reward through dopamine pathways. Factors such as emotions, rationality, morals, and intuition all influence our decisions, which we have seen through experiments such as “the ultimatum game” and “the investor game.” So when making your next decision, financial or not, remember that following your gut instincts, morals, and even your emotions can actually help make good decisions!
REFERENCES:
1. Loued-Khenissi, L., Pfeuffer, A., Einhäuser, W., & Preuschoff, K. (2020). Anterior insula reflects surprise in value-based decision-making and perception. NeuroImage, 210, 116549. https://doi.org/10.1016/j.neuroimage.2020.116549
2. Schacter, D. L., Gilbert, D. T., & Nock, M. (2023). Psychology (Sixth edition.). New York: Worth Publishers, Macmillan Learning.
3. Berridge, K. C., & Kringelbach, M. L. (2015). Pleasure systems in the brain. Neuron, 86(3), 646–664. https://doi.org/10.1016/j.neuron.2015.02.018
4. Lewis, R. G., Florio, E., Punzo, D., & Borrelli, E. (2021). The Brain’s Reward System in Health and Disease. Advances in experimental medicine and biology, 1344, 57–69. https://doi.org/10.1007/978-3-030-81147-1_4
5. Zey, M. A. (2015). Rational Choice and Organization Theory. In J. D. Wright (Ed.), International Encyclopedia of the Social & Behavioral Sciences (Second Edition) (pp. 892–895). Oxford: Elsevier. https://doi.org/10.1016/B978-0-08-097086-8.73109-6
6. Shadlen, M., & Roskies, A. (2012). The Neurobiology of Decision-Making and Responsibility: Reconciling Mechanism and Mindedness. Frontiers in Neuroscience, 6. Retrieved from https://www.frontiersin.org/articles/10.3389/fnins.2012.00056
7. Frimodig, B. (2022, November 3). Heuristics In Psychology: Definition & Examples. Retrieved from https://www.simplypsychology.org/what-is-a-heuristic.html
8. Bojke, L., Soares, M., Claxton, K., Colson, A., Fox, A., Jackson, C., … Taylor, A. (2021). Reviewing the evidence: heuristics and biases. In Developing a reference protocol for structured expert elicitation in health-care decision-making: a mixed-methods study. NIHR Journals Library. Retrieved from https://www.ncbi.nlm.nih.gov/books/NBK571047/
9. Matarazzo, O., Carpentieri, M., Greco, C., & Pizzini, B. (2019). The gambler’s fallacy in problem and non-problem gamblers. Journal of Behavioral Addictions, 8(4), 754–769. https://doi.org/10.1556/2006.8.2019.66
10. Loock, M., & Hinnen, G. (2015). Heuristics in organizations: A review and a research agenda. Journal of Business Research, 68(9), 2027–2036. https://doi.org/10.1016/j.jbusres.2015.02.016
11. Houser, D., & McCabe, K. (2014). Chapter 2 - Experimental Economics and Experimental Game Theory. In P. W. Glimcher & E. Fehr (Eds.), Neuroeconomics (Second Edition) (pp. 19–34). San Diego: Academic Press. https://doi.org/10.1016/B978-0-12-416008-8.00002-4
12. Sanfey, A. G., Rilling, J. K., Aronson, J. A., Nystrom, L. E., & Cohen, J. D. (2003). The Neural Basis of Economic Decision-Making in the Ultimatum Game. Science, 300(5626), 1755–1758. https://doi.org/10.1126/science.1082976
13. Lerner, J. S., Li, Y., Valdesolo, P., & Kassam, K. S. (2015). Emotion and Decision Making. Annual Review of Psychology, 66(1), 799–823. https://doi.org/10.1146/annurev-psych-010213-115043
14. Trolley problem | Definition, Variations, Arguments, Solutions, & Facts | Britannica. (2023, October 19). Retrieved October 31, 2023, from https://www.britannica.com/topic/trolley-problem
15. van Baar, J. M., Chang, L. J., & Sanfey, A. G. (2019). The computational and neural substrates of moral strategies in social decision-making. Nature Communications, 10, 1483. https://doi.org/10.1038/s41467-019-09161-6
16. Kopalle, P. K., Kuusela, H., & Lehmann, D. R. (2023). The role of intuition in CEO acquisition decisions. Journal of Business Research, 167, 114139. https://doi.org/10.1016/j.jbusres.2023.114139
17. Arici-Ozcan, N., Cekici, F., & Arslan, R. (2019). The Relationship between Resilience and Distress Tolerance in College Students: The Mediator Role of Cognitive Flexibility and Difficulties in Emotion Regulation. International Journal of Educational Methodology, 5(4), 525–533. https://doi.org/10.12973/ijem.5.4.525
18. Lufityanto, G., Donkin, C., & Pearson, J. (2016). Measuring Intuition: Nonconscious Emotional Information Boosts Decision Accuracy and Confidence. Psychological Science, 27(5), 622–634. https://doi.org/10.1177/0956797616629403
19. Yip, J. A., Stein, D. H., Côté, S., & Carney, D. R. (2020). Follow your gut? Emotional intelligence moderates the association between physiologically measured somatic markers and risk-taking. Emotion (Washington, D.C.), 20(3), 462–472. https://doi.org/10.1037/emo0000561
Comments