Black Friday is here! Perhaps you set an alarm for some unearthly hour this morning to purchase that "once in a lifetime special priced" item. Or you're currently online providing your credit card details to grab that 50% discounted smartwatch. With today's crazy sales popping up via SMS and on radio stations everywhere I am reminded of the psychological nudges at play on the build-up to this day and on the day itself.
First a little background...
Daniel Kahneman (Nobel Prize laureate 2004) in his best-seller "Thinking Fast and Slow", talks about two virtual overlays in the brain:
Think of it as a rapid response, reflexive part of the brain. That fight or flight response. It is the instinctive part of the brain, that comes into play when you slam the brake pedal or swerve the car out of the way to avoid a collision with the pedestrians dashing across the highway. It's also the part of the brain that makes you turn up your nose when you detect a foul odour. The muscle memory stuff.
This is the slow deliberate thinking associated with the prefrontal cortex. This is the part of the brain that aids with memory recall - evaluating complex algorithms or performing complex calculations. Performing the slow deliberate steps when first acquiring a new skill.
To adapt and survive in our everyday surroundings, our brains have become incredible pattern recognition engines. We are faced with lots data, spending valuable processing time separating the value from the noise. We develop heuristics that files our thoughts in boxes, making snap decisions on what information to file or discard.
What we file, or discard is largely influenced by our families, the communities that raised us, our schooling, our values, some level of genetic predisposition and so on. When faced with the same information set, two individuals may decide to file or discard very different parts of that set, potentially resulting in two very different decisions or outcomes.
We use these same heuristics to "fill in the blanks". Think of those posts in social media, with variations on paragraphs where the first and last letters of each intended word in position with a blend of alphanumeric characters in between. We are astounded by our ability to make sense of the paragraph and feel highly complimented by the false statistic that only a small percentage of people of special ability can make sense of it.
No surprise there! We are superb pattern recognition machines, that have a strong bias towards story-telling, or differently, creating a cohesive narrative. This is a variation on the phenomenon when you recognise elephants and angels in the clouds (cluster bias).
We talk about context switching as we recognise the mental drain frequent switching of mental tasks places on us, especially the more complex tasks. We understand from the work of Kahneman and others that we work with a finite concentration span and that the switching between system 1 and system 2 responses exposes us to cognitive faults that impede our ability to think rationally.
At a management retreat a few years ago, a colleague played a YouTube video of what has become known as the Invisible Gorilla described in the book by the same title. The basic plot: Two teams pass a basketball between them. The viewer objective: Count the number of times the ball was passed.
We were so absorbed in counting the number of passes, developing a kind of tunnel vision when counting and blocking out all other visual stimuli, that we completely missed the person dressed in a gorilla suit, appearing in the video.
Daniel Kahneman, in reference to the same video, concludes that we are sometimes blind to the obvious, and most importantly that we are blind to our blindness.
Our ability to make rational decisions are impeded by a range of cognitive biases – systematic blind spots in our thought processes that often impede our ability to make rational decisions. Here I will define a rational decision as given the same set of input parameters, making the same choices in achieving an outcome. For example, if I am willing to drive for 10 minutes in order to save R5 on a cup of coffee, I should be willing to drive for 10 minutes to save R5 on the purchase of a microwave oven? The saving is R5 in either scenario.
These blind spots of which there are many, are predictable and impact our decision-making in ways we are often blissfully unaware of.
So why mention all of this in a data-driven world?
We have worked on the premise that automating our routine activities would elevate our decision making, enable us to focus on higher business value activity. Change management 101 sold this to employees whose roles were candidates for automation.
My view on change - not my original idea, but a model that works well for me is that change is a two-step process: Why do I need to change? Once I understand the need for change how can I make that change? Once both questions have been answered we can overcome the initial change inertia, and have created the momentum for the transition. (More on this topic in future blogs.)
For the first time, we understand that more choices do not necessarily mean better decision making and selection. It is true to a large degree, but a subjective tipping point is reached, where more options actually start to impair on our ability to make decisions and take us to some form of 'analysis paralysis.'
High stake decisions have a similar impact. How do we react? Well, predictably. Sometimes we stick to the default option, yet other times, because we are social creatures, we defer to our social circle (think of the "Like" button on Facebook and the impact it has on your decision making).
Marketers amongst others understand to some extent what we are cognitively blind to, and since time immemorial have used techniques to exploit and manipulate this blindness.
Richard Thaler (Nobel Prize Laureate 2017) in his book "Nudge" effectively highlights this, further introducing concepts like choice architecture.
For a real-life application, look at the most recent US Presidential Election and the use of data collection and analytics techniques to dissect and "nudge" the US voter base in Trump's direction.
In this era of digitalisation known as the 4th Industrial Revolution, we have become obsessed with data collection. We talk about wearables, IoT devices, edge computing, machine learning, deep learning, predictive analytics, bots and connected vehicles. Social media platforms vie for your data as there is quantifiable business value. Real-time streams of data are available, exabytes of the stuff is persisted and accessible at the touch of a button, or at the command of your voice (my vote for the user interface of the future)
Never has the voice of the consumer been so powerful, yet never have we felt so helpless that we are roping in AI, deferring the decisions to our bots.
What is the endgame? What will be the impact of these powerful influences on our decision making over time, both individually and as corporates? More discussion for another day.
Going back to our original theme, I would like to talk to some cognitive biases specifically with Black Friday in mind.
Lie down on the couch and let's start session one.
When you are online making your purchases, chances are at least 3 biases are being exploited:
- Anchoring Bias
- Availability Bias
- Contrast Effect
My favourite definition of the anchoring bias is one I gleaned from the Harvard Law School site.
Anchoring is a cognitive bias that describes the common human tendency to rely too heavily on the first piece of information offered (the "anchor") when making decisions.
This is visible in most negotiation forms, where the initial price mentioned becomes the basis for bargaining, and the agreed price doesn't differ too materially from the initial anchored price.
Anchoring has implications in a business setting too.
The Availability Bias
This refers to how the easy availability of information plays a disproportionately large role in shaping our view of the world. The information does not have to be current, it does not have to be accurate but easily accessible. Courtesy again of Rolf Dobelli, we would prefer wrong information to no information at all.
At DVT, we consult with clients of all sizes, building solutions from starter BI kits to enterprise data and analytics solutions.
The dashboards, cubes or reports that we compile for consumption have an intended audience, but most importantly they need to drive certain behaviour that meets identified organisational objectives. Examples of these could be new business development, product cross-selling objectives, profit optimisation and the list goes on.
With Anchoring and the Availability Bias in mind, the information we present, and how we present this information, can and should strongly drive the intended business behaviour. We work closely with our clients to understand key business drivers and to ensure that they are supported in the solutions we implement.
Every metric competes for screen real estate, and typically the discussion revolves around creating a ranking of metrics based on its impact in driving out that behaviour, constantly checking in that it contributes to the intended business narrative.
Rolf Dobelli states in his book, 'The Art of Thinking clearly' that we struggle with making absolute judgements and that we need a crutch as a basis for comparison. When making that high stakes car purchase, we seldom settle on the first vehicle that we test drive. At the very least we will test drive one other to calibrate like-for-like features and to get our basis for comparison.
How often in software projects with a waterfall delivery approach have we heard the following?
"I thought I needed this, but actually", or
"Can make the following change", even though you stuck to the letter of the specification document in your waterfall project. Rework costs were significant and you had to defend against scope creep at all costs.
In the absence of a crutch, we notice only dramatic changes or departures from what we intended. In the waterfall context, "this is exactly what we did not want".
The Agile software delivery approach understands the importance of building out "proof of concepts" (POC) as a mechanism to play back customer requirement and validate that as solution providers, we are on the correct path. Invariably changes flow post the POC discussion. In effect, the POC acts as the psychological crutch, the baseline for the discussion.
Agile through its iterative development nature, practically builds out a crutch at each release with each release serving as an evolutionary basis for comparison, effectively providing us with a mechanism to pick up smaller incremental changes.
Looking ahead to the day.
Thinking in the context of Black Friday, the first information shared with us is product X was R5000, now down to R3500. This is fundamental to the discounting business model. When we look at this, we think, what a bargain! (Anchoring Bias). A similar cognitive response is evoked when we walk into a store and we see an on promotion sticker. Immediately the mind registers it as a great deal, even if we don't know what the original price was.
We are inherently lazy at digging deep for the right information, so we use the first "map" that comes our way to make our decision (availability bias). Doesn't matter if it is a wrong map, as long as it is a map. In this example, the map is the "was" or "on promotion price".
One should guard against our cognitive biases being exploited in the following way:
Ask yourself the question: Does it really matter what the price was at that retailer/ e-tailer?
Dig deeper and baseline the prices for the items you tend to buy at price comparison sites like Pricecheck.co.za. Understand the going price, and what, in your mind makes a good deal.
Be wary of terms like "On promotion". Do the hard work and compare the prices. Also be aware of percentages. Percentages help to simplify more complex calculations but 20% on a small base price can mean much in Rand terms.
If you see good deals now based on your research in the build-up to Black Friday and it is a price you are willing to pay, go ahead and buy. Don't be afraid of the buyer's remorse bias. (A talk for another blog).