Hectic, stress-inducing airports can nudge calm behaviour. The Long Beach airport in California uses quiet areas, plenty of greenery, and a playful environment |
In my last post, I related the concept of nudge. In relation to SafeGrowth, the concept from behavioral economics suggests that human choices and decisions are triggered as much by physical designs as by education, legislation, or enforcement.
While there are scientific disputes about nudge theory, there is little doubt the concept has mileage in contemporary community-building, whether the community is physical or virtual. This will increasingly become concerning as augmented and virtual reality devices expand.
The concept of CPTED has always been led by a form of nudging. We see it in 1st generation CPTED. You improve the beautification, cleanliness, and image of an area and you nudge offenders away from that location by making it difficult for them to offend with impunity!
In 2nd Generation CPTED, you engage neighbours in local cultural, art, and music festivals, and nudge them towards caring more for their community as they get to know each other– what sociologists call collective efficacy. Research shows how collective efficacy can nudge crime out and positive activities in.
Portland enhances collective efficacy by nudging people outdoors to common street gardens where they can socialize and meet neighbours in a positive way |
NUDGING IN THE WILD WEST OF CYBERSPACE
If nudging works in the physical environment, what about cyberspace? I recently attended Calgary’s Cyber Summit with Law Enforcement and Cyber Security companies from all over the world. Can the concept be used online since it is humans who mostly control social media?
Popular opinion is that social media is a cesspool! It’s destructive to mental health and can lead to crime. It continues to be a Wild West when it comes to regulation and the damage filters into the real world.
Those developing social media do not always understand the psychology of impulse behavior and its negative effects. Or perhaps their motives are only about sales or power and not about the harmful impact on daily life? Either way, the damage has been done. That means it’s up to us to carefully choose and decide how we should behave.
Can we use the concept of nudge to influence online behaviour? Cyber security experts have been attempting to nudge their clients to adopt cyber security habits like using 2-step verification to protect our identity or installing security updates at a regular time.
Cybersecurity experts are constantly attempting to nudge safer online habits |
In other cases, it is possible to add “trigger words” to your preference sections of apps to filter out what you see. Your preference sections are probably filled with swear words, but you can choose to filter out other words that emerge with online scams.
CYBER POWERBROKERS
What else might we do to limit the impact of negative nudging and enhancing civil behavior online? Perhaps we can assemble a community of informed online users who advocate filtering out trigger words that contribute to harmful behaviors? That would prevent negative nudging before it began. Perhaps we can prevent cyber threats like spoofing, deepfakes, and phishing by populating our preferences to avoid risky trigger words?
We are waiting for the regulations of lawmakers, or the voluntary compliance of platform owners and powerbrokers, to figure out how to govern cyberspace. People should take the powerbrokers out of this equation and democratically decide how to govern each other online.
There are obviously pros and cons to filtering words, but we need to have this important discussion about how cyber nudging can help, or hinder, our online experience.
It is practically impossible to find any social space where people are not fixated on their phones or their computers - online behaviour matters in the real world |
NUDGING A SAFER, MORE CIVIL WAY?
Are we capable of producing online communities that are there for the community and not for profit? These would be communities that don’t use your personal information for profit.
We should start with some stronger guardrails from our favorite online platforms. Imagine if each of us could adhere to the simple principle of being the positive change you want to see in the online world. If the owners of a platform put up their own guardrails, then the community would not need to call out the bad behaviour. Perhaps we can find some responsible platform owners and showcase how they focus on setting the example of what we all want to see in our online world?
I am not advocating for a perfect, everyone-should-be-happy scenario. I am advocating for careful attention to the cyber nudges that cause harmful behaviour and responsible guardrails to protect the online community. What about more serious consequences for those who do harm for profit (scammers, deep-fake fraudsters, ID thieves)?
An online community of civil discourse. Showcasing responsible platform owners. |
How about more information about the negative triggers that nudge behaviour in harmful directions (like suicide and harassment)? We already label the harms of tobacco. What about the online harms?
How about creating a savvy user-base who participates in civil discourse. If this user-base ever comes to pass, we will offer up a quotation attributed to both Nobel Laureate writer George Bernard Shaw and US Supreme Court Judge Ruth Bader Ginsberg. It is a concept we also teach in our SafeGrowth classes - it is possible to disagree without being disagreeable. That is the basis of a civil society. Otherwise, we end up with what we see online every day.