![]() |
The shape of digital thinking - not always what it seems - Stockphoto |
Techno-solutionism
(noun) tech·no·so·lu·tion·ism |ˈtek-nō-sə-ˈlü-shə-ˌni-zəm * 1: The idea that every social problem has a technical fix — a mindset that often bypasses community input, local culture, and lived experience in favor of data-driven shortcuts.
by Gregory Saville
Do you know what’s happening in Las Vegas next month?
More gambling and fantasy — except this time, it’s not about blackjack or slot machines. It’s about our future.
The Ai4 Conference is next month — one of the biggest gatherings of AI developers, investors, and corporate clients in the world. It’s packed with major institutional players: banks, financial giants, data brokers, and tech developers. They’re not there to sound alarms. They’re there to map the future — one algorithm at a time.
But not everyone is buying in.
One of the voices at this year’s event is Geoffrey Hinton, widely regarded as the godfather of deep learning. He’s a recipient of the 2024 Nobel Prize in Physics, and he resigned from Google in May 2023 — not in protest of bad engineering, but in alarm over what his own research might unleash.
He’s not there to sell. He’s there to warn.
And if even Hinton is nervous about the direction of AI — especially when it comes to ethics, autonomy, and control — we should all be paying attention.
That includes those of us working in urban safety and crime prevention.
Geoffrey Hinton at his Nobel Prize ceremony Photo by Jennifer 8. Lee, CC BY-SA 4.0, via Wikimedia Commons |
AI AND CPTED - THE SILENT CONVERGENCE
Artificial Intelligence is already reshaping how cities think about safety. I described this convergence in detail during my keynote address on AI and CPTED at the 2021 CPTED Conference in Sweden. It’s happening in ways both subtle and systemic: facial recognition networks quietly expanding in public places; real-time surveillance feeding data into centralized dashboards; algorithms determining who is suspicious and who is "safe".
We’ve raised alarms about these trends before on the SafeGrowth blog — in posts like:
- What 1980s Weather Models Taught Me About AI
- Summoning the Demon: AI in Law Enforcement
- The Pros and Cons of Using AI to Prevent Crime
My keynote at the 2021 conference was titled: Artificial Intelligence, Smart Cities, and CPTED. A Threat to the ICA.
Much of this unfolds behind the scenes. And yet, it touches our daily lives, our neighborhoods, and the very public spaces that CPTED seeks to protect.
The problem? These AI tools are often built through a corporate techno-optimist lens. One that:
- Prioritizes efficiency over ethics
- Amplifies surveillance capitalism
- Ignores spatial injustice
- Replicates systemic bias in code causing algorithmic harm
If we’re not careful, we risk building cities that feel more like open-air data farms than thriving communities.
Marriotte Hotel at the Las Vegas convention Center - location of the 2025 AI conference - photo by Marriotte |
THE RESPONSE? ETHICS BEFORE ALGORITHMS
That’s why the International CPTED Association (ICA) has convened a new AI and CPTED Subcommittee — a global collaboration of criminologists, computer scientists, planners, ethicists, and practitioners (myself included) working to confront these challenges head-on.
I’m chairing this subcommittee, and after months of research and reflection, the ethical questions have crept into my consciousness like invasive roots spreading beneath the roadways of our communities. You don’t see them right away. But they’re there — widening the cracks.
We’re now finalizing a white paper to guide the CPTED field — a direction-setting document for how to responsibly navigate this new terrain.
RECLAIMING SAFETY IN THE AGE OF AI
This isn’t just a matter for data scientists or tech firms. There is a message shaping up for CPTED practitioners and residents alike. To me, that message goes like this:
We must call for oversight, ethical scrutiny, and community-driven approaches over digital or predictive models that falsely claim certainty without embracing uncertainty and probability.
And above all - techno-solutionism must not prevail. Cities are not math problems to be solved by code. They are ecosystems of people, stories, and space — deserving of care, not automation. Despite what some experts claim, sustainable crime prevention is not a simple matter of quick situational fixes. Livability too is important.
We’ll be sharing more soon in our ICA white paper.
For now, the dice may be rolling in Vegas. But the rest of us? We’re not gambling with our future.