IS TECHNOLOGY WORKING FOR US, OR AGAINST US?

IS TECHNOLOGY WORKING FOR US, OR AGAINST US?

THE SAPPERS: SOLDIERS WHO UNDERMINE THE FORTRESSES OF OUR MINDS!

That’s what I’d like to talk about today. But as you know, it’s my habit to always base our conversations on a text and cite sources. Otherwise, I wouldn’t be able to stay grounded, and you wouldn’t be able to trust me.

I picked this up while listening to Friday sermons at the mosque. Our neighborhood imam at Kızıl Minare Mosque, Mahmut Bayram Hocaefendi (may he rest in peace), would always explain which verse from which surah of the Qur’an he was quoting, which hadith of the Prophet (peace be upon him) he was citing, and what topic he would be addressing. If he quoted from other books, he’d even mention the title and page number. The attentive ones in the congregation would take careful notes.

Now, the book I want to reference here is titled NeuroMined: Triumphing Over Technological Tyranny. Translating “NeuroMined” directly into Turkish is no easy task. It’s a wordplay. “Neuro” as in nervous system, “Mind” as in the brain, and by adding just an e, it transforms into “mined”, as in explosive or landmine. To me, “mined” here sounds like the Ottoman Turkish term lağım, meaning a subterranean tunnel used to blow up fortress walls. In fact, in the past, the army had a special unit called lağımcılar—sappers. When laying siege to a fortress, before the days of heavy artillery, they would dig tunnels under the walls, plant explosives, and collapse the structure from within. That’s what gave the besieging force the upper hand.

The book’s cover shows two hemispheres of the brain embedded inside a QR code, with a red underline beneath the “e” in “Mined.” That’s what triggered this whole association for me. The technologies that surround us are, in fact, tunneling beneath our consciousness. We think we’re being smart, productive, and rational when we’re being steered by their enticing allure.

In NeuroMined, authors Robert Grant, a tech company owner, and Michael Ashley, a Disney-affiliated screenwriter and author of 35 distinct books, lay out the risks associated with modern technologies and innovations.

The book is 186 pages, told largely through personal stories, and Ashley’s storytelling skills are on full display. But I’ve stripped the narrative down for you to highlight the core issues they raise: Smartphones, Wikipedia, ESG (Environmental, Social, and Governance), Online Reputation and Perception, Education, Finance, Autonomous Vehicles, the Music Industry, Disinformation, and Surveillance Capitalism.

 

Mobile phones, or smartphones as we now call them, dominate our time. It’s incredible how much of our time is spent staring at those tiny screens, isn’t it? We often complain about how small they are, yet we go as far as to pinch and zoom with our fingers just to see things more clearly. What we see is too tempting to ignore!

 

But is that intelligence, or an addiction? We’re under the constant spell of that screen for every waking moment. These devices are no longer mere phones; they are pocket-sized computers. With endless streams of appealing content and countless apps, we are increasingly tethered to the charm of staying online, driven by FOMO, the fear of missing out.

 

Smartphones

The book’s first chapter focuses on the dangers of smartphone addiction and how these devices are increasingly becoming tools of control:

 

Smartphones are capturing our attention in ways we’ve never experienced before. Many individuals devote 3–4 hours a day to them, with some spending as much as 17 hours.

 

Despite their convenience, smartphones can be regarded as “modern shackles” that subtly control us without our awareness. They track our locations and, while providing easy access to services, they also overwhelm us with stimuli to trigger consumption and keep us hooked.

 

Most people appreciate the conveniences smartphones provide, yet they fail to recognize how they are being steered. I allow all cookies and tracking, as it enhances my services. However, I draw the line at online shopping—that’s my boundary.

 

Technological developments like digital government services have made smartphone use essential for everyone. Apps for tickets, tax receipts, payments, and health credentials like vaccine certificates are restricted if you don’t adhere to standard rules.

 

The book suggests that our increasing reliance on these conveniences and dopamine-driven interactions is “guiding us, like sleepwalkers, into an unseen prison.” It cautions us to be more aware of where this allure is ultimately leading us.

 

In short, while smartphones offer immense value, the book reminds us that if misused by authorities, they have the subtle potential to limit personal freedoms and establish new forms of public control.

 

Wikipedia

 

In the second chapter, the authors underscore that while Wikipedia is an invaluable resource, concentrating all knowledge within a single platform creates significant risks. They raise a few concerns:

Excessive power: When a single entity or platform becomes the definitive authority on information, it amasses disproportionate influence, setting the stage for totalitarian control over information.

Definition manipulation: Online content is seldom updated in real time, which opens the door to manipulation of facts and definitions.

Narrative control: Picture a handful of Wikipedia editors and administrators deciding what the public should know—and whose reputations should be protected or destroyed. That’s an immense level of control.

Meta-data vulnerabilities: As AR and VR make the internet ever more immersive, centralized platforms like Wikipedia will wield unprecedented power to shape users’ experiences and beliefs.

 

While the authors acknowledge Wikipedia’s many advantages, they caution against letting any single institution become the sole arbiter of truth. An ecosystem of diverse sources is essential to check centralized power and prevent manipulation. After all, we may not have a “Ministry of Truth” controlling knowledge and history like in Orwell’s 1984, but modern institutions claiming to “combat disinformation” still exist.

 

ESG

 

The third chapter dives into ESG, which stands for Environmental, Social, and Governance. It’s a scoring system used to rate companies based on their sustainability practices and social impact. ESG funds have grown rapidly in recent years, reaching $360 billion in the U.S. alone.

 

However, critics argue that ESG ratings are arbitrary and politically charged. Companies aren’t necessarily rewarded for genuine environmental or social impact, but rather for virtue signaling or alignment with neoliberal ideals. While I believe this isn’t always the case, misuse does happen.

 

What’s the buzz? ESG scores are allegedly assigned by “gatekeepers” who favor companies that align with their own political beliefs. For example, despite producing electric vehicles, Tesla receives a low ESG score, supposedly because the assessors have issues with Elon Musk. Meanwhile, a weapons manufacturer like Lockheed Martin can earn a higher rating.

 

Online Reputation and Perception

 

The fourth chapter discusses the risks of centralized control over online reputation and perception, and how these systems can be manipulated.

 

The authors imagine a dystopia where people rate each other online, with those scores carrying real weight in people’s lives. In such a society, every aspect of daily existence is scored from one to five stars, determining one’s social status. This dynamic breeds individuals who behave solely to win approval. The chapter explores how social media influences our self-expression, encouraging us to project curated versions of ourselves, fit in, and earn societal validation. It paints a dystopian picture in which social networks tighten their grip on individuals, driving personal and societal uniformity until all differences disappear, and where artificial, superficial relationships come to shape the very fabric of society. After all, the authors demonstrate how this perception could easily become a reality.

 

In China, a social credit system is already in place. People are tracked, and their behavior directly affects their score. Reckless driving, smoking in restricted areas, or spreading “fake news” can lower your rating—and with it, your access to public life. The authors argue that similar centralized control systems are technically feasible in the West, enabled by smartphones, AI, and modern surveillance tools.

 

They discuss the concept of the Panopticon, a prison design that allowed guards to monitor all inmates simultaneously and argue that today’s surveillance technologies function as a modern “techno-panopticon.”

 

The Panopticon was designed in the 18th century by British philosopher and legal scholar Jeremy Bentham. The term comes from the Greek words pan (everything) and optikon (to observe).

 

Bentham intended the Panopticon for prisons but argued it could also be used in schools, hospitals, and factories. Its basic principle is a circular arrangement of cells around a central watchtower, so that inmates never know when they are being watched but always feel they might be. Ultimately, this psychological pressure leads to self-policing behavior without constant intervention. (**)

 

The takeaway? Online reputation and perception systems, especially when tied to centralized authority and surveillance, are ripe for dangerous misuse.

 

Education

 

In the fifth chapter, the authors turn their attention to data collection in education and the increasing surveillance of students. While the stated goal is to improve learning outcomes, they warn of serious unintended consequences.

 

Schools are now collecting vast amounts of data on students, such as grades, attendance, even biometric data, and behavioral feedback from learning software. The logic is simple: analyze the data to identify students’ needs, personalize instruction, and deliver better results.

 

But the authors draw a direct parallel to the work of B.F. Skinner, the behavioral psychologist who believed in using technology to shape human behavior. As you may know, Skinner (1904–1990) was one of the most influential figures in behaviorism. He conducted extensive experiments to understand and control human behavior.

 

Skinner argued that behavior could be molded through a system of rewards (reinforcement) and punishments. If a behavior is rewarded, it becomes more likely to be repeated; if punished, less so. He claimed that human behavior could be conditioned in the same way as animals, and that this could lead to a more orderly society. In his view, people weren’t truly free; they were shaped by their environments through reinforcement. He challenged the very idea of free will, claiming a society that consciously conditions its members might function better.

 

Skinner’s work, demonstrating how human behavior can be controlled, has had a major impact far beyond psychology, extending into education, business management, and technology. However, critics have judged his model as overly mechanistic and dismissive of individuals’ subjective experiences. They argue that Skinner did not adequately account for the fact that people are motivated not only by external rewards but also by internal drives (***).

 

So, the question is: has Silicon Valley quietly applied Skinner’s model through data mining, social media, and gaming technologies to guide young people toward certain behaviors?

 

The authors suggest that it’s no longer just about tracking external actions; it’s also about monitoring internal states and emotions. With tech like biometric wristbands, educators can now monitor students’ physical responses while learning. Public health and safety may be the justification, but who else might be accessing this data?

 

Bottom line: while educational data collection may aim to improve outcomes, it also risks conditioning students, limiting their potential, breaching their privacy, and pushing us toward a culture of excessive internal surveillance. These are not small risks, and they demand serious reflection.

 

Finance

 

In this chapter, the authors raise a deeply unsettling concern: the ability of governments and large tech companies to cut individuals off from financial systems if they engage in activities deemed unacceptable by the state. Take Julian Assange, for example. After he leaked documents exposing the truth about the war in Afghanistan, organizations blocked donations to WikiLeaks. Similar tactics were used against Canadian truckers protesting COVID-19 vaccine mandates, and media outlets challenging the narrative on the war in Ukraine faced the same restrictions.

 

We have to acknowledge that restricting access to capital is a serious threat to personal freedom.

 

Autonomous Vehicles

 

This chapter explores the push for self-driving cars and the hidden cost of surrendering personal autonomy in the name of safety and efficiency. The authors argue that driving is a symbol of freedom and control over one’s life. Automation might reduce accidents in theory, but it also hands over excessive power and surveillance capabilities to the authorities, which can curtail human autonomy.

 

I agree. I’ve been driving for 49 years, thankfully with no major incident. But I don’t want my car talking back to me. It can assist, but only if I ask for it. I still remember my first encounter with ABS brakes. The system told me, “This is how you brake,” while I nearly rear-ended the car in front.

 

The authors warn that failing to switch to autonomous vehicles could lead to having our driving rights revoked, suggesting that the only remedy for such overreach is mass civil disobedience and human will. If authorities try to force this on us, collective resistance may be the only way to push back.

 

I don’t think it has to be that dramatic. Just give people a choice and let them decide.

 

The Music Industry

 

Now let’s talk about music. The authors recall how, in the past, artists had to sign lopsided contracts with managers and record labels that offered them little creative freedom or financial compensation. It was a common theme, even in our own Yeşilçam films. Similarly, in Hollywood, there were infamous cases like Colonel Parker’s exploitation of Elvis Presley and Ike Turner’s abuse of Tina Turner. The power dynamics were one-sided.

 

According to the authors, back then, intermediaries like managers and record labels were necessary to help artists gain visibility. But today, technology has changed the game, allowing artists to connect directly with their audiences. The rise of non-fungible tokens (NFTs), for instance, now allows creators to issue certificates of authenticity and release their work straight to the market without a middleman.

 

So, if up-and-coming artists once found themselves exploited by managers and record labels in the industry, they now have the tools like NFTs to challenge, and even dismantle, that old power structure.

 

Disinformation

 

In the ninth chapter, the authors examine government and institutional efforts to regulate and restrict so-called “disinformation” in the name of public safety. They raise several serious concerns about these initiatives:

  • Given their history of spreading misinformation, authorities or the bureaucracies should not be the ultimate arbiters of what’s true or false.
  • Critics from both the political left and right have spoken out, arguing that these efforts are not simply driven by political motives.
  • To counter censorship by governments and big tech companies, there must be alternative, independently financed and organized channels for information dissemination.

 

As the authors point out, real journalism requires the courage to investigate and expose wrongdoing.

 

In short, while “disinformation governance” is being marketed as a tool to protect the public, the authors see it as a threat to free speech and an open internet. They advocate for decentralized technologies like Web3 as possible solutions, which can promote freedom of information and offer alternatives to centralized media control.
(See: https://acikkaynakfikirler.com/merkezi-olmayan-internet-nedir/)

 

We’re witnessing a real-time example. There’s a boycott movement in response to what’s happening in Gaza. It’s not particularly well-organized or articulated, but I still respect the intent. That said, let’s not even waste time on the absurd rumors targeting Ülker, spread through fake news and rival trolls. The real issue? No one seems to notice, or perhaps they feel powerless against, the social media platforms that do fit the definition of boycott-worthy targets. And why? Because algorithms are already suppressing the broad dissemination of atrocities under the guise of humanism.

 

There’s a popular phrase: “You are what you eat.” But maybe it’s time we revise it: “You are what you speak.”

 

Surveillance Capitalism

 

In the final chapter, the authors discuss the dangers of excessive state access and surveillance capitalism in the context of recent global crises such as the 9/11 attacks, the 2008 financial crisis, and the COVID-19 pandemic. Their claim: governments and powerful corporations have exploited these moments to expand control over our lives and data.

 

In this context, a clear example is the World Economic Forum’s “Great Reset” vision, which I’ve discussed in previous articles (see: https://muratulker.com/y/covid-19-buyuk-reset-fabrika-ayarlarina-donmek). Under this model, stakeholder capitalism and Fourth Industrial Revolution technologies will be used to push societies toward shared goals defined by governments and elite institutions. However, the WEF also promotes ideas like “You will own nothing and be happy.” Understandably, this raises concerns around property rights.

 

At some point, you start to wonder: can this still be called capitalism? (After making this comment, I came across an article by İsmet Berkan on techno-feudalism, along with a related video, and realized I’m not alone in thinking this way:
https://10haber.net/yazarlar/ismet-berkan/ib-gundem/2025-03-15/
https://www.instagram.com/reel/DG0vhEiNRI4/?igsh=enFqY2x5bGhqZXJv)

 

The authors call for legal action to protect constitutional privacy rights, greater public engagement in policymaking, and the encouragement of civic courage to resist technocratic control. While tools for individual data sovereignty are generally seen as a step in the right direction, the authors argue that broader, systemic change is still necessary.

 

In essence, the core message is clear: the recent crises have paved the way for deeper state and corporate surveillance, but the fight isn’t over. There is still time to take a stand and defend individual rights and freedoms. Even the smallest step can create a meaningful impact and encourage the public to join the fight against technocratic control—and that’s precisely why they wrote this book.

 

My late father once told me: “Son, through science and discovery, mankind has managed to improve plants, animals, and just about everything else… everything, that is, except mankind itself.” Sadly, that still holds. Yet ethics and conscience are innate to every human being. We instinctively know what’s right and naturally gravitate toward it. After all, the Prophet Muhammad (PBUH) said, “I have been sent only to perfect good character.” And Jesus (peace be upon him), the prophet before him, was sent to reform moral conduct through the laws of the Torah.

 

(*) Grant, E.D. and Ashley, M. (2023). NeuroMinded: Triumphing Over Technological Tyranny, Fast Company Press, p.186

(**) https://tr.wikipedia.org/wiki/Panoptikon

(***) https://tr.wikipedia.org/wiki/Burrhus_Frederic_Skinner

 

Note: This open-source article does not require copyright and can be quoted by citing the author.

 

MOST READ

Subscribe and get notified

Related Posts

MOST READ

Subscribe and get notified

Sign Up TO OUR NEWSLETTERS for Updates

GELİŞMELERDEN HABERDAR OLMAK İÇİN ABONE OLUN