DO YOU BELIEVE? WHAT DO YOU BELIEVE? WHY DO YOU BELIEVE? HOW DID YOU COME TO BELIEVE?

DO YOU BELIEVE? WHAT DO YOU BELIEVE? WHY DO YOU BELIEVE? HOW DID YOU COME TO BELIEVE?

ARE WE FOOLS? WERE WE DECEIVED?

There is no single answer to this question. The desire to reach the truth, the urge to suppress uncertainty, the need to make sense of things or to find an anchor… all of these are tied to belief. Yet not every belief rests on a rational, “logical” foundation. At times, what shapes our beliefs is not our critical capacity but our needs, or, the environment we live in.

Psychology even has a term for this state of vulnerability: the Gullibility Effect—the tendency to be easily deceived. It describes our inclination to accept false or misleading information without questioning it.

Note: Originally published on July 4, 2025.

 

In psychology, there is a concept known as Gullibility. Often mistaken for foolishness, it describes a tendency: a certain naiveté, or the inclination to be easily deceived. At its core, it is the act of accepting information without questioning its accuracy.

How many of the things we believe do we truly understand? How many have we carefully thought through, examined critically? And how many have we absorbed simply because they were passed down in our families, schools, neighborhoods, or across the screens before us? Do we really need something to be true to believe it?

 

This kind of naiveté, sometimes bordering on folly, may arise from inattention, from weak critical faculties, or even from an excess of trust. Individual factors such as analytical thinking, level of education, and media literacy are certainly important. Yet questioning demands effort; thinking takes time; doubt requires courage. Believing what is already there, on the other hand, is easy. It is fast. It is comforting. But to view the issue only as a matter of individual shortcomings would be misleading, for gullibility is also shaped by the conditions of the society we live in.

 

So, the real question becomes:

 

Are you gullible?

 

I doubt it. Believing is easy; thinking takes effort. Why? Thinking means embracing inner tension. Doubt comes with surrendering certainty and confronting uncomfortable possibilities. Intuition, by contrast, is much quicker. The mind’s first response to an idea is to treat it as “true,” whereas considering it false requires a second step. Our first reflex is to believe. Daniel Kahneman calls this System 1: a fast, intuitive, effortless, but superficial mode of thinking. It serves us well in daily life—but when distinguishing truth from falsehood, it can lead us astray. Intuition gravitates toward what is comfortable; it favors convenient thoughts over complex realities.

 

According to Daniel Gilbert’s research, when we first encounter a proposition, the mind processes it as if it were “true”. The evaluation of its potential falsehood belongs to a later stage of cognition. Our first reflex is not skepticism, but acceptance—an acceptance that brings not only mental ease but also emotional relief.

 

Belief, then, is not as rational as we might assume. At times, it is instinctive, an inclination to choose what is familiar, comfortable, or easy. Sometimes reality is set aside simply because it is too complex, and over time, that avoidance hardens into belief.

 

Believing Without Thinking and the Pull of Populism

 

Some people cling to their beliefs no matter how contradictory, unfounded, or absurd those beliefs may appear. This has nothing to do with a lack of intelligence. Even highly educated, cultured individuals sometimes defend such beliefs as if they were proven truths. The reason is simple: belief is not always anchored in knowledge but in a sense of truth.

Research by Van Prooijen and colleagues shows that individuals with a populist worldview place greater trust in intuition. For them, knowledge becomes meaningful not through reasoning but when it feels “intuitively true”. And that intuition is nourished less by logic than by emotions and belonging. In this sense, gullibility is not a way of thinking, but a form of “not thinking”. Questioning means facing one’s inner contradictions. However, intuitive thinking sweeps those contradictions away before they can surface. This is where populist rhetoric draws its power. It offers simple solutions to complex problems; it sharpens the divide between “us” and “them”; it reinforces insecurity not with knowledge but with anger. And this intuitive confidence is not limited to conspiracy theories. It extends to pseudo-scientific claims, supernatural assertions, and even to statements that sound convincing but are ultimately meaningless. The research reveals that this tendency is not only about “believing falsehoods” but also about “abandoning the search for meaning itself”. At that point, gullibility ceases to be a personal weakness and becomes a social phenomenon.

 

Societies, Distrust, and the Compulsion to Believe

 

Beliefs may appear personal, yet they often mirror society. We do not only believe in ourselves—we believe what others believe. Many times, we even believe in precisely avoiding questioning what others believe. Because questioning isolates. The more trust erodes within a society, the more belief becomes a form of defense. This relationship was revealed in a striking study conducted by Sinan Alper and his colleagues, who analyzed conspiracy beliefs across different countries.

 

Their findings are revealing in societies where corruption is widespread; belief in conspiracy theories becomes far more common. In some contexts, such theories almost take the place of reality itself! Why? Because in societies where corruption has been normalized, and transparency and honesty have become rare exceptions, people find it more “realistic” to believe that the truth is being hidden than revealed. Conspiracy theories in such environments are not mere fictions. They also serve as tools to make sense of what is happening.

 

In these societies, being deceived is less an individual failing and more a collective form of defense. People live within an illusion; lacking alternative explanations, they embrace—or rather, are compelled to embrace—rumors collectively. Why? Because in a system where the truth is constantly obscured, believing through deception is rarely a matter of choice; it appears more like the only option left. In such settings, the line between believing and being deceived grows blurred. At this point, gullibility ceases to be an individual naiveté and becomes a collective adaptation.

 

Another important finding of the research is that the role of individual differences in gullibility depends on context. In countries with low corruption, education and analytical thinking skills protect against conspiracy beliefs. But in societies where corruption is widespread, these safeguards lose their force. The social foundation required to resist belief crumbles, and beyond individual effort, gullibility—naiveté itself—becomes the default. To see the truth, eyes alone are not enough; there must also be light. And in some societies, that light is systematically dimmed.

 

Believing in Truth or Arriving at Belief Through Deception?

 

We define human beings as seekers of truth. Yet perhaps that is not always the case! Are we truly in pursuit of truth, or are we seeking meaning in our lives? Sometimes reality is accepted only when it is bearable; otherwise, meaning takes its place. People do not always search for what is right; often, they turn to whatever belief makes them feel safer and less alone.

 

In his April 15, 2024 article in The New Yorker, “Don’t Believe What They’re Telling You About Misinformation,” Manvir Singh makes a compelling point: some beliefs are not designed to generate knowledge but to forge connections. People cling to such beliefs not because they are true, but because they help define who they are in society. In Singh’s words, these are “symbolic” beliefs. To adopt them, he says, is to signal belonging—to show that you are part of a group, an identity, a community, and that you are willing to uphold it.

 

The function of sharing an idea is not always to express belief in it. At times, carrying that idea serves as a social code, a marker of identity, and a way to forge invisible bonds with those who resemble you. Singh describes this as “believing not in something, but in someone through that something.” When a person says, “Yes, the world is flat,” their intent is not to debate physical reality but to declare, “I am on your side.” At that point, truth loses its relevance; what matters is the relationship. This is where the power of symbolic beliefs lies. Do we truly believe, or do we simply cling to the social capital that belief provides? Singh explains that someone who professes faith in God does so not only at a theoretical level but also as part of a community of believers, to form an emotional bond. Such beliefs may not dictate every action, but they give us a sense of belonging. He even suggests that in a church service, when someone sings a hymn, they are often connecting less with God and more with the congregation around them. Likewise, a person at Friday prayers may be bonding less with God and more with the community. Do you agree? I leave the question open.

 

Singh also points to politics and conspiracy theories. Believing in a conspiracy theory, he argues, does not mean treating it as verified knowledge. Rather, such a belief may embody anger, exclusion, or a sense of belonging. And though such beliefs can be easily disproven, they are difficult to abandon, because to abandon them is to face loneliness. This is why combating misinformation cannot be reduced to simply providing more accurate information. At times, people choose to believe simply because they need to. Without addressing that need, no amount of data can truly change minds.

 

Gullibility is not simply about being deceived. Sometimes it is a kind of fatigue that postpones thinking. Sometimes it is a quiet compromise to avoid loneliness. In a world where everyone believes in certain things, doubt is a virtue reserved for the few. To be skeptical is, more often than not, to be alone. So, people tend to choose not what they themselves know, but what those around them believe. This is not only about the ability to question, it is about the courage to do so.

 

AND WHAT ABOUT ME?

 

Even the search for truth itself depends on trust. Without trust, truth stands alone, powerless, stripped of meaning. I chose to believe in Muhammad al-Amin (PBUH). He was the most trustworthy of men. Once, he invited his close relatives to a meal and asked them: “If I were to tell you that there is an enemy army behind that hill, would you believe me?” They replied: “Indeed, we would believe you, for you are al-Amin, the most trustworthy.”

 

Yet when he invited them to the faith he was entrusted with—to believe in one God—they responded: “We will not abandon the ways of our forefathers.” Unable to renounce the comfort, privilege, and wealth that belonging had given them, they imposed years of embargo and hardship on the believers, leaving them hungry and destitute.

 

By then, when he (PBUH) re-entered Mecca, everyone had professed faith. Yet in truth, they had merely become Muslims; in other words, they had submitted. True belief, however, is a matter of the heart, and only God knows and judges what lies within it.

 

Unfortunately, history shows that kings, rulers, and systems have long sustained their power by deceiving societies. From Roman emperors proclaimed as gods to Pharaohs declaring themselves divine, such has always been the case. And societies, in their gullibility, followed. People either aligned themselves with those in power and profited, or they followed the majority, becoming gullible masses clinging to false beliefs and conspiracies. Those who resisted? They were persecuted, becoming the oppressed.

 

Yet, there have always been just kings and righteous systems as well. So, where do you stand today? The choice is yours.

 

References

Alper, S. (2023). There are higher levels of conspiracy beliefs in more corrupt countries. European Journal of Social Psychology, 53, 503–517.

Forgas, J. P., & Baumeister, R. F. (Eds.). (2019). The Social Psychology of Gullibility: Fake News, Conspiracy Theories, and Irrational Beliefs. Routledge.

van Prooijen, J.-W., Cohen Rodrigues, T., Bunzel, C., Georgescu, O., Komáromy, D., & Krouwel, A.P.M. (2022). Populist Gullibility: Conspiracy Theories, News Credibility, Bullshit Receptivity, and Paranormal Belief. Political Psychology, 43: 1061–1079.

https://www.newyorker.com/magazine/2024/04/22/dont-believe-what-theyre-telling-you-about-misinformation

 

*Cover image generated by ChatGPT.

Note: This open-source article does not require copyright and can be quoted by citing the author.

 

MOST READ

Subscribe and get notified

Related Posts

MOST READ

Subscribe and get notified

Sign Up TO OUR NEWSLETTERS for Updates

GELİŞMELERDEN HABERDAR OLMAK İÇİN ABONE OLUN