Self-Gaslit Minds in the Age of AI
The most expensive sentence missing from Indian education and industry is also the shortest: “I don’t know.”
A society can survive poverty, bad roads, bureaucratic absurdity, erratic electricity, mediocre software, and the national habit of forming a queue only after first attempting a small civil war near the counter. What it cannot easily survive is a large class of people who have learned to perform knowledge instead of acquiring it. That is not a small defect. It is not a personality quirk. It is civilizational sand in the gearbox.
The trouble begins early. A child is not trained to say, “I do not understand this step.” He is trained to fear the social cost of saying it. The classroom becomes a theater where comprehension is less important than posture. A hand is raised not because a doubt exists, but because a doubt must not be seen to exist. Marks become reputation. Reputation becomes family oxygen. Family oxygen becomes the whole point of the exercise. The result is not education. It is a long, sweaty rehearsal in plausible competence.
This is why so many students can pass exams yet cannot explain the subject in ordinary language. They can recite. They can pattern-match. They can deploy the correct buzzword at the correct moment, like a waiter producing cutlery before the soup arrives. But ask them what the thing actually means, what breaks if one assumption changes, why the formula works, where the model fails, what the code is doing after the tutorial ends, and suddenly the magnificent palace of confidence turns out to be painted plywood and bamboo scaffolding.
Industry then inherits this damage and adds fluorescent lighting.
In many workplaces, the same drama continues with better chairs. People attend meetings where no one understands the system fully, but everyone speaks as if comprehension is a contractual obligation. A junior engineer nods because the senior engineer is nodding. The senior engineer nods because the architect is nodding. The architect nods because the client has paid money and therefore reality must now adjust itself. Somewhere in the basement, a database is coughing blood into a log file.
The problem is not that people are ignorant. Ignorance is normal. Ignorance is the natural starting condition of every human being not born as a smug PowerPoint slide. The problem is the refusal to locate ignorance honestly. Technical progress depends on accurate maps of not-knowing. Science is built on them. Engineering is built on them. Debugging is built on them. A good engineer does not merely know things; a good engineer knows the boundary of what he knows, the boundary of what the system knows, and the boundary where everyone is guessing while wearing formal shoes.
Self-deception destroys that boundary.
A mind that lies to itself cannot learn properly because it cannot find the wound. It is like a doctor treating fever by repainting the thermometer. The student says, “I know Python,” but cannot explain mutability. The developer says, “I know databases,” but treats indexing as a kind of decorative furniture. The analyst says, “I know statistics,” but thinks correlation is causation wearing a nicer shirt. The project manager says, “The integration is complete,” because messages are moving, though no one has checked whether the receiving system interprets them correctly. Transport has occurred. Meaning has not. The parcel has reached the house, but inside it is a goat, a cricket bat, and someone else’s medical report.
This distinction matters everywhere. Moving data is not the same as understanding data. Passing an exam is not the same as grasping a concept. Writing code is not the same as reasoning about a program. Producing a dashboard is not the same as knowing what the numbers represent. The Indian system, with its old talent for social choreography, often rewards the visible artifact and ignores the invisible comprehension. A certificate glitters. A GitHub profile glitters. A résumé glitters. A LinkedIn post glitters. Inside the glitter, there may be genuine ability. There may also be sawdust.
And now comes Artificial Intelligence [AI], carrying a tray of fresh masks.
AI did not create pretension. It merely industrialized it. Earlier, a person who did not know could bluff clumsily. Now he can bluff fluently. He can generate an answer, polish it, rearrange it into executive diction, add a diagram, create pseudo-depth, and walk into the room with the confidence of a man who has borrowed not only someone else’s umbrella but also the rain. AI has put knowledge arbitrage behind a veil. The old trick was to copy from a senior, a coaching note, a Stack Overflow answer, or a cousin in New Jersey. The new trick is smoother. The borrowed mind speaks in perfect paragraphs.
This makes the problem more dangerous because incompetence has acquired good grammar.
In the past, not knowing had rough edges. The answer would wobble. The syntax would crack. The copied code would have strange variable names from some dead tutorial. Now the surface is clean. AI can produce the language of understanding without the inner machinery of understanding. It can explain monads, Kubernetes, quantum mechanics, transformer models, and fiscal policy with the same polished calm, like a hotel pianist playing through a small fire near the buffet. The reader, interviewer, manager, or client must work harder to distinguish comprehension from generated fluency.
A society already fond of performance will find this intoxicating.
Students will submit work they cannot defend. Developers will paste code they cannot repair. Consultants will sell architectures they cannot operate. Founders will describe products they cannot build. Influencers will publish explainers on topics they met fifteen minutes earlier in a prompt box. The marketplace will not immediately punish them, because markets are not moral gods with excellent eyesight. Many organizations reward speed, confidence, and vocabulary before depth. By the time the fraud is discovered, the system may already be in production, the money spent, the team scattered, and the root cause converted into a “lessons learned” document no one reads.
The deeper issue is cultural. We have mistaken shame for discipline and confidence for competence. Shame makes people hide weakness. Discipline makes people work on it. Confidence may help a person begin; competence decides whether anything survives contact with reality. But in a crowded, status-hungry society where jobs are scarce and humiliation is abundant, pretense becomes a survival tactic. If everyone is selling certainty, the honest beginner looks defective. If everyone claims “full stack,” the person who says, “I know backend basics but need help with deployment,” sounds weak. In such a world, honesty becomes professionally expensive.
This is why the disease reproduces itself.
The teacher pretends to teach. The student pretends to learn. The coaching center pretends to transform. The company pretends to assess. The employee pretends to know. The manager pretends to believe. The client pretends the delivery date is real. The whole machine runs on a kind of mutual diplomatic immunity from truth. Everyone suspects the building is hollow, but no one wants to be the first to tap the wall.
The scientific cost is severe. Real science begins with disciplined doubt. It advances because someone says, “This result is surprising,” “This assumption is weak,” “This measurement is noisy,” “This model is overfitted,” “This causal story is not established,” or “We do not yet know.” These are not timid sentences. They are the steel beams of progress. Without them, research becomes decoration. Engineering becomes ritual. Data becomes astrology with spreadsheets.
A country can produce many degree-holders and still remain technically shallow if it does not produce people who can interrogate their own understanding. That interrogation is painful. It requires private embarrassment. It requires the ability to sit with confusion without immediately wrapping it in jargon. It requires mentors who do not crush questions. It requires classrooms where doubt is not treated like a contagious skin disease. It requires interviews that test reasoning rather than memorized trivia. It requires workplaces where saying “I need to check” is not career suicide.
The difficulty is that these reforms are not merely curricular. You cannot fix this with one new syllabus, one national portal, one AI literacy module, one hackathon, or one ministerial speech written in the usual cement-mixer language of transformation, empowerment, and innovation. The habit of pretending is braided into family pressure, exam culture, caste and class anxiety, English-language hierarchy, job scarcity, managerial insecurity, and the small daily terror of being exposed as ordinary. That braid is old and tight.
It may take generations to loosen.
One generation has to learn that not knowing is not disgrace. Another has to build institutions where admitting ignorance does not invite cruelty. Another has to normalize slow competence over fast signaling. Another has to create assessment systems that reward explanation, repair, transfer, critique, and original application. Another has to develop managers who can tell the difference between fluency and understanding. This is not the work of one heroic reformer arriving with a laptop and a slogan. It is plumbing. It is sewage work. It is the unglamorous rebuilding of how a society relates to truth.
AI makes this harder, but it also makes the solution clearer.
If AI can answer ordinary questions, then human education must move beyond answer-production. The valuable person is not the one who can generate text. The valuable person is the one who can test it, challenge it, contextualize it, improve it, detect its omissions, and connect it to reality. In coding, that means reading generated code and knowing why it fails. In statistics, it means knowing whether the model is answering the question actually asked. In science, it means designing experiments that separate signal from theater. In industry, it means building systems whose behavior can be explained under stress, not merely demonstrated under ideal conditions while everyone smiles like hostages in a vendor webinar.
The practical implication is blunt: every serious technical institution should make ignorance visible and safe, but not comfortable. Safe means a student or employee can admit confusion without ridicule. Not comfortable means confusion must lead to work. “I don’t know” is not a sofa. It is a doorway. The sentence must be followed by “Here is how I will find out,” “Here is what I tried,” “Here is where my reasoning broke,” or “Here is the smallest experiment that can test this.” That is the culture of engineering. That is the culture of science. Not the swagger. Not the certificate. Not the English accent. Not the LinkedIn fog machine.
A good technical interview should ask candidates to explain a concept at three levels: to a child, to a peer, and to a production team that must maintain it at 2 a.m. A good classroom should reward the student who asks the question everyone else is silently smuggling under the bench. A good manager should distrust perfect confidence from people who have not yet met the mess. A good engineer should keep a private inventory of ignorance and update it like a sacred ledger.
The non-obvious architectural insight is that societies, like software systems, encode their failure modes into their interfaces. If every human interface punishes uncertainty, then every internal process will learn to conceal uncertainty. The school interface conceals it through marks. The job interface conceals it through jargon. The corporate interface conceals it through status reports. The AI interface conceals it through fluent generation. By the time the truth reaches decision-makers, it has passed through so many beautifying filters that ignorance looks like progress.
That is why representation failures are so often mislabeled as quality failures. The data is not “bad” in isolation. The answer is not merely “wrong.” The résumé is not merely “inflated.” The real failure is representational: the symbol no longer corresponds honestly to the underlying capability. A grade does not represent understanding. A degree does not represent readiness. A dashboard does not represent reality. A generated answer does not represent cognition. A confident employee does not represent competence. The label and the thing have drifted apart, like two relatives after a property dispute.
Once that drift becomes normal, every system built on top of it becomes fragile.
There is no clean solution because the incentives are dirty. People pretend because pretense works often enough. Institutions tolerate it because exposure is expensive. AI will be used to deepen it because fluency is cheap and thought is costly. The correction will require slower hiring, better teaching, harsher evaluation of actual work, kinder treatment of honest confusion, and a cultural demotion of swagger. None of this will trend easily. It does not photograph well. It will not fit into a two-day workshop with samosas.
But it is necessary.
The first revolution is almost embarrassingly small. Say “I don’t know” earlier. Say “I don’t understand” without making it a confession of moral failure. Ask the second question. Ask the stupid question, especially when it is not stupid at all but merely socially inconvenient. Make people explain. Make them repair. Make them transfer knowledge from one context to another. Make them show the working, not only the answer. Reward the person who finds the crack before the bridge collapses.
Technical progress does not come from a society pretending to be brilliant. It comes from a society becoming less afraid of its own ignorance. Until that happens, we will keep producing confident fog: students who can pass, employees who can posture, consultants who can pronounce, and AI-polished minds walking around with borrowed certainty under the veil. The future will not be kind to that arrangement. Reality is not impressed by fluency. It waits patiently, like an old examiner with bad eyesight and excellent memory, and then asks the only question that matters: do you actually understand what you just said?