Half-truths and self-righteousness
07 Sep 2023
Homily by Fr Johnny Go SJ, Dean of the Gokongwei Brothers School of Education and Learning Design, during the Higher Education cluster's Mass of the Holy Spirit at the Church of the Gesu on Thursday, 7 September 2023.
In our short Gospel reading today, in the very same breath that our Lord promises to send us the “Spirit of Truth,” He issues a warning about this world: that the world will resist the Spirit of Truth because it neither sees nor recognizes this Spirit.
So how are we to respond to the Lord? If you ask me, if His warning were sent to us today as a text message, my guess is that we here at the Ateneo would most probably respond with “Noted”--maybe with a “thumbs up” or a “folded hands” emoji.
But sure, why not? After all, we are an academic institution, and truth is precisely our business. Isn’t its pursuit the very mission of our university or any university? To discover the truth, to learn it, and to impart it? We are already all about the truth.
And even more significantly, not only are we a university, but we are a Jesuit university: Our community is supposed to be schooled in Ignatian Discernment. We even have a unique three-unit core curriculum course for our seniors called “Discerning Life’s Questions.”
For what does discerning mean anyway if not precisely seeing the Spirit, recognizing the Spirit, and accepting the Spirit–the very things our Lord is saying that the world will not do?
So there’s a temptation for us Ateneans to regard this warning as addressed to someone else, directed somewhere else. Maybe La Salle!
But wait, not so fast. Before we discard this warning from the Lord as unnecessary or inapplicable to us, maybe it will help if we pause and reflect in true Ignatian fashion, and just for a moment, put aside this thing I call our Ateneo Blue Ego, and just honestly and humbly examine ourselves. Contrary to our initial reaction, this warning from our Lord may just turn out to be more relevant today than ever, even and especially for us here in a Jesuit university.
It’s no exaggeration to say that in today’s world, truth and discernment are both endangered species. All we need to do is to go online and check out what’s going on in our favorite social media platforms.
Everyone is talking about Generative AI and apps like ChatGPT these days. So we may as well talk about this much-discussed phenomenon called “AI hallucination.”
AI hallucination refers to the behavior of a large language model like GPT4 when it makes up false information or facts that are not based on real data or events. In short, it hallucinates.
The reason why it does this is that contrary to most people’s understanding, Generative AI is not just your super smarter search engine, which is what I actually thought. Generative AI does not scour the Internet for actual data the way a search engine does. Trained on trillions of patterns of language, all Generative AI knows is which words tend to go together and it produces its output based on these associations. So it doesn’t understand what the words actually mean and therefore cannot distinguish fact from fiction. In short, what Generative AI actually mimics is human language and not human thought. Hence, the so-called hallucinations.
Our very own Dr. Didith Rodrigo of the Department of Information Systems and Computer Science was caught by surprise one day when a student she had never seen before approached her to request for a copy of the full text of a paper that she had allegedly written. According to Dr. Rodrigo, the title of the paper sounded very much like a paper she might have written, but unless she herself was hallucinating, she was a hundred percent sure that she had never authored this paper. And what was the student’s source of information? ChatGPT, of course. It was a classic case of AI hallucination.
The problem with AI hallucination is that it produces half-truths. That’s why they’re so hard to detect. Half-truths resemble the truth because they contain legitimate elements of it, but in mingling truth with falsehood, they end up distorting the truth.
Now, at the risk of sounding anthropocentric, I think with AI hallucination, we have a case of technology imitating life. After all, we humans do something similar all the time. Philosophers have a name for what we can say is the human counterpart of AI hallucination; they call it the “epistemic fallacy.”
We commit epistemic fallacy when we reduce all of reality only to our knowledge of it, when we mistake what we’ve experienced, what we’ve been trained in, and what we know so far to correspond completely and perfectly with reality. In lay people’s terms, we put reality in a box. We think we know something or someone completely when in fact, there’s a lot more we don’t know and a lot more that we still need to learn. In short, we fall prey to half-truths and are convinced that they are the truth and nothing but the.
We are guilty of epistemic fallacy much more frequently than we suspect. Many relationships actually fail because of epistemic fallacy: We think we already know our friends, colleagues, or spouse, so we put them in a box and we insist that they fit in it when they are really much more than that. So here’s an unsolicited piece of relationship advice from me: Next time other people pick a fight with you and demand that you fit into their box, simply make the declaration that they are guilty of epistemic fallacy, and I bet they’ll walk away in defeat and bewilderment.
Epistemic fallacy is the human penchant to stretch and amputate reality so that it would fit our knowledge of it–instead of the other way around. It’s our version of AI hallucination because it causes us to fall for half-truths or ourselves to come up with them.
Now half-truths can be quite insidious. They are particularly deceptive precisely because they sound so true, but are not. As someone quipped, a half-truth is actually a whole lie.
Social media is littered with half-truths. And what’s worse is that these half-truths tend to elicit strong emotional reactions. Unfortunately, studies show that negative content is more likely to spread than positive content: Online negativity is approximately 15% more prevalent than positivity, and negative tweets tend to engage significantly more users.
So just as fake news has been proven to travel faster than true stories, negative feelings go viral much more quickly than positive ones. And more than any other online emotion, the emotions that are retweeted the fastest and most often are anger, indignation, and especially righteous outrage.
In general, righteous anger should be welcome because we need it–to fuel our passions to fight injustice or to correct a wrong. But if we’re not careful and discerning, we may get carried away, and while we’re not looking, our righteous anger may easily morph into self-righteous anger. It’s a pretty slippery slope from righteous anger to self-righteous anger.
Now, half-truths and self-righteous anger: They sound suspiciously like the very symptoms of what St. Ignatius of Loyola in his Rules for Discernment calls “spiritual desolation.” They are telltale signs of the evil spirit hard at work to lead us away from God, and in this case, to prevent us from recognizing and accepting the Spirit of Truth.
For St. Ignatius, spiritual desolation is about being led to somewhere other than God: It tends to diminish us, to divide us, and to make us less generous, less charitable, and less humble.
St. Ignatius also warns against the evil spirit’s tactic of showing up disguised as an angel of light, with the intention to deceive us with half-truths and to give us the impression that we are doing something good only to realize later that we have ended up doing something else. The evil spirit goads us so that we go overboard, and before we know it, our righteous anger has been disfigured into self-righteousness.
If we are not careful, we may end up constantly putting ourselves at risk of getting infected with this online desolation, exposing ourselves to all the viral half-truths and self-righteous rage. Moreover, if we are not discerning, not only can we fall victim to online desolation, but actually participate and become agents of social media desolation, spreading it and infecting many others with it.
Now, the trouble with online desolation, with its half-truths and self-righteous anger, is that it can also happen in real life–or as people say these days, IRL: in person, in our families, in our social circles; even in our classrooms and our academic community. We know that IRL we can easily fall for half-truths. IRL they can lead to premature or even unwarranted righteous anger. IRL our righteous anger can quickly mutate into self-righteousness. And IRL we can become victims of desolation, individual or communal.
So come to think of it, maybe the Lord’s warning about the world’s inability to accept the Spirit of Truth is, after all, a legitimate warning issued to us as well. And so, at this Mass of the Holy Spirit, let us ask ourselves whether we are susceptible to half-truths and self-righteousness, as well as to spiritual desolation.
Let us pray that this academic year will be full of new learning and friendships. Let us also invoke the guidance of the Holy Spirit so that in our families, among our friends, and in our community, we will not be willing victims of desolation–or worse, its unwitting agents.
Recent News
CALL FOR APPLICATIONS: Training Coordinator
04 Oct 2023
CALL FOR APPLICATIONS: Training Assistant
04 Oct 2023
CALL FOR APPLICATIONS: Political Consultant
04 Oct 2023