Contradictheory: Not all of us are Ferrari drivers


In January 2026, the Malaysian government blocked access to Grok, the AI tool on social media platform X. Grok, it turns out, can create pornographic versions of people from ordinary pictures. Some, however, offer a counter argument to the ban: 'AI doesn't create porn, users do'. — Filepic/AFP

Earlier this week, a man in Subang Jaya, Selangor, went viral on social media when he got into his Ferrari, chased another driver down the road, and in the words of the police, was “tailgating the complainant, stopping his vehicle abruptly and obstructing the path, which could potentially endanger road users’ safety”.

To me, that’s just many words to say “he was a samseng on the road”. And while investigations are still ongoing, the hope is that if you drive irresponsibly, then perhaps your driving licence should be revoked.

But what if we could prevent potentially dangerous drivers from driving in the first place? What if, instead of merely asking people to reverse-park properly and stop at intersections, we decided that cars are inherently dangerous and should only be used by those who can truly prove they’re competent?

This, it seems, is also how the government is starting to think about artificial intelligence (AI) tools. Earlier this month, the Malaysian Communications and Multimedia Commission (MCMC) blocked access to Grok, an AI tool on social media platform X that helps users with answering questions, solving problems, and brainstorming ideas.

It turns out Grok can also help users create pornographic images by, say, manipulating real images to put women in bikinis. Specifically, the MCMC says it has identified misuse of Grok to “generate and disseminate harmful content, including obscene, sexually explicit, indecent, grossly offensive, and non-consensual manipulated images”.

Reading between the lines of various reports, it seems that X responded by saying that users can submit complaints if they find unsavoury content on the platform. But this was not enough for the MCMC, which has now decided that Malaysians should be denied access to Grok while the government takes legal action against X Corp over its failure to ensure user safety.

Now, if you think this is like telling people they can’t drive at all because cars are inherently dangerous, then you might argue that the government’s ban is wrong. After all, while AI tools can be used to create “bad” content, many more people use them for genuinely useful purposes, like fact-checking or generating a cartoon version of your cat.

Or, to paraphrase a familiar quote about guns, AI doesn’t create porn, people create porn. Which is a little ironic, because the opposing argument is precisely that using Grok is like using a gun, because AI is inherently dangerous and therefore should be tightly controlled.

Many leading AI experts do, in fact, argue that we should be extremely cautious about how AI is deployed, given how casually it ignores privacy and copyright standards, and how it can amplify biases that reflect and reinforce dangerous human ones. There is even an ongoing lawsuit alleging that AI chatbot ChatGPT contributed to a man taking his own life after an interaction that “normalised” suicide.

That said, given that you are currently reading an article that will almost certainly be touched by AI tools somewhere along the line, I believe AI should be treated as a tool not a gun. While we would all benefit from understanding its strengths and shortcomings, I don’t think it should be inherently restricted yet.

(In fact, I suspect the reason many people will eventually stop using AI is that it will become prohibitively expensive, turning it into yet another powerful tool only the wealthy can afford. But that’s a discussion for another day.)

Interestingly, just this week, lawmakers in the United Kingdom have moved to criminalise the creation of non-consensual intimate deepfake images, and perhaps even make it illegal to supply the tools used to do this.

Handicapping technology to prevent illegal use is not without precedent. The most notable example I can think of is how modern scanners and printers won’t copy or print banknotes, because they can detect security features in the notes. However, preventing publication of explicit images is much more difficult, and I suspect the focus will eventually shift towards punishing the bad workman rather than banning his tools.

That, of course, creates a new problem: How do we identify who is responsible for creating and sharing what?

This is directly related to the Malaysian government’s push to regulate who can use social media, including proposals requiring platforms to verify users using government-issued IDs. Ostensibly, this is about age verification. But it’s only a short legislative hop from that to saying such systems can also be used to identify who did what on X, which will make prosecution easier. But only when necessary, of course....

Please note this is not quite the same as requiring drivers to hold licences before they’re allowed on the road. The true equivalent would be installing personal trackers in every vehicle, identifying at all times who is driving, when they are driving, and where they are going.

In fact, the real gap in Malaysia’s digital laws right now is that the Personal Data Protection Act (PDPA) does not apply to the government. And even where it does apply, enforcement appears weak. To date, it’s not even clear whether anyone has been successfully prosecuted under the PDPA for leaking personal data.

Yes, it’s right that we can punish the Ferrari driver and, if necessary, take away his ability to drive. But thankfully, that doesn’t mean we should all suddenly live under the constant watchful eye of the state, monitoring our every outing and action – all in the name of safety and moral wellbeing, of course.

In his fortnightly column Contradictheory, mathematician-turned-scriptwriter Dzof Azmi explores the theory that logic is the antithesis of emotion but people need both to make sense of life’s vagaries and contradictions. Write to Dzof at lifestyle@thestar.com.my. The views expressed here are entirely the writer's own.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!
Dzof Azmi , MCMC , Non-Consensual Images , Privacy

Next In Living

Saving for life’s surprises: How to build an emergency financial fund
Dear Thelma: I’m still torn between a past and present relationship
As Italy allows alcohol-free wine, might this be the month to try it?
Second-hand shopping in China has evolved into a trendy, eco-friendly lifestyle
Heart And Soul: Threads that lead to thoughts about life, death and meaning
Restaurant trends that are likely to bloom in Malaysia in 2026
How a tragic plane crash built a volunteer army of pet rescuers in the US
Swimming with the world's largest fish, the whale sharks of Australia
The secret to this Ukrainian soup? It's all in the sauerkraut
Johnnie Walker Blue Label Unveils Year of the Horse Limited Edition by Robert Wun

Others Also Read