Fifty years ago, an MIT professor created a chatbot that simulated a psychotherapist.
Named Eliza, it was able to trick some people into believing it was human. But it didn't understand what it was told, nor did it have the capacity to learn on its own. The only test it had to pass was: Could it fool humans?
Already a subscriber? Log in.
Limited time offer:
Just RM5 per month.
Cancel anytime. No ads. Auto-renewal. Unlimited access to the web and app. Personalised features. Members rewards.
Follow us on our official WhatsApp channel for breaking news alerts and key updates!