London: Geoffery Hinton, the godfather of AI, said that people who are confident that AI will do no harm are crazy.
He is in favour of uncertainty because there is still hope that AI's potential is overstated. The computer scientist thinks that technologies like ChatGPT have already consumed all documents on the web but they won't be able to go much further until they can get access to private data.
The 75-year-old recently quit Google to speak freely about the dangers of AI.
"We are in a time of great uncertainty. It might well be that it would be best not to talk about the existential risks at all so as not to distract from these other things [such as issues of AI ethics and justice]. But then, what if because we didn’t talk about it, it happens? Simply focusing on the short-term use of AI, to solve the ethical and justice issues present in the technology today, won’t necessarily improve humanity’s chances of survival at large," he said.
Speaking to The Guardian, Hinton recalled his psychologist roots and said that he was not trying to develop cutting-edge technology with his work in the field of neural networking. His work was focused on how the human brain works and it later turned into an approach to building computer systems that can learn from data and experience.
According to Hinton, his work on neural nets was a curiosity until recently because it needed vast amounts of computer power. In the last decade, the processing power and datasets grew a lot which put his work at the centre.
Comparing the human brain and artificial intelligence, he said that both have advantages and disadvantages. Biological intelligence runs on very low power - just 30 watts when someone is thinking. But it is inefficient in terms of information transfer. Digital intelligence is the opposite. They make sharing information very easy but require a lot of energy. "When one of them learns something, all of them know it, and you can easily store more copies."
In Hinton's words, "We’ve discovered the secret of immortality. The bad news is, it’s not for us."