
SUMMARY: At a community event organized by Siam, the speaker explores whether artificial superintelligence poses an existential threat, drawing on shifting views from leading AI pioneers like Yann LeCun, Geoffrey Hinton, and Ray Kurzweil. Citing rapid exponential progress in AI, they argue the singularity could arrive as soon as 2027–2028 — with over a 50% chance that ASI could surpass human control.
By: Siam Kidd
Be the first to comment