
SUMMARY: At a community event organized by Siam, the speaker explores whether artificial superintelligence poses an existential threat, drawing on shifting views from leading AI pioneers like Yann LeCun, Geoffrey Hinton, and Ray Kurzweil. Citing rapid exponential progress in AI, they argue the singularity could arrive as soon as 2027β2028 β with over a 50% chance that ASI could surpass human control.
By: Siam Kidd
Enjoyed this article? Join our newsletter
Get the latest Bittensor & TAO ecosystem news straight to your inbox.
We respect your privacy. Unsubscribe anytime.

Be the first to comment