🎉 The #CandyDrop Futures Challenge is live — join now to share a 6 BTC prize pool!
📢 Post your futures trading experience on Gate Square with the event hashtag — $25 × 20 rewards are waiting!
🎁 $500 in futures trial vouchers up for grabs — 20 standout posts will win!
📅 Event Period: August 1, 2025, 15:00 – August 15, 2025, 19:00 (UTC+8)
👉 Event Link: https://www.gate.com/candy-drop/detail/BTC-98
Dare to trade. Dare to win.
💬 Vitalik Buterin warns against AI advances.
AI could overtake humans as a "superior species."
"One of the ways AI could make the world worse is the worst possible: it could literally cause the extinction of humanity."
Vitalik cites a 2022 survey of more than 4,270 researchers, which estimated the risk of AI killing humanity at 5-10%.
"Even Mars may not be safe if superintelligent AI turns against humanity."
For Vitalik, AI is "fundamentally different" from other recent inventions because AI can create a new kind of "mind" that can backfire on human interests.
"AI is a new type of mind that is rapidly gaining intelligence and has a serious chance of surpassing humans' mental faculties and becoming the new umbrella species of the planet."
Vitalik also suggested "active human intent" to steer AI in a direction beneficial to humanity.