upcarta
  • Sign In
  • Sign Up
  • Explore
  • Search

AGI rising: why we are in a new era of acute risk and increasing public awareness, and what to do now - EA Forum

  • Article
  • May 3, 2023
  • #ArtificialIntelligence
Greg_Colbourn
@ColburnGregg
(Author)
forum.effectivealtruism.org
Read on forum.effectivealtruism.org
1 Recommender
1 Mention
We are in a new era of acute risk from AGI Artificial General Intelligence (AGI) is now in its ascendency. GPT-4 is already ~human-level at language and showing sparks of AGI. Large... Show More

We are in a new era of acute risk from AGI
Artificial General Intelligence (AGI) is now in its ascendency. GPT-4 is already ~human-level at language and showing sparks of AGI. Large multimodal models – text-, image-, audio-, video-, VR/games-, robotics-manipulation by a single AI – will arrive very soon (from Google DeepMind[3]) and will be ~human-level at many things: physical as well as mental tasks; blue collar jobs in addition to white collar jobs. It’s looking highly likely that the current paradigm of AI architecture (Foundation models), basically just scales all the way to AGI. These things are “General Cognition Engines”.

All that is stopping them being even more powerful is spending on compute. Google & Microsoft are worth $1-2T each, and $10B can buy ~100x the compute used for GPT-4. Think about this: it means we are already well into hardware overhang territory.

Here is a warning written two months ago by people working at applied AI Alignment lab Conjecture: “we are now in the end-game for AGI, and we (humans) are losing”. Things are now worse. It’s looking like GPT-4 will be used to meaningfully speed up AI research, finding more efficient architectures and therefore reducing the cost of training more sophisticated models.

Show Less
Recommend
Post
Save
Complete
Collect
Mentions
See All
Liron Shapira @liron · May 2, 2023
  • Post
  • From Twitter
Great overview of our urgent situation. Worth a read.
  • upcarta ©2025
  • Home
  • About
  • Terms
  • Privacy
  • Cookies
  • @upcarta