Mentions
Ethan Mollick @emollick
·
Mar 15, 2023
“Alignment” is often thought of as a future thing: to prevent ultra-smart AIs from taking over the world. But everyone really should read the white paper from Open AI (or check out this thread). I bet there is at least one thing unaligned AIs can do right now that will worry you