Eliezer Yudkowsky is a founding researcher of the field of AI alignment and played a major role in shaping the public conversation about smarter-than-human AI. He appeared on Time magazine's 2023 list of the 100 Most Influential People In AI, was one of the twelve public figures featured in The New York Times's "Who's Who Behind the Dawn of the Modern Artificial Intelligence Movement," and has been discussed or interviewed in The New Yorker, Newsweek, Forbes, Wired, Bloomberg, The Atlantic, The Economist, The Washington Post, and many other venues.
Having now read EVERY post by Eliezer Yudkowsky on Less Wrong, I can say I have thoroughly enjoyed it and benefitted immensely. The Sequences are a smaller portion conveying many important ideas, but by reading all the posts you get a stronger sense of direction, and see where all of the philosophy leads to.
The book is very informative and I really liked some parts, but it was a very difficult read. Why? Because Yudkowsky simply LOVES beating a dead horse. There's just too much repetition and sloppy writing, although the ideas are great.
It's just... It ought to be more concise and clear.
This is about how to overcome bias, which is really important in order to take good decisions, so I recommend it to most people. And it's well written, so it's fun to read! The first sequence was about bayesian probability, which is something I already knew about since I study physics engineering, but I've still learned new things. And in the other sequences, I'm definitely learning a lot.
This one was really hard to rate as my reactions to each post and sequence varied from "jeez, this just nailed it" to "meh, why are you explaining the same thing over and over again". I admit only skimming through many of the metaethics-sequence posts and might have given four stars if that one was left out. My rating doesn't reflect on how much I appreciate the writer doing and writing what he does - it IS important trying to explain things out in various ways, even for the people with creationist views (for whom many of the posts seemed to be directed to). Still, I'd rather read a specifically written - and edited - book about these topics than a sequence of blog posts with a lot of unnecessary rambling. That book might well get 5 stars and a favourite.
The Sequences are blog posts from Eliezer Yudkowsky written from 2006 to 2009. Subjects include Rationality, cognitive biases, some psychology, evolutionary psychology, quantum physics, morality and more. The second and third books were my favorite.
It was gifted to me by my friend Neil, who thought that rationalism was a school of thought I would subscribe to. It contains a series of essays by Eliezer Yudkowsky and fellow rationalist thought leaders.
Some of the essays are brilliant and expanded my horizon. For example, in "What Do We Mean By "Rationality?", Eliezer lays out the analogy of false beliefs as a map of the world that doesn't correspond to the territory, following up with the expected prescription to use rationality to cure our false beliefs and build accurate maps.
Other chapters were less exciting and focused on more technical topics. To be honest, I skipped quite a few of those.
What I disliked about the book was that the essays were disjointed and did not follow a structured path. It would have been a great piece if the author cared to clean up and write a coherent narrative, making it easier for the layman to tag along.
Am I convinced by the rationalists? While I endorse their effort to identify flaws in our thinking, I am missing a fundamental "metaphysical" justification for their brand of rationality, in particular when it comes to dealing with incomplete information, or probabilities.
Kinda helped me achieve what I value in life in more ways than I can count. Not because they're uncountable, mind you, but because I'm too lazy to count them.