A Bayesian superintelligence, hooked up to a webcam, would generate the world’s most beautiful camgirl, like a photoshop that’s been photoshopped, and take over OnlyFans to raise money to wedgie rationalists, just don’t look too closely at the fingers or teeth
I’m trying to find the twitter post where someone deepfakes eliezer’s voice into saying full speed ahead on AI development, we need embodied catgirls pronto.
Alas, if Yud took an actual physics class, he wouldn’t be able to use it as the poorly defined magic system for his OC doughnut-steal IRL bayesian superintelligence fanfic.
It’s because of stuff like this and people like you on the Fediverse that leaving Reddit for the final time on that night of June 11… two and a half months already… hasn’t hurt as much as I feared back then.
“Eliezer has sometimes made statements that are much stronger than necessary for his larger point, and those statements turn out to be false upon close examination” is something I already generically believe, e.g. see here.
I get the impression that this guy (whose job at an AGI thinkpiece institute founded by a cryptobillionaire depends on believing this) would say this about ALL of EYs statements, leaving his larger point floating in the air, “supported” by whatever EY statements you aren’t currently looking at.
this is fantastic! if you’ve ever got another one of these in you, feel free to tag it NSFW and post it here or on MoreWrite depending on what feels right. I live to see yud get destroyed in slow motion by real expertise
I’ve more than once been tempted to write Everything the Sequences Get Wrong about Quantum Mechanics, but the challenge is doing so in a way that doesn’t just amount to teaching a whole course in quantum mechanics. The short-short version is that it’s lazy, superficial takes on top of cult shit — Yud trying to convince the reader that the physics profession is broken and his way is superior.
I’d be happy to contribute what CS material I can to a multidisciplinary effort to prove that Yud’s lazy, superficial takes and cult shit are universal
Yeah, I’ve been writing up critiques for a year or two now, collected over at my substack. I’ve been posting them to the EA forum and even Lesswrong itself and they’ve been generally well received.
Interesting read, thank you for sharing! You nicely put into words what I thought as well - the amount of information you can deduce from 3 frames of a falling apple is way too limited to do what they describe.
As a physicist, this quote got me so mad I wrote an excessively detailed debunking a while back. It’s staggeringly wrong.
Kudos for the effortpost. My 5-second simpleton objection went something like
A Bayesian superintelligence, hooked up to a webcam, would generate the world’s most beautiful camgirl, like a photoshop that’s been photoshopped, and take over OnlyFans to raise money to wedgie rationalists, just don’t look too closely at the fingers or teeth
I’m trying to find the twitter post where someone deepfakes eliezer’s voice into saying full speed ahead on AI development, we need embodied catgirls pronto.
now that’s a positive contribution to the space
that was shockingly polite
Love this!
Alas, if Yud took an actual physics class, he wouldn’t be able to use it as the poorly defined magic system for his OC doughnut-steal IRL bayesian superintelligence fanfic.
It’s because of stuff like this and people like you on the Fediverse that leaving Reddit for the final time on that night of June 11… two and a half months already… hasn’t hurt as much as I feared back then.
I get the impression that this guy (whose job at an AGI thinkpiece institute founded by a cryptobillionaire depends on believing this) would say this about ALL of EYs statements, leaving his larger point floating in the air, “supported” by whatever EY statements you aren’t currently looking at.
this is fantastic! if you’ve ever got another one of these in you, feel free to tag it NSFW and post it here or on MoreWrite depending on what feels right. I live to see yud get destroyed in slow motion by real expertise
I’ve more than once been tempted to write Everything the Sequences Get Wrong about Quantum Mechanics, but the challenge is doing so in a way that doesn’t just amount to teaching a whole course in quantum mechanics. The short-short version is that it’s lazy, superficial takes on top of cult shit — Yud trying to convince the reader that the physics profession is broken and his way is superior.
I’d be happy to contribute what CS material I can to a multidisciplinary effort to prove that Yud’s lazy, superficial takes and cult shit are universal
I got as far as this blog post that I shared in the first days of new!SneerClub, but that was only a first stab.
Yeah, I’ve been writing up critiques for a year or two now, collected over at my substack. I’ve been posting them to the EA forum and even Lesswrong itself and they’ve been generally well received.
Relevant Asimov (PDF)
Interesting read, thank you for sharing! You nicely put into words what I thought as well - the amount of information you can deduce from 3 frames of a falling apple is way too limited to do what they describe.