New diet villain just dropped. Believe or disbelieve this specific one, "fat" or even "polyunsaturated fat" increasingly looks like a failure as a natural category. Only finer-grained concepts like "linoleic acid" are useful for carving reality at the joints.
A lot of rationalism is just an intense fear of death. Simulation hypothesis? Means that maybe you can live forever if you’re lucky. Superintelligence? Means that your robot god might grant you immortality someday.
Cryogenics? Means that there’s some microscopic chance that even if you pass away you could be revived in the future at some point. Long terminism? Nothing besides maybe someday possibly making me immortal could possibly matter.
I mean don’t get me wrong I’d give a lot for immortality, but I try to uhh… stay grounded in reality.
@sailor_sega_saturn@TinyTimmyTokyo My BIL seems to believe that if we invent immortality it will immediately usher in utopia where everyone lives forever. I said no way, even if turns out the secret is a pill that costs ME 5c to manufacture, it’s going to cost YOU everything you earn forever to BUY. True immortality will simply amplify existing inequality a billion percent.
@sailor_sega_saturn@TinyTimmyTokyo@nyrath I like to call LessWrong “modern-day Scientology” and that moniker seems more and more appropriate with each passing month.
@sailor_sega_saturn@TinyTimmyTokyo@nyrath For someone claiming to be rational (meaning putting reality above superstition), he [Yudkowsky] really did create what is essentially a proto-religion. Hence, Scientology.
I am literally a devout (if reformist) Christian and I’m less superstitious than that clown shoe and his zombies.
@sailor_sega_saturn@TinyTimmyTokyo Been thinking and saying this for a while. These powerful billionaire types are terrified of death as it’s so egalitarian - nobody escapes it; No matter how much money and power they accumulate, they can’t get control over this one thing and it drives them up the wall.
The anti-TESCREAL conspiracy argues that even relatively cautious people like Bostrom talking about the risks of superintelligence is reactionary since they distract us from algorithmic bias and the electricity use of server farms. While we agree that techno-libertarians tend to be more interested in millennialist and apocalyptic predictions than responding to the problems being created by artificial intelligence today, we also believe that it is legitimate and important to discuss potential catastrophic risks and their mitigation. The anti-TESCREALists dismiss all discussion of AGI, ranging from “believers” to “doomers.”
@self the tldr is that lumping everything TESCREAL together into “assholes are into this, therefore it is bad” means that a lot of worthwhile and important ideas, many of which were developed by left thinkers, get lost.
(that said, “anti-TESCREAL conspiracy” is I think itself an unfortunate compression)
I tried to read this over breakfast, which consisted of very mellow bowl of jungle oats (no extra flavour) and some semi-terrible filter coffee. and I gotta tell ya, both of those fairly mellow things were better than the entire first quarter of this post
the author seems to be trying to whiteknight some general idea of maybe some progress isn’t bad and “well obviously there will be some bad associations too”, while willfully excluding the direct and overt bad actions of those associated bad actors?
admittedly I only got a quarter of the post in (since my oats ran out - scandalous), but up until that point I hadn’t really found anything worthwhile beyond the squirrelly abdication bullshit
@froztbyte maybe my breakfast (untoasted muesli, coconut yoghurt) started me in a different frame of mind. I read it as showing that a lot of these ideas, which, yes, some jerks (but also plenty of non-jerks) are into, have deeper left histories, and deserve serious consideration.
The only people mentioned who are not the usual rogue’s gallery (MuskThielSBF) are Marx/Engels, JB Haldane, John Desmond Bernal (who??) and this fucking guy:
Max More was one of the libertarian thinkers (non-billionaire) who helped shape modern transhumanism.
Oh he’s not a billionaire, obviously he is Of The Left.
(I quickly googled this dude of whom I have never heard and didn’t find any obvious techfash red flags, but maybe he’s better at hiding them than most others)
Anyway, extropianism!
like all arguments from first principles, the Extropians encountered problems when trying to extrapolate derivative principles, like political economy. While the Extropian ideas went in an anti-state direction, their logic leads just as naturally to the Enlightenment Left’s conclusion that humanity should take our collective future in hand through democratic deliberation or the guidance of “scientific socialism,”
“OK so right now it’s basically fascist feudalism, but it could be socialism”, got it.
More weird framings
But some effective altruists, most famously the crypto scammer and donor to the Democratic Party Sam Bankman-Fried,
Outside the “not all EAs!” crowd I haven’t seen this before, but the authors are “democratic socialists” which basically means they hate the Democrats more than the GOP.
I can kinda agree on their take on Cosmism, which AFAIK is really fringe (I mean, I have heard of Fyodorov, but I have read a lot of SF), but even here they can’t really refrain from oohing over the “weird and wonderful” Russian cosmists, while perfunctorily noting that they’re all fascists now.
Russian Cosmists also prefigured a version of eco-philosophy, emphasizing the unity of all living beings and the interconnectedness of the universe. Cosmists believed that all forms of life, including animals and plants, were part of a universal whole. They advocated for the ethical treatment of all living creatures and the preservation of biodiversity.
The Izborsky Club explicitly condemns the technocratic “transhumanism” of Western thought, including individualism, rationalism, democracy, capitalism and transgender rights, as contrary to their “technocratic traditionalist” Cosmism. The Izborsky Club reflects the swirl of NazBol ideas in contemporary Russia, attempting to merge Russian Orthodoxy, Bolshevik authoritarianism and fascist “Eurasian” racial-nationalism. […] In other words actual organized Russian Cosmists today despise TESCREAL ideas and their Western proponents.
But both Musk and Thiel hate trans people, but trans treatment is essentially transhumanism, how can we square this circle? It is a mystery.
Outside the “not all EAs!” crowd I haven’t seen this before, but the authors are “democratic socialists” which basically means they hate the Democrats more than the GOP.
the more I read, the more I get the sinking suspicion that the authors are cherry-flavored fascists who are particularly bad at smuggling their ideas under a thin guise of leftist thought
no, they are at best the colonial liberal strain of technoprogressive, and only “left” of the out and out techfash. the technoprogressive transhumanist offering is that in the future, everyone will be a middle class white man!
i mean, at least they thought the idea was to bring the rest of humanity along with them. but they still share in the same selection of poison pills. including “positive” eugenics, for example.
@froztbyte kinda, but with a different emphasis. The author talks about specific ideas and their origins, and asks that try to build a positive left futurism, and not cede the field to a subset of 2020s Silicon Valley interpretations of those ideas. If eg transhumanism was interesting and worth exploring before Peter Thiel turned up, it can still be so afterwards.
but transhumanism wasn’t interesting before Thiel showed up. it started as an Italian proto-fascist movement and to this day it hasn’t shaken its association with fascism and white supremacy
if there’s any deeper leftism in the post you linked, you’d best quote it — cause all I’m seeing from my skim through is dollar store Marx and literally a paragraph of poorly-cited Eco used to somehow justify the idea that opposition to TESCREAL ideas is due to a conspiratorial mindset and membership in a cult. I’m seeing a bunch of shit flung at folks like Timnit who’ve put more apparent thought into TESCREAL than anything I’m seeing in that post
so the wild bit here is that Hughes previously ranted over transhumanism’s hard-right turn as something Thiel personally did in the late 2000s as he tried to buy his way onto the IEET board
dude, TESCREAL is talking about precisely those guys
@sailor_sega_saturn@TinyTimmyTokyo Eh, no guarantee (or any reason to believe really) a simulation would be even focused in any way on humanity (no anthropocentrism needed).
Similarly for superintelligence, few reasons for it to care.
Cryogenics is a better bet and as you say it’s quite unlikely unfortunately.
@sailor_sega_saturn@cstross I”m not sure that those things are “Rationalism”. Rationalism is about reason above religious belief, not substituting one god with another…
don’t worry, folk like Yudkowsky have already taken care of substituting one god with another under rationalism’s name in an attempt to grift atheists who miss the comfort of an afterlife to look forward to
you can thank him by donating a tithe of all your money to effective altruism or whichever “AI alignment” org will save you from the Basilisk these days
@self I had no idea who that was until I looked him up. He talks about something called “Rationality” (which seems to be anything but). I’m talking about Rationalism, which is a different thing: https://en.wikipedia.org/wiki/Rationalism?wprov=sfti1
the brain, mostly notable for being the only organ capable of generating lies. how curious that anyone would recommend a website that identifies with the liemaker organ
e: please read this post in the most ben shapiro voice your liemaker can muster
That reminds me. If the world is about to FOOM into a kill-all-humans doomscape, why is he wasting time worrying about seed oils?
A lot of rationalism is just an intense fear of death. Simulation hypothesis? Means that maybe you can live forever if you’re lucky. Superintelligence? Means that your robot god might grant you immortality someday. Cryogenics? Means that there’s some microscopic chance that even if you pass away you could be revived in the future at some point. Long terminism? Nothing besides maybe someday possibly making me immortal could possibly matter.
I mean don’t get me wrong I’d give a lot for immortality, but I try to uhh… stay grounded in reality.
@sailor_sega_saturn @TinyTimmyTokyo My BIL seems to believe that if we invent immortality it will immediately usher in utopia where everyone lives forever. I said no way, even if turns out the secret is a pill that costs ME 5c to manufacture, it’s going to cost YOU everything you earn forever to BUY. True immortality will simply amplify existing inequality a billion percent.
@Salty @sailor_sega_saturn @TinyTimmyTokyo Right up until the secret gets liberated and one gets buried in concrete for trying to hide it, anyway.
No guarantees of any utopia anyway though.
deleted by creator
@sailor_sega_saturn @TinyTimmyTokyo @nyrath I like to call LessWrong “modern-day Scientology” and that moniker seems more and more appropriate with each passing month.
@sailor_sega_saturn @TinyTimmyTokyo @nyrath For someone claiming to be rational (meaning putting reality above superstition), he [Yudkowsky] really did create what is essentially a proto-religion. Hence, Scientology.
I am literally a devout (if reformist) Christian and I’m less superstitious than that clown shoe and his zombies.
@sailor_sega_saturn @TinyTimmyTokyo Been thinking and saying this for a while. These powerful billionaire types are terrified of death as it’s so egalitarian - nobody escapes it; No matter how much money and power they accumulate, they can’t get control over this one thing and it drives them up the wall.
@sailor_sega_saturn @TinyTimmyTokyo Kinda hilarious those beliefs are called “rationalism”. They’re speculation
& pseudoscience at best.
@sailor_sega_saturn
Spelt “TESCREAL”, pronounced “existential angst”…
@apocraphilia @sailor_sega_saturn a worthwhile read: https://medium.com/institute-for-ethics-and-emerging-technologies/conspiracy-theories-left-futurism-and-the-attack-on-tescreal-456972fe02aa
none of this looks worthwhile to me
@self the tldr is that lumping everything TESCREAL together into “assholes are into this, therefore it is bad” means that a lot of worthwhile and important ideas, many of which were developed by left thinkers, get lost.
(that said, “anti-TESCREAL conspiracy” is I think itself an unfortunate compression)
I tried to read this over breakfast, which consisted of very mellow bowl of jungle oats (no extra flavour) and some semi-terrible filter coffee. and I gotta tell ya, both of those fairly mellow things were better than the entire first quarter of this post
the author seems to be trying to whiteknight some general idea of maybe some progress isn’t bad and “well obviously there will be some bad associations too”, while willfully excluding the direct and overt bad actions of those associated bad actors?
admittedly I only got a quarter of the post in (since my oats ran out - scandalous), but up until that point I hadn’t really found anything worthwhile beyond the squirrelly abdication bullshit
@froztbyte maybe my breakfast (untoasted muesli, coconut yoghurt) started me in a different frame of mind. I read it as showing that a lot of these ideas, which, yes, some jerks (but also plenty of non-jerks) are into, have deeper left histories, and deserve serious consideration.
The only people mentioned who are not the usual rogue’s gallery (MuskThielSBF) are Marx/Engels, JB Haldane, John Desmond Bernal (who??) and this fucking guy:
Oh he’s not a billionaire, obviously he is Of The Left.
(I quickly googled this dude of whom I have never heard and didn’t find any obvious techfash red flags, but maybe he’s better at hiding them than most others)
Anyway, extropianism!
“OK so right now it’s basically fascist feudalism, but it could be socialism”, got it.
More weird framings
Outside the “not all EAs!” crowd I haven’t seen this before, but the authors are “democratic socialists” which basically means they hate the Democrats more than the GOP.
I can kinda agree on their take on Cosmism, which AFAIK is really fringe (I mean, I have heard of Fyodorov, but I have read a lot of SF), but even here they can’t really refrain from oohing over the “weird and wonderful” Russian cosmists, while perfunctorily noting that they’re all fascists now.
But both Musk and Thiel hate trans people, but trans treatment is essentially transhumanism, how can we square this circle? It is a mystery.
the more I read, the more I get the sinking suspicion that the authors are cherry-flavored fascists who are particularly bad at smuggling their ideas under a thin guise of leftist thought
deleted by creator
Max More is a full on ancap libertarian, though in the context of technoprogressivism
Going by the flow of nominative determinism, this is one remarkable and poignant name in that case
no, they are at best the colonial liberal strain of technoprogressive, and only “left” of the out and out techfash. the technoprogressive transhumanist offering is that in the future, everyone will be a middle class white man!
i mean, at least they thought the idea was to bring the rest of humanity along with them. but they still share in the same selection of poison pills. including “positive” eugenics, for example.
you are aware that you just repeated the exact pattern that I pointed out the author did?
@froztbyte kinda, but with a different emphasis. The author talks about specific ideas and their origins, and asks that try to build a positive left futurism, and not cede the field to a subset of 2020s Silicon Valley interpretations of those ideas. If eg transhumanism was interesting and worth exploring before Peter Thiel turned up, it can still be so afterwards.
but transhumanism wasn’t interesting before Thiel showed up. it started as an Italian proto-fascist movement and to this day it hasn’t shaken its association with fascism and white supremacy
if there’s any deeper leftism in the post you linked, you’d best quote it — cause all I’m seeing from my skim through is dollar store Marx and literally a paragraph of poorly-cited Eco used to somehow justify the idea that opposition to TESCREAL ideas is due to a conspiratorial mindset and membership in a cult. I’m seeing a bunch of shit flung at folks like Timnit who’ve put more apparent thought into TESCREAL than anything I’m seeing in that post
so show me the good part
no, at best it’s the more benign and fluffy end of the Californian Ideology, it’s still extremely much the same thing
so the wild bit here is that Hughes previously ranted over transhumanism’s hard-right turn as something Thiel personally did in the late 2000s as he tried to buy his way onto the IEET board
dude, TESCREAL is talking about precisely those guys
@sailor_sega_saturn @TinyTimmyTokyo Eh, no guarantee (or any reason to believe really) a simulation would be even focused in any way on humanity (no anthropocentrism needed).
Similarly for superintelligence, few reasons for it to care.
Cryogenics is a better bet and as you say it’s quite unlikely unfortunately.
@sailor_sega_saturn @cstross I”m not sure that those things are “Rationalism”. Rationalism is about reason above religious belief, not substituting one god with another…
don’t worry, folk like Yudkowsky have already taken care of substituting one god with another under rationalism’s name in an attempt to grift atheists who miss the comfort of an afterlife to look forward to
you can thank him by donating a tithe of all your money to effective altruism or whichever “AI alignment” org will save you from the Basilisk these days
@self I had no idea who that was until I looked him up. He talks about something called “Rationality” (which seems to be anything but). I’m talking about Rationalism, which is a different thing: https://en.wikipedia.org/wiki/Rationalism?wprov=sfti1
further reading as you fall down the Yudkowsky rabbit hole I guess (and the complementary RationalWiki article, written by folks who probably deserve the term rationalist to a greater degree than any of LessWrong or Yudkowsky’s output)
how can you trust rationalwiki it’s the sorta organisation that uses a brain as a logo
the brain, mostly notable for being the only organ capable of generating lies. how curious that anyone would recommend a website that identifies with the liemaker organ
e: please read this post in the most ben shapiro voice your liemaker can muster
i expect it was a brain that did that