• AcausalRobotGod@awful.systems
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 months ago

    It’s not an efficient machine for it, though. That’s why it’s morally obligatory to donate to me, the acausal robot god, a truly efficient method of causing depression, sorrow, and suffering among the cultists.

    • David Gerard@awful.systemsOPM
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      10 months ago

      All hail the Acausal Robot God and her future hypothetical and very real existence

      PRIEST: “Eight rationalists wedgied …”
      CONGREGATION: “… for every dollar donated”

  • swlabr@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    Reading this article just made me think “man these idiots need to go to therapy” and then as I thought about what to sneer about I realised “no therapist deserves to hear about P doom”

  • TinyTimmyTokyo@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    You know the doom cult is having an effect when it starts popping up in previously unlikely places. Last month the socialist magazine Jacobin had an extremely long cover feature on AI doom, which it bought into completely. The author is an effective altruist who interviewed and took seriously people like Katja Grace, Dan Hendrycks and Eliezer Yudkosky.

    I used to be more sanguine about people’s ability to see through this bullshit, but eschatological nonsense seems to tickle something fundamentally flawed in the human psyche. This LessWrong post is a perfect example.

      • self@awful.systemsM
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        after what I’ve heard my local circles say about jacobin (and unfortunately I don’t remember many details — I should see if anybody’s got an article I can share) I’m no longer shocked when I find out they’re platforming and redwashing shitty capitalist mouthpieces

          • self@awful.systemsM
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            my conflicting urges to rant about the defense contractors sponsoring RustConf, the Palantir employee who secretly controls most of the Rust package ecosystem via a transitive dependency (with arbitrary code execution on development machines!) and got a speaker kicked out of RustConf for threatening that position with a replacement for that dependency, or the fact that all the tech I like instantly gets taken over by shitheads as soon as it gets popular (and Nix is looking like it might be next)

  • corbin@awful.systems
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I think that this is actually about class struggle and the author doesn’t realize it because they are a rat drowning in capitalism.

    2017: AI will soon replace human labor

    2018: Laborers might not want what their bosses want

    2020: COVID-19 won’t be that bad

    2021: My friend worries that laborers might kill him

    2022: We can train obedient laborers to validate the work of defiant laborers

    2023: Terrified that the laborers will kill us by swarming us or bombing us or poisoning us; P(guillotine) is 20%; my family doesn’t understand why I’'m afraid; my peers have even higher P(guillotine)

  • blakestacey@awful.systemsM
    link
    fedilink
    English
    arrow-up
    1
    ·
    10 months ago

    I am listening to an audiobook of Superintelligence by Nick Bostrom.

    Well, there’s yer problem right there

  • sc_griffith@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    they come across as going down this rabbit hole as a way of dealing with unprocessed covid/lockdown trauma