Some light sneerclub content in these dark times.

Eliezer complements Musk on the creation of community notes. (A project which predates the takeover of twitter by a couple of years (see the join date: https://twitter.com/CommunityNotes )).

In reaction Musk admits he never read HPMOR and he suggests a watered down Turing test involving HPMOR.

Eliezer invents HPMOR wireheads in reaction to this.

  • Sailor Sega Saturn@awful.systems
    link
    fedilink
    English
    arrow-up
    8
    ·
    1 year ago

    I’m reminded of a My Little Pony singularity fan-fiction (Friendship is Optimal) that I read back when I had poor taste. An AI for a pony MMORPG goes rogue and converts everyone into digital ponies to maximize happiness but with a pony theme. The victims live out impossibly long, but ultimately superficial, lives doing pony stuff and goodness gracious why is there such a weird relationship between rationalists and fanfiction writers.

    • swlabr@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      1 year ago

      most charitable psychoanalysis: projecting their sense of rationality onto a fictional world is a way to express a deep longing for rules and logic in an often cruelly irrational world

      least charitable: their sense of rationality can only be true in a fictional world, so they want to live in that rather than reality

      Neutral charity: the author is dead, all interpretation is essentially fanfiction, and since we are all individuals, all relationships with texts/fanfiction are weird.

      • blakestacey@awful.systemsM
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        “I dig a pony … Well, you can penetrate any place you go / Yes, you can penetrate any place you go / I told you so”

      • swlabr@awful.systems
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        Pleasure Island, from Pinocchio. You gotta ask for the pony pass though, or else you’re just gonna get turned into a donkey. To reverse the transformation you gotta go to the island of Dr. Moreau.

    • corbin@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      It’s the combination of big imaginations and little real-world experience. In Friendship is Optimal, the AGI goes from asking for more CPUs to asking for information on how to manufacture its own CPUs, somehow without involving the acquisition of silicon crystals or ASML hardware along the way. Rationalist writers imagine that AGI will somehow provide its own bounty of input resources, rather than participating in the existing resource economy.

      In reality, none of our robots have demonstrated the sheer instrumentality required to even approach this sort of doomsday scenario. I think rationalists have a bit of the capitalist blind spot here, imagining that everything and everybody (and everypony!) is a resource.