• earthquake@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    1 year ago

    It’s not very good at most of what it does

    don’t think these systems are the useless toys

    It’s excellent at what it does, which is create immense reams of spam, make the internet worse in profitable ways, and generate at scale barely sufficient excuses to lay off workers. Any other use case, as far as I’ve seen, remains firmly at the toy level.

    But @GorillasAreForEating was talking about OpenAI, not just him.

    Taking a step back… this is far removed from the point of origin: @Hanabie claims Sutskever specifically is “allowed to be weird” because he’s a genius. If we move the goalposts back to where they started, it becomes clear it’s not accurate to categorise the pushback as “OpenAI has no technical accomplishments”.

    I ask that you continue to mock rationalists who style themselves the High Poobah of Shape Rotators, chanting about turning the spam vortex into a machine God, and also mock anyone who says it’s OK for them to act this way because they have a gigantic IQ. Even if the spam vortex is impressive on a technical level!

    • GorillasAreForEating@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I suppose the goalpost shifting is my fault, the original comment was about Sutskever but I shifted talking about OpenAI in general, in part because I don’t really know to what extent Sutskever is individually responsible for OpenAI’s tech.

      also mock anyone who says it’s OK for them to act this way because they have a gigantic IQ.

      I think people are missing the irony in that comment.

      • datarama@awful.systems
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Guilty as charged: I missed the irony in it.

        (I’m the sort of person, unfortunately, who often misses irony.)

      • earthquake@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        I’m still not convinced Hanabie was being ironic, but if so missing the satire is a core tradition of Sneer Club that I am keeping alive for future generations.

        • GorillasAreForEating@awful.systemsOP
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          I think there’s a non-ironic element too. Sutskever can be both genuinely smart and weird cultist; just because someone is smart in one domain doesn’t mean they aren’t immensely foolish in others.

    • datarama@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      It’s excellent at what it does, which is create immense reams of spam, make the internet worse in profitable ways, and generate at scale barely sufficient excuses to lay off workers. Any other use case, as far as I’ve seen, remains firmly at the toy level.

      I agree! What I meant about not being very good at what it does is that it writes poetry - but it’s bad poetry. It generates code - but it’s full of bugs. It answers questions about what to feed a pet bird - but its answer is as likely as not to kill your poor non-stochastic parrot. This, obviously, is exactly what you need for a limitless-spam-machine. Alan Blackwell - among many others - has pointed out that LLMs are best viewed as automated bullshit generators. But the implications of a large-scale bullshit generator are exactly what you describe: It can flood the remainder of the useful internet with crap, and be used as an excuse to displace labour (the latter being because while not all jobs are “bullshit jobs”, a lot of jobs involve a number of bullshit tasks).

      I ask that you continue to mock rationalists who style themselves the High Poobah of Shape Rotators, chanting about turning the spam vortex into a machine God, and also mock anyone who says it’s OK for them to act this way because they have a gigantic IQ.

      Obviously.

      I’ve said this before: I’m not at all worried about the robot cultists creating a machine god (or screwing up and accidentally creating a machine satan instead), I’m worried about the collateral damage from billions of corporate dollars propping up labs full of robot cultists who think they’re creating machine gods. And unfortunately, GPT and its ilk has upped the ante on that collateral damage compared to when the cultists were just sitting around making DOTA-playing bots.