• AlecSadler@sh.itjust.works
    link
    fedilink
    arrow-up
    1
    arrow-down
    2
    ·
    9 months ago

    I worked for a company that required 95% code coverage but simultaneously complained about how slow tests ran.

    🤷‍♂️

      • AlecSadler@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        9 months ago

        When you have 1000+ Cypress tests, for example, it takes time to run, plain and simple.

        Now, if they were simple unit tests, sure, one could run thousands in a second or two, but they aren’t. Even headless, these just took time.

        • lysdexic@programming.dev
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          9 months ago

          When you have 1000+ Cypress tests, for example, it takes time to run, plain and simple.

          It’s one thing to claim that tests need time to run.

          It’s an entirely different thing to claim that the time it takes to run tests is proportional to test coverage.

          More often than not, you have massively expensive and naive test fixtures in place that act as performance boat anchors and are massive bottlenecks. Thousands of tests run instantly if each test takes around a few milliseconds to run. For perspective, the round trip of network request that crosses the world is around a couple of hundreds of milliseconds. A thousand of sequential requests takes only a couple of minutes. If each of your tests takes that long to run, your tests are fundamentally broken.