• prime_number_314159@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    5 hours ago

    In industrial software, I’m sure performance is a pretty stark line between “good enough” and “costing us money”.

    The pattern I’ve seen in customer facing software is a software backend will depend on some external service (e.g. postgres), then blame any slowness (and even stability issues…) on that other service. Each time I’ve been able to dig into a case like this, the developer has been lazy, not understanding how the external service works, or how to use it efficiently. For example, a coworker told me our postgres system was overloaded, because his select queries were taking too long, and he had already created indexes. When I examined his query, it wasn’t able to use any of the indexes he created, and it was querying without appropriate statistics, so it always did a full table scan. All but 2 of the indexes he made were unused, so I deleted those, then added a suitable extended statistics object, and an index his query could use. That made the query run thousands of times faster, sped up writes, and saved disk space.

    Most of the optimization I see is in algorithms, and most of the slowness I see is fundamentally misunderstanding what a program does and/or how a computer works.

    Slowness makes customers unhappy too, but with no solid line between “I have what I want” and “this product is inadequate”.