• 0 Posts
  • 7 Comments
Joined 2 years ago
cake
Cake day: June 13th, 2023

help-circle
  • JPEG also supports lossless compression.

    Technically, the spec does require it, but given that we’re in a thread about ecosystem support for a file format that’s approaching its 15th birthday, it’s worth considering how many image viewers will actually be able to work without the DCT step that is the essence of what typical JPEG does.

    I don’t have a Windows machine handy to test, but it’s entirely possible that maybe lossless JPEG won’t display in its default viewer.






  • These are fun rabbit holes to go down. Everything here is true, of course: Big-O complexity isn’t everything, context always matters, and measurements trump guesses.

    But also, how many times have you encountered a performance problem with a slow O(n) solution that you solved by turning it into a fast O(n²) solution, compared to the other way around? The difference between 721ns and 72.1ns is almost always irrelevant (and is irrelevant if it’s not on a hot path), and in all likelihood, the same can be said at n=500 (even 500x these numbers still doesn’t even reach 0.5ms).

    So unless context tells me that I have a good reason to think otherwise, I’m writing the one that uses a hash-based collection. As the codebase evolves in the future and the same bits of code are used in novel situations, I am much less likely to regret leaving microseconds on the table at small input sizes than to regret leaving milliseconds or seconds on the table at large input sizes.

    As a trained practicioner of “the deeper magics” myself, I feel the need to point out that there’s a reason why we call these types of things “the deeper magics”, and that’s because heuristics like “better Big-O means better performance” generally point you in the right direction when it matters, and the wrong direction when it doesn’t matter.