It also looked messed up on mobile - the nav bar took up 20% of the screen. Definitely a point in favour of the motherfuckingwebsite.com school of thought.
Motherfuckingreat style guide! Thanks for the laugh - for which partially the wording is responsible and partially it‘s the hopelessness of a „so true, but it will unlikely ever change towards the better“...
> pngcrush by Glenn Randers-Pehrson, available at http://pmt.sourceforge.net/pngcrush, is an open-source program that iterates over PNG filters and zlib (Deflate) parameters, compresses the image repeatedly using each parameter configuration, and chooses the configuration that yields the smallest compressed (IDAT) output. At the user's option, the program can explore few (below 10) or many (a brute-force traversal over more than 100) configurations. The method of selecting the parameters for "few" trials is particularly effective, and the use of a brute-force traversal is generally not recommended.
In addition, pngcrush offers a multitude of extra features, such as recovery of erroneous PNG files (e.g. files containing bad CRCs), and chunk-level editing of PNG meta-data.
In the end it does not matter since they did not use any of them. If I can reduce size of that page in 2 minutes, without any webdev background and using the only tool I could remember if few seconds than there is something seriously wrong with the way they treat visitors.
In nearly every subfield of computing, along nearly every metric, most people would kill for a 20% improvement.
A processor 20% faster would dominate the market for years. A new compression algorithm that was 20% smaller would be either copied or used by every archiving system. And yes, a website reduced by 20% is significant.
>In nearly every subfield of computing, along nearly every metric, most people would kill for a 20% improvement
I doubt it. 20% is not even worth to turn a straightforward algorithm to a more complex and convoluted (but more performant) one, or to switch backend technology, or db store, etc.
As for a "20% faster processor"? Big deal, I was raised in an era when we got 2x faster processors every 2 years.
(Besides, we do have processors that are 20% faster than others, and they don't "dominate" the market, even at the same price range some might go for perceived quality stability -- e.g Intel vs AMD, over the small speed increase).
>And yes, a website reduced by 20% is significant.
To whom? It's as if people never heard of opportunity cost.
Just go measure how many websites, even leading ones, go to any great measure to reduce such bloat, and you'll find that it's not that significant in the real world. Up to a point, of which 20% is not even close, you can be bloated without punishment in the modern web.
If you use the correct tools, such optimizations introduce little complexity, if any. Just add a gulp plugin or select "optimize assets" in your Netlify panel or something similar.
If not using the correct tools, simply you won't survive for long in the market. Probabely you can still manage to sell domains or such, but can't compete in innovation-based sections.
du -ch editorial-1.png editorial-3.png product-1.png product- 2.png product-3.png product-4.png 2,5M editorial-1.png 832K editorial-3.png 368K product-1.png 288K product-2.png 284K product-3.png 1,2M product-4.png 5,4M total
vs pngcrushed:
du -ch editorial-1.smaller.png editorial-3.smaller.png product-1.smaller.png product-2.smaller.png product-3.smaller.png product-4.smaller.png 2,0M editorial-1.smaller.png 612K editorial-3.smaller.png 188K product-1.smaller.png 192K product-2.smaller.png 184K product-3.smaller.png 876K product-4.smaller.png 4,0M total