Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Great page, it only took 13s to load. With no wonder when it takes 13.8 MB. And the biggest png files are not even pngcrushed...

du -ch editorial-1.png editorial-3.png product-1.png product- 2.png product-3.png product-4.png 2,5M editorial-1.png 832K editorial-3.png 368K product-1.png 288K product-2.png 284K product-3.png 1,2M product-4.png 5,4M total

vs pngcrushed:

du -ch editorial-1.smaller.png editorial-3.smaller.png product-1.smaller.png product-2.smaller.png product-3.smaller.png product-4.smaller.png 2,0M editorial-1.smaller.png 612K editorial-3.smaller.png 188K product-1.smaller.png 192K product-2.smaller.png 184K product-3.smaller.png 876K product-4.smaller.png 4,0M total



It also looked messed up on mobile - the nav bar took up 20% of the screen. Definitely a point in favour of the motherfuckingwebsite.com school of thought.


I remember websites. I miss them dearly.


Pepperridge Farm remembers


Motherfuckingreat style guide! Thanks for the laugh - for which partially the wording is responsible and partially it‘s the hopelessness of a „so true, but it will unlikely ever change towards the better“...


I just heard of pngcrush for the first time. I usually use optipng. Any insight which is better?


There's a good comparison between multiple png optimizers here: http://optipng.sourceforge.net/pngtech/optipng.html

> pngcrush by Glenn Randers-Pehrson, available at http://pmt.sourceforge.net/pngcrush, is an open-source program that iterates over PNG filters and zlib (Deflate) parameters, compresses the image repeatedly using each parameter configuration, and chooses the configuration that yields the smallest compressed (IDAT) output. At the user's option, the program can explore few (below 10) or many (a brute-force traversal over more than 100) configurations. The method of selecting the parameters for "few" trials is particularly effective, and the use of a brute-force traversal is generally not recommended.

In addition, pngcrush offers a multitude of extra features, such as recovery of erroneous PNG files (e.g. files containing bad CRCs), and chunk-level editing of PNG meta-data.


In the end it does not matter since they did not use any of them. If I can reduce size of that page in 2 minutes, without any webdev background and using the only tool I could remember if few seconds than there is something seriously wrong with the way they treat visitors.


maybe they aren't familiar with this tool. perhaps you could email them and suggest it to them, and then they would improve their site.


https://pngquant.org is worth looking at as well. It's lossy, so not for every image, but the size reduction is often dramatic.


I realise this doesn't answer your question, but there's a nice macOS GUI with multiple crushers built-in:

https://imageoptim.com/


Also take a look at ImageAlpha - https://pngmini.com/ - great for single images (happy fan of both!).


About the same. But, I found this little benchmark: [0]

[0] http://pointlessramblings.com/posts/pngquant_vs_pngcrush_vs_...


I used to hear good things about Ken Silverman's PNGOUT but not recently.


They actually have a message in the footer that says “Leave us an issue.” Ha.

https://github.com/ibm/type/issues/new


>And the biggest png files are not even pngcrushed...

Makes sense (that they are not) if the difference is just ~20% as in here.


In nearly every subfield of computing, along nearly every metric, most people would kill for a 20% improvement.

A processor 20% faster would dominate the market for years. A new compression algorithm that was 20% smaller would be either copied or used by every archiving system. And yes, a website reduced by 20% is significant.


>In nearly every subfield of computing, along nearly every metric, most people would kill for a 20% improvement

I doubt it. 20% is not even worth to turn a straightforward algorithm to a more complex and convoluted (but more performant) one, or to switch backend technology, or db store, etc.

As for a "20% faster processor"? Big deal, I was raised in an era when we got 2x faster processors every 2 years.

(Besides, we do have processors that are 20% faster than others, and they don't "dominate" the market, even at the same price range some might go for perceived quality stability -- e.g Intel vs AMD, over the small speed increase).

>And yes, a website reduced by 20% is significant.

To whom? It's as if people never heard of opportunity cost.

Just go measure how many websites, even leading ones, go to any great measure to reduce such bloat, and you'll find that it's not that significant in the real world. Up to a point, of which 20% is not even close, you can be bloated without punishment in the modern web.


If you use the correct tools, such optimizations introduce little complexity, if any. Just add a gulp plugin or select "optimize assets" in your Netlify panel or something similar.

If not using the correct tools, simply you won't survive for long in the market. Probabely you can still manage to sell domains or such, but can't compete in innovation-based sections.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: