Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There was an article in recent memory that looked at how various websites were using CDN's for delivering common libraries and if I recall correctly they concluded that the chance of getting a cache hit from a CDN is relatively low because of the different versions each website is using. The Google CDN for example offers dozens and dozens of different versions of jQuery: your visitor might have visited a site using jQuery before, but have they visited a site using the same version? Unfortunately I cannot find the article, maybe someone else will be able to provide a link.

There are other concerns too, if your website depends on a Javascript library and the CDN is offline it can break your website (which has happened with Google CDN before, and many of the other popular Javascript CDN's). What if the CDN is compromised and is delivering malicious content? Not impossible.



I'm only aware of this slightly older (11/2011) article [1] on the topic. I've poked at the numbers recently in Big Query and see a wide spread for jQuery versions on popular sites. Maybe one of these days I'll get around to documenting it properly. I'd rather measure from a sampling of client side pageloads but this might be a bigger project to tackle.

[1] http://statichtml.com/2011/google-ajax-libraries-caching.htm...

[2] http://stackoverflow.com/questions/29930805/measuring-visito...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: