Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In a very generic answer to your question, suppose you're moving a 1GB file from A to B with an 8GB flash drive. It doesn't cost anyone anything to move an extra 7GB of data that somebody might need later. You can 8x your throughput for free.

But, at the same time, you don't want a flash drive to be in a constant state of being near-full so it can't fit the new StarCraft beta.

So there's some logic that takes files that seem to be spreading around anyway and sort of spreads them around ahead of when they're requested. This both increases redundancy and increases network throughput. It's not exactly machine learning, but it does, in practice, provide decent results.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: