Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don’t think it’s fair to blame C for the flat random access memory model. Arguably it goes back to Von Neumann. There was a big push to extend the model in the 1960s through hardware like Atlas and Titan (10 years before C) and operating systems like Multics. And there’s all the computer science algorithms analysis that assumes the same model.


At the time C rose to prominence there was already understanding that memory access isn't going to be uniform, and less and less so as hardware evolves and becomes more complex. Ada came out with this idea from the get go.

Von Neumann created a model of computation. It's a convenient mathematical device to deal with some problems. He never promised that this is going to be a device to deal with all problems, nor did he promise that this is going to be the most useful or the most universal one etc.


You’re echoing my point back at me, though to be fair I should have been more explicit that my examples from the 1960s were about caches and virtual memory and other causes of nonuniform access hidden under a random access veneer.

But we can go 15 years earlier: Von Neumann wrote in 1946: “We are therefore forced to recognize the possibility of constructing a hierarchy of memories, each of which has greater capacity than the preceding but which is less quickly accessible.” https://www.ias.edu/sites/default/files/library/Prelim_Disc_...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: