Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> The rate at which technology is changing is absolutely insane.

By the time you graduate, most of the things you learned are obsolete. My advice is to grok the fundamentals, the concepts that don't change. Then absolutely learn to use Google.



The obsolete stuff I studied at university almost 20 years ago:

Network and hierarchical (now called "NoSQL") DBs, the problems they caused and why they were replaced with Relational DBs.

Functional programming with a derivative of Haskell called Gofer

Parallel and distributed computing techniques (including stuff like SIMD, Message queues, Event driven programming). Wrote some Erlang.


I hear this a lot, but it seems like the most commonly used languages are all around 20 years old (with C, C++ being even older).

I suppose if you went to college in the mid-90s this may be true since you would have seen the invention of Java, JavaScript, and Ruby.


I can't see why seeing the invention of a language is significant in this respect. Schools are worse at keeping up their curricula with new technology (at least where I live), so if a language were invented while you're in college, I don't think that language will be taught while you're there.

I went to college in the late 90s through early 2000 and we were taught (introduced would be more accurate) Pascal, C++ and Visual Basic. When I graduated, the language in demand was PHP.


I think in the late 90s / early 2000s the big language to teach in university was Java, which is still basically true today.

As to the invention of a language being relevant, it's only relevant depending on how you define a previous language as being "obsolete." I was trying to play devil's advocate with my own point, and concede there may have been times in history when a lot was made obsolete during a 4 year period. The industry isn't really any swifter than academia though, which is why we still use 20 year old languages.


Its 2013 and in my university, the first programming courses are taught on pascal and modula2.

I think that later they teach you in C++ and Java at least...

I understand that you should use the right tool for the job (in this case teaching) but the basic concepts(and in general most difficult to get, because you dont know anything, not because the concepts are really that difficult) are almost the same on all languages


"My advice is to grok the fundamentals, the concepts that don't change."

That is what a Computer Science degree should be teaching - not the latest programming language and/or revision control system.

A CS degree makes a pretty poor introduction to software development in the same way that a physics degree makes a pretty poor introduction to bricklaying.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: