I'm far from an expert coder compared to most on here...but...what's the big deal? Once you've grokked server-side code and framework, is it really that hard to move from Rails to Sinatra or even to Django? Once you know databases, how hard is it to switch from SQLite to Postgres or to even Mongo? Same goes with Javascript frameworks.
Now, I'm referring to the scope of what a web-developer needs to know to interface with these technologies...obviously, a database engineer is expected to dive deep and know all the quirks/limitations of NoSQL vs SQL.
Is the complaint that "Oh shit, I don't know if I can learn new syntax?" Or is it more, "The hiring market is segmented by too many technologies for me to claim to be an expert at?"
I'm a .NET developer by day, and I've met numerous Java developers that have crawled into a .NET role, either as a contractor/freelancer or into a entry/mid-level role at a company, claiming that because they are fantastic Java developers they can pick up C# in no time at all.
Yes, if you have been programming for a number of years then the syntax will come quickly, and you'll find yourself able to use the language. Top that off with an existing code-base and it's fairly easy to get started and to write some usable code. However, in my experience, the kind of ego's we see on sites like Reddit and HN either churn out:
1) Decent code, but only after comparing their own code to what is already out there out of fear of writing something that people will laugh at, regardless of if it works or not.
2) Code that looks like a Java developer wrote it, either rewriting methods that are already a fundamental part of the .NET framework, not using LINQ/Lambdas or anything from C#2-4 and failing to use any of the built-in tools within Visual Studio to check code.
The better programmers fully immerse themselves in the differences between the languages and how they are used. They have chosen to use this new language/framework for a reason, and they enter that world knowing why they did so and what they need to know to become productive for that given task.
As I'm sure everyone on here understands, there is a huge difference between being able to write some code in a given language and working on a given project in that language with others.
This is generally solved with a 1 week onboarding processing into company culture/dev tools. Normally done by assigning an experienced engineer a week of priority time with the newcomer with the focus on showing company tools and techniques. This really shouldn't be an issue if managed even slightly.
LINQ or VS.net compiler features are not difficult to grasp.
So if your 'huge difference' translates into a week of onboarding time, then sure, you can call it a 'huge difference'.
Regardless of how good a developer is, there is simply no way that a single week will get a developer proficient in a different language to write sufficient code. Yes, any sane dev team will dedicate senior developer time towards getting a new member up to speed with the code base, but any transitioning developer working on a non-trivial project will not just pick up everything within a week. It might work like that in a beginning startup with a small code base, but it just doesn't work like that in your typical business., to truly understand how they work within .NET, and to conform to the current standard of how code is written within the company.
That doesn't even begin to scratch the surface of how long the transitioning developer has been an employee. It's very likely that this developer is brand new to the team, and even new lead developers or team managers take longer than a week to be proficient at a company.
Yes, LINQ, lambda expressions and many of the other constructs within C# are easy to understand. What takes time is learning where to use these things.
A transitioning developer needs a week of time with a senior dev and the code-base at minimum. They need at least a month to write code to the same speed and standard of another team member, and that is if they really try and adapt to their new tools and environment.
> "The hiring market is segmented by too many technologies for me to claim to be an expert at?"
I think that's a concern to many new programmers (myself included) when looking at the pace these tools and frameworks are coming out and then maturing.
some companies hire only for particular skills and experience with the technology they already use. these companies are looking for quick and dirty hires to do work on day 1.
other companies hire for the human himself/herself and will understand that a good hire will be able to learn new technology at a rapid pace and the important things have to do with learning ability, perspective, and drive.
As a computer science major this doesn't worry me at all. Learning the fundamentals, the theory, and the overall science leads me to believe we will be just fine on the market.
As an intern I use PHP on a day to day basis. At school classes have been fairly language agnostic, switching from C++, C#, Java, PHP, and others. I have classes that force you to do all your work in the terminal, and some that allow you to use powerful IDEs. Furthermore, I have classes that you rarely even touch a computer (Algorithms, which should be more aptly named Algorithms II since Algorithms & Data Structures is a pre-req).
In my experience, switching tools/languages/frameworks isn't as hindering as lacking core engineering skills.
Now, I'm referring to the scope of what a web-developer needs to know to interface with these technologies...obviously, a database engineer is expected to dive deep and know all the quirks/limitations of NoSQL vs SQL.
Is the complaint that "Oh shit, I don't know if I can learn new syntax?" Or is it more, "The hiring market is segmented by too many technologies for me to claim to be an expert at?"