-->
  • How entry level developers are being squeezed out of the job field, and what they can do about it

    How entry level developers are being squeezed out of the job field, and what they can do about it

    Entry-level developers can overcome the increasingly steep barriers to entry in the field, but only if they are willing to invest the necessary time to get some hands-on experience using the tools that most shops use.
    By all measures that I have seen, the job market and demand for developers is doing well. And every prediction out there, both formal and "best guess" is that demand for developers is only getting stronger as the future depends more and more upon software. You would think that this would be a great time to come onto the job market, but unfortunately this isn't so. It is increasingly difficult to get work as an entry-level developer, and without a major shift in how most software is developed, it will only get harder.
    The culprits here are primarily .NET and Java, though the increasingly complex nature of Web development in general isn't guiltless either. The languages themselves are getting more complex over time. And yes, it is possible to say, "well, no one is forcing your entry-level developers to know all of the dark corners of C# to be a success." And what happens when the senior architect sets up the whole thing so that LINQ is the primary driver of the architecture?
    Look, I think LINQ is awesome, but the fact is, I had to read a fairly long, in-depth book to learn it. And that applies to XAML/Silverlight/WPF/WinRT/WinForms, Entity Framework (or NHibernate), SQL, ASP.NET MVC, and a whole host of other items that are the bedrock of a modern .NET application. There is a pile of knowledge you need to know and know how to use that knowledge in a project just to be considered a "competent" .NET developer. Meanwhile, people are graduating from college with 36 credit hours of total "computer science" (and increasingly, "information systems" or similar degrees). That is barely enough to teach them the basic fundamental principles of programming let alone the tools and information they need to survive in the .NET environment
    I look at the baseline knowledge needed to make it in a modern .NET or Java shop, and I realize that if I was faced with that reality upon graduation from college, there is a solid chance that I would have washed out of software development.
    And that is exactly what I am seeing with recent college grads. If they were taught in a classical computer science environment and know principles, then they seem to be sorely lacking in real-world languages and systems. For those that went to a more "modern" school that taught the "latest and greatest" they simply have not been armed with the proper fundamentals. Or to put it another way, C# and Java (and the .NET and Java ecosystems) are really poor environments to learn programming skills, but the systems that are good teachers are not going to be C# or Java. And what use do most shops have for people who not only lack experience in general, but also need to be taught the language and frameworks?
    This is not a situation that makes it a good time to be an entry-level developer.
    So that's the bad news. Is there any good news? I struggle to see a silver lining in this cloud, but I do feel that entry-level developers can do things to make it easier to get hired.
    The most important thing (and I hate to sound like a broken record here) is that they must get real-world experience. End of story. Unless schools suddenly start devoting the full 120 hours for graduation to programming and ignore general education and minors, or students combine undergraduate work in a solid, theory-oriented program with an internship or a Masters program rooted in real-world development, it is a guarantee that a recent graduate has gaps in their basic toolset that will take years to fill in order to be effective in a typical programming environment.
    Now, there are places where you can become effective with far less learning. A Ruby on Rails project, for example, has a significantly reduced amount of overhead to become productive. The Agile Platform tool that I have been working with is the same way. As a result, I feel much more comfortable hiring someone with less real-world experience (or a lack of knowledge of the full stack) for these systems, simply because the "full stack" is short enough for someone to learn in a reasonable amount of time.
    But, for the entry-level developer, openings for these kinds of positions are rare. For the average person, they will be looking at breaking into Java or .NET development. And really, those jobs are going to need a working understanding of a large portion of their stacks. I can't speak to the specifics of the Java stack, but I know that for the .NET stack, I would want to see that an entry-level developer had put together a project (as an intern, on their own time, for an open-source project, or as a volunteer at a charity or non-profit) using the following technologies:
    • C#
    • ASP.NET MVC or a XAML-based system (Silverlight, WP7, WinRT, WPF)
    • HTML5, JavaScript, CSS (for someone with ASP.NET MVC experience)
    • Entity Framework or NHibernate (or similar)
    • For bonus points, SQL and LINQ would be great too.
    In truth, simply seeing one or two simple projects leveraging these technologies would be great, and I think that for a senior in college, they should be able to either do a small project on their own, or work in something like a volunteer or internship situation where they will have the opportunity to do it.
    So, entry-level developers can overcome the increasingly steep barriers to entry in the field, but only if they are willing to invest the necessary time to get some hands-on experience using the tools that most shops use.

  • You might also like

    No comments:

    Post a Comment