Are Software Developers Becoming a Commodity?
Back in the 1990′s, the World Wide Web experienced an explosive period of growth; it was a renaissance age that helped shape the Internet as we know it today. However, as we look back now, that period was also akin to medieval times, in terms of the tools and technologies that people had to work with. In those formative days, it required a full-fledged, hardcore techie to set up a web site with basic content (and not just becuase of all the annoying animated images). If you wanted to build a fully functional e-commerce site to sell seven varieties of pet rock, you were looking at a major investment. That kind of project required a team of people with experience in a wide range of skills, including alien disciplines such as Perl, CGI and UNIX .
Of course, back then the term e-commerce site had not yet come into usage. People were still talking about the information superhighway, and you could only access it through primitive devices that prevented you from talking on the phone. No one had ever heard of blogs or social media, even though early forms of both already existed. There was certainly no Google to help you google, no Twitter to tweet with, and no Facebook to offer you endless diversion.
During that time, anyone who dreamed of becoming an Internet entrepreneur (or even just a hobbyist) needed access to serious technology firepower. Things had to be programmed, for goodness sake. Code needed to be written… and then compiled and linked and installed and tested. Servers had to be configured and networked, and of course nothing was compatible with anything. Creating a useful web application was tough work; the Internet really was serious business.
As the Web became more mainstream in both popular culture and business, there was an explosion in demand for the human resources to make it go. Programmers, server administrators, network engineers and database administrators quickly became hot careers with high employment. From 1994 to 2004, jobs in the Information Technology sector posted an impressive 8 percent annual growth rate, according to the U.S. Department of Labor.
Flash forward a decade or two, and a lot of things have changed. To start with, it is almost completely trivial to set up any number of robust, fully functional websites. Imagine a grandmother somewhere in Nebraska; for the princely sum of $19.99 per month, she can create and administer an online store, selling soon-to-be-famous cupcake recipes across the world. To support this new global enterprise (no doubt called GrannyCakes.com), what does she need? Nothing more than a love of baking and a PayPal account; no software developers need apply. In this current age of enlightenment, well entrenched technophobes can easily create dynamic, content rich web sites complete with social media integration, online advertising revenue, and comprehensive site analytics. They have no need for programmers or designers, much less anyone to help run the network and server infrastructure. With a cheap laptop and good cable provider, the internet is a vast canvas upon which you can easily make your mark.
These user-friendly hosting services are just the tip of the iceberg. There now are hundreds of platforms, frameworks, libraries, and tools that make it many times easier to design, implement, and operate a range of features and functionalities, many of which nobody had conceived of 20 years ago. In addition, you can hire a trained programmer living in some exotic (but very cheap) location for less than the cost of an experienced baby sitter in the United States. Seriously.
Consider this all together, and some people start to conclude that software development resources are a commodity, or soon will be. A case can be made, it seems, that the natural evolution of technology is leading toward a software utopia where anyone with an idea can assemble complex applications via point and click. For those few projects that might need a real live programmer, the assumption is that you can recruit from the armies of cheap offshore developers for a quick and easy solution.
So then, are software developers becoming a commodity? Not by a long shot.
What we see is a natural evolution, but it does not end with software development resources slowly becoming extinct. These self-service hosting platforms allow non-technical users to build sites and application using patterns that were established and proven long ago, such as marketing sites, blogging platforms, online stores, and social communities. Almost anyone can use them to create unique and possibly groundbreaking content, but they conform to rules (written by developers) which provide only a particular set of features. To extend or improve those features, it requires… you guessed it, more programmers.
In other words, it is now extremely easy to repeat what has already been done before, many times over. But to truly innovate - to create something new in the world of software (and ultimately create real value) - requires skilled development resources. And all the tools and frameworks being created and constantly improved don’t replace the programmer, they serve him. Without the knowledge and skill to leverage such tools, they are useless; a nail gun makes the job easier but it can’t build a house without the carpenter.
It is true that you can hire offshore developers for less money than U.S. based resources, but this fact by no means transforms the role into a commodity. There are many wonderful, highly skilled offshore programmers, but they are not the ones with bargain basement price-tags. Those that do come cheap are usually relatively inexperienced and require a high degree of management and technical direction. Regardless of the skill level of an offshore team, there is generally extra management overhead associated with differences in timezone, language, and culture. So while there are situations where it may make sense to send development offshore, it is still a crucial decision that can can make or break the success of any project.
All that said, certain advances have definitely shifted the landscape of technical resource requirements, such as the rise of managed server and cloud hosting. This has reduced the need for some organizations to employ dedicated system and network administrators, but those roles remain crucial for hosting companies themselves (and for larger organizations with complicated infrastructures). Other technologies like the Google App Engine go a step further in offering Platform as a Service (PaaS), and actually remove the need for any type of infrastructure administration, including the database. This is exciting and very promising for certain types of applications, but it does have a number of limitations. And of course, the primary ingredient to build a quality Google App Engine project is a skilled software developer.
Looking forward, will the situation be different in ten or twenty years? The pace of change and innovation will continue to accelerate, aided by the explosion of open source tools, libraries and frameworks (as well as the trend towards open commercial API’s, such as those provided by Facebook and Twitter). The specific technical skills that are most relevant, and thus in highest demand, will continue to evolve. Non-technical users will be able to create ever more powerful applications, but always within boundaries that can only be exceeded by programmers. As software of all varieties continues to become more central in our lives, the development resources needed to create and maintain that software will remain critical – and will not ever become a commodity.