My reason for titling this post as such and posing it as a question is due to the uncertainty that the future of AI poses. This post will focus specifically on how implications of AI on the job of software developers. We’ve already seen impressive code generation by Large language models (LLM), leaving many developers to question what the future of our role as engineers may look like. This is a topic I’ve pondered quite a bit lately and it’s been far too long since I created a blog post, so I decided to share my perspective on the topic. I am by no means an expert in Artificial Intelligence, but I am an experienced engineer and this is a field of research that I have studied and am compelled by.
NOTE: Since I wrote this article, ChatGPT-4 has demonstrated the ability to self repair incorrectly generated code. Naturally this changes my view on the matter and am confident what I note in this article will be a phase prior to us reaching a point where we have LLMs generating a complex code base with relatively minimal human interaction. Only time will tell.
The Transformer model released by Google in 2017 was certainly a game changer in the realm of deep learning. Due to the additional context this model can utilize compared to a more traditional approach such as Recurrent neural networks (RNN), we’ve seen huge improvements in newer LLMs that use the Transformer model. This is rapidly changing the field of Natural language processing (NLP) in general. Many developers already use tools like GitHub Copilot, which utilizes OpenAI’s Codex system which was created with the focus of translating natural language to code.
My view is that within the coming years, more tools like this will be created, and the role of a software engineer will shift. I find this both exciting and a bit dreadful, because I must confess I don’t mind writing my code from scratch. If a tool such Copilot is able to include the context of the code base it’s operating in, alongside the source code of dependencies, the industry will change rapidly. Although it’s somewhat difficult to put a timeline on these things and I’m hesitant to do so, I believe this will be achieved in 5 years at most. This is taking into account Microsoft’s ownership of GitHub and large investments in OpenAI. I don’t currently use Copilot, as I enjoy writing the code myself and fully comprehending it, alongside finding the ideal solution to complex problems. Based on my anecdotal experience/research, neither Copilot or ChatGPT (I do use ChatGPT often) are currently capable of solving problems I perceive to be complex. I still acknowledge that given some development time, anybody who isn’t using Copilot will be at a huge disadvantage compared to developers who choose to use it. I don’t think the job of a programmer will ever disappear, so lets get into how it will change now that we have a good enough concept of the current state of the industry regards to code generation.
What will we be doing?
I strongly believe in the coming years these code generation tools will have widespread adoption amongst developers, perhaps even enforced by employers in the future as part of your developer environment. In the near future we’ll be using the backing of things like OpenAI Codex to simply describe the code we want to be written, and then perhaps manually writing a unit test for that code (or generate it.) It is vital that we programmers stay vigilant when adopting these tools, as it’s impossible to guarantee that any generated code is completely correct. Our job will be to validate the code does what is intended, look for edge cases the LLM may have missed, fix mistakes in any regard, and make refactors if necessary. I’m very curious to see how the industry evolves and this is my view as to how AI will change the role of a Software Engineer in the near future.