Will ChatGPT replace software developers? Or will it make their jobs more enjoyable by doing the heavy lifting when it comes to writing boilerplate code, debugging and testing? David Adams speaks to developers already using ChatGPT
Amid all the coverage of the uncanny and sometimes alarming capabilities of ChatGPT, less attention has been paid to its potential uses in software development. Yet this may turn out to be one of its most socially and commercially significant applications – if developers understand and mitigate the risks it could create.
Many developers are already using ChatGPT as a kind of research assistant, a source of useful code snippets; and/or a debugging or testing tool. Many also use GitHub’s Copilot coding tool for similar purposes, but ChatGPT’s ability to respond to conversational requests increases its value to developers.
‘Our experience using ChatGPT so far is that it can be compared to a very confident, but sometimes wrong, person at a pub quiz’
Dave Colwell, Tricentis
Peter Zentai, CTO at software development company OneBeyond, says he is “blown away by OpenAI and GPT”. “It can suggest a better method and that helps you to learn,” he explains. “It is also good for documenting functionality [of code].” And wherever it is used in development, “it gives you a productivity boost”.
But developers also recognise the need to retain human oversight. John Smith, CTO, EMEA at application security company Veracode, describes using ChatGPT to help write a tool to support a client demo.
“It came back with some code which I could take and modify,” he recalls. “Without it I would almost certainly have found someone on Stack Overflow with an answer, but I think ChatGPT came up with more useful content: not only did it give me the code, but it described how the code worked.”
A second enquiry also generated a useful answer, but a third question, about automating activity within a DevOps platform, revealed the model’s limitations. “It had a good answer but didn’t tell me that once you have asked the DevOps platform to create an object you then have to regularly ask if it has created the object. I knew from experience that if you don’t, then go ahead and try to use it sometimes you would get an unpredictable bug.
“To me that highlights the good and the bad of ChatGPT,” he continues. “If you’re able to see that the answer doesn’t quite work then you can [correct it]. The challenge comes when those mistakes are more subtle.” For this reason, he says, developers must never “blindly trust” the model.
This is arguably even more important if ChatGPT is being used to debug or test software, says Dave Colwell, vice-president for artificial intelligence and machine learning at automated software testing company Tricentis.
“Generative AI can be used to write basic software test cases, or identify gaps in testing requirements,” he says. “But our experience using ChatGPT so far is that it can be compared to a very confident, but sometimes wrong, person at a pub quiz. It always needs to be double checked against existing knowledge. Recently, we tasked it with rewriting a simple algorithm in C#, using Vector numerics – a method for speeding up the algorithm using data parallelisation on the CPU. The algorithm looked fine and was very fast. But it assumed vectors had a fixed length, whereas that was only true in some circumstances. This caused a bunch of hard to find, insidious bugs.”
ChatGPT security risks
There is also some debate about the extent to which use of ChatGPT can create security risks. In March 2023 OpenAI took ChatGPT offline temporarily, following detection of a data breach caused by a bug in an open-source library. In a blog post published a few days later OpenAI admitted the incident may have caused “unintentional visibility of payment-related information” of a small number of users; and that it was “possible that the first message of a newly created conversation was visible in someone else’s chat history if both users were active around the same time”.
The potential for the latter sort of vulnerability worries Smith. “That could have been a leak of intellectual property,” he says. The problem faced when trying to secure ChatGPT, he suggests, is that an attacker does not “need to attack the AI; [they] need to attack how it is accessed”.
But ChatGPT could also help developers improve software security, argues Jason Kent, hacker in residence at Cequence Security. He has written a blogpost, ChatGPT and API Security | Security with Open AI, highlighting the way ChatGPT could be used to debug or to find security flaws in APIs, for example.
ChatGPT legal risks
Developers should also monitor legal risks. GitHub Copilot is currently the subject of a class-action suit in the US, linked to licensing and original authorship of the open-source code upon which it is based. “Depending on the outcome … software developers in the UK and elsewhere could find themselves on the receiving end of similar legal actions brought by the developers of the original open-source code used to train Copilot or ChatGPT,” warns Karl Barnfather, partner and patent attorney at law firm Withers & Rogers.
But the perennial fear about new AI does not yet seem to apply: ChatGPT will not replace software developers. Instead, it could make their jobs more enjoyable, by absorbing or accelerating boring parts of the role, such as writing boilerplate code, debugging or testing.
Vadym Novakovskyi, a senior Java software engineer at Sony Europe and a tutor at online education platform CodeGYM, does fear ChatGPT could remove parts of some junior developers’ jobs, while adding to senior developers’ responsibilities. “We already have a problem of there being not enough developers in the world – I think this problem will become worse,” he says.
Nick Durkin, CTO at software delivery platform company Harness, is more optimistic. “When used properly, generative AI has the power to make good developers great,” he claims. “For developers with less experience, collaboration with AI increases what they’re able to accomplish.”
But whether ChatGPT becomes a mainstay of the technology landscape, or a footnote in tech history, what is certain is that generative AI will continue to improve, says Colwell. “In the near future, we will see generative AI able to express its level of confidence, or ask follow up questions, increasing the likelihood of improving its outcome,” he predicts. As these capabilities evolve, he suggests “its ability to be ‘human’ – unsupervised by an expert – is inevitable.”
Until then, these technologies can be a boon to software development. And by the time we reach that point, who knows what else generative AI may be able to do? Its impact on software development processes may then be the least of our concerns…
More on what ChatGPT means for developers
What is generative AI and its use cases? – Generative AI is the is a technological marvel destined to change the way we work, but what does it do and what are its use cases for CTOs?
ChatGPT vs alternatives for programmers – ChatGPT and its alternatives are set to be especially useful for programmers writing code. But just how reliable is the technology for developers? Antony Savvas considers what’s available and what the alternatives are
The challenge of using ChatGPT for search engines – Large language models (LLMs) such as ChatGPT may be emerging as complements for search engines, but there are still pitfalls to consider
Will ChatGPT make low-code obsolete? – Romy Hughes thinks that ChatGPT could do what low-code has been trying to achieve for years – putting software development into the hands of users
How to embrace generative AI in your enterprise – What are the use cases for embedding generative AI in your enterprise? How can it help ease burden of repetitive admin? What are its limitations?
Original post: https://www.information-age.com/what-chatgpt-means-for-developers-123502845/