Posted on 22nd March 2024
Building good technology -
Much more than writing code
I believe coding itself is entirely joyless... unless you are producing something meaningful to you or others around you. There are certain related factors I've come to understand that determine how fulfilling I find software engineering work. In this post I hope to explain some of these factors and how they can either motivate, or contribute to a malaise in, the software development workforce, particularly amongst engineers.
Tools and 'developer experience'
Working with modern technology stacks can be productive but it can also be painful. We have a juxtoposition of amazing capability that allows us to build and roll-out software for a variety of platforms and yet product development often ends up an annoyingly convoluted process. Usually we're not building anything that requires the complexity of the tooling afforded to us. I personally find the less tools you need to build something, the more fun the process is, and the easier it is to explore and understand the solution space.
Your value as a developer is not whether you know .NET, React or AWS; it is in what you've actually worked on and how you went about it.
The more components we pull in from various sources (purporting to be generic solutions for what we are trying to achieve), the more difficult our job seems to become and the further away from an elegant solution the results become.
Building things in plain HTML, CSS and JS is the closest many new developers come to the enjoyable process of hacking around in interpretive coding environments I started to learn in. But before long junior developers are encouraged to 'upskill' and learn the latest web framework fast. What is largely forgetten about is developing a deeper understanding of the craft of working with technology to help people do things.
I'm not advocating for reinventing the wheel, merely that we be meticulous about deciding on the appropriateness of a particular type of wheel, and the need for it in the first place, to achieve our goals. While choosing a widely used framework has advantages in that developer recruitment and onboarding may be easier, there should not be an assumption that it is absolutely necessary from the start. The latest hot framework changes every few years anyway. Adding third-party tech later is much easier than trying to extricate yourself from it once you've embedded it into the foundations of your product.
In startup parlance I would argue that to truly create the minimum viable product, your starting point should be the minimum viable technology required to build it.
In short: the developer and user experience would be far better if we didn't bring in any moving parts until absolutely necessary. If it's possible to download a project's files and run it in the browser without the need for any additional tooling or platforms, you absolutely should allow other developers to do this.
A note to the engineers among us:
Much as we all bemoan JavaScript, it is a fundamental building block of the web. By all means avoid it completely if you can, but you must recognise using Node.js, TypeScript etc. is still using JavaScript, with added complexity on top. Try to just use plain JavaScript sparingly and it can be your friend.
Unrecognised talent of users and support staff
The longer I've worked in the IT industry the more I have come to an understanding that it is people that actually do the heavy lifting of getting work done, not systems. There are certain roles that may on the face of it seem entirely redundant and 'automatable' with today's technology that are actually most cost-effective still being filled by a human being.
It can be forgotten how often people plug the gaps in technology which is not entirely fit for purpose; transforming, summarising and generating data that the technology can't readily provide nor easily be adapted to accomodate. Some people are great at finding workarounds for the technology's deficiencies. It can definitely be the case that the cost, benefit and risk of modifying the technology does not stack up against the skilled employee who already dilligently gets the job done.
Additionally those employees often have insight into how systems, both technical and operational, can be improved. Tapping into their knowledge is key to making systems more efficient and yet from the POV of many executives when it comes to 'cost-cutting' drives, these employees' crucial input is not sought. In fact executives are usually thinking about ways to eliminate those staff from their payroll altogether.
It is unfortunate that after decades of hailing concepts like the 'Toyota Production System', or SCRUM / agile methodologies, where process improvement is baked into the workforce from the ground-up, we still often neglect to implement their principles systemically throughout our organisations.
Losing touch with our customers
The culture and economics of modern professional services seem to increasingly drive a wedge between the 'producer' and 'consumer'. As soon as you start dealing with any customer larger than a micro-business (companies that usually have very little budget for professional services anyway) you end up working within systems that tend to make it harder to deal with real people directly.
I am convinced that the further away from a customer (metaphorically speaking) a business providing professional services is, the harder it is to successfully deliver the solution they require. This is an observation I have made from years of experience. What I have also learned in that time is that this is an element of modern work that is not getting any better.
Technology is not separate from a business today, it is a core part of it, but this also means that the business and most importantly the people operating within it are not separate from the technology. They live and breath it every day and every digital pathway affects real people's lives.
You cannot build technology in isolation from its users and an organisation's business practices. Neither can you implement change in technology or operations without one thing impacting the other. Designing and building technology should not feel like churning out widgets in a factory.
Why do large IT companies and projects regularly fail their customers, often resulting in dissatisfactory 'resolution' after implementation, usually causing a lot of pain for users in the process? It is my strong belief it is because we do not pay attention to communicating effectively. That may relate to a missed requirement, a requirement a developer didn't understand properly or a feature that wasn't adequately tested. But sometimes it is more fundamental such as the voices of people affected by these projects not being heard or included in the process of designing them.
At the heart of this is a lack of user focus which is usually an aspect of development supported by designers and analysts. Trimming away the budget for these roles has definitely contributed to the problem. But in fact the success of every role in the technology industry hinges on effective communication, and I believe the lack of it to be a wider cultural problem in all modern professional services and not just a cost-cutting issue.
Inadequate testing
No amount of testing can account for all possible scenarios. Much like cyber-security, the efforts in testing software do have to be proportional to the risk. But testing is an area of software development that is rarely adequate. There are many different forms of testing, but even with a well thought out testing strategy it is possible we are not verifying the correct things.
This is an area where businesses tend to look for a 'silver bullet' to cut costs, going all-in on one particular form of testing. Unit testing and 'test driven development' can be important, but it is not a complete solution. It can place an additional burden on existing developers if the business opts to avoid hiring additional QA staff to help coordinate automated testing. On the other end of the spectrum manual testing is incredibly important too and can be cheaper to fund, but it rarely helps to quickly identify causes of problems and doesn't help to prevent them at source.
Then there is the fact that usually customers are expected to carry out some level of testing of solutions developed for them. This is important too, but the boundaries of responsiblity can become very blurred by this process, regularly leading to disputes about what was or was not valid acceptance criteria - because often none were ever defined in the first place.
A note to the executives among us:
In the worst case the lack of clarity caused by poor involvement of / communication with users and inadequate testing can result in significant oversights leading to serious problems, sometimes negatively affecting real people's lives, that can cost eye-watering sums of money to legally resolve. Senior executives need to understand the risk of not fully understanding the impact of technological change. It is a moral responsibility which if neglected can land an organisation and ultimately its leadership in hot water.
How do we fix this?
The naive solution is to increase the number of staff on projects and make sure some of them are given testing and analytical / design roles, which would definitely go some way to help. But often discussions around this kind of investment boil down to 'we don't have the budget for that'. So businesses prioritise employing what they see as the most valued employees in the process: the software developers, justified by the fact that they are the people you actually need to 'write code'.
What is forgotten in this argument is that you might as well flush the seven or eight figure sum you are spending on developers down the toilet if there are not enough people on the team that can communicate or develop an understanding of the customer, the users and their needs. Or if you are not hiring people, or giving them the time, to test the output thoroughly. While some developers can contribute to these aspects, on anything other than a trivial project, they cannot be solely responsible for design and testing as well as development. Addressing this issue obviously goes hand-in-hand with more honesty and transparency about budgets, costs and what customers are getting for their money.
Creating a culture where we gain a better understanding of what we do and how to improve on it is also key. Allowing people the space to have discussions with stakeholders and properly summarise and document the findings and learnings is important. It is also actually a part of the job of developers, testers and designers. It's just a part of the job they are not often given the time or agency to carry out effectively.
People 'on the ground' can advocate for these things as much as they like, but it is up to managers and leaders to truly foster this cultural change. To some extent it doesn't matter what methodology you follow, as long as you create a space for reflection and discussion on process improvement. Encouraging useful artefacts and documentation to be generated as part of the whole development lifecycle will help.
I've already written several times about the importance of design and user experience. But these issues are wider and have a siginificant impact. Addressing them will improve the satisfacation of people who work in the industry, no matter their role.
There are only so many projects you can walk away from thinking 'well it was painful but we got there in the end, sort of', before you start to experience burn-out. At that point people either start looking for a new job where things might be different or eventually retire from the industry altogether.
I think a great many talented people are lost from the industry as a result of these problems.
Most developers don't want to just be the guy who is just thought of as the expert in some acronym that will not exist in a few years. Nor do they want the unenviable position of being the 'last person standing' when a product is in a maintainance phase, solely responsible for it when something goes wrong, usually at 3am in the morning on a bank holiday. They want to enjoy building and delivering stuff that is tangibly out there in the real world affecting real people in a positive way, helping them to achieve things, rather than frustrating them.
Some organisations and departments do recognise these issues and some do better at addressing them, but I have to be honest when I say many don't seem to be making much progress on them.
My personal position on being a 'coder'
Over a decade and a half on, I still get that buzz from being a part of delivering real change for real people. Even if it is just fixing a bug that a single individual has reported, it genuinely gives me a little shot of accomplishment every time. That's why I still am working within the industry, and in some capacity will continue to do so, hopefully for some time to come.
But I have to admit I find firing up that IDE and staring at lines of code increasingly wearing. The factors outlined here seem to be getting worse, rather than better. So I can't help but feel that with every keystroke the code I'm writing is putting another brick into some impenetrable fortress of obscure hieroglyphs, separating the 'product' I'm working on from the people that might be using it.
This is not an attack on any single organisation's failings. The issues of poor developer experience, bad design and inadequate testing are all systemic within the industry. There seems a dearth of effective leadership at the top of the biggest organisations which wield the power to reset some of these shortcomings across the sector.
What am I doing about this?
It's easy to be an armchair critic, but there are a few things I'm doing myself to try and contribute to a change. First and foremost I'm learning some user-centric skills and improving my understanding of UX and design in general. I'm actively designing things, both as part of the UX design course I'm studying, and setting myself tasks and exercises to develop those mental muscles that help me think more about users in my work.
Beyond that, the more I write about and learn in this space, the more I am building an understanding of where there are gaps in the typical process of working with and developing technology. In particular I have some ideas about how we could improve the link between our work and the end-user. The key, I think, is connecting the worlds of the 'technologist' and 'user'. If this does turn into an actual project I think it will be a long, slow burn, but I hope one day I can play a part in improving the process of making technology.
Distilling all this down into some simple recommendations
Analysing my own writing here, which I admit is a fairly lengthy stream of observations, there are a few recommendations I can boil this down to which certainly would help anyone in the industry if they can be addressed by themselves, their colleagues and the organisations they work for.
So in no particular order here's some principles that can be applied which might just get more satisfactory results that make a job writing code more appealing:
-
Start with as few tools, platforms and third-party components as possible and add them sparingly only when necessary
-
Communicate early and often with the most diverse range of stakeholders possible
-
Build feedback loops into all processes to allow for continual improvment
-
Generate and curate documents and artefacts to capture valuable information not embedded in designs or code
-
Test your developments, and regularly 'test your testing' to determine if your test strategy is still adequate
-
Above all, do everything you can to improve the communication and flow of information between the people using the product and the people creating the product.