If it hasn't been evident by all the articles published here on Dev Tester in the past year and a half, I love talking about testing. Whenever a thought about testing pops into my head, I write it down and see if I can make it a full-blown article. While I enjoy the writing process, it's a lot more fun when I get the opportunity to talk with others about their practices, either online or in person.

Most of the test-related discussions I have with my fellow testers and developers surround the work that we're doing at the time. We discuss our current projects, things we've learned recently, and sometimes - okay, most of the time - we vent about things we wish our teams could do better. Usually, our discussions land around what kinds of best practices we're doing to improve our daily work.

While most of my days are spent developing software, I sometimes spend long stretches of time automating tests than building new functionality or fixing bugs. As a software engineer who emphasizes test automation for all the development work I do, I often receive the following question: Do you do test-driven development (TDD)?

My answer is straightforward - 99% of the time, I don't do any test-driven development.

For some reason, this candid response takes others by surprise. When I ask why - especially when I know the other person doesn't do much test automation themselves - they simply assumed that I was a hardcore TDD practitioner. With all the talk I do about testing and automation, their thought is that I'm all in when it comes to test-driven development.

It's not just TDD, though. When it comes to best practices surrounding software development and test automation, there's a lot that I often don't do. For most projects, I've rarely dealt with formal test plan documents. I rarely set up reporting for most of my testing tools beyond the basics. I sometimes even skip writing any tests until later in the project's timeline.

Despite my desire to always have well-tested applications and encourage others that do their part when it comes to testing, why don't I always make sure I'm following certain practices to help achieve those ambitions? After years of working on different projects and teams, it boils down to two main reasons.

Reason #1: Time is finite

To me, one of the worst excuses when it comes to skimping on automated tests is "We don't have enough time." Often, this reasoning is nothing more than a cop-out, a way for project members to justify their lack of testing. Still, that doesn't mean it's always an excuse. You might have worked on plenty of projects with zero breathing room built into their schedule and know how impossible it feels to carve out time in those moments.

In an ideal world, developers and testers would have sufficient time to do their regular work and have it fully tested, whether through automation or good ol' manual testing. But the reality is that organizations set overly-ambitious schedules in most projects and leave the rest of the team to deal with tight timelines. We'll barely have enough time to finish our tasks, and the team must make sacrifices somewhere.

Sadly, it's rare that our tasks include testing as part of their definition of done. It's usually set as an afterthought and performed after most of the team has moved on to other work. At best, you have developers creating unit tests or a group of well-organized testers that know what needs testing. At worst, it's a free-for-all left up to every individual to decide what to test, and more often than not, that means not thinking about testing at all.

If you're working on a project with a super-tight deadline as a developer, you'll barely have enough time to code up the new functionality or patch the bug. During these times, the extent of testing is a cursory run-through to make sure your code doesn't blow up. The last thing you'll want to do is pressure yourself to write more code that no one outside the immediate team will ever see. Of course, that code is essential for the codebase's long-term health, but the stress of a time crunch won't allow you to dwell on that thought for too long.

The main argument for those who are fully into best practices like test-driven development is that they will make your work as a developer easier and more efficient. That statement might be true, but it depends on your current situation. The project needs to have a clearly defined scope throughout the project to take full advantage of TDD and other practices. In my personal experience, especially working in smaller startups, you'll often have an unclear path on top of your deadlines. This fuzziness leads to the other reason why you don't always need to follow best practices.

Reason #2: No plan can survive the battlefield

On paper, the beginning of any software development project feels like a clean slate. With no constraints from existing work and neatly organized tasks, everything forms a crystal-clear picture of how we expect the project to turn out when it's completed. But if you've been a part of any project from the start, you know that crystal-clear image will soon become a murky, hazy mess as everyone digs into the work.

This issue isn't necessarily a bad thing. As we make progress through a project, we're constantly acquiring new information. We learn how to do things better. We discover scenarios that others missed during the original planning sessions. We realize that the module someone's working on won't fit neatly in the grand scheme of things. Uncovering new details as we go along will cause constant changes in the project's life cycle, and it usually ends in a better result.

Inevitably, parts of the puzzle shift around, new pieces get added into the mix, and some get scrapped altogether. Focusing on best practices like test-driven development or test plans from the start can lead to a lot of wasted work. In the face-paced world of software development, sometimes it's a good idea to get something done quickly to see how it fits in the project. When there's a better degree of certainty in what's necessary for a successful project, you can always improve it in the future.

Practices like TDD and formalized test plan documents can still be a part of the development process, though. For example, some applications have deep business logic that's essential for the project. These areas will often have a clear definition from the start and will likely not change. If you're working on this logic (and you have the time), then these practices make sense to integrate early in the process. Otherwise, it's best to work with what you have and experiment early and often.

Still, the fact that project scopes change doesn't mean you have a free pass to put in sloppy work, thinking you'll always have time to fix things later. Letting small issues slide can lead you down a path with no return. You should always strive to do the best job you can within the parameters of time and information you have at hand, knowing that you'll be prepared if and when things change for your work.

Software development is all about trade-offs with one goal

In past articles on Dev Tester, some of the main disagreements from my peers come from saying something isn't necessary for a successful project. For instance, a few weeks ago, I wrote an article saying that most teams don't need to spend time on formalizing test plans. This article prompted a response about the necessity of test plans in any project. While I don't completely disagree with the author's thoughts on the subject, it reminded me that there's never a one-size-fits-all solution to any project.

One of the hardest lessons I had to learn as a developer was to learn to let go of my idealistic tendencies. Early in my career, I prided myself on following best practices in everything I did. In isolation, there's no problem striving to do the best work possible. However, that pride manifested itself in harmful ways throughout the rest of the team. I sometimes caused schedule slippage because I cared more about my work than what everyone else was doing. I would take exception in seeing what in my mind was a blatant disregard of doing things "right", whatever that meant.

My low point occurred when I unintentionally berated a junior developer for not writing automated tests for some new functionality they created. I learned later that they were under incredible stress to complete their work because someone was waiting on it to complete their tasks. My words also caused this developer to almost quit altogether. It was not my proudest moment, but the shame from that incident gave me a valuable lesson I carry with me every day.

Eventually, I learned that the most important thing in our work, either as a developer or tester, is to deliver the product to your customers. Good practices like automated testing or formal test plans help you get there. But it won't matter if those practices hold you back from shipping your project on time and when your customers need it the most.

One of the most popular sites out there, StackOverflow, recently published an article titled Best practices can slow your application down. The article explains, among other things, how their team purposely sacrificed testability for speed while building the site. Given their site handles tens of millions of daily visits, it shows that almost everything is a trade-off in software development and testing. As the article succinctly puts it, they're called best practices, not required practices.

Looking back at the different projects I've touched throughout my career, the most successful ones from a monetary standpoint are the ones with the messiest codebases. At first glance, it may look like the previous developers were sloppy and careless. But look deeper, and you'll find a history of compromises made under different circumstances. As you further your career, you'll learn to manage these issues better. No amount of best practices will let you safely navigate through this.

Summary

We might often feel like we need to establish certain practices early in our projects to help us deliver the best possible product. For instance, you may want to focus on test-driven development and always cover your code with automated tests or have formalized test plans from the get-go. However, these best practices aren't always necessary and can actually hurt your chances at a successful project.

Most projects are under tight or strict deadlines, requiring lots of work in a seemingly short timeframe. On top of that, you're likely going to encounter new details as you dig deeper into your tasks, leading to changes in the project's planning and scheduling. If you take too much time early in the process attempting to follow these practices strictly, you can find yourself running out of time to do the work itself.

Software development projects should have one major overarching goal - deliver a useful product to your customers' hands. While commonly-used best practices can help, they're not always required. Almost every facet of software development has trade-offs. You must be aware of them and willing to choose the best option based on your current situation - even if it goes against your idealistic vision for your work.

Which best practices have you followed without realizing they might be slowing you down? Share your experiences with others in the comments section below!