Skip to main content
added 223 characters in body
Source Link
l0b0
  • 11.6k
  • 2
  • 45
  • 49

I've worked on projects with no testing, development driven testing and actual red-green-refactor TDD, and what you describe is something I might have written before trying actual TDD.

I’ve rapidly developed a prototype in 2 months that we will be testing with a company they consult with. The software will be used with 4-8 employees who will be testing it.

Is the source code for the prototype going to be thrown away before the beta? If so you may have been right in avoiding TDD, at least for the simple parts of the code base - the point of such a prototype is to learn or prove that something is doable. If not then you have written legacy code and will find that the difficulty of making changes without breaking anything rises very fast with the complexity of the software. This is the case for every piece of non-TDD code I have ever seen, and a lot of DDT code.

I had a meeting last night that solidified 4 features/concerns that, if I had done TDD for, would have wasted approximately 8 hours, as the tests and the code would have been thrown out.

This is a common straw man argument. First, effort spent on non-TDD work isn't equivalent to TDD work, because you would have ended up with different code. You just have to trust me on this; I've never seen any non-TDD code for which it's easy to write automated tests. Second, my conclusion from actual TDD is that it is faster to write tests and features than to just write features.

I’ve done about 200 hours of development, and would probably have about 150 hours of testing (complex industry rules to test!), but I have barely been keeping up with the development work as it is.

Again, in my experience your velocity would be higher if you were doing TDD, even under changing requirements.

It seems to me that testing before beta would have been a huge time sink, especially because requirements are changing.

And again, this is the typical view if you simply add up the time to write the feature and the time to write the tests. But in my experience DDT looks something like this:

  1. Write code in 1 unit of time
  2. Write tests in 0.5 to 2 units of time

Your application is now 1 unit harder to modify to implement the next feature. While TDD looks something like this:

  1. Write tests in 1 unit of time
  2. Write code in 0.1-0.5 units of time because by now you should know exactly what needs to be changed and how
  3. Refactor in 0.1-0.2 units of time

Your application is now 0.2 units harder to modify. These numbers will be very different if you start testing a legacy application, but you should see a marked improvement in time to change within tested parts of your application.

On the flip side actual TDD is hard. It takes a skillful mentor to demonstrate it, and it takes a long time to learn all the techniques necessary not to arrive at an unmaintainable set of tests. I think I can honestly say it's the hardest thing I've learned as a programmer, and I'm not done learning - especially avoiding large amounts of context in tests is hard.

I've worked on projects with no testing, development driven testing and actual red-green-refactor TDD, and what you describe is something I might have written before trying actual TDD.

I’ve rapidly developed a prototype in 2 months that we will be testing with a company they consult with. The software will be used with 4-8 employees who will be testing it.

Is the source code for the prototype going to be thrown away before the beta? If so you may have been right in avoiding TDD, at least for the simple parts of the code base - the point of such a prototype is to learn or prove that something is doable. If not then you have written legacy code and will find that the difficulty of making changes without breaking anything rises very fast with the complexity of the software. This is the case for every piece of non-TDD code I have ever seen.

I had a meeting last night that solidified 4 features/concerns that, if I had done TDD for, would have wasted approximately 8 hours, as the tests and the code would have been thrown out.

This is a common straw man argument. First, effort spent on non-TDD work isn't equivalent to TDD work, because you would have ended up with different code. You just have to trust me on this; I've never seen any non-TDD code for which it's easy to write automated tests. Second, my conclusion from actual TDD is that it is faster to write tests and features than to just write features.

I’ve done about 200 hours of development, and would probably have about 150 hours of testing (complex industry rules to test!), but I have barely been keeping up with the development work as it is.

Again, in my experience your velocity would be higher if you were doing TDD, even under changing requirements.

It seems to me that testing before beta would have been a huge time sink, especially because requirements are changing.

And again, this is the typical view if you simply add up the time to write the feature and the time to write the tests. But in my experience DDT looks something like this:

  1. Write code in 1 unit of time
  2. Write tests in 0.5 to 2 units of time

Your application is now 1 unit harder to modify to implement the next feature. While TDD looks something like this:

  1. Write tests in 1 unit of time
  2. Write code in 0.1-0.5 units of time because by now you should know exactly what needs to be changed and how
  3. Refactor in 0.1-0.2 units of time

Your application is now 0.2 units harder to modify. These numbers will be very different if you start testing a legacy application, but you should see a marked improvement in time to change within tested parts of your application.

I've worked on projects with no testing, development driven testing and actual red-green-refactor TDD, and what you describe is something I might have written before trying actual TDD.

I’ve rapidly developed a prototype in 2 months that we will be testing with a company they consult with. The software will be used with 4-8 employees who will be testing it.

Is the source code for the prototype going to be thrown away before the beta? If so you may have been right in avoiding TDD, at least for the simple parts of the code base - the point of such a prototype is to learn or prove that something is doable. If not then you have written legacy code and will find that the difficulty of making changes without breaking anything rises very fast with the complexity of the software. This is the case for every piece of non-TDD code I have ever seen, and a lot of DDT code.

I had a meeting last night that solidified 4 features/concerns that, if I had done TDD for, would have wasted approximately 8 hours, as the tests and the code would have been thrown out.

This is a common straw man argument. First, effort spent on non-TDD work isn't equivalent to TDD work, because you would have ended up with different code. You just have to trust me on this; I've never seen any non-TDD code for which it's easy to write automated tests. Second, my conclusion from actual TDD is that it is faster to write tests and features than to just write features.

I’ve done about 200 hours of development, and would probably have about 150 hours of testing (complex industry rules to test!), but I have barely been keeping up with the development work as it is.

Again, in my experience your velocity would be higher if you were doing TDD, even under changing requirements.

It seems to me that testing before beta would have been a huge time sink, especially because requirements are changing.

And again, this is the typical view if you simply add up the time to write the feature and the time to write the tests. But in my experience DDT looks something like this:

  1. Write code in 1 unit of time
  2. Write tests in 0.5 to 2 units of time

Your application is now 1 unit harder to modify to implement the next feature. While TDD looks something like this:

  1. Write tests in 1 unit of time
  2. Write code in 0.1-0.5 units of time because by now you should know exactly what needs to be changed and how
  3. Refactor in 0.1-0.2 units of time

Your application is now 0.2 units harder to modify. These numbers will be very different if you start testing a legacy application, but you should see a marked improvement in time to change within tested parts of your application.

On the flip side actual TDD is hard. It takes a skillful mentor to demonstrate it, and it takes a long time to learn all the techniques necessary not to arrive at an unmaintainable set of tests. I think I can honestly say it's the hardest thing I've learned as a programmer, and I'm not done learning - especially avoiding large amounts of context in tests is hard.

Source Link
l0b0
  • 11.6k
  • 2
  • 45
  • 49

I've worked on projects with no testing, development driven testing and actual red-green-refactor TDD, and what you describe is something I might have written before trying actual TDD.

I’ve rapidly developed a prototype in 2 months that we will be testing with a company they consult with. The software will be used with 4-8 employees who will be testing it.

Is the source code for the prototype going to be thrown away before the beta? If so you may have been right in avoiding TDD, at least for the simple parts of the code base - the point of such a prototype is to learn or prove that something is doable. If not then you have written legacy code and will find that the difficulty of making changes without breaking anything rises very fast with the complexity of the software. This is the case for every piece of non-TDD code I have ever seen.

I had a meeting last night that solidified 4 features/concerns that, if I had done TDD for, would have wasted approximately 8 hours, as the tests and the code would have been thrown out.

This is a common straw man argument. First, effort spent on non-TDD work isn't equivalent to TDD work, because you would have ended up with different code. You just have to trust me on this; I've never seen any non-TDD code for which it's easy to write automated tests. Second, my conclusion from actual TDD is that it is faster to write tests and features than to just write features.

I’ve done about 200 hours of development, and would probably have about 150 hours of testing (complex industry rules to test!), but I have barely been keeping up with the development work as it is.

Again, in my experience your velocity would be higher if you were doing TDD, even under changing requirements.

It seems to me that testing before beta would have been a huge time sink, especially because requirements are changing.

And again, this is the typical view if you simply add up the time to write the feature and the time to write the tests. But in my experience DDT looks something like this:

  1. Write code in 1 unit of time
  2. Write tests in 0.5 to 2 units of time

Your application is now 1 unit harder to modify to implement the next feature. While TDD looks something like this:

  1. Write tests in 1 unit of time
  2. Write code in 0.1-0.5 units of time because by now you should know exactly what needs to be changed and how
  3. Refactor in 0.1-0.2 units of time

Your application is now 0.2 units harder to modify. These numbers will be very different if you start testing a legacy application, but you should see a marked improvement in time to change within tested parts of your application.