The dangers of unit testing [modified]
-
OK, so for the last few days I have been writing a load of unit tests for some code that has already been mostly written (not wanting to be flamed for the obvious problem with this, just go with me here). The danger I am finding with these unit tests is that I am fast approaching a point where I would rather "refactor" so that I only have 1 method with a load of cut and paste code so that I have fewer methods to write tests for... [Edit: To the people telling me why its a bad idea to flatten the code like this, I did write *all* the original code myself, and do understand the point of methods and classes. I'm just fed up of this particular project, and no, actually it will not be me that has to maintain it, for reasons too complicated to summarise here :) ]
modified on Thursday, September 2, 2010 10:53 AM
-
OK, so for the last few days I have been writing a load of unit tests for some code that has already been mostly written (not wanting to be flamed for the obvious problem with this, just go with me here). The danger I am finding with these unit tests is that I am fast approaching a point where I would rather "refactor" so that I only have 1 method with a load of cut and paste code so that I have fewer methods to write tests for... [Edit: To the people telling me why its a bad idea to flatten the code like this, I did write *all* the original code myself, and do understand the point of methods and classes. I'm just fed up of this particular project, and no, actually it will not be me that has to maintain it, for reasons too complicated to summarise here :) ]
modified on Thursday, September 2, 2010 10:53 AM
c2423 wrote:
no, actually it will not be me that has to maintain it,
To me, this is the exact reason why you should leave the best code you can before you leave, and not the opposite. Think about that maybe once it could be _you_ who will have to take over the code from a frustrated guy who searched and replaced variable names with a1, a2, a3, ... Leave "Fire and forget" to the military industry.
-
Writing Unit tests is atmost a one time job, with only periodic maintenance over time. But if you do a bad job designing your actual code, it'll surely bite back later.
SG Aham Brahmasmi!
Agile Development - esp. Test Driven Design - seems to ride on a slew of assumptions that are completely ridiculous to me, e.g.
- Interfaces almost never change
- It's worth mocking databases, file systems, sound cards, external USB devices, analog test units (ok, I can mock Vista, that would already 80% of an OS)
- I can build an application from
BankAccount
,Stack
andList
. (seriously, have you seen any other example than one of these?)
Maybe interfaces don't change if you crank out forty-two customer-specific warehouse applications a year. Some of my interfaces change. A lot. because we need it faster, because we need it with more data, because changing behavior while keeping backward compatibility is expensive. Also, there's a good amount of code that's just to figure out something doesn't work we hoped it would. There are a few things that make this unevitable: research, our own hardware on a multitude of computers, 3rd party hardware. It's amazing how many things can break after years of working perfectly, sometimes without any cause that could be tracked down in reasonable time. Agile is all about "the target isn't known yet". It's all about "we don't know yet what we need in two months", it's about change (the (nonobamic type).
Agh! Reality! My Archnemesis![^]
| FoldWithUs! | sighist | WhoIncludes - Analyzing C++ include file hierarchy -
Agile Development - esp. Test Driven Design - seems to ride on a slew of assumptions that are completely ridiculous to me, e.g.
- Interfaces almost never change
- It's worth mocking databases, file systems, sound cards, external USB devices, analog test units (ok, I can mock Vista, that would already 80% of an OS)
- I can build an application from
BankAccount
,Stack
andList
. (seriously, have you seen any other example than one of these?)
Maybe interfaces don't change if you crank out forty-two customer-specific warehouse applications a year. Some of my interfaces change. A lot. because we need it faster, because we need it with more data, because changing behavior while keeping backward compatibility is expensive. Also, there's a good amount of code that's just to figure out something doesn't work we hoped it would. There are a few things that make this unevitable: research, our own hardware on a multitude of computers, 3rd party hardware. It's amazing how many things can break after years of working perfectly, sometimes without any cause that could be tracked down in reasonable time. Agile is all about "the target isn't known yet". It's all about "we don't know yet what we need in two months", it's about change (the (nonobamic type).
Agh! Reality! My Archnemesis![^]
| FoldWithUs! | sighist | WhoIncludes - Analyzing C++ include file hierarchypeterchen wrote:
Some of my interfaces change. A lot. because we need it faster, because we need it with more data, because changing behavior while keeping backward compatibility is expensive
I have worked on an application that maintained backward compatibility for 9 different versions in about 10 years. Seriously?? And all of them are internal clients. Go figure.
peterchen wrote:
It's all about "we don't know yet what we need in two months"
And at the end of 6 months, it ends up in "This is not what we had in mind". Am not the biggest fan of Agile. I'm convinced that it's a methodology that will work, but not for any type of project.
SG Aham Brahmasmi!
-
OK, so for the last few days I have been writing a load of unit tests for some code that has already been mostly written (not wanting to be flamed for the obvious problem with this, just go with me here). The danger I am finding with these unit tests is that I am fast approaching a point where I would rather "refactor" so that I only have 1 method with a load of cut and paste code so that I have fewer methods to write tests for... [Edit: To the people telling me why its a bad idea to flatten the code like this, I did write *all* the original code myself, and do understand the point of methods and classes. I'm just fed up of this particular project, and no, actually it will not be me that has to maintain it, for reasons too complicated to summarise here :) ]
modified on Thursday, September 2, 2010 10:53 AM
It's easy to go overboard on simple basic unit tests that don't provide a whole lot of value and just end up slowing down changes when you have to make them. Things like string parsing and complicated input validation are good candidates for unit testing, or complicated state machines are a good candidate. Also thinking about how to make sure your code is simple enough to be tested effectively can be a good thing. However, writing an interface for every single object in your app and tying everything together with an IoC and a huge xml file is definitely not a good thing. Also, don't overlook the benefits of good integration and fuzz testing. I've probably uncovered more bugs in my code by pumping random numbers and strings into them in random orders than from anything else. Also, spending time making sure your exception/error handling code logs a small amount of useful information, can pay off way more than making sure you have test coverage of every getter and setter in your app.
I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon
-
It's easy to go overboard on simple basic unit tests that don't provide a whole lot of value and just end up slowing down changes when you have to make them. Things like string parsing and complicated input validation are good candidates for unit testing, or complicated state machines are a good candidate. Also thinking about how to make sure your code is simple enough to be tested effectively can be a good thing. However, writing an interface for every single object in your app and tying everything together with an IoC and a huge xml file is definitely not a good thing. Also, don't overlook the benefits of good integration and fuzz testing. I've probably uncovered more bugs in my code by pumping random numbers and strings into them in random orders than from anything else. Also, spending time making sure your exception/error handling code logs a small amount of useful information, can pay off way more than making sure you have test coverage of every getter and setter in your app.
I can imagine the sinking feeling one would have after ordering my book, only to find a laughably ridiculous theory with demented logic once the book arrives - Mark McCutcheon
All good advice! I'm having to do quite a few tests because I have to run some data through a load of algorithms to get a result, and don't have any real data sources for another week (long story involving some broken servers) - so lots of unit tests and mock objects.
-
Agile Development - esp. Test Driven Design - seems to ride on a slew of assumptions that are completely ridiculous to me, e.g.
- Interfaces almost never change
- It's worth mocking databases, file systems, sound cards, external USB devices, analog test units (ok, I can mock Vista, that would already 80% of an OS)
- I can build an application from
BankAccount
,Stack
andList
. (seriously, have you seen any other example than one of these?)
Maybe interfaces don't change if you crank out forty-two customer-specific warehouse applications a year. Some of my interfaces change. A lot. because we need it faster, because we need it with more data, because changing behavior while keeping backward compatibility is expensive. Also, there's a good amount of code that's just to figure out something doesn't work we hoped it would. There are a few things that make this unevitable: research, our own hardware on a multitude of computers, 3rd party hardware. It's amazing how many things can break after years of working perfectly, sometimes without any cause that could be tracked down in reasonable time. Agile is all about "the target isn't known yet". It's all about "we don't know yet what we need in two months", it's about change (the (nonobamic type).
Agh! Reality! My Archnemesis![^]
| FoldWithUs! | sighist | WhoIncludes - Analyzing C++ include file hierarchy -
OK, so for the last few days I have been writing a load of unit tests for some code that has already been mostly written (not wanting to be flamed for the obvious problem with this, just go with me here). The danger I am finding with these unit tests is that I am fast approaching a point where I would rather "refactor" so that I only have 1 method with a load of cut and paste code so that I have fewer methods to write tests for... [Edit: To the people telling me why its a bad idea to flatten the code like this, I did write *all* the original code myself, and do understand the point of methods and classes. I'm just fed up of this particular project, and no, actually it will not be me that has to maintain it, for reasons too complicated to summarise here :) ]
modified on Thursday, September 2, 2010 10:53 AM
When you are almost done, you will discover a really basic flaw in one of your core algorithms that will require that you change all your code and that change will be such that all your unit tests will now be invalid and have to be fundamentally rewritten.
-
peterchen wrote:
Some of my interfaces change. A lot. because we need it faster, because we need it with more data, because changing behavior while keeping backward compatibility is expensive
I have worked on an application that maintained backward compatibility for 9 different versions in about 10 years. Seriously?? And all of them are internal clients. Go figure.
peterchen wrote:
It's all about "we don't know yet what we need in two months"
And at the end of 6 months, it ends up in "This is not what we had in mind". Am not the biggest fan of Agile. I'm convinced that it's a methodology that will work, but not for any type of project.
SG Aham Brahmasmi!
I do not doublt that you can do that. However, I object to the notion that "changing interfaces are caused by stupid design."
Agh! Reality! My Archnemesis![^]
| FoldWithUs! | sighist | WhoIncludes - Analyzing C++ include file hierarchy -
Agile Development - esp. Test Driven Design - seems to ride on a slew of assumptions that are completely ridiculous to me, e.g.
- Interfaces almost never change
- It's worth mocking databases, file systems, sound cards, external USB devices, analog test units (ok, I can mock Vista, that would already 80% of an OS)
- I can build an application from
BankAccount
,Stack
andList
. (seriously, have you seen any other example than one of these?)
Maybe interfaces don't change if you crank out forty-two customer-specific warehouse applications a year. Some of my interfaces change. A lot. because we need it faster, because we need it with more data, because changing behavior while keeping backward compatibility is expensive. Also, there's a good amount of code that's just to figure out something doesn't work we hoped it would. There are a few things that make this unevitable: research, our own hardware on a multitude of computers, 3rd party hardware. It's amazing how many things can break after years of working perfectly, sometimes without any cause that could be tracked down in reasonable time. Agile is all about "the target isn't known yet". It's all about "we don't know yet what we need in two months", it's about change (the (nonobamic type).
Agh! Reality! My Archnemesis![^]
| FoldWithUs! | sighist | WhoIncludes - Analyzing C++ include file hierarchyI've been arguing for a number of days, trying to have shown to me any true value-added with a (so called) agile design methods, test-driven-design, and an acronym-based hell. I though myself as coding version of a Luddite. No More ! By and large, all the real work - the stuff that makes a program something useful, has to be done anyway. All that's really been done is moving the effort to a different location. And, it's monstrous: reading a simple database and creating the classes, (&etc of whatever it's doing) created 698 files. All of these are based on conversion of the basic tables into classes. The real coding has yet to begin! Another peeve with this donkey-dropping methodology is that the tests are supposed to replace the documentation. Another )&)$#&(^#$ excuse not to make the code readable to anyone else. What ever happened to just "knowing what you're doing and doing it right"? I think I'll end my rent and go out to watch the approach of hurricane Earl.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein
"As far as we know, our computer has never had an undetected error." - Weisert
"If you are searching for perfection in others, then you seek dissappointment. If you are searching for perfection in yourself, then you seek failure." - Balboos HaGadol Mar 2010
-
As far as I have seen, agile and TDD are all too often used as excuses to just do the same poor quality work while justifying the lack of documentation. If followed properly and intelligently they *can* be applied to make better software.
What you are doing is not TDD. It is Test After Development. Where TDD is about testing that the requirements have been implemented. TAD is about testing the implementation, whether or not it does what is required. That being said, I unfortunately did a little TAD last week on some simple code I wrote. The first thing I discovered was that I couldn't write independent tests as I had some hard dependencies, one indirect on on the database. A quick refactoring, moving responsibilities to the correct levels of abstraction resulted in testable code. Additionally, the code was simpler and more robust. Oh yeah, I also found 3 or 4 bugs in the process. And this was in code that was working well. Keep going. You will end up with better code, and a suites of test that will help you keep it that way. Just remember, you need to evolve your tests as you evolve your code. Matthew
-
What you are doing is not TDD. It is Test After Development. Where TDD is about testing that the requirements have been implemented. TAD is about testing the implementation, whether or not it does what is required. That being said, I unfortunately did a little TAD last week on some simple code I wrote. The first thing I discovered was that I couldn't write independent tests as I had some hard dependencies, one indirect on on the database. A quick refactoring, moving responsibilities to the correct levels of abstraction resulted in testable code. Additionally, the code was simpler and more robust. Oh yeah, I also found 3 or 4 bugs in the process. And this was in code that was working well. Keep going. You will end up with better code, and a suites of test that will help you keep it that way. Just remember, you need to evolve your tests as you evolve your code. Matthew
Yeah, thats what I'm finding as I'm doing it. For the record though, I wasn't aiming at it being TDD. I'm trying to make sure that tests cover everything so that I can hand it all off to the next loser developer as cleanly as possible. As it happens I'm still on the fence a bit about TDD - I can get behind DbC as a methodology, but to write tests first and then code to conform to the tests still makes me a little uneasy. Something about it just doesn't smell right.
-
Agile Development - esp. Test Driven Design - seems to ride on a slew of assumptions that are completely ridiculous to me, e.g.
- Interfaces almost never change
- It's worth mocking databases, file systems, sound cards, external USB devices, analog test units (ok, I can mock Vista, that would already 80% of an OS)
- I can build an application from
BankAccount
,Stack
andList
. (seriously, have you seen any other example than one of these?)
Maybe interfaces don't change if you crank out forty-two customer-specific warehouse applications a year. Some of my interfaces change. A lot. because we need it faster, because we need it with more data, because changing behavior while keeping backward compatibility is expensive. Also, there's a good amount of code that's just to figure out something doesn't work we hoped it would. There are a few things that make this unevitable: research, our own hardware on a multitude of computers, 3rd party hardware. It's amazing how many things can break after years of working perfectly, sometimes without any cause that could be tracked down in reasonable time. Agile is all about "the target isn't known yet". It's all about "we don't know yet what we need in two months", it's about change (the (nonobamic type).
Agh! Reality! My Archnemesis![^]
| FoldWithUs! | sighist | WhoIncludes - Analyzing C++ include file hierarchypeterchen wrote:
Agile Development - esp. Test Driven Design - seems to ride on a slew of assumptions that are completely ridiculous to me, e.g. Interfaces almost never change It's worth mocking databases, file systems, sound cards, external USB devices, analog test units (ok, I can mock Vista, that would already 80% of an OS) I can build an application from BankAccount, Stack and List. (seriously, have you seen any other example than one of these?)
I'm not sure that it does ride on those assumptions. Mocking can indeed be painful, especially if you're trying to mock something that doesn't have an interface already. And on the subject of interfaces, it is annoying when you suddenly have to extract a bunch of interfaces just to aid testability. It's kind of like going back to C++ where you have separate header files and source files - too many files and too much boilerplate "stuff". The last time I used mock objects in a Java test, it was very ugly thanks to the excessive syntax, but there's probably a much better way that I'm ignorant about. OTOH, to suggest that TDD assumes interfaces almost never change doesn't feel right. The whole point (to me) about TDD is to help in exploratory coding and to provide some level of confidence in the code you've written so far. And since I picked up the habit in 2006, I can tell you it's been a lot of fun (mostly) and that I still try to use TDD where possible (in Java and Ruby these days). Also, there's absolutely nothing wrong with testing after the code has been written - if you inherit a codebase on a project, there's a strong likelihood that it will be poorly tested (or totally untested). Writing unit tests and refactoring as necessary (as the OP seems to be doing now) is a superb way of systematically working your way through the code and understanding how and why it works. Sooner or later your unit tests start to look like little use-cases which describe and validate the API you're testing. Bugs are uncovered, redundancy is made clear, benchmarking becomes reasonably straightforward (if necessary). Win-win! :thumbsup:
-
Yeah, thats what I'm finding as I'm doing it. For the record though, I wasn't aiming at it being TDD. I'm trying to make sure that tests cover everything so that I can hand it all off to the next loser developer as cleanly as possible. As it happens I'm still on the fence a bit about TDD - I can get behind DbC as a methodology, but to write tests first and then code to conform to the tests still makes me a little uneasy. Something about it just doesn't smell right.
Don't think of the tests a tests of the code. In TDD, the Unit Tests are executable expressions of the client and system requirements. Passing the test doesn't mean the code works, it means that feature or requirement has been successfully implemented. I find when I properly use TDD, I end up writing less code and it works. The final lines of code may be greater if you include the tests, but the actual number of lines written, re-written, deleted and modified is much smaller. Plus, I can prove I've implemented the desired functionality and have a set of tests that help me ensure the functionality doesn't get broken when I make other changes. Matthew
-
OK, so for the last few days I have been writing a load of unit tests for some code that has already been mostly written (not wanting to be flamed for the obvious problem with this, just go with me here). The danger I am finding with these unit tests is that I am fast approaching a point where I would rather "refactor" so that I only have 1 method with a load of cut and paste code so that I have fewer methods to write tests for... [Edit: To the people telling me why its a bad idea to flatten the code like this, I did write *all* the original code myself, and do understand the point of methods and classes. I'm just fed up of this particular project, and no, actually it will not be me that has to maintain it, for reasons too complicated to summarise here :) ]
modified on Thursday, September 2, 2010 10:53 AM
The only danger of Unit Testing is the tendency to rely on it as a silver bullet. The main problem I see with novices is not understanding how to test the only thing they're testing. That plus the frustration that comes from looking only at the short term loss in velocity without considering the long term gain when you get to be good at it, and leverage it. Mocking frameworks (nMock, Rhino Mocks) will help to make your Unit Tests really self-contained units. Only time will solve the velocity problem. While it's interesting that you wrote all the original code, flattening the code will pretty much guarantee that it will become harder to test, because you've reduced cohesion and introduced multiple code paths for something that probably used to be a collection of simple objects. This sounds like basically throwing away everything we've gained in OOP and going back to the purely procedural world circa 1960.
-
I've been arguing for a number of days, trying to have shown to me any true value-added with a (so called) agile design methods, test-driven-design, and an acronym-based hell. I though myself as coding version of a Luddite. No More ! By and large, all the real work - the stuff that makes a program something useful, has to be done anyway. All that's really been done is moving the effort to a different location. And, it's monstrous: reading a simple database and creating the classes, (&etc of whatever it's doing) created 698 files. All of these are based on conversion of the basic tables into classes. The real coding has yet to begin! Another peeve with this donkey-dropping methodology is that the tests are supposed to replace the documentation. Another )&)$#&(^#$ excuse not to make the code readable to anyone else. What ever happened to just "knowing what you're doing and doing it right"? I think I'll end my rent and go out to watch the approach of hurricane Earl.
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein
"As far as we know, our computer has never had an undetected error." - Weisert
"If you are searching for perfection in others, then you seek dissappointment. If you are searching for perfection in yourself, then you seek failure." - Balboos HaGadol Mar 2010
Be aware that I might be doing it wrong - just like almost everybody else. I agree with the notion that TDD is good for some projects, but not for all. However, there's no guideline for *what* project it would be good. So what's the point until we can tell at least that? Agile/TDD has some insights to offer: pair programming is more than lip service to the often repeated fact that people matter more than technologies. sprints - or however you call short cycles - can just mean coordinating changes so that you can deliver frequently. Automated unit tests do improve confidence in the code base. However, I see limits. There are many stakes to code - simplicity, maintainability, performance, extendability. Good design means that these go together - improving one also improves the others. Still, at some point, they start to work against each other. Testability is just another stake - for a long time, improving testability also improves other aspects, but it's nothing special, beyond that point you have to make tradeoffs. I have real problems with the "extreme" ends of TDD: I just can't write code that I know is wrong just to make a test pass. Frau Passig, commenting on capitalist vs. socialist economies, said very wise words: she just can't understand that any kind of planning - no matter how bad - can be worse that no planning at all.
Agh! Reality! My Archnemesis![^]
| FoldWithUs! | sighist | WhoIncludes - Analyzing C++ include file hierarchy -
peterchen wrote:
Agile Development - esp. Test Driven Design - seems to ride on a slew of assumptions that are completely ridiculous to me, e.g. Interfaces almost never change It's worth mocking databases, file systems, sound cards, external USB devices, analog test units (ok, I can mock Vista, that would already 80% of an OS) I can build an application from BankAccount, Stack and List. (seriously, have you seen any other example than one of these?)
I'm not sure that it does ride on those assumptions. Mocking can indeed be painful, especially if you're trying to mock something that doesn't have an interface already. And on the subject of interfaces, it is annoying when you suddenly have to extract a bunch of interfaces just to aid testability. It's kind of like going back to C++ where you have separate header files and source files - too many files and too much boilerplate "stuff". The last time I used mock objects in a Java test, it was very ugly thanks to the excessive syntax, but there's probably a much better way that I'm ignorant about. OTOH, to suggest that TDD assumes interfaces almost never change doesn't feel right. The whole point (to me) about TDD is to help in exploratory coding and to provide some level of confidence in the code you've written so far. And since I picked up the habit in 2006, I can tell you it's been a lot of fun (mostly) and that I still try to use TDD where possible (in Java and Ruby these days). Also, there's absolutely nothing wrong with testing after the code has been written - if you inherit a codebase on a project, there's a strong likelihood that it will be poorly tested (or totally untested). Writing unit tests and refactoring as necessary (as the OP seems to be doing now) is a superb way of systematically working your way through the code and understanding how and why it works. Sooner or later your unit tests start to look like little use-cases which describe and validate the API you're testing. Bugs are uncovered, redundancy is made clear, benchmarking becomes reasonably straightforward (if necessary). Win-win! :thumbsup:
destynova wrote:
to suggest that TDD assumes interfaces almost never change doesn't feel right.
My point exactly. I often go beyond these "no-go-areas" just to provoke by asking for rationales for obvious and accepted facts. My policy: Write unit tests for what can be unit-tested easily, forget the stuff that is hard. I don't trust anything that is preached as "if X is a problem for you, you should use more X".
Agh! Reality! My Archnemesis![^]
| FoldWithUs! | sighist | WhoIncludes - Analyzing C++ include file hierarchy -
OK, so for the last few days I have been writing a load of unit tests for some code that has already been mostly written (not wanting to be flamed for the obvious problem with this, just go with me here). The danger I am finding with these unit tests is that I am fast approaching a point where I would rather "refactor" so that I only have 1 method with a load of cut and paste code so that I have fewer methods to write tests for... [Edit: To the people telling me why its a bad idea to flatten the code like this, I did write *all* the original code myself, and do understand the point of methods and classes. I'm just fed up of this particular project, and no, actually it will not be me that has to maintain it, for reasons too complicated to summarise here :) ]
modified on Thursday, September 2, 2010 10:53 AM
I find using helper classes a good idea. The less cut n paste copied code out there the better. But don't put this cut n paste common code into inheritance heirarchies, then it becomes too complicated to morph. Less duplicate code the better. Use resharper. Tim
-
OK, so for the last few days I have been writing a load of unit tests for some code that has already been mostly written (not wanting to be flamed for the obvious problem with this, just go with me here). The danger I am finding with these unit tests is that I am fast approaching a point where I would rather "refactor" so that I only have 1 method with a load of cut and paste code so that I have fewer methods to write tests for... [Edit: To the people telling me why its a bad idea to flatten the code like this, I did write *all* the original code myself, and do understand the point of methods and classes. I'm just fed up of this particular project, and no, actually it will not be me that has to maintain it, for reasons too complicated to summarise here :) ]
modified on Thursday, September 2, 2010 10:53 AM
Talking about best practices for unit testing online is not easy because so many people are so easy to get on your case what you shoulda done, etc. So good job for having the balls to bring this forth. The issue that you're talking about is that of coverage, if think of coverage is executing lines of code with a particular program state. Whether a particular piece of production code is in a single method or multiple methods doesn't take away from having to write the unit tests that ensures the code is doing what it's supposed to do. Assuming you have 5 production methods, each of which can be easily tested with a unit test, then collapsing the 5 methods into 1 would not reduce the amount of test code. You'll still need those 5 unit tests, calling the one method now, with different parameters. But, now it may be more difficult to test.
-
Talking about best practices for unit testing online is not easy because so many people are so easy to get on your case what you shoulda done, etc. So good job for having the balls to bring this forth. The issue that you're talking about is that of coverage, if think of coverage is executing lines of code with a particular program state. Whether a particular piece of production code is in a single method or multiple methods doesn't take away from having to write the unit tests that ensures the code is doing what it's supposed to do. Assuming you have 5 production methods, each of which can be easily tested with a unit test, then collapsing the 5 methods into 1 would not reduce the amount of test code. You'll still need those 5 unit tests, calling the one method now, with different parameters. But, now it may be more difficult to test.