It's OK Not to Write Unit Tests
-
You're still thinking solely along engineering lines. This is irrelevant to the situation I described. What MAY happen in the future is a probability, it is a calculated risk. Also you're weighing time spent developing now as equal to time spent developing in the future. This is wrong. Time spent now is more expensive then time spent in the future. Time spent now solving or mitigating theoretical future upgrades is much much much more expensive then time spent in the future fixing it. This is perhaps because you're seeing the software project as the be all and end all. It's not, the product helps others perform their tasks more effectively thereby improving their productivity and reducing costs. Cost savings net overtime and can be allocated into more productive tasks netting an even higher ROI. Delaying a product with the aim of producing perfect testing reduces the potential ROI by constraining the time the productivity boost would otherwise have to work. This ultimately will cost more than fixing issues in the future. In the end it's up to the project sponsers to weigh these competing factors and decide the most reasonable path. Creating unit tests, though feasible, for a reporting engine was simply not cost effective. Unit testing seems to have become the new Gospel of testing over the last 5 or 6 years, but like all IT holy crusades it was led solely by an Engineering mindset whilst ignoring that ever annoying bogeyman we like to call the real world.
10110011001111101010101000001000001101001010001010100000100000101000001000111100010110001011001011
Time spent now is a lot cheaper than time spent in the future. This is not about designing for features that don't yet exist. This is about developing a complete solution for the software you're delivering now. Maintainability is built-in, not added on, so are pretty much every other non-functional requirement.
-
I think you're missing the point. A unit test ultimately tries to validate the functionality of a single piece of code. Imagine a method being a math function. You simply want to validate that *that* code is correct, not the rest of the system. So you do that by mocking everything around it. It's a bad argument to say that by doing a unit test it doesn't test how the code works integrated with the rest of the system, therefore a unit test is not useful. No one is saying that the unit test is the only test for code. In fact, some argue that's not even a "test", but a way to verify the the design. You still need integration, functional, exploratory tests, etc.
Kenneth Kasajian wrote:
So you do that by mocking everything around it.
That's very arguable.
Kenneth Kasajian wrote:
In fact, some argue that's not even a "test", but a way to verify the the design. You still need integration, functional, exploratory tests, etc.
I consider it a test.
Kenneth Kasajian wrote:
You still need integration, functional, exploratory tests, etc.
I know that. My point was against 'academic' approach to unit testing in real systems. :)
If the Lord God Almighty had consulted me before embarking upon the Creation, I would have recommended something simpler. -- Alfonso the Wise, 13th Century King of Castile.
This is going on my arrogant assumptions. You may have a superb reason why I'm completely wrong. -- Iain Clarke
[My articles] -
Kenneth Kasajian wrote:
So are you saying you won't write unit tests so that you're not as easily replaceable?
Nope, personally I don't write unit tests because I work in a small shop with very experienced developers and we have a very good and thorough testing team and we can't afford the overhead because the benefit does not outweigh the cost. We tested that method of development and it turned out to be redundant for us, too slow and time consuming and caught none of the typical bugs that we normally always see anyway which are highly complex interactions between things caused by users doing the unexpected and unanticipated. My primary problem with many modern development methodologies is not that they make developers easily replaceable, that's just a side effect of my primary concern which is that they commoditize developers. In other words they reduce what used to be the job of a skilled and experienced craftsman to a job that can be done by any idiot with minimal experience. This drives down the value of all developers everywhere, i.e. salaries, job security and it often results in software that is less than exceptional. It's a method of risk aversion and risk aversion is pretty much everything that is wrong with every creativity based industry these days. It's why you predominantly get bland remakes of movies with the same small set of actors as every other movie instead of new compelling movies with new talent. In the end I suspect it's really not saving software houses any money in the long run commoditizing development. One highly skilled and talented and experienced developer is worth a whole team of cheap developers. Ask any experienced restaurateur whether they can put out more dishes of high quality with a huge brigade of average cooks or a very small highly skilled team and there's no comparison. Something the hospitality industry has known for centuries but the software development industry seems determined to ignore. The problem for software houses is they are afraid to have so much riding on so few people, i.e. risk averse. And no one is ever stuck at the same job, we do have free will after all. :) A highly skilled developer can get work they truly enjoy in any circumstances.
"Creating your own blog is about as easy as creating your own urine, and you're about as likely to find someone else interested in it." -- Lore Sjöberg
It's not risk aversion, it's risk mitigation, which is accepting whatever that would introduce the risk, but you mitigate against it. The cost of developers finding a bug and fixing it is always less expensive than software testers doing the same. Let them find the problems you cannot find. If there's a bug you can find by writing an isolated unit test, it would be more expensive if that test were to pass through to software test, and have them have to find it, reproduce it, write down the repeat steps, create an incident in a bug tracking system, then you fixing it, then they have to verify that you fixed it. When you increase the length of development by adding more time for developers to write unit tests, it's a much more predictable % increase, then if you were to deliver more buggy code software test. It's hard for them to predict how long their test cycle is going to be at the start of the project, when no code has been written. They have to predict how many additional cycles are required, which is based on how well the previous cycles went. Cycles 2 and on are extremely difficult to predict. In the end the cost of the project may be the same, but spending it in the more predictable stage of the project enables you to have more predictable release dates, which in the end, saves more money for the company.
-
Time spent now is a lot cheaper than time spent in the future. This is not about designing for features that don't yet exist. This is about developing a complete solution for the software you're delivering now. Maintainability is built-in, not added on, so are pretty much every other non-functional requirement.
Not if it delays deployment, and not if deployment is delayed to mitigate what *might* happen. Seesh, please read the replies. Time spent building a decent architecture may reduce future costs if you have a good idea of what will be required. If not it won't since you're designing against what might be there, and so far as I know there isn't a recognized profession of Clairvoyent Software Engineer. Similarly spending significant extra time developing unit tests of no immediate value to the client costs more than future fixes if it forces delays in deployment. Sales Persons had no reliable way of dynamically generating reports in the field. That means lost sales. That means lost revenue. That means lost profit. It's nice that you want to reduce the work load on future developers, but ultimately it's *probable* future work thus not real work or therein a real cost. It's considered a fractional cost as it's a risk and not an absolute which again makes work now more expensive. I feel supremely confident that I'm a good source to verify that CBA decision was a good one since I was actually there and helped make it. Apparently you were there too and disagreed? I'm sorry I don't remember you. All code dealing directly with the automation objects were abstracted to well documented classes. This was deemed sufficient to mitigate future risk. If you can recall what your viewpoint was at the meeting, please remind me ...
10110011001111101010101000001000001101001010001010100000100000101000001000111100010110001011001011
-
It's not risk aversion, it's risk mitigation, which is accepting whatever that would introduce the risk, but you mitigate against it. The cost of developers finding a bug and fixing it is always less expensive than software testers doing the same. Let them find the problems you cannot find. If there's a bug you can find by writing an isolated unit test, it would be more expensive if that test were to pass through to software test, and have them have to find it, reproduce it, write down the repeat steps, create an incident in a bug tracking system, then you fixing it, then they have to verify that you fixed it. When you increase the length of development by adding more time for developers to write unit tests, it's a much more predictable % increase, then if you were to deliver more buggy code software test. It's hard for them to predict how long their test cycle is going to be at the start of the project, when no code has been written. They have to predict how many additional cycles are required, which is based on how well the previous cycles went. Cycles 2 and on are extremely difficult to predict. In the end the cost of the project may be the same, but spending it in the more predictable stage of the project enables you to have more predictable release dates, which in the end, saves more money for the company.
I agree with the stipulation that you're describing a large company with cheap developers. In a small company with expensive developers the entire scenario is turned upside down.
"Creating your own blog is about as easy as creating your own urine, and you're about as likely to find someone else interested in it." -- Lore Sjöberg
-
Deyan Georgiev wrote:
I think this entire “unit testing” madness originates from Java
IMHO, it is from Smalltalk - switched to Java folks. The same crowd that invented "Extreme Programming", "Design Patterns", etc. Anyway, I am happy to see that even Java programmers are comming to their senses[^]
Nemanja Trifunovic wrote:
The same crowd that invented "Extreme Programming", "Design Patterns", etc.
I thought these were two opposite crowds. XP was from PHP etc developers, Patters were from C++ & Java world