Eppur Si Muove

 
 
The past month I have been in San Jose as a communication facilitator between teams, and since the real world motivates a lot of my posts, I'm going to review my thoughts on the overseas experience.
-- We are creating a formal process for QA testing in the company -  mostly ui testing or whole system integration testing.  The expectations of the formalisation process is that it will improve productivity and time to market.  Unfortunately I dont really think we will achieve those goals for the following reasons:
1. Return on investment: I have nothing full stack environment, ui types of tests. They catch bugs and they bring confidence. But at the same time there is a high maintenance cost to testing software through the ui just because uis tend to have high rates of changes. Ui tests can also be difficult to write so that they have high reliability. Often there is a much higher return on investment to focusing on unit tests with mocks to create internal states of interest in software. So I think the the company is going to find that the more tests they write the more time they will spend time trying to keep those tests stable, eating more and more resources with no long term gain in large picture software quality.
2. It already too late: Ui type tests hit the software at the end of the production cycle. That's not where we want to be figuring out if software is ok or not. If the goal is quality software then the emphasis has to be on teaching developers to write software correctly at the point of construction. That can be done and it doesn't take any fancy resources. If software is not correct when it is written quality cant be tested into it.
3. This formalization process is starting to get very custom and rigid. Everyone has to use the same tools. Everyone has to use our special home grown repository. Everyone has to package up their software in a special in house package format. Sorry, reality is that none of us want to or has the time to learn a special set of technologies just to test our software. Junit -- fine -- its simple and so its successful. If testing is going to be more complicated than using junit forget it -- junit is already to complicated for some developers. Mandating certain tools? Well, there is a trade off between uniformity (and its maintenance benefits) and rigidity by demanding that everyone fit into the same mold.  Productivity is improved by doing by finding better methods and adopting those methods. My heart is with keeping the door to new, better methods open.

-- The testing system we are building is an engineering support system but I dont believe people have thought clearly about the process.
1. Continuous integration not just a popular phrase: We are putting together a system to support continuous integration testing and its strange to me to see how this system doesn't match up to continuous integration systems that I have worked on in the past. In this company the whole focus of the testing is that there will be some deployable module and it will be tested in environment A-1 and once it passes it will go to be tested in environment A-2 and....tested in environment A-N. Where upon passing the module can be deployed to production (after manual testing ;))  Basically I feel our continuous integration process is "just what we do now only more automated so its faster". Continuous integration actually means, unless there is good reason, that software is tested in environments A-1...A-N in parallel. Its tested immediately after a commit. not a couple hours later after its passed test suites 1...N. We want to notify people of possible problems as close as possible to the introduction of problems not later when additional confounding or just the mists of time obscure the problem.
2. Prefer parallel to stepwise as a general process design principle: Efficiency is gained by creating processes that are continuous and parallel and not batch and stepwise. Unfortunately the human mind seems to have batchwise and stepwise conceptual tendencies, so people start thinking batches and steps and never get past that.

-- This month the development team spent a lot of time developing software that was later discarded as the wrong design.
1. Agile development is not an excuse to be stupid:  There are problems that have well known solutions, problems whose solution is discoverable, and problems whose solution is best discovered through an evolutionary attack. People who are enamoured of being agile want to evolve a solution to all their problems. This past month we were dealing with a well know problem with a few discoverable elements, a distributed team, and a distributed system. The architects in charge are trying to evolve this system.  There are large, knowable implementation grey areas where people are saying we'll think out the details later,  There are no specification documents - esp. for the distributed communication part which really annoys me. And one day of thinking could have saved us one month of coding.

-- And lastly, we are not testing our systems as we build them and we are not using our system to test.  Sorry, people who are building a process or a system should be using that process or system as they build what they are building. If the builders are not using what they are building then the builders are not serious about creating a product. They are just playing with company money while they congratulate themselves on their good ideas.

Do your best, Marco