Saturday, June 18, 2011

Test to Learn, Test to Debug, Test to Evaluate

The lot of discussion is going on about the differences between Systems Integration, Systems Testing, Development Testing, Operational Testing etc. A lot of confusion... totally unnecessary in my opinion.

The problem with the above definitions that they reference the types of testing to the methods, responsibility etc. but not to the purpose.

I'd like to propose more natural (for me) distinction - Test to Learn (or to Discover), Test to Debug (or to Solve Problems) and Test to Evaluate (or to Judge).

Learning is the process of acquiring knowledge that previously was not in the posession of the learner. The testing that is focused on learning is always explorative, with the path that may wind and twist and loop. More than that, Test to Learn will never end - there will be always missing knowledge.

Debugging is the process of finding anomalies, investigating the causes and fixing the system. Integration is essentially System Debugging.

Evaluation is the judging the system against some criteria in order to accept or reject some proposition. Testing is the activity that may be a part (but not the whole thing) of either the Debugging or the Evaluation. (System Testing is essentially System Evaluation.

When the managers say "Start Testing. When you'll find severe problems, investigate them and change the system. When the problems are not severe - proceed" they ask to start in the Evaluation Mode and change to Debugging Mode amidstream. Start as ST and change to SI if needed. No wonder that it's difficult for me to explain the difference between SI and ST - the managers frequently perceive the pair-wise encounters of some sub-systems that are done as part of sub-system testing combined with interface checking as the sufficient SI activity and try to drive the project into ST mode without conclusion of all SI activities such as E2E thread testing - they afterwards wonder why the ST phase takes as long as SI+ST phase that was proposed in the initial project plan and rejected as "too long ST".

I'd be delighted to forfeit talking about "System Testing" or "System Integration" and change to "System Debugging" (Subsystem Debugging, Interface Debugging, Whole System Debugging") and "System Evaluation" (or "System

Learning, Debugging and Evaluation are not just Testing - there is a lot more that's going on - discussions, arguments, reading, eriting, thinking - but when the Testing is concerned the disctinction of the type of Testing by purpose has its merits and for me is more useful.

Thursday, June 16, 2011

Fourth Secret of Integration - Intelligent Evolution

Some circles still dispute relative merits of the Theory of Evolution and Belief of Intelligent Design. Without entering the fray myself I want to ask: Intelligent? Design? The proponents of ID claim that the complex system we observe in nature are so beutiful that this suggest that somebody has designed them. The Book doesn't support this claim - it specifically states that God has created the world and not designed. So I suppose the ID people just grant us engineers too much credit - we tend to give up design and turn to evolution when faced with complexity not other way around.

Paul Graham in his essays reveals the secret of creative programming (and writing, and building start-ups) - build something that works and then iterate until it's good. Refactoring patterns of Agile Programming suggest the same. Fast Prototyping - once again. We know that the evolution is better than design. It's exactly what Integration is all about - evolving the system until it's fit for the Judgement Day - sorry - the System Testing.

So the Fourth Secret of Integration may be formulated as "The Integration is an Intelligent Evolution of (hopefully) Intelligently Designed Systems until they're fit to be Judged".

Tuesday, June 7, 2011

Third Secret of Systems Integration - There is Always Lack of Knowledge

One more illusion of Systems Engineering is the assumption that the system will behave as required and designed. This illusion is not hubris - in the software world it is possible (at least theoretically) to design the system to do exactly what is required - as long as the logic is the only concern. But when the real world starts to assert its rules of physics, chemistry and more than that - biology, psychology etc. - the lack of knowledge how the system will really behave becomes profound.

Sure that it is possible to know enough to any purpose as long as we're aware that the lack of complete knowledge harbors the eternal promise of surprizes. When the surprize hits it becomes the really wicked problem. The only solution to this eternal lack of knowledge is the eternal learning - and the Systems Integrators have to excel in learning just for their professional survival.

The Systems Integrators always start with the lack of knowledge - for all things equal they usually are not those who have designed this vile creature to be integrated. More that that - there are always parts on the bottom of the hierarchy (the Second Secret nonwithstanding) that are not created by us but are bought from somebody that may not reveal the whole bunch of their properties. Or even the natural things that for them the lack of knowledge is inherent.

The lack of knowledge is not an excuse for Systems Integrators. They have to tell the system's story (the First Secret) and to deal with the networked nature of the system (the Second Secret). So the Third Secret of Systems Integration may be formulated as "There is always missing knowledge that th SI has to discover in order to succeed".

Second Secret of Systems Integration - It's All about Networks

The (infamous) Vee-model of Systems Development Process tells us that the Systems Integration is all about hierarchies - you divide your systems into smaller and smaller pieces until you're able to make or buy the components and then you integrate them into larger and larger pieces until you've got the entire system. It's an illusion.

Really what you see in Systems Integration is a network of things interacting with each other in various ways depending on the external (or internal) excitations. You see matter, energy and data flow back and forth. And frequently your may discern the parts of the network that has much more interactions inside than outside those parts - so you can (for a time being) abstract that part as being an entity on higher level of hierarchy than the nodes of the network.

When you design the system you're free to abstract parts of it, define requirements from these parts, interfaces and input-output relations. That's convenient. In the Integration you are frequiently deprived of this luxury - everything looks like the network that doesn't organize itself naturally into the neat hierarchy of the design. Most surprizes are in those situations when the neat hierarchy shows its (unexpected) network-ness.

So the Second Secret of Integration may be formulated as "Everything is a Network - Hierarchy is an Illusion".

Monday, June 6, 2011

Integrate to Use vs Integrate to Sell

During the conversations with the Systems Integration and Test Discussion Group on LinkedIn , with the INCOSE discussion forums and during the meetings SI Workgroup of INCOSE_IL there were fair share of hair-splitting about what is Integration Test and what is System Test.

The usual gist is that the IT is done to "groom the system" until it works and the ST is designed to prove to the customer that the system works as required.

Something has bothered me until I've realized that I've felt that there's a little bit of hypocrisy in the distiction between the IT for internal use and the ST for the external show - me being for the whole length of my career on the customer's side. Never more. There is no hypocrisy - just plain business. I do my IT and then do the ST to sell you my product. You do your IT to integrate my product into your product and then do ST to further sell your product to the next guy etc etc.

Just like the VAT tax - the last one pays the bill and it is the ultimate customer - the user that has only IT to do and then use the system for the length of its lifecycle. There is no ST here because there are no requirements to falsify.

So that's crucial for the understanding that there is a kind of "Final Integrator" that is the User or someone working on his/her behalf and the type of Systems Integration that is done by the Final Integrator is very distinct from one that's done by any kind of Systems Developer. I propose to name this two types of integration the System Integration to Sell (or Ship) and the System Integration to Use and discuss them separately.