vrijdag 30 april 2021

I test, therefore I learn. I learn, therefore I test.

[This blog was originally published as Dutch article in TestNet Nieuws
In one of my previous blogs I wrote about the centipede and while I wrote it with sarcasm in mind, especially when working in consultancy it becomes evident -with all the hiring-desks and brokers that have nested themselves between client and supplier- that the illustrated jack-of-all-trades is actually asked for. 
Hiring agencies and selection desks use (a form of) semantic tests (checks) to see if the requested data and - if applicable- the accompanying certificate - is in the resumé. Every tester knows - off course- that a semantic test is there to verify the validity of inputs and not - while the name might suggest otherwise- an elementary comparrisson technique that is necessary to thouroughly test functionality based on pseudocode or similar specifications. 
While I was hoping that my years of experience would get me somewhere nowadays, nothing is less true. No certificate: no job. So I had to go back to school and 'hit the books' to be able to test again. Much of the requested certificates are (based on) knowledge that I already have experience in in practice but I never got the official certificate. So I had to study again, Especially since practice now seemed a pitfall for the exam. The exam has questions with multiple choice answers which are derived directly from the theory. I figured out very quickly by doing practice-exams that I started doubting the answers because in practice, the solutions are never applied so black-and-white as the anwers describe OR in practice a combination of answers is applied. The exam demands the 'most fitting answer', but in practice that is dependent on the situation...

Letting go the thought of 'that is the way things work' for the benefit of 'this is how it is described in theory' was an eye-opener for me. How many times did I myself explain to somebody that was starting as tester "yes, the theory might say that, but in practice it works like this...". In this case it is "yes, in practice you are doing it like this, but the theory states that..." And then I was also bothered during the exam by the thought: "I think it is like this in practice, but the theory probably is stating this..." and afterwards when checking the answers it seemed that my knowledge of how I would have done it in practice, was the right answer. It is a big hassle to relearn everything that I learned in my early days when I began as tester after all these years of practice. The saying is 'as the twig is bent, the tree is inclined' but aparently testing this is easier said than done.

Another eye-opener was that I thought that 'SINO' (SCRUM in Name Only) and thus SCRUM-mis-use was very common. Well, than you should dive into 'Prince2'... 'PINO' (Prince2 in name only) and the mis-use of that name is far more serious and widespread than or will every be the case than with SINO. Did you know for example that testing and quality (yes, yes!) is an embedded part of Prince2? Did we as testers got fooled by project managers that were able very skillfully hide testing under the carpet... I have seldom seen an exception report or have seen one rule in a risk ledger which Prince2 describes as one of the deliverable to a steering committee when an activity can't be executed according to plan of when there's an increased risk. Read: when not enough testing can be done or testing cannot start on time (of finished in time)... well... never too late to learn...
Also in the upcoming year(s) 'learning' seems to be the trend at various (test)conferences, as it has been the passed years with themes like "Learning to test, testing to learn'  (EuroSTAR)  and ' “Verbreed je basis: nieuwe vaardigheden voor testers!” (TestNet).  These conferences point out the importance of learning in practice and learning new skills. I'm willing to bet that one skill isn't thaught and that is the learning itself. One needs to learn to learn! Where the theory ends in practice, when to apply theory and when in practice one needs to use theory as basis for argumentation. But HOW you should do that, isn't thaught. 
In my experience (learning) the theory is increasingly seen as something 'dirty', let alone that we verify the learned by doing an exam ánd getting a certificate for it. The certificate isn't saying anything about the skill of the tester. But I find this a very black-and-white line of thought. Especially 'us testers' should know the importance of verification, validation and even falsification and we finalize this by writing a report. If I would state that the report doesn't say anything about how the systems performs it would be totally quatsch. We know exactly how the tests were performed and how the system (does not) perform, but the report reflects what we have tested. 
Learning and doing are entwined and  -for me- also getting a certificate is part of that. Because I can be convinced of my skills and knowledge, but my future employer likes to have that in writing and would like the evidence by seeing a certificate. And that is why this verifying and falifying centipede is still studying actively on upcoming exams. I test, so I learn and I learn, so I test!

Geen opmerkingen: