Once a regular
time to start the day... now a unholy moment to get up. I got on the bus
at 05:42, the chauffeur hadn't even bothered to turn on the lights yet.
Was easy on the eyes though. Traveling by train was quite fine today,
unlike yesterday when I had to arrange a car on last notice because of
'actions by NS personnel'. Approximately 08:30 I stepped into 'Mezz'
for the Let's Test BeNeLux,
great venue when your tagline is 'For those about to Rock', since it's a
smaller (music)stage/ rockvenue. At registration already some familiar
and also loads of unfamiliar faces for me. Always easy to have the
longest name on the registration list; easy and fast find :-)
After some coffee I ran off to mainstage where James M. Bach was scheduled for the opening keynote about 'checking versus testing'. In style the keynote starts with some rock music by AC/DC and James plays te part with a striking pose :-). Interactiveness is encouraged and the 2Dcode is shown to download the deck on-site (saves notetaking) so I have an easy job only to have to write down the keywords and scribble my doodles down.
My interpretations of this keynote is that checking seems to be the fetish of people like managers, who don't understand that testing is more than automaticly running stuff but and that checking is part of testing. Testing being ' evaluation by learning through experimentation and exploration including questioning, modeling, observation, inference, etc. It's like morphine; something that's for professionals for use for a specific use, but not to be given to children.
When we look into testing there are four quadrants, consisting of spontaneous testing and checking and deliberative testing and checking, all activities no matter in which quadrant they are, are useful but it takes people who understand the matter to really make it valuable. The key is 'making sense' , which is the part that can't be automated (probably also the reason why 'sensemaking' has 'sense' or 'sentient' in it ;-))
As I see it, checking is something that can be defined and when you have difficulty defining it into a specific criterium, you'll probably have something before you that is in the category of sentience and non-checkable testing. Checking is something that is derived from algoritms.
In the QA I asked a question that referred to something that James called epistemic testability, which was explained as the things we already know. Together with the mention of the 'history oracle' (the things we see/find we already know), I wondered how to cope with the things we think we know.
As I interpreted James' answer this is the core of testing and he referred to the story of the 'Silver Bridge', which had a problem in it since the beginning but only after 40 years the problem emerged. He also mentioned having dinner; what are the acceptance criteria there, how are you going to define when you are done up front? It's all about discussion and conversation, but also having an attitude of acceptance; acceptance that problems can and will be in the things we test. With this knowledge and mind-bender, I went for the coffee break.
After the coffee break James Lindsay had a very energetic note about 'A nest of test'. First time I had to take out my laptop in a non-testlab room and test during the track!! How cool is that. Check out the IP: 52.16.45.184.
for some interesting teststuff. I really had a good time puzzeling around and figuring out what would cause the things I encountered. It was cool to test with a room full of people and having people hypothetising about the things seen on the screen when changing the parameters. I felt like this is what 'Let's Test' is all about; learning and especially doing together. Sorry for being so short in this part, but being very busy with tools, reduced the amount of time of being able to blog...
.... The continued...
What a fabulous lunch! Good food and a very sunny terrace outside with testing colleagues. It was almost too difficult to drag my ass into the venue again.
But I got myself up to listen to Jean-Paul van Varwijk about the challenges of implementing context driven testing (at Rabobank international).
Jean-Paul told about some Dutch context (the Dutch apparently have loads of publications about testing compared to other countries) and the steps that lead to the implementation of context driven testing. Rabobank, also because of the crisis and the wish to become more agile, changed to an organisation with 'Domain based delivery teams'.
It's surprising to hear about 'thought leadership' in this particular case, since lot's of times I have heard about the term thought leadership being perceived as a nonsense thing, since you can't give leadership to thoughts. My opinion around that was that it was that this thought leader is someone who knows his (or her!!!) stuff and guides people to investigate new things and to learn, educate and stimulate development; it was mostly honed away. Understand my surprise that the thought leader is described in this presentation as such!
Jean-Paul tells about the uncertainty about not having guidance and direction, he tells about being a bit down about the situation of not knowing where the organisation is heading, but is recently more enthusiastic because direction is more outspoken and he's even motivated to organise workshops again. I found this last part of this track the most valuable, since it (again) points out - to me- that having the organisation or management pointing into a direction or to have leadership, especially in turbulent times or change programs/ organisational changes (and implementations) is essential to keep your people motivated and stimulated and to keep reminding them that they are invaluable to the organisation, even during these times of turmoil.
After Jean-Paul, Joep Schuurkes took the stage to do a track called 'Helping the new tester to get a running start'. He made the analogy with learing to navigate a city to make a point that the 'usual suspects' as plain documentation, map, route descriptions, etc., won't make a newby in the company a happy starter. He has lot's of images of his home town of Rotterdam to explain the different aspects of introducing the employee in the company. For instance, when showing a picture of Rotterdam right after WOII (flat), he explains that a historic view might not be that interesting for your new team member, since they have to work on the now and future development, but then again we (IT in general) are too history unaware and an overview is important to know how you got there where you are. Slide by slide he ads and ads to the package, only to tell us that we need to become more abstract and have a more guideline like approach with the next key areas: provide structure, model the application (SANFRANCISCODEPOT-heuristic), model your approach to testing (mind the overhead hazard), guide interactions with the application and with the team, empower the new tester (mastery, autonomy, purpose) and the least; have fun!
I hoped to warm up in the sun during the afternoon break, the conference room being a fridge. But I ended up having a great conversation about conferences and German literature being an inspiration for a workshop about reporting (looking forward seeing it at one of the future conferences!).
Back to the stage in the fridge again. Andreas Faes starts his track, titled "Testing test automation model", with telling a story of the whale, experiencing different things in the "emptiness" of space and defining those things to create it's model to understand these. Loving the story about counting; 1,2,3,4,5,6,7,8,9,10,11,12,13, €... Euro being a number in the model of his son who has not grasped the concept of currency yet. By assimilation this model is correct in his sons mind, but who understands currency knows € isn't a number of course. About understanding models and verifying them...:-). Making a bridge to models in test automation, Andreas explains his path to the now, on the way explaining some historic concepts on the way and adressing what a implicit and explicit model is, but specifically how to get from an implicit (test) to a explicit (automated) model. The idea of what is mentioned here, domain specific language, sounds familiar to me and I can't help but think about 'Kenniskunde' (sorry for the international guys; it's a concept by Sjir Nijssen on use of proper Dutch language and mathematics and logic in the daily use) or 'Kennis Representatie Zinnen' (google translates this to knowledge representation sentences, but I wonder if this the same meaning), seems -like the article- a Dutch principle, but I'm sure there's a non-Dutch version as well. It triggers me to look into this matter more and it dissapoints me a bit that the track suddenly is over. It feels it's ended very abruptly and would have loved to have heard more about this, but I guess the fact that I am triggered is also valuable, so I have to be satisfied for now.
Instead of Jacky Franken, Pascal Dufour now takes the stage. Which I find a bit too bad, since I skipped Jacky's track in an earlier conference knowing I would see it here. The topic of Pascal is very relevant for me, so it makes up for the loss. 'Automation in DevOps and Continuous delivery' it is called. From continuous integration, to continuous delivery to continuous deployment. Continuous seems to me to ensure a constant, fast feedback loop to development, team or customer, dependent on what type of 'continuous...' is used. DevOps is then explained, because as I understand, to be truly agile in development, whether this is XP or SCRUM, development and operations should be 'on eachothers' lap' sort of speak; hence DevOps. I got confused during the track about DevOps, as it seemed as a line of tools to be able to push through a development lifecycle, but checking Wiki set me on track again. Getting back into the track again an example is shown of a check in cucumber and a summary about what is possible and to be done. And then suddenly the presentation is over and slides over to a discussion. Keeps me wondering about whether continuous integration, continuous delivery and continuous deployment also needs or implies continuous testing?....or is only checking then possible?...
After the testlabrats James Lyndsay and Bart Knaack had finished the testlab report and Huib Schoots closed the official part of the day, the crowd went to the bar or the hotdog stand by 'dokter Worst' outside, enjoying a hotdog, some fries and beer (or wine, or sodadrink etc.) and some after conference conversations. I called it they day when I had just finished my hotdog and (after all it IS almost an summer day) a glas of rosé.
I had an excellent day with good tracks, talks and I learned a lot. I think this Tasting Let's Test or this year called 'Let's Test BeNeLux' is a nice oppurtunity for those can't afford the 17000 (ex 25% VAT!!) Swedish croner to attend the full edition. Hope to attend again next year.
After some coffee I ran off to mainstage where James M. Bach was scheduled for the opening keynote about 'checking versus testing'. In style the keynote starts with some rock music by AC/DC and James plays te part with a striking pose :-). Interactiveness is encouraged and the 2Dcode is shown to download the deck on-site (saves notetaking) so I have an easy job only to have to write down the keywords and scribble my doodles down.
My interpretations of this keynote is that checking seems to be the fetish of people like managers, who don't understand that testing is more than automaticly running stuff but and that checking is part of testing. Testing being ' evaluation by learning through experimentation and exploration including questioning, modeling, observation, inference, etc. It's like morphine; something that's for professionals for use for a specific use, but not to be given to children.
When we look into testing there are four quadrants, consisting of spontaneous testing and checking and deliberative testing and checking, all activities no matter in which quadrant they are, are useful but it takes people who understand the matter to really make it valuable. The key is 'making sense' , which is the part that can't be automated (probably also the reason why 'sensemaking' has 'sense' or 'sentient' in it ;-))
As I see it, checking is something that can be defined and when you have difficulty defining it into a specific criterium, you'll probably have something before you that is in the category of sentience and non-checkable testing. Checking is something that is derived from algoritms.
In the QA I asked a question that referred to something that James called epistemic testability, which was explained as the things we already know. Together with the mention of the 'history oracle' (the things we see/find we already know), I wondered how to cope with the things we think we know.
As I interpreted James' answer this is the core of testing and he referred to the story of the 'Silver Bridge', which had a problem in it since the beginning but only after 40 years the problem emerged. He also mentioned having dinner; what are the acceptance criteria there, how are you going to define when you are done up front? It's all about discussion and conversation, but also having an attitude of acceptance; acceptance that problems can and will be in the things we test. With this knowledge and mind-bender, I went for the coffee break.
After the coffee break James Lindsay had a very energetic note about 'A nest of test'. First time I had to take out my laptop in a non-testlab room and test during the track!! How cool is that. Check out the IP: 52.16.45.184.
for some interesting teststuff. I really had a good time puzzeling around and figuring out what would cause the things I encountered. It was cool to test with a room full of people and having people hypothetising about the things seen on the screen when changing the parameters. I felt like this is what 'Let's Test' is all about; learning and especially doing together. Sorry for being so short in this part, but being very busy with tools, reduced the amount of time of being able to blog...
.... The continued...
What a fabulous lunch! Good food and a very sunny terrace outside with testing colleagues. It was almost too difficult to drag my ass into the venue again.
But I got myself up to listen to Jean-Paul van Varwijk about the challenges of implementing context driven testing (at Rabobank international).
Jean-Paul told about some Dutch context (the Dutch apparently have loads of publications about testing compared to other countries) and the steps that lead to the implementation of context driven testing. Rabobank, also because of the crisis and the wish to become more agile, changed to an organisation with 'Domain based delivery teams'.
It's surprising to hear about 'thought leadership' in this particular case, since lot's of times I have heard about the term thought leadership being perceived as a nonsense thing, since you can't give leadership to thoughts. My opinion around that was that it was that this thought leader is someone who knows his (or her!!!) stuff and guides people to investigate new things and to learn, educate and stimulate development; it was mostly honed away. Understand my surprise that the thought leader is described in this presentation as such!
Jean-Paul tells about the uncertainty about not having guidance and direction, he tells about being a bit down about the situation of not knowing where the organisation is heading, but is recently more enthusiastic because direction is more outspoken and he's even motivated to organise workshops again. I found this last part of this track the most valuable, since it (again) points out - to me- that having the organisation or management pointing into a direction or to have leadership, especially in turbulent times or change programs/ organisational changes (and implementations) is essential to keep your people motivated and stimulated and to keep reminding them that they are invaluable to the organisation, even during these times of turmoil.
After Jean-Paul, Joep Schuurkes took the stage to do a track called 'Helping the new tester to get a running start'. He made the analogy with learing to navigate a city to make a point that the 'usual suspects' as plain documentation, map, route descriptions, etc., won't make a newby in the company a happy starter. He has lot's of images of his home town of Rotterdam to explain the different aspects of introducing the employee in the company. For instance, when showing a picture of Rotterdam right after WOII (flat), he explains that a historic view might not be that interesting for your new team member, since they have to work on the now and future development, but then again we (IT in general) are too history unaware and an overview is important to know how you got there where you are. Slide by slide he ads and ads to the package, only to tell us that we need to become more abstract and have a more guideline like approach with the next key areas: provide structure, model the application (SANFRANCISCODEPOT-heuristic), model your approach to testing (mind the overhead hazard), guide interactions with the application and with the team, empower the new tester (mastery, autonomy, purpose) and the least; have fun!
I hoped to warm up in the sun during the afternoon break, the conference room being a fridge. But I ended up having a great conversation about conferences and German literature being an inspiration for a workshop about reporting (looking forward seeing it at one of the future conferences!).
Back to the stage in the fridge again. Andreas Faes starts his track, titled "Testing test automation model", with telling a story of the whale, experiencing different things in the "emptiness" of space and defining those things to create it's model to understand these. Loving the story about counting; 1,2,3,4,5,6,7,8,9,10,11,12,13, €... Euro being a number in the model of his son who has not grasped the concept of currency yet. By assimilation this model is correct in his sons mind, but who understands currency knows € isn't a number of course. About understanding models and verifying them...:-). Making a bridge to models in test automation, Andreas explains his path to the now, on the way explaining some historic concepts on the way and adressing what a implicit and explicit model is, but specifically how to get from an implicit (test) to a explicit (automated) model. The idea of what is mentioned here, domain specific language, sounds familiar to me and I can't help but think about 'Kenniskunde' (sorry for the international guys; it's a concept by Sjir Nijssen on use of proper Dutch language and mathematics and logic in the daily use) or 'Kennis Representatie Zinnen' (google translates this to knowledge representation sentences, but I wonder if this the same meaning), seems -like the article- a Dutch principle, but I'm sure there's a non-Dutch version as well. It triggers me to look into this matter more and it dissapoints me a bit that the track suddenly is over. It feels it's ended very abruptly and would have loved to have heard more about this, but I guess the fact that I am triggered is also valuable, so I have to be satisfied for now.
Instead of Jacky Franken, Pascal Dufour now takes the stage. Which I find a bit too bad, since I skipped Jacky's track in an earlier conference knowing I would see it here. The topic of Pascal is very relevant for me, so it makes up for the loss. 'Automation in DevOps and Continuous delivery' it is called. From continuous integration, to continuous delivery to continuous deployment. Continuous seems to me to ensure a constant, fast feedback loop to development, team or customer, dependent on what type of 'continuous...' is used. DevOps is then explained, because as I understand, to be truly agile in development, whether this is XP or SCRUM, development and operations should be 'on eachothers' lap' sort of speak; hence DevOps. I got confused during the track about DevOps, as it seemed as a line of tools to be able to push through a development lifecycle, but checking Wiki set me on track again. Getting back into the track again an example is shown of a check in cucumber and a summary about what is possible and to be done. And then suddenly the presentation is over and slides over to a discussion. Keeps me wondering about whether continuous integration, continuous delivery and continuous deployment also needs or implies continuous testing?....or is only checking then possible?...
After the testlabrats James Lyndsay and Bart Knaack had finished the testlab report and Huib Schoots closed the official part of the day, the crowd went to the bar or the hotdog stand by 'dokter Worst' outside, enjoying a hotdog, some fries and beer (or wine, or sodadrink etc.) and some after conference conversations. I called it they day when I had just finished my hotdog and (after all it IS almost an summer day) a glas of rosé.
I had an excellent day with good tracks, talks and I learned a lot. I think this Tasting Let's Test or this year called 'Let's Test BeNeLux' is a nice oppurtunity for those can't afford the 17000 (ex 25% VAT!!) Swedish croner to attend the full edition. Hope to attend again next year.
2 opmerkingen:
Thanks voor de samenvatting Nathalie! Op het laatste moment ben ik toch naar het buitenland gegaan voor werk, dus ik was er deze keer niet. Bij dankzij jou heb ik toch een beetje kunnen proeven ;-) Zie ik je weer op Agile Testing Days NL? (En mocht je nog geen kaaryje hebben check mijn Tweet!)
I am so sorry to have missed this :'(
Will I see you at TestNet later this month to catch up? :-)
Een reactie posten