Reflection from Let’s Testlab 2013 Martin Jansson

Planning a test lab takes a lot more time than you think. You prepare a lot of things and try to make the event great. Here are a few things that I considered …

What applications/systems to test?

  • how many would be enough?
  • what type of system?
  • how big or complex?
  • are they fun to test?
  • do they have enough faults in them?
  • are they open source?
  • is the project/company interested in feedback?
  • has the system been used elsewhere in test labs recently?
  • can we gather enough test data or other material for it to be testable?
  • would these trigger interesting discussions?

How do you share information about the test lab?

  • wiki or web page or something else?
  • does anyone read anyhow?
  • how to report a bug is probably a good idea?
  • does testers with different skill level require different information?

What is your schedule and how are events set up in the test lab?

  • have you gotten hold of any of the speakers that want to be part of the test lab to extend their talk?
  • will the speaker show up, you do get tired after a talk?
  • depending on the conference, the participants will be receptive to different things
  • depending on the conference, the main schedule will be different
  • if the test lab is done during the evening, it should probably end before people fall dead
  • what other sessions or events are performed during the test lab that you might want to sync with

What venue assistance do you get for the test lab?

  • can you get print outs during the test lab in any format, size and color?
  • do you gain access to white boards, flipcharts, pens, scissors, tape and papers?
  • do you have power and cords everywhere?
  • can you present in the test lab in two places?
  • how many can be in a room without violating security or restrictions for safety?

How have you handled sponsors to help with the test lab?

  • do you have a sponsor that handles all the client machines?
  • do you have a sponsor that handles the servers and wifi?
  • do you have sponsors that want to install tools?
  • do the main conference sponsors have specific requirements for the test lab?
  •  have all sponsors installed their tools on the client machines?
  • will the sponsors participate in the test lab and help out other testers during the events?
  • are the sponsors aware of the expectations in the test lab?
  • is it possible to have sponsors in the test lab?
  • what shallow agreements do you have with the sponsors?

How do you intend to handle bug reports for the various systems in the test lab?

  • will you set up categories for the bugs so that you tailor where they report bugs?
  • will you leave the categories open so that the testers have more freedom?
  • will you use a separate bug system such as bugzilla or mantis?
  • will you use a solution such as Redmine or Trac to handle wiki and bug system in one?
  • will you have a few bugs reported before hand to help guide participants?
  • will you review bug reports and help participants when reporting?
  • will you when finished report all bugs to the project owners of the systems?

How do you handle information from the projects and owners of systems under test?

  • do you ask for what they would think is valuable?
  • do you ask for a mission for the testers?
  • do you ask for their fears, risks or rumors that they wish investigated?
  • are you at all interested in what they think?

How do you handle builds and versions?

  • do you set up oracles such as earlier versions?
  • do you have nightly builds?
  • do you have a recommended version on a USB?

How do you handle existing information about the testing of the systems under test?

  • do you gather session notes, mind maps, test matrices or some other kind of artifact?
  • do you gather models of the system and coverage models?

How do you handle test data for the systems under test?

  • have you set up test data based on a domain analysis?
  • have you structured the test data for ease of use?
  • have you documented the test data so that testers of different expertise can understand and use it?
  • have you test data to perform load tests or performance tests?

How do you organize testers in the test lab?

  • do you let them just sit down and test randomly?
  • do you try to gather them based on specific missions or skills?
  • do you let teams form on the spot?
  • do you have predefined teams that have booked spots?
  • do some speakers have booked spots for others to join up around?

How do you handle pins, prizes, awards or rewards?

  • do have your regular set of test lab pins to hand out?
  • do you have specific prizes for specific parts of events?
  • what do you promote in the test lab that you would give an award to?
  • are you inspired by spirit of the game award from Ultimate Frisbee?
  • do you have any fake certificates to hand-out to promote a special ability or skill?
  • do you have T-shirts or any free give-aways that can be handed out?

How do you handle bells and whistles?

  • do you promote participants making sounds when they find bugs?
  • do you promote focus and silence instead of interrupting sounds?

How do you work with your partner[s] in the test lab?

  • do you split things between you?
  • do you cooperate on everything?
  • do you have a day each?
  • do you have a partner?

How do you handle artifacts generated from the test lab?

  • do you follow Open Source Testing principals by storing the artifacts in a public repository?
  • do you save them for the next conference?

OK, now you see a few of the things we consider before the conference. So, what happened at the conference?

James Bach talked, among many things, about something called “Shallow Agreement”. This is the first thing I experienced regarding lab setup. One of the sponsors had sent me 15 laptops that we could use in the test lab. Perfect! What was shallow with our agreement between us was what a tester do with a laptop. I will not, hopefully, do the same mistake again. What I should have clarified was that I expected the laptops to have full administrative access. Tester want to install, uninstall, reinstall, monitor applications, install our favorite tools and share them among fellow testers. We probably want to use our favorite editors and some probing tools. We probably want to be able to start and stop services, kill any process and change the behavior of the system. Basically, we want to be admins on the environment we are testing. Without that we are blocked. So, the next morning I sent all laptops back to the sponsor since I was not able to gain admin access because of sysadmin policies. It was a stupid mistake of mine to not make it clear to the sponsor what I expected and how we should operate the laptops. Since the sponsor was a test tool vendor I just presumed that they knew. But no matter what company you work with there will be shallow agreement that you need to identify and eventually avoid.

Now the first day of the conference started. Me and James planned out the details for the schedule of the first night and imagined that with lots of tweets people would bring their own laptops. We were just going to finish up the setup real quick in the test lab so that we could join the sessions and talks. 7 hours later we ate dinner. Then we were almost ready. Compare Testlab, the sponsor who setup the server and run the wifi, helped us in an excellent way. Torbjörn Wiger from Compare managed to keep his calm and helped us throughout the conference to the last day. We were so happy that we had that sponsor aboard. This way, we could focus on the event and activities in the test lab and instead try to minimize the activities around the equipment.

4-5 teams had registered for the test lab, we were ready for them! Clock struck 8 and the test lab was open. The test lab was empty, apparently many got stuck in the bar. So, our nemesis for this evening was the bar. As people started arrive they saw that there were no laptops, some arguments arose about the lack of laptops. Yes, I know… it would have been excellent with laptops, but only if they had admin access. Teams started to arrive and were trying to setup their laptops, install test data and understand what they were doing there. Apparently many had missed our tweets about bringing laptops. Oh well, so much for information overload.

I didn’t release how angry I was about the laptops until I started to introduce everyone to the test lab. It did not help that I argued about the laptops, or lack off them, with one of the testers. I started blabbing and blundering something, then gave the word to James Lyndsay who took over without a sweat. James facilitated the test lab the first evening, while I went around trying to help the best I could. Note to self, shallow agreements on sponsored items is bad.

Our initial schedule was broken and meaningless. Just like reality in any regular projects. We changed the plan and managed to add some kind of debriefing. A few teams had gotten very far while others had just started. We realized that the time needed to get started was long. Something to consider for coming test labs. The teams debriefed and expressed what they had found. It was not a perfect first day of the test lab.

The next day, directly after lunch, some of the conference participants joined us in the test lab to run a few sessions, exploring XBMC. It was nice to test together, sharing techniques and looking at issues found. At 20.00 we started the test lab for the second night. This time people were a bit more prepared, more were on time and the focus was great. James Lyndsay had the great idea of creating certificates, somewhat fake ones, that we had posted on a wall to be handed out both of the nights. Our idea was that these should hint the participants on what was valued in the test lab, thus emphasis on diversity, creativity, persistence and so on. Participants were to hand the certificates out to others who they thought were worthy of them. James elegantly created most of them, I believe I can only created the one called Best dressed tester.

The participants in the test lab is used to being a bit stressed with tight deadlines and short scenarios/events. But this time, we went around to the teams at a bit after 8.00 and told them we are holding a debrief 9.40, letting them dig deep and focus. The room was fully of energy and focus. Michael Bolton worked on a mindmap and exploration of Mumble. Pradeep did the same, but investigated the black boxes created by Altom, based on Flash programs from James. I was surprised that so many wanted to focus on the Mumble application, but I guess I was a bit biased with my focus on XBMC.

Close to 10.00 we started to debrief and each time presented what they had found. Many of the participants were not used to working collaboratively with planning, testing and reporting. Some said that had learned a lot by watching testers present as well as trying out new tools in their collaboration. When everyone was done, it seemed like a great success. One of the reason for success were the participants sharing techniques, tools, ideas and showing how they tested. Those who participated could at the end of the conference say that, “Yes, I was at Let’s Test and I tested!”, which I think is a great.

Summing it up, I would like to thank Compare Testlab again for helping out in the lab, James Lyndsay for being such a great partner and to all who joined to participate in the test lab during the conference. I am also thankful to get lots of experience with shallow agreements even if it brought me lots of trouble.

One Comment
Richard Robinson May 31st, 2013

Thanks for this post Martin. I will be using a lot of your questions to guide the setup and operation of the test lab at the Tasting Let’s Test conference in Sydney, 5 August 2013. http://www.lets-test.com

I particularly like the detail around sponsorship management, and the thought gone in to the different scenarios that can arise.

Cheers
Richard Robinson @richrichnz