Home > testing > Developers exploratory testing – Expanding its value

Developers exploratory testing – Expanding its value

This post was also published on my company blog.

There is a common practice in our company to perform Developers Exploratory Testing sessions, explained by my colleague Davor here. The cool thing is that this way of performing higher level testing has actually become accepted by our developers, and they really enjoy it.

In my current work of developing our organization wide practices for quality, I have made a deep dive into how DET is carried out on a regular basis. What I have seen is that DET is accepted and acknowledged as a valuable practice, however it is not really carried out in its full potential. There are many details and aspects of it to work on, especially regarding reporting and follow-up.

The other day I was asked to help one of our teams with a DET session.As they are familiar with the approach, I wanted to expand their view on its potential through light weight note taking and showed them an example of a very basic session template I like to use. I also explained the intentions of having bugs and issues separated as well as how I do with test notes and the summary. We decided to use a whiteboard instead and put stickies with issues and bugs on there for visualization. We spoke about the mission and decided on two different areas to focus on. The app is an iOS app where a new component is used in the app core functionality for iOS5.

The four of us paired into two teams with some different iOS devices and tested for about 45 minutes. We found quite a few issues and bugs that were put on the whiteboard, but not so much notes about which areas we covered. Of course, by looking at the issues you might get a hint of what areas have been covered, but how much? We also quite quickly noted that we had to tag each issue with the test environment (iOS version and device), since they differed quite a lot. Most of the things found were considered bugs to fix, which is also not always the case in all projects or settings. It usually depends on the customer and the relationship to them, as well as having a decision maker on this present during testing. In this project, the scrum master/tech lead which was present knows the customer well enough to make those judgements.

But what about the learnings?

After the session debrief I got into a meta-debrief, discussing the outcome compared to my introduction about reporting. At first there was not very much understanding of why having test notes is valuable, but this shifted a little during our discussion. “While our smaller projects change that rapidly, even storing test notes might be waste” is a motivation I will take with me. I explained a common scenario of being asked what was tested and with what configuration. It could also be valuable for future test sessions to know what parts of a functionality was covered and when this was tested before. I also like to emphasize the possibility to explain current functionality which might not be explicitly stated in requirements.

The team was happy with the experience and I got some more input on how to improve the value of our DET sessions. I am not going to abandon the reporting, but I need to find a way of combining the fun part of interacting between pairs through the collaborative setting of the whiteboard. This actually helped getting attention on the issues in the debrief.

And then the developers actually got their own ideas of what a session note taking tool should look like to suit their needs, this is how they sketched it out after the session. I would explore some other possibilities before building our own for example RapidReporter or SessionWeb, but it is really cool that the meta-discussion could trigger the further thinking about the problem. Other aspects of the problem with collective knowledge transfers I wrote about here.

  1. December 29, 2011 at 21:14

    I’m fascinated with the idea of Developer Exploratory Testing. It takes the whole-team approach to testing to a new level.

    What I can’t tell from this post or your colleague’s article is what you do in terms of automating regression tests? Also, do you use tests to drive development (specification by example)? I see DET as an addition to these other practices.

    Our developers do exploratory testing when needed – if the testers are all occupied and some exploratory testing needs to be done. They also may do it in researching as they work on new features. But we haven’t tried anything this structured. Would be fun to try.

    • Sigge
      December 29, 2011 at 23:24

      Thank you for your comment Lisa. I have to admit that alot of my previous thinking on this has fallen into place the last two days when discussing it more, and I just need to get that into writing in a proper way. So stay tuned for those thoughts during the next couple of days, which is why I dont really want to give you a full answer to your question just yet. It takes time for me to structure some and get it into writing. =)

      About our teams, there is a very big difference in the maturity of automation, both regarding regressions and through TDD. That also goes into driving development with specification by example.

      As you are suggesting, DET is an addition to other practices in development. However I am seeing a bigger picture view on it now, that is my next step in elaborating.

      About your developers, how do they perform exploratory testing?

  2. December 29, 2011 at 23:40

    We don’t have any kind of structure around our exploratory testing. We put suggested test scenarios on the wiki page for each user story, and for more complex stories or themes we do testing mind maps showing relationships and what areas need testing. When the developers pick up a manual exploratory testing card, they follow what’s on the wiki plus use their own ideas and domain knowledge. Everyone on the team has deep domain knowledge, and the developers have had to do enough testing that they are good at it. We do TDD and specification by example as well, and we do a lot of exploration as we are creating API-level functional tests to guide development.

  3. January 3, 2012 at 10:39

    Very interesting! This is actually very similar to what we do in my team. During the sprint we test mostly by a mixture of test cases and session-based exploratory testing with feature-“playbooks” as guidence for test. At the end of each sprint the testers identify one or a couple of focus areas where we see the most risk for the particular release. We then gather the whole team, including testers, developers and PM in front of a whiteboard where we explain which areas we want to exercise and give some hints of what to look at within these areas. We also explain wich environment to work in and if there is any testdata prepared. We divide everyone in pairs and each pair gets a focus area. We then go and explore the area(s) for an hour. Each pair takes notes on post-its on everything they find. It can be bugs, issues or just questionable functionality or gui-improvements. When the time is up we meet at the whiteboard again and go through everything we found. We decide and prioritize which issues to act on now, which should be logged for later fixing etc. We call this session a “Dress rehearsal” and we find it a very effective way to find those extra issues that we might have missed during the sprint testing and in regression test. We’ve just started to build on our automated regression using specification by example and Specflow but we are not quite there yet. Even later on when we have expanded our way of working with sbtm and specification by example we all see the “Dress rehearsal” as a really nice addition where the whole team gets together for a test bash for an hour.

    • Sigge
      January 5, 2012 at 02:06

      Hi Björn,
      Thank you very much for your comment. It sounds like a good way of working you got there. But I would call that DET from my perspective. Dress rehearsal is in my experience something very different to this.

  1. January 5, 2012 at 02:03

Leave a comment