UX testing and the PARM principle: STAREAST boasted QA fun

By Elizabeth Simons | 5/9/16

If you’re looking for fun testing, STAREAST is the place to find it. With an overarching theme of user-experience testing, the undertone of this show was definitely a fun one. Not to mention that it’s set in Orlando, FL., where you can find Disney World, Sea World, Epcot, and more.

Hot topic: User Experience (UX)

User experience is obviously a big focus of image-based testing, so we talk about it a lot here at TestPlant. I was happy to find that there was a lot of chatter about how to address UX in functional testing at STAREAST this year as well. Whether focusing on end-to-end testing, mobile and web testing, exploratory testing, or something else entirely, everyone seemed to be discussing testing techniques that take into account the user experience (UX or UE). The question, as always, is exactly how to test that experience to improve it and make it the best experience possible.

It’s likely you’ve seen this software testing pyramid before:

software testing pyramid

The question is, where is your testing focus now? Danny McKeown of Paychex discussed how to “Stay ahead of the Mobile and Web Testing Maturity Curve”, and he mentioned that while a lot of your testing will be Unit testing, and you’ll also be doing a lot of service/API testing, you cannot forget to test the UI. This is the most notoriously difficult, often fragile, and complex testing that needs to be done to polish off a software release. BUT! That doesn’t mean it’s not important.

Danny McKeown also made some specific points about automation. Here’s my favourite: Automation doesn’t find defects, good scripts do. Automation is great, and I’m going to shout it from the rooftops, but I definitely agree with his point. The same applies to manual test scripts; the test is only as good as its script.

Another entertaining speaker was of course Michael Bolton, presenting on his usual topic of exploratory testing (or as he calls it, “Testing”), but he stressed something very similar to McKeown. The test may be considered a pass, but if it’s written to follow a path in the application that is known to work, then it’s not a very good test. Bolton emphasized exploratory testing as a way to give testers the agency they deserve, and allow them to play around and find those pathways through the application that are less travelled, and potentially, more buggy.

This is not to say that automation doesn’t have it’s place alongside manual testing, because it does. In fact, I would argue that exploratory testing is critical to testing, and that automation encourages and allows increased exploratory testing because it frees up time for testers to do the work they are good at. For some companies, this means that they can cut costs and use a smaller dedicated team of testers instead of a huge team of manual testers, but Danny McKeown argued that the savings do not necessarily come from cutting testing staff. At Paychex, testing jobs weren’t eliminated but reassigned. With the same size of testing staff, they were accomplishing a lot more testing.

Not to mention that the effects are long-lasting for the company implementing an automated test plan. In his discussion of how he implemented test automation at Paychex, Danny McKeown mentioned that if a tester leaves the company, Paychex still has their automated tests.

 

Fun facts: STAREAST wins with testing fun

One of the cheekiest talk themes has to have been Matt Barbour’s “End to End Automated Testing: Lessons from Zombieland”. The Senior Engineering Director at Comcast outlined a number of lessons for avoiding death in what one could call the testpocalypse with advice such as…

  • Know your way out: Accept that you have a problem, do not tolerate mediocrity, and look forward.
  • Cardio: Also known as A.B.T. or “Always be testing”. Trim the fat, and create tests that yield useful metrics.
  • Limber up: Make sure your test tools scale in both directions (one being number of tests, the other being number of devices).

Barbour may have also stolen the best line of the show when he outlined his PARM testing principle. In order for tests to be effective, they must be:

  • P: Performant
  • A: Accurate
  • R: Repeatable
  • M: Meaningful

…And around here, we just can’t pass up an eggPlant parmesan pun:

STAREAST_TestPlant_Twitter_PARM

If you like haiku, Isabel Evans had you covered with what was my favorite slide of the conference:

STAREAST_Haiku._IsabelEvans

 

Congratulations to the STAREAST Robot Winner!

We of course can’t wrap up without proudly congratulating the winner of our passport contest prize… Gennady Poznyakov, Manager of Software Quality Engineering at Medadata Solutions was ecstatic to win one of our little robot friends. Congrats, Gennady, and I hope you have fun with your prize!

TestPlant STAREAST Robot Winner Gennady Poznyakov

It’s been a busy few weeks with Mobile Dev+Test/Mobile IoT Con immediately followed by STAREAST, but it’s also been enjoyable! If there’s anything in particular I took away from STAREAST, it’s that people are creative, determined, and that testing is fun. I’m definitely looking forward to the next STAR event!

Topics: GUI testing, STAREAST, Test automation, TestPlant Events, UE, UE testing, UI testing, User Experience, UX, exploratory testing, Functional testing, software testing, User experience testing, UX testing

Elizabeth Simons

Written by Elizabeth Simons

Elizabeth is a Product Marketing Executive at Testplant. Formerly a part of the Technical Support and Documentation teams here at the company, she’s been working with Eggplant test automation tools for the past four years. Elizabeth also loves to paint with watercolors, and designs repeat patterns for fabric and wallpaper.

Stay up-to-date with the latest in test automation

Lists by Topic

see all