Is Your Arts Programming Usable?
(Cross-posted at the National Arts Marketing Project's blog salon on ARTSBlog, taking place all this week.)
At Fractured Atlas, we're in the process of rolling out a few new technology products that have been in the pipeline for the past year or so. One of these is Artful.ly, which is the hosted version of the ATHENA open-source ticketing and CRM platform that was released earlier this year. Another is a calendar and rental engine add-on to our performing arts space databases in New York City and the San Francisco Bay Area that will allow visitors to the site to reserve and pay for space directly online.
For both of these resources, we felt it was important to get feedback from actual users before proceeding with a full launch. So we engaged in a round of what's called usability testing. Usability testing differs from focus groups in that it involves the observation of participants as they actually use the product. So, rather than have people sit around a room and talk about (for example) how they might react to a new feature or what challenges they face in their daily work, you have people sitting in front of a computer and trying to navigate a website's capabilities while staff members look over their shoulders and take notes. Users are given concrete, specific assignments like "exchange Bob's Friday night tickets for Wednesday night's show," and the degree to which they were successful, the time it took, and how they felt about it are all recorded as part of the testing process.
I was thinking recently about how the idea of usability testing could be applied to a marketing or programming context. After all, the purpose certainly translates: don't you want to know how people really interact with what you offer? Especially if they're not already extremely familiar with what you do?
They key feature of usability testing that makes it different from most other kinds of feedback-gathering methods is that it is based on direct observation rather than self-reporting. As applied to artistic programming, for example, it would be more like Drew McManus's Take a Friend to the Orchestra series than WolfBrown and Theatre Bay Area's intrinsic impact initiative. Because of its labor-intensive design and need for complicity from the study subject, it doesn't generally allow for a large sample size, but the richly detailed information that comes out of it can yield tremendous insights.
Here are a few other scenarios that come to mind where usability testing (or something similar) could be relevant:
- Can your best friend's grandmother find tickets to your latest show, without having been provided with your website or any other information about it?
- How long does it take a visitor to your museum to find the gift shop and purchase a mug for her friend's birthday?
- Is a newcomer to your organization able to follow the plot of your opera's performance of Rigoletto? How much, and what, does he remember about it afterwards?