Feature

Online Qualitative Platforms: Expanding Capabilities

By Rebecca Bryant, President, Bryant Research LLC, Knoxville, TN, rebecca@bryant-research.com

One of the many advantages of attending QRCA conferences is touring the Marketplace. Here, the vendors that so many of us partner with will gather with the express goal of sharing new ways their products and services might support the qualitative work we do.

The intent of this article is to highlight some of the newly released online platform features. We reached out to a number of those in the Marketplace to find out more. A few have updates pending and were not quite ready to talk about their latest developments. So stay tuned for that. While not a comprehensive cataloguing, the hope is this article serves three purposes:

  1. perhaps to introduce new online capabilities to some,
  2. to encourage trial of capabilities that appear to be a good fit for our readers, and
  3. to foster conversation between qualitative researchers and our vendor partners so that product development continues to focus on value-added capabilities.

This article is not an exhaustive review, it is the beginning of a conversation. To that end, I invite vendor partners to be in touch so that we might continue this dialogue. And, I encourage conversation on the QRCA Forum regarding additional options that quallies would like to see. After all, a rising tide lifts all ships.

Artificial Intelligence Producing Machine Transcripts

Several of the platforms currently offer, or soon plan to incorporate, artificial intelligence (AI) in ways that can greatly reduce the time needed to pull quotes and to build a compelling report. Machine transcription works in the background during an online interaction, turning the spoken word into transcribed text.

Is machine transcription perfect? No. In fact, several of our online partners offer quick turnaround on human transcription, should that level of precision be needed. What machine transcription is, is fast. Whether from a focus group, triad, dyad, or one-on-one depth interview, researchers can very quickly access machine-generated transcripts from the cloud. Imagine getting a transcript within fifteen minutes, quickly conducting keyword searches based on notes jotted down during the interview and pulling video clips for use in the report right after groups. QualSights, 20|20 Research, and Civicom are just a few of those offering this capability. All stand ready to demonstrate how this works on their specific platform.

Email Alerts

Some, like 20|20 Research, offer time-saving options like email alerts that pop into the moderator’s inbox as soon as a participant responds to a follow-up question. This gives the researcher the option of reading this information and quickly responding while the participant is online, saving time for both the moderator and the participant.

Tagging Sentiment with Artificial Intelligence

In addition to machine transcription, QualSights, which operates on any visual media or camera device, features sentiment tagging. Based on responses to specific questions, AI groups individual responses into nominal categories. For example, the researcher might ask participants to describe their reaction to a product or service. The platform then buckets all of the responses into positive, negative, or neutral categories. Additionally, it counts these response types by category and provides a graphic display, making it very easy to see how opinions are grouped. Is it quantitative data? Not at all. Rather, it is a quick visual record of sentiment that can both direct and speed up analysis.

20|20 Research also uses automation to do what it terms “automatic tagging”. This Analytics Engine also displays sentiment and keywords as well as a graphic display. Responses are characterized as positive, negative, or neutral using IBM Watson technology.

What could be better than this? What about coding facial emotions? Stay tuned because QualSights plans to offer this soon as an add-on feature.

AI Image and Brand Tagging

20|20 Research has incorporated computer vision AI that, with 80% accuracy, tags images and identifies specific items or brands in the pictures uploaded by participants. Imagine doing a study on baby wipes and through this technology having the online platform automatically register the number of images in which a certain brand of diaper bag appears. This type of contextual data is now available automatically.

Real-Time Chatting

With QualSights, it is possible to both remotely observe participants and to shift into a real-time interaction. For example, the participant is assigned to shop for oral care products. The researcher can watch the participant shop through the participant’s mobile phone. Then, should the researcher want to chat directly with the participant during the shop, the platform supports this interaction. And, the researcher has the option of bringing the client in to observe the shop-along and/or the real-time, follow-up session. No more sending questions after the fact. Now, we can opt to capture the moment in real time.

Adding Skip Patterns and Combining Projects Easier

Sometimes small changes save time— a great value.

Skip patterns, a long-time staple feature in quantitative surveys, can now be programmed into the scripts loaded into 20|20 Research’s QualBoard. Skip patterns allow researchers to pre-set follow-up queries based on individual participants’ responses to a prior question. For example, those who indicate they have tried a certain product might be asked to relay their product-use experience, while those who report no experience with a product might skip to a new line of questioning.

It is now possible to combine projects on QualBoard 4.0. In the past, information from an in-home use test (IHUT) with 35 participants had to be manually combined with the information collected during the follow-up IHUT where participants are asked to try another version of the product. Simple, yes. Helpful, immensely. I, for one, happily bid a fond farewell to the days of flipping open-ended comments into Excel in order to piece together concurrent studies.

In Summary

A number of online platforms designed for qualitative research now employ artificial intelligence in ways that save researchers time, augmenting the analytic rigor quallies employ as they work to achieve their client’s objectives. Transcription has been automated. Sentiment is being tagged. Brands and items from participant-submitted images can be identified by machine technology, providing quick contextual insights. In addition to remotely observing mobile shop-alongs, we can ping participants during the shop and switch to real-time interactions where we can drill down right then rather than having to wait for a response from a text follow-up when the participant may no longer recall the moment.

Be the first to comment

Leave a Reply