Case Study: Online Communities For Qual Research

It’s December! The days are shorter, the weather is colder and we’re beginning to look forward to the holidays! December is also a good time to reflect on what has happened in the past year and to plan ahead for the new year.

In 2012, the marketing research industry buzz about online communities continued. As commented in the Spring 2012 GreenBook Industry Trends (GRIT) report, “. . . Online Communities are already mainstream”.  While that may be true, the technology to power an online community for insights might not yet be in your toolkit. In this post, I’m going to share what one of our clients learned about online communities in their first foray earlier this year.

In the summer, Ramius partnered with a market research agency and their client, a major grocery retailer. The client was interested to experiment with online communities to understand what the method could bring to their research needs. Since the agency had many years of experience moderating focus groups and online discussions, we collaborated to combine Ramius’ Recollective offering with their expertise to meet the client’s objectives.

The project chosen by the grocer was to understand the big-box store phenomenon: why and how consumers shop there. It involved a national survey of consumers followed by focus groups into which a three-week online research community was incorporated to discern new insights. It meant that the client and agency could compare and contrast their online experience to the traditional focus group method. Some interesting findings came from the project.

Communities afford opportunities for wider exploration of issues

In contrast to traditional qualitative research techniques, in an online community you can expect to be less time constrained and respondents aren’t so limited for how they can contribute. For example, a participant could start their own online discussion thread and get right to the heart of what matters to them rather than wait for the focus group moderator to direct a question to her. In this project, the researchers had a full 3 weeks to delve deeply into responses and discussions to meet the client’s brief, yielding rich data and allowing for more complex analysis.

Variety leads to better engagement

The Recollective platform has a bulletin board function for moderator and/or participant-led discussion. As well, Recollective provides an engine to deliver structured research activities including diaries, mystery shopping, brainstorming, rankings, card sorts and open-ended text and photos. We found that by presenting a variety of activities, the degree to which participants are engaged in the community increases; participants will at least complete those activities well-suited to their abilities, interests and time.

For example, in the mystery shopping exercise, some participants favored video responses – one participant even filmed an interaction with the mystery shopped store manager. Others preferred to contribute feedback in a written form. The learning here was to respect and facilitate the variety of end user preferences to maximize the insights gathered from the community.

Open-ended queries elicit greater depth and creativity of response

The researchers designed an exercise where it asked respondents to use images to express perceptions of a grocery store. Using stock images, they received somewhat expected answers. By allowing the participant to find and share their own image, researchers anecdotally found responses of greater depth and creativity. The answers lead to novel and unexpected paths that could be potential future research topics.

Socializing responses

Recollective has a modern, intuitive design similar to modern social networks like Facebook. This includes “social features” that encourage study participants to interact with each other. For example, it has an newsfeed to shares participant responses with other the wider community which the project found stimulated insightful discussions because there are more opinions and content to build upon.

Recognition and reward

Recognition is a commonly identified community effect that also worked well in a short term community. A “Star of the Week” respondent was singled out, based on response quality, frequency and quantity of contributions. This encouraged and motivated other participants to also strive for such compensation and recognition.

Incentives were used in the study and were associated to the completion of activities. To encourage participant-led discussions, rewards were based on not starting a topic, but that the topic resulted in other participants contributing to the new thread.

Be real with the participants

The researchers added video instructions which made the study a more personable experience for participants. They actually could see and hear the team behind the project. Where appropriate, key findings were fed back to participants who were interested in what the results were.

Apply community-based techniques to appropriate solutions

At the end of the study, both the researchers and clients were impressed with the possibilities of online community-based research. It yielded a lot of information that can lead to other future research topics. Certainly there are some situations that it’s not suited to, but in conclusion we discovered it’s very applicable to:

  • exploratory research projects
  • longitudinal research (multi-phased product development, in-home testing over time, purchase processes, behaviors)
  • testing (concepts, communications, ads, etc)
  • reality checks
  • group ideation, co-creation and crowdsoucing initiatives

For more information on this case study, contact Ramius on

Recollective Release – December 2012

Recollective will very soon include a new, visually-rich task type called “Sort and Rank“. The task type is variation on a single-choice grid question whereby a series of “cards” can be placed into one or more “groups” by participants.

Cards and groups are defined with text and an optional image. Each card can be placed into one group and a group can hold many cards. What’s more, within a group, cards can be ranked and to aid participants, a scale can be defined for each group (e.g. “Most Sour” to “Least Sour”).

As done for other task types, completion of the Sort and Rank task type can be accomplished on any mobile device without degradation of the activity’s visual richness. Each task submission is neatly summarized in the response stream and a rich overview area is provided for analysts with customizable charts and exportable data.

In a simple example, a group of 10 cards could be defined with only a single group. Participants could be asked to rank only 5 of the 10 cards by placing them into the group in a desired order. A more complex configuration will have multiple groups which allows items to be categorized and then ranked within those categories.

We expect to see a lot of innovative use of this new task type and will expand its capabilities based on your feedback.

Using Backroom Collaboration in Recollective

A question posed a handful of times in the past few weeks is how to create a backroom comment in Recollective. We’re expanding the backroom commenting capabilities in a future release but for now, this post describes some good practices for backroom comments applied to task responses and excerpts.

Task Response Backroom Comment
The first and most common backroom comment is tied to a task response. To add them you must first navigate to a task response. To do that, click on an activity card and either select a task card or an participant’s name in the Activity Response Summary table at the bottom of the page:

Clicking on a task card will bring you the first response for that task. Alternatively, clicking on a participant’s name opens their response for the first task in the activity. In each scenario there are two comment textboxes beneath the participant’s task response:

The top box is where you can add either Open Comments (visible to anyone who has permission to view that task response) and Backroom Comments. The bottom box is where you can send a private message (probe) to the participant.

To add a backroom comment, select the Backroom Comment option on top of the first comment textbox (you will notice the box turn red), enter your comment and press the Add Comment button.

Excerpt Backroom Comment
To apply a backroom comment to an excerpt, you must first create the excerpt. This is done by selecting a portion of user submitted text with your mouse and clicking the Save Excerpt option in the menu that appears. The Coded Excerpts section of the side panel will automatically open, enabling you to apply codes to your excerpt and add a backroom comment:

Backroom Comment vs. Backroom Task
In both scenarios described above you will notice that underneath the text input fields there is a checkbox labelled “Make this a backroom task” that, by default, is enabled. Keeping this enabled will turn your backroom comment into a “to do” item. This means the person creating the task is either asking someone else to do something (eg. “Please probe more on this response”) or are setting a reminder for themselves to do something.

If you don’t want to create a backroom task and are merely commenting (e.g. “This is a trend we’re beginning to see more often”), uncheck that box to avoid creating a task.

Backroom Comment Notifications
All backroom comments and tasks are easily accessible in the Backroom section of the side panel:

This list shows, in reverse chronological order, all backroom comments made within the study. You can use the options at the top to filter that full list to show outstanding or previously completed tasks.

You can click on a item in the list to be brought to the original task response or excerpt that the backroom comment was made against. For tasks, once they are complete, click the “Mark as Completed” link to move the task from your To Do list to your Completed list. This helps you keep on top of any outstanding tasks to do within the study.







Do you have any suggestions for improvements to backroom comments? Please add your suggestions by commenting on this blog or submitting a feature request directly from your Recollective study (click the Feature Request option in the side panel).