My New Favourite Question

Striking the right balance between workload and incentive is difficult. When I’m working with a team that’s new to research communities I can, understandably, expect to hear at least one question dealing with this dilemma. More recently though, I was struck by a simple solution implemented by a new client. In retrospect, this solution seems almost intuitive -- just ask the participants if everything has been reasonable or fair so far.

Not long ago I found myself in the midst of what was disguised as a quick online survey but proved to be a significant investment of my time. I opted in because I wanted to contribute; I opted out because I didn’t want to contribute that much. Striking the right balance between workload and incentive is difficult. When I’m working with a team that’s new to research communities I can, understandably, expect to hear at least one question dealing with this dilemma:

“How many questions should/can I ask in a day?” “How much time can I expect participants to spend contributing to the community on a given day?” “Is X enough incentive for Y time spent?”

I can provide some broad guidelines based on my exposure to other communities but I always conclude with a confident, “it depends.” The answers to these questions depend on a number of variables: demographics, subject matter, incentive offered etc. In weighing these variables, the best I can offer is an estimate of what might be fair in terms of workload and associated payout. The trouble is that if participants begin participating but then decide that either the incentive or associated workload is unfair (as I did), it’s going to have a negative impact on the results. Participants will rush to complete the study or not complete at all. More recently though, I was struck by a simple solution implemented by a new client. In retrospect, this solution seems almost intuitive -- just ask the participants if everything has been reasonable or fair so far. The client's question went something like:

“And just so I know, if I am asking too many or too few questions would you please let me know how much time you've spent working on these questions? This will be very helpful going forward.”

Why didn’t I think of this? Why, of the hundreds of communities I’ve supported, has no researcher (to my knowledge) asked this before? This simple question draws attention to a few of the advantages of an asynchronous engagement tool such as Recollective. It’s not unusual to find a thank you message addressed to participants at the end of a study accompanied with a field to provide general feedback. However, positioning this query near the beginning of the community -- at the end of the first day in this particular instance -- renders the feedback more useful. Moreover, rather than asking for general feedback, pointedly asking participants to reflect on the effort they had just expended -- quantified with a unit of time, for example -- is useful information to move forward with. Positioning an explicit inquiry near the beginning of the study presents a number of opportunities:

  1. This information can be used to plan the progression of the community going forward. With a better sense of the participant’s effort threshold we can plan the distribution of topics to ensure that exhaustion is avoided and participants remain enthusiastic. In other words, as the community develops the researcher has a benchmark for the threshold of effort they can push to and can plan future activities accordingly.
  2. A more equitable distribution of the workload presents us with the opportunity to optimize the value received from the incentive payment. If we’ve struck a balance that the participants feel is fair, we can more confidently expect participants to contribute thorough, complete and genuine responses rather than them simply opting out. In order to achieve this balance, it helps to clearly ask participants to evaluate the amount of time spent. Doing so quantifies the investment of effort in relation to the incentive offered.Complicating this point is the prospect of socializing the responses. I would expect this to influence outliers and possibly encourage under-performers but also discourage top performers. Though this could be said of any socialized activity.In this case, responses were socialized. Responses consistently indicated that the workload seemed about right, so I can’t say that I observed any sort of considerable change in behaviour. Still, the implications of socializing this sort of question should be weighed.
  3. The final, and possibly most compelling opportunity, is that the question establishes a positive tone or, put differently, it initiates a positive relationship between participants and moderators. This question illustrates that the relationship is reciprocal in nature and demonstrates to the participants that their work is important, valued and compensated. Also, though outside the primary aim of the research, it demonstrates that their contributions can and will enact some kind of change – something I’ve been interested in for a while now.

The most effective communities I’ve supported react and evolve in real-time based on the participants’ contributions. This simple question provides an additional layer of depth that positively directs the progression of the community. We can optimize both the thematic subject matter as well as the participants’ eagerness to contribute. Ultimately, then, what I like most about this approach is the opportunity to right the wrongs of other online methods that may unintentionally deceive participants or dissuade participants from continuing.

Dana Cassady
Vice President Customer Services

Let's research happy together