Measuring Your Crowdsourcing Efforts by Aliza Sherman | Beth’s Blog

Measuring Your Crowdsourcing Efforts by Aliza Sherman

Crowdsourcing

Click to Purchase Your Copy Now!

Note from Beth: My colleague, Aliza Sherman, has written a book, The Complete Idiot’s Guide to Crowdsourcing and you drop everything and buy it now.    We’ve been chatting about how to measure the impact of the crowd and she offered to write this guest post on the topic.

Measuring Your Crowdsourcing Efforts by Aliza Sherman

If you are considering incorporating crowdsourcing principles or processes into your workflow, you should also thinking about how you’ll measure the results of your crowdsourcing efforts. In order to know how to measure crowdsourcing results, you first need to understand what kind of crowdsourcing you’re implementing.

The Types of Crowdsourcing

Crowdsourcing consists of principles, processes and platforms to get things done and usually involves an open call to a group of people online – the “crowd.” There are three main types of crowdsourcing: Work, Input and Action. Each type dictates a different process and therefore requires different measurement and analysis methods.

For clarification, “Work” means any process that requires actual efforts that are usually compensated in some way. Work can consist of microtasks or small, repetitive tasks that might constitute “busy work” and that can be performed with minimal skills. Microtasks could be tagging hundreds of images or cleaning up hundreds of database entries. Other work might be creative such as coming up with a name for a new product or company or designing a logo, website or brochure or producing a 30-second video spot.

Some creative work is crowdsourced in the form of creative or design contests while others are produced by interactive agencies that gather and vet online communities of designers, artists and producers to be part of a creative development process. For nonprofits, crowdsourcing work can be a cost-effective way of taking care of repetitive tasks through sites like Crowdflower, Castingwords, and Clickworker. You might also tap into online communities of people willing to provide services through a site such as Sparked.com where professionals volunteer time and talent to help nonprofits and causes with marketing and design needs.

“Input” usually means getting others to provide information of some kind such as feedback on an initiative or ideas for new campaigns or even to get answers to questions. Nonprofits can use answer sites such as Quora.com or Vark.com that use variations of crowdsourcing principles to tap into many people for input. In some cases, money also exchanges hands such as in innovation challenges where you might be seeking a solution to a complex R&D problem and provide an award to the solver.

In terms of “Action,” nonprofits can deploy crowdsourcing initiatives that encourage many people to contribute to a cause such as through crowdfunding on platforms like Causes or Crowdrise or any of the other numerous social online fundraising platforms.

Action can also include online voting initiatives to get a wider audience to participate in decision-making. And there are a growing number of examples of nonprofits getting the crowd to provide information toward the greater good such as Ushahidi’s platform that makes it easier for rescue organizations to aggregate information from citizens during a crisis or GlobalVoice that assembles on-the-ground reports from citizen journalists about protests happening around the world.

 

Measuring Work

When measuring work, you need to start with a clear outline of the task at hand. Knowing what work you need done and the quality of work you’d like to receive and set benchmarks to measure outcomes. Work completion is clearly a goal as is accuracy.  If speed is a goal, crowdsourcing may not be your best option, however, saving money and freeing up staff time can be reasonable outcomes.

To help ensure that there are some quality standards in place for work rendered, you are better off using a work-specific crowdsourcing site than trying to assemble an online crowd and manage them ad-hoc. Crowdflower, in particular, is pioneering mechanized QA processes within their work platform to provide a high degree of accuracy for work rendered through their site. Many crowdsourcing work sites provide some kind of rating system meaning the better, more accurate workers can rise to the top. Sites like uTest and Topcoder help you work through work like website or application testing and provide ratings and controls to help you manage more technical processes with vetted programmers and developers.

Measuring Input

Measuring Input can be a little more nebulous. What is a good outcome when you are looking for an answer? Getting the right answer? But if you don’t know the answer to something, how will you ultimately know you are getting the right answer? Crowdsourcing can be a compelling way to get a wide variety of answers and to consider the “wisdom of the crowd” if they are able to vote good answers up. But you are also relying on the knowledge, skills and even honesty of the crowd to get to that “right” answer.

In terms of feedback, this is much more subjective, and there is a fuzzy line between surveying, polling and questioning a lot of people or your constituents and actual crowdsourcing for input. Just getting input isn’t really crowdsourcing and there needs to be some other “crowd” participation mechanism such as a voting aspect to make it closer to “true” crowdsourcing. In terms of measuring success when crowdsourcing input, just getting a large number of responses may constitute success so you might set a goal for receiving dozens or hundreds of responses. You can take measurement of input further by how you use the responses you receive. Does it help you solve a problem? Does the input aggregated allow you to move forward on a project or move forward in a new direction? Was the outcome of your initiative that was affected by your input a better one in some way because you tapped into a crowd? Again, crowdsourcing for input is harder to measure in concrete terms, but if you know what you are trying to achieve, and you achieve that or surpass expectations, then you could constitute that a success.

Measuring Action

Of all the three different types of crowdsourcing, measuring action can be the most straightforward. You are looking to raise $10,000. You raise $9,000? You’ve missed your goal, but you’ve still raised some money. You’ve raised $11,000? You’ve surpassed your goal. Did you do it in less time than any other fundraising effort? Crowdfunding may then be wildly successful for you.

Even for other information-based crowdsourced actions, you can still measure outcomes. Looking for people in your community to submit reports of littering so you can deploy volunteers for cleanup? Look at how many reports you’ve been receiving prior to crowdsourcing then compare to the number of reports and the time you’ve saved looking for litter sites by other means. How many more clean-up teams have you been able to deploy? Crowdsourcing may not save you on clean-up time but get you closer to reaching higher goals of cleaning up more locations in your community.

Success in Measuring Crowdsourcing

In order to measure the outcomes of your crowdsourcing initiatives, you first need to know what you are trying to achieve. You should also review how you’ve traditionally done the work or solicited the input or encouraged the action in the past and note what worked and what didn’t work previously to use as a comparison to crowdsourced efforts.

When you are clear on what you need done, explore crowdsourcing platforms built specifically to do what you require. A Google search of crowdsourcing translation or crowdsourcing transcription, for example, should lead you to links for Lingotek, Acclaro and Smartling (translation) or Castingwords and Speechpad (transcription). How do you know which platform will work for you? Look for reviews, ask for references, and put a question out to your community (pseudo-crowdsourcing), just as you might research any other service provider or online tool.

If the crowdsourcing you need done also requires customization, check with the companies that offer crowdsourcing consulting services in addition to platforms such as Chaordix or Whinot.com.

To best measure your crowdsourcing efforts, you need a thoughtful process that includes:

Clarity of what you are trying to achieve and benchmarks to compare outcomes from the past.

  1. Assessment of what you want to crowdsource and determining if crowdsourcing is your best option.
  2. Evaluation of crowdsourcing platforms and tools to find the one best suited for your needs or consulting your “crowd” or a crowdsourcing consultant for guidance.
  3. Recording results and analyzing outcomes.
  4. Reviewing outcomes and comparing to past efforts to determine success.

Approach crowdsourcing not as a silver bullet but as a new way to get things done. Sometimes, crowdsourcing will save you money, in some cases it might save you time, but regardless, outcomes will most likely be different than your usual ways of doing things. Is different better? Only you can determine that one.

Aliza Sherman is a veteran Web entrepreneur, author and commentator on digital, social and mobile. She consults companies and organizations and speaks around the world on strategic use of online technologies in marketing and communications. Her latest book is The Complete Idiot’s Guide to Crowdsourcing (Penguin, 2011) Her next book is Mom, Incorporated.


8 Responses

  1. Erin McMahon says:

    What a GREAT post! Thank you!

    Would you consider Harwood-style Community Conversations to be a form of crowdsourcing? Would the same general principles apply?

  2. Erin McMahon says:

    (Was just thinking – not sure how familiar Harwood’s Community Conversations are. Here’s a link in case: http://www.theharwoodinstitute.org/index.php?ht=a/GetDocumentAction/i/23583 )

  3. Thanks much, Beth. Always looking for ways to design evaluation into projects right from the start. Look forward to e-reading this book when I fly out to Communications Network conference tomorrow.

  4. [...] Measuring Your Crowdsourcing Efforts by Aliza Sherman Click to Purchase Your Copy Now! Note from Beth: My colleague, Aliza Sherman, has written a book, The Complete Idiot’s Guide to Crowdsourcing and you drop everything and buy it now. Source: http://www.bethkanter.org [...]

  5. Hi Beth, Nice article with some solid, practical advice – that’s refreshing. For your interest I have been using a tool to gather expert input from colleagues around the world. It allows you to gather thoughts on a subject, group and converge them and then allow voting for prioritisation – similar to what you might do in a workshop exercise where you were (for instance) looking to identify the key strategic issues of a situation – although this way you do it live. I have found it to be a useful variation on crowdsourcing. Regards, Simon

  6. Beth says:

    Simon: What’s the same of the tool?

  7. [...] background-position: 50% 0px; background-color:#222222; background-repeat : no-repeat; } http://www.bethkanter.org – Today, 10:41 [...]

  8. [...] on the power of collective intelligence along with the related ideas of the Wisdom of Crowds, crowdsourcing and Smart Mobs. We also have a ton of research in multiple areas, and especially in education, [...]