Does Rigorous Data Analysis Thwart Effective Storytelling by Non-Profits? | Beth’s Blog

Does Rigorous Data Analysis Thwart Effective Storytelling by Non-Profits?

Data

Earlier this month I co-facilitated the “Impact Leadership Track” at the NTEN Leading Change Summit with John Kenyon, Elissa Perry, and Londell Jackson.   Our track was one of three where participants could take a deep dive into a topic and learn from peers through dialogue.  The event also included plenary speakers, including a provocative talk about storytelling with data from Alexandra Samuel.

Her most controversial point was:

“Rigorous data gathering and analysis can get in the way of effective storytelling by non-profits.”

 

She was speaking to a room filled with nonprofit leaders who have the power to use data as part of communications campaigns to raise awareness or inspire people to take action on important causes.   Yet, they are held back because they lack the resources or skills.   In her talk, Alex encouraged the audience to discern between the scenarios where rigorous, objective research is needed, using scientific methods and those where they can simply tell an interesting or compelling story with data.   She created the 2×2 for nonprofits to help think through the level of rigor in research methods are the best fit doing quantitative data storytelling.

As Alex notes, in an ideal world, nonprofits would have access to high quality data collected through rigorous methods and professional data scientists to design, implement, and analyze it.  However, this isn’t always the reality.   As Alex points out “Sometimes you can get the results you need (like inbound traffic generated by an eye-catching infographic)  by sharing data that is useful and interesting, if imperfect. That’s why I think it’s time for organizations to get comfortable doing at least some of their data work in the “relaxed” zone, because that is better than missing out on doing any data storytelling at all.

This idea upset a few colleagues, especially colleagues who have been trained in rigorous scientific methods like Deborah Finn, who strongly disagreed with Alex’s points about relaxed data.

“I don’t think we should encourage nonprofit professionals to be more relaxed about how they use data to tell their stories; I think that ‪#‎nptech‬professionals should be concentrating on how we can deliver training (and other forms of capacity building) that will assist them in telling their stories with valid, reliable information that will withstand scrutiny. These nonprofit professionals don’t need graduate degrees in statistics, social research methods, and database administration – they need something that is pared down to the skills they will use in their daily work lives.”

I interpreted “relaxed” as still generating quality data but not at the social science academic standard – and that is what nonprofits who are not professional measurement/data geeks could learn how to do.   Perhaps Alex’s 2×2 framework could be a useful instructional device to help nonprofits decide  a) we can do it ourselves (e.g. get training for that) or b) when we need a more rigorous approach. If the latter, they would have enough understanding of the methods that they could find the right data scientist/geek and able to work effectively with them. I don’t think Alex is advocating for “use crappy data to make your point.”

Deborah is concerned that the permission to “relax” will encourage sloppiness in data research methods, although we all did agree that nonprofits do not need to do the kind of quantitative analysis that is has to pass peer reviews in order to be accepted by an academic journal.

Alex shared that her reasons for advocating that nonprofits relax a little is that data projects can be very costly and time-consuming. She hopes that nonprofits can think about ways to do data work “that is more focused, economical and yes, relaxed, as long as they are still putting forth information they believe is accurate.   What I am trying to do is to encourage people to rethink their risk/reward calculus, though how that lands will depend on how high they had set the bar to begin with. From what I observe, it seems like folks may be setting the bar too high, because I can’t think of any other explanation for why more nonprofits aren’t seizing the opportunity to tell their stories with data. When so much of the data-driven stories online are produced by large media companies and other companies with deep pockets, it can seem very daunting for smaller organizations to try their hand at the same thing.   Not everyone is Jawbone or the New York Times or Nielsen. That doesn’t mean we can’t use data, and use it effectively, if more modestly.”

What do you think?   Should any and all data that a nonprofit uses be collected and analyzed and held to rigorous standards or are organizations missing an opportunity by not using “relaxed” methods as long as they putting for accurate data?

 

 

28 Responses

  1. Perfect timing on this post Beth! I’m about to get on a call in twenty minutes that has to achieve both these goals – data that will hold up to peer review scrutiny, but will also tell a powerful story. I have some ideas about how to do this in the area I’m targeting right now, but it’s all very (very) early stage.

    So glad I saw this pass through on my tweet stream…

  2. Beth says:

    Susan, good luck with that! Let me know where it lands

  3. Eric Mulford says:

    Beth, the time has come for people to hear the stories! Facts tell, but stories sell. People are moved by stories. I meet with a board on Monday to convince them that they have a powerful story beyond the raw data. I love the term “relaxed data”. Story put the other person in a relaxed state of mind so they are receptive to the message. Thank you!!!

  4. I was sitting next to Deb and somewhat aghast by this recommendation as well. I agree with Alex that the data scientist skill set is not a requisite for publishing infographics and the like. But we have to tread carefully when we publish data online. For any mission that has detractors (and tell me which one doesn’t, because I can’t think of any), we have to be able to back up what we say. We are stewards of charity, and under a high level of scrutiny. So if the infographic is making a fairly established data point “97% of all scientists say Global warming is real” and we have a citation, sure, no data scientist required. But any data that we are presenting for the first time should be well-vetted.

    Alex also said something about going with survey data even if your sampling was small. That raised my eyebrows an inch or too, as well. Do that only if you clearly (directly above or below the graphic) make that clear. Explicitly discuss your methodology.

    The other fun controversy at 14LCS was Deanna’s keynote comment about being 100% authentic. Here’s my take: authenticity is a scale, where 100 is completely authentic and zero is completely unauthentic. If you’re on either end of that scale, you are a sociopath. We all put on personas in social situations, at a minimum in order to contain offensive bodily functions. We “assimilate” a bit in order to establish a shared platform for communication. Assimilation is a loaded term, because forcing people to discard their culture is offensive. But expecting someone to “act professionally” is not. I censor my political views on social media because I am an exec at a bipartisan organization, and I don’t want anyone to draw the impression that we are on a particular side of the fence. That’s unauthentic of me. But it’s also good mission stewardship, right?

  5. Beth says:

    Peter,

    Thanks for your thoughtful comments on the topic! I think we are saying the same thing about data methods – that there is a scale of rigor with methods — from “car salesman” stats to rigorous scientific methods. I think we all agree that car salesman stats (suspect data and methods) does a service to nonprofits. But, on the other hand – and as you point out – we don’t need to have a scientitific data for an infogrpahic. So, what are the standards for quality data for the space inbetween?

  6. Norman Reiss says:

    I think strong data collection practices & storytelling can live together – no reason why nonprofits have to pick one or the other. Having reliable data is critical, and implementing evidence-based practices is becoming more common is some of the best run nonprofits (I’m currently on committee to select orgs that will receive Nonprofit Excellence Awards)

  7. To my mind, the standards vary based on the risk/benefit. You want to present data-based cases for your cause. You don’t want to provide fodder for your detractors to attack you with, and you don’t want to be thought of as an organization that throws out questionable numbers in support of your cause. If the impact of putting broad datapoints out there that aren’t terribly verifiable is that it makes you look (to tie my prior comment together) unauthentic, there’s a danger. My org puts out a stat that 80% of the people who qualify for legal aid and need legal aid aren’t being served. That’s based on some pretty reliable research and isn’t considered controversial. It makes a key point in support of our work (funding legal aid attorneys). But we have detractors who, if the datapoint could be easily shot down, would. So we have to be careful. The infographic has to consist of uncontroversial facts and/or well-backed data. And the datapoint presented should present a compelling argument in support of your work (not just data for data’s sake).

  8. Ari Davidow says:

    For some reason this subject reminds me of a panel about Linked Data that i attended last year. There are some five stages from “inchoate mess” to “linked data.” One presenter emphasized that getting clean data online even at stage 2 (spreadsheets) is incredibly valuable. I have to imagine that something similar applies to backing up stories with data in non-profit scenarios. It is important not to let “perfect data” become the enemy of “good enough data from which to reasonably draw conclusions”–we need good enough data, not rigorous ritual.

    I think it is very true that for some non-profits, trying to find their ways in the world of data, ritual can become an early substitute for understanding what data are needed. But, how to gauge when data are “good enough”? That’s going to take more education and outreach.

  9. Miriam Young says:

    Science has a proud tradition of telling stories about super complicated/rigorous math. Einstein described relativity with trains and tennis balls after all. Some algorithms may make it hard to interpret results, but that’s a fun and different discussion 🙂 No matter how complicated our tools and methods are, they’re nothing if we can’t explain the results simply.

  10. Susannah Fox says:

    Thank you, Beth, for paging me to this conversation.

    I recently left the Pew Research Center after 14 years of combining traditional quantitative survey methods with qualitative, “listen more than ask” fieldwork. I personally think the best research comes when you combine the two. And, as other commenters have pointed out, you need to be able to tell a compelling story based on hard OR soft data.

    In order to stay ahead of the curve, I was always talking with and listening people who were living with life-changing diagnoses or rare conditions. I consider them the alpha geeks of health care. I noticed that they were using kitchen-table tracking solutions to try to solve the mysteries in their own lives or in the life of someone they cared for, such as a child with seizure disorder or food allergies. However, I couldn’t just announce that I thought the self-tracking revolution was reaching new heights. I needed to test the idea with a national survey. Long story short, we conducted the first national survey on the topic in 2012 and it continues to be cited today as the gold standard measure of the “tracking for health” phenomenon.

    Coda: The first time I presented the data at a conference, I joked that the only tracking I do is when I try to fit into my skinny jeans. Guess what everyone remembers? Not the data points. The skinny jeans. But hopefully they cite the data.

  11. Susannah Fox says:

    One more thing:

    If academic publication is your primary goal, then you must adhere to different standards than if culture change is your primary goal.

    Also, I can’t stop laughing about Peter’s comment: “If you’re on either end of that scale, you are a sociopath.” Someone said to me last year: “You can’t possibly be as nice in real life as you are online.” Well, DUH, I wanted to reply. But I didn’t. The conversation was happening online 🙂

  12. Beth, thanks so much for facilitating this conversation, since it’s something I’ve been stewing about (and debating) ever since my LCS talk.

    What I find incredibly encouraging about this thread, as well as the related conversation on Deborah’s Facebook page, is that so many people in the not-for-profit sector care passionately about data and are thinking deeply about how to work with it. I agree with much (most!) of what’s been said here, including Norman’s point that good data practices and storytelling can live together (absolutely!), Eric’s point about focusing on stories, Miriam’s note about explaining results simply even when methods/data are complex, Susannah’s argument for combining qual/quant methods so that your data connects to a story, and Peter’s note about the risk/benefit context around any decision regarding data methods.

    The more this conversation continues, the more I want to unpack the underlying questions: How rigorous do you need to be when telling stories with data? How do the specific circumstances of your nonprofit (for example, whether you’re in a controversial issue space, and whether you’re a service or advocacy organization) affect how rigorous you need to be on a specific project, or in general? And what *exactly* do we mean by rigour, anyhow?

    Until we get into the specifics of these questions I’m not convinced we’re disagreeing as much as we might think. For example — to pick a specific question Peter raised — what do we mean by “relaxing” about sample size? In my case, I have been able to work on several projects with datasets of more than 80,000 respondents, so I think about a “small” sample as 2 or 3k. But the truth is that even a much smaller dataset could be perfectly appropriate — for example, if you’re trying to talk about classroom conditions, a survey of 200 teachers in 10 different communities might do a good job of telling your story.

    It would be *great* to convene a group of non-profit communicators, data analysts, researchers etc to talk this through and see if we could move towards some general rules of thumb and principles for research design in the context of non-profit data projects. My hunch is it would be more useful to have separate conversations about the requirements for storytelling with data, because as I outlined in my talk, I think part of the reason we get limited in our communications is we are holding individual info graphics to the same standards as multi-year projects meant to map and guide decision-making on issues and operations.

    Hovering in the background, of course, is another question: what standards of data collection and analysis do *funders* expect to see in grant applications, grant reporting or even the day-to-day communications of the organizations and projects they fund? The answer will vary from funder to funder, but it’s crucial to bring funders into the conversation about how rigorous our data work needs to be…and perhaps even to help funders think about the relative benefit and burden of the data and research standards they expect from grantees.

  13. Andrew Means says:

    In my opinion rigor has never gotten in the way of anything. But we also shouldn’t let an inability to be rigorous stop us from using data at all.

    Not everything needs to be statistically proven. There’s a reason we have the phrase “the burden of proof”, it can be very burdensome to prove something. We should never let that stop us from using evidence in our work. Evidence can scale and fit the size and capability of every organization, project, endeavor, and decision.

    That said, we should also know when we do need rigor and need to examine things carefully and with great care. We should strive to do the best work we can in everything we do. Rigor is deeply important for many things.

  14. kayza kleinman says:

    I think that Ari Davidow makes an excellent point. In fact, that’s one thing I wanted to say. I do think that we should make sure that our data is as accurate as we can get it, and we should be clear and honest about any limitations. But, if we limit ourselves to academic rigor, we limit ourselves in many ways, not just in story telling.

  15. A few comments, ported over and slightly modified from a response that I posted to Facebook:

    1) I think it’s overstating it to say that I’ve been trained in social science. I’d only go so far as to call it “social research.” While social research does borrow some of the quantitative methods of science, it should (in my opinion) stop trying to deck itself in the prestige of science.

    1a) I should also point out that my training in social research focused mostly on qualitative methods such as ethnography. I did not pass the quantitative coursework with flying colors! When it comes to statistical methods, I know enough to understand that validity and reliability of results have specific meanings, and that I’m not the right person to do the analysis.

    2) Like Alexandra Samuel , I’ve learned to do four-celled conceptual charts; however, there are times when it helps to view things as a continuum. Let’s say that there’s a 15 point scale. At Point One, a person holds that it’s ok to be completely loosey-goosey about using data to tell your nonprofit’s story. At Point Fifteen, a person holds that nothing less than the rigor required by peer-reviewed academic journals is acceptable. My best guess is that you and Alex are at Point Six or Seven, and Peter Campbell and I are at Point Ten or Eleven. None of us are at either extreme. (By the way, I’m much obliged to Peter for bringing up the continuum as handy device in the context of his comments about the kind of authenticity that Deena Pierott talked about in her keynote.)

    3) I love engaging in this discussion with folks that I respect so highly. We can disagree in a way that enables all of us to learn from each other.

  16. Two very relevant links:

    Saturday Morning Breakfast Cereal
    http://www.smbc-comics.com/?id=3485#comic

    Let’s call it the Wisdom Cycle: Further adventures in clarifying the role of data in a mission-based organization
    http://deborahelizabethfinn.wordpress.com/2014/09/18/lets-call-it-the-wisdom-cycle-further-adventures-in-clarifying-the-role-of-data-in-a-mission-based-organization/

  17. This is a great conversation – and perhaps could benefit from a few stories or examples rather than all of us debating the theoretical. (That would provide color to the “is it a continuum or a four-celled chart” question.)

    But while we’re talking philosophically:

    To me the big question isn’t about what medium you’re sharing data (blog post vs academic article) but the “why” of sharing data. What actions are you trying to influence with your storytelling? If you’re trying to prove that something did or didn’t work, I feel like the quantitative method needs to be valid.

    Far to often sloppy research methods end up with not just “relaxed” data but actually inaccurate conclusions… and that can cause harm to the people nonprofits are working to serve! We’re all responsible for studying the impact of our work to prove that our good intentions are actually leading to good outcomes. Especially when you consider we’re all working together to share ‘best practice’ because no organization has the resources to test everything themselves.

    If we’re drawing conclusions from quantitative data, we’d better have pretty rigorous methods. If we’re sharing anecdotal stories through qualitative data, that’s when your reporting can be more “relaxed.”

  18. […] Beth Kanter asks Does Rigorous Data Analysis Thwart Effective Storytelling by Nonprofits? […]

  19. […] comments others have added to the conversations. It’s a blogger’s dream come true. Does Rigorous Data Analysis Thwart Effective Storytelling by Non-Profits? (Beth’s […]

  20. […] Beth’s blog post, Does Rigorous Data Analysis Thwart Effective Storytelling by Non-Profits? she discussed whether the standards of rigorous data analysis would make non-profits leaders miss […]

  21. We actually believe that NGOs NEED to collect better data to tell their story – that it’s table stakes. This is the way they make their case for their causes – with statistics and real data. It’s also critical for encouraging donors (TechSoup puts a ton of stats into their reports). I think NGOs need to learn how to think about data in the right way because data is the institutional memory, in many ways,of an org, and properly managing data and data collection means you can easily communicate, simplify reporting, and create more accountability. Lastly, NGOs may find that rigorous treatment of data helps them actually understand whether they are meeting their mission – a painful journey to transparency but one that will be useful for constituents and institutions both.

  22. […] change summit for non-profits, one of the sessions generated some debate around the question: Does Rigorous Data Analysis Thwart Effective Storytelling by Non-Profits? This resulted in a useful 2×2 matrix comparing the […]

  23. […] preceding is a cross-post by Beth Kanter, the author of Beth’s Blog: How Nonprofits Can Use Social Media, one of the longest running and most popular blogs for nonprofits.  To read the original article […]

  24. […] preceding is a cross-post by Beth Kanter, the author of Beth’s Blog: How Nonprofits Can Use Social Media, one of the longest running and most popular blogs for nonprofits.  To read the original article […]

  25. […] preceding is a guest post by Beth Kanter, the author of Beth’s Blog: How Nonprofits Can Use Social Media, one of the longest running and most popular blogs for nonprofits, and Andrea Kihlstedt, […]

  26. […] year’s Leading Change Summit in San Francisco. Noted nonprofit blogger Beth Kanter was there and captures the mix of outrage and admiration in the room as Alexandra gave here thesis: rigorous data gathering and analysis can get in the way […]

  27. […] preceding is a cross-post by Beth Kanter, the author of Beth’s Blog: How Nonprofits Can Use Social Media, one of the longest running and most popular blogs for nonprofits.  To read the original article […]

  28. […] preceding is a cross-post by Beth Kanter, the author of Beth’s Blog: How Nonprofits Can Use Social Media, one of the longest running and most popular blogs for nonprofits.  To read the original article […]