In our new episode of the Collective Impact Forum podcast, we’re talking with Justin Piff from Equal Measure about his recent article Data in Collective Impact: Focusing on What Matters. This piece was featured online in the Stanford Social Innovation Review and is part of the online series Collective Impact, 10 Years Later.
In this chat, we talk about four key strategies for understanding and using data effectively to support collective impact work as well as ways collective impact funders can serve as a valuable data partner for initiatives.
Please find a transcript of this podcast lower down this page
Resources and Footnotes
- Article: Data in Collective Impact: Focus on What Matters
- Series: Collective Impact, 10 Years Later
- Resource: OYF Evaluation and Assessment
- Report: Equity Counts: Using Data to Increase Equity and Improve Metric Outcomes for Opportunity Youth
- Webinar: Evaluating Systems Change Efforts: Where to Start
The Intro music, entitled “Running,” was composed by Rafael Krux, and can be found here and is licensed under CC: By 4.0. The outro music, entitled “Deliberate Thought,” was composed by Kevin Macleod. Licensed under CC: By.
More on Collective Impact approach to collaborate for social change:
Listen to Past Episodes: Listen to past episodes in the Forum resource library. You can also listen and subscribe via Itunes, Spotify, Simplecast, Sticher, iHeartRadio, Amazon, and other podcast apps.
(Intro) Welcome to the Collective Impact Forum podcast, here to share resources to support social change makers working on cross-sector collaboration.
The Collective Impact Forum is a nonprofit field-building initiative and online community that is co-hosted in partnership by the nonprofit consulting firm FSG and the Aspen Institute Forum for Community Solutions.
In this episode, we’re discussing some key strategies for understanding and using data effectively to support collective impact work. These strategies were first discussed in the article titled, “Data in Collective Impact: Focusing on What Matters,” that was recently featured online in the Stanford Social Innovation Review, and is part of the online series Collective Impact, 10 Years Later. We’re excited to dive deeper into these data strategies with the author of the article Justin Piff who is vice president, of learning and impact at Equal Measure, a nonprofit evaluation and consulting firm. Interviewing Justin is Collective Impact Forum executive director Jennifer Splansky Juster. Let’s listen in.
Jennifer Splansky Juster: Hi everyone. Welcome to today’s podcast. I’m Jennifer Juster, executive director of the Collective Impact Forum and I’m happy to be with you today.
Over the past few months, the Collective Impact Forum has partnered with the Stanford Social Innovation Review to carry a sponsored online series, Collective Impact, 10 Years Later. This series has been lifting up the perspectives of practitioners, funders, thought leaders, and intermediary support organizations, highlighting lessons from collective impact work over the last 10 years. The series includes a range of perspectives on a really broad variety of topics. I encourage you to check it out on SSIR.org.
Today, I’m hosting a conversation that will dive deeper into one of these pieces titled Data in Collective Impact: Focusing on What Matters. I really appreciated this piece being really practical and applied and taking on a topic that we get lots of questions around here at the Collective Impact Forum, data in collective impact. This piece dives into the importance of collaboratively using data in such efforts and offers concrete advice for people engaged in collective impact work. The author of the piece, Justin Piff, from Equal Measure, is with me today. Justin is vice president, learning and impact at Equal Measure, which is a nonprofit evaluation and consulting firm that partners with foundations, nonprofit organizations, and public entities to help them advance social change. Welcome Justin. Great to have you here today.
Before we dive into hearing from you about the article, I’d love to have you tell me a little bit about what brought you to the work that you do at Equal Measure and to spending a lot of your evaluation and learning work with place-based collaboratives.
Justin Piff: Thanks, Jen. It’s great to be here. I’ve been at Equal Measure for 13 and a half years. Since graduate school was attracted to the notion of using research to inform decisions whether it’s for practitioners or policymakers, and so I was attracted to Equal Measure what feels like eons ago because of its commitment to using data to drive change. Like I said, I’ve been here for a while, and so over that period of time we’ve certainly seen an increasing number of philanthropic investments in place-based systems change and we’ve been involved in that work really since the beginning or since really the resurgence of those investment approaches back to 2010 and even beyond that. So our work has really evolved with the field and hopefully at points we’re leading this field in some spaces as well.
Jennifer Splansky Juster: That’s great. We both have been in our roles and at our organizations a long time, which it’s fun to talk about how this practice has evolved. I know, Justin, you and your colleagues at Equal Measure have been part of being a learning and evaluation partner to so many different collaboratives based in place, working with both the foundations and the sites for many years so that expertise is just terrific to contribute to this conversation.
For folks who have not read the article, can you just give us a general summary of the piece? Then we’ll certainly dive into many of the details.
Justin Piff: We know that data is important in collective impact. Obviously, it’s one of the five conditions. Our own work really confirms that it’s a critical driver and a competency of effective partnerships. Yet, many struggle with using data effectively. This piece really offers guidance on how to overcome some of the most common challenges. It focuses on using data with intention, prioritizing what’s most important, listening to voices of those in the community and those on the ground doing the work. It’s so easy to get distracted or caught up in the weeds when it comes to data and so this piece was really an attempt to pull back a bit and provide a more macro view of data strategies.
Jennifer Splansky Juster: You mentioned a couple of them but there are really four primary pieces to how you are providing advice to folks about using data in their collective impact work. The first one you call out is prioritizing the learning, not the data system. Super important, so tell us more about that.
Justin Piff: I think this principle is really key. So many communities have really put this notion of a perfect data system on a pedestal and I think it ends up getting in the way of the work. Form really should follow function yet so many communities I think just spend so much time, countless hours and days, focusing on building this quote perfect system because they’re convinced that system will lead to change in their communities. While it can certainly be an effective tool, it’s just so important for folks to take a step back and think about what they need to know about their communities, understanding the problem, understanding the challenge, understanding who is maybe benefiting from the state of the system as it currently is situated in and whose lives might need to be improved. Starting with the end in mind, starting with the why not the how and really viewing data systems as a means to an end, because I think once communities can get out of their own way in thinking about that perfect system, they’re much more likely to end up with a system that actually meets their needs.
Jennifer Splansky Juster: Justin, one of the things that I often hear from folks is that they get caught up and maybe this is what you mean by system, in the technology, what technology solution should I pick. They feel very complicated, they feel very expensive, and for folks like me who are not technology people, it’s really intimidating. Is that a little bit of what you were talking about, thinking about what you need to learn, not the technology solution itself?
Justin Piff:That’s a great way to put it, yes, absolutely. Even really effective data use can be done with Excel spreadsheets. But yes, folks tend to lead with the technology and forget why they were interested in understanding data to begin with.
Jennifer Splansky Juster: Can you think about any communities that have worked through this and share a little bit about what that looked like for them to kind of take a step back and not focus on the technology but really prioritizing learning that then they could create the system that made the most sense?
Justin Piff: So, there are a couple that come to mind. Certainly, many of the communities associated with the StriveTogether network, I think, really do an excellent job of leading with data. So I think about the greater Cincinnati area and the Strive Partnership there being a really good example. The city of Chicago has also done a really nice job. Strive Chicago has been working for quite a while on a very integrated data system and again, is particularly effective at really thinking about the strategy behind its data use and really leading with the why and not building out that system without an idea of where they wanted to be headed ultimately. There are several examples really from all over the country, large communities, small communities, rural communities as well.
Jennifer Splansky Juster: The second strategy that you talk about is being clear about whose lives you hope to improve. Tell us more about that and how that ties to the data work.
Justin Piff: Let’s assume that most collective impact initiatives really have been launched because they’re looking for systems change. They’re looking for systems change because inequity exists. But many of these communities for one reason or another aren’t necessarily very explicit about those inequities, so they kind of dodge or work around the way the system is currently structured to benefit say certain types of residents or residents of one demographic over another. The encouragement here really is for folks to be very clear and articulate about who they expect to be and who they hope to be the ultimate beneficiaries of collective impact work. Certainly, if we don’t name inequity, we can’t address it.
Jennifer Splansky Juster: In the article you give some interesting examples from the Aspen Institute’s Opportunity Youth Forum work. Can you tell us a little bit more about this?
Justin Piff: Absolutely. The example there really has to do with the way that American Community Survey data is collected and analyzed and recorded and, in a nutshell, we had identified that data source as a really important way to understand the state of opportunity youth across the country. Those are youth who are neither working nor in school and are the age of between 16 and 24. That data source really was very important for us as part of an initiative with Aspen known as Equity Counts. It essentially was the only data source we could find that would nationally help communities compare opportunity youth outcomes across a really diverse array of communities around the country. Yet despite as strong of a source as it is it still certainly has limitations.
In that example specifically, one of the limitations is that the data are aggregated and shared generally in units of about 100,000 residents. Many of the communities that the Aspen Opportunity Youth Forum works in are much smaller, have much smaller residents, particularly rural and tribal communities. The Hopi community in particular really helped us understand just how limiting that data was for that community because the large number required to really look at trends across the population masked a lot of the unique experiences of the opportunity youth living in those tribal communities just because the absolute number of residents and youth in those communities was so small.
Jennifer Splansky Juster: Can you say a little bit more about how folks, just pulling on that Hopi example, have approached being data informed in their work when perhaps the population isn’t big enough to show up in those public data sets as a unique subset of the population?
Justin Piff: The Hopi community in particular has been working for a while now in building the data capacity of its partners. In the absence of a large macro dataset, I think the data collected by partners is that much more critical. Because it’s a small close-knit community I would never say it was easy to do but maybe there’s a real asset there and a real advantage for the folks in the community to leverage and also really thinking creatively. The Hopi community relies on stories of the youth in their communities and again, the close-knit community and the oral tradition allow folks to talk in a really different way about what’s happened with youth in their communities. They tend to know the youth in the communities or they know someone who knows someone in the communities. That’s very common in many of the rural communities that we’ve worked with as well. The relative proximity, if you will, of those collecting the data to the youth living in communities and being in schools has really served as an asset in the absence of high-quality data systems.
Jennifer Splansky Juster: That’s actually a great unintentional segue to the next strategy that you mention in the article around using qualitative data. Tell us a little bit more about the lesson and the advice around using qualitative data.
Justin Piff: The notion of using qualitative data in many respects feels trite. There’s really nothing brilliant about suggesting that folks use qualitative data but it’s so often overlooked. We talk about creating a shared vision in collective impact and that vision is often quantitative in nature and it often has a data-driven metric associated with the end goal. Philanthropies who of course are large funders of collective impact initiatives often are also associating success with quantitative metrics. This really is just an attempt to remind folks that qualitative data can help us learn so much more about a community in so many rich and nuanced ways well beyond stories that quantitative data could ever reveal.
Jennifer Splansky Juster: One of the things that I see in some of the collective impact efforts is that when you’re bringing together folks from really different backgrounds and different parts of the community into the same table to look at data together, you get so many nuanced interpretations of the data because you don’t only have system leaders or only community members or only funders looking at the data and thinking about the rich perspectives or rich insight that can come out of the data. I love thinking about that not only looking at qualitative data but discussing the stories and other forms of qualitative data as a collaborative as well.
Justin Piff: That’s a good point. We’ve had—I should say we’ve seen a lot of communities have success with data walks or data galleries where they post quantitative data and metrics on large posters and they give folks an opportunity to walk around and understand the data on their own and then regroup perhaps in small groups to discuss what the data mean to them, what’s behind the numbers, bring communities together to just like you said, offer multiple interpretations of what’s going on within a particular community. I also think for any evaluator who’s interested in this work, it’s a good reminder as well that when we collect data and produce reports it’s real important to share those findings back with grantees or community members and help us help us make sense of that data as well.
Jennifer Splansky Juster: That’s great. For the fourth strategy we have is keeping the short and long games in view. You often use some really helpful sports metaphors, Justin, when I’ve heard you talk about data in the past. Tell us about this game, the short game and the long game.
Justin Piff: In a lot of collective impact work it feels like folks are forcing themselves to choose between the two. Those who kind of are focused on the long systemic changes of collective impact I think might lose sight or certainly are at risk of losing sight of short-term progress and whether they’re on track towards those longer term outcomes. We have sort of the other type of community that really wants to see results or maybe they’re driven by a set of partners or funders who really want to see results. But we know systems change takes a long time and so they end up losing sight of those longer term often structural and systemic changes in the interest of quick wins. I want to encourage folks to toggle between the two, that being the short and long view and pay attention to both.
In terms of sports analogy I think it’s a little bit like needing to pay attention to the score throughout the course of the game and not waiting until the end of the game to determine whether you’ve lost. Paying attention to the score, paying attention to free throw percentages, whatever sport you’re interested in, there are always short immediate metrics that need to be adjusted along the way before the game is over.
Jennifer Splansky Juster: One of the things I’m really hearing, Justin, is that this is about data but it’s also very much about the strategies and how people are spending their time and if it interplays. Sometimes we see the need to collect data driving the activity perhaps by compliance or funders-dictated metrics and the idea here is using data as a tool but not letting it drive necessarily the priorities. I’m thinking about—and of course, thinking about both the short and long game. Is that fair?
Justin Piff: Absolutely. That’s a great way to put it, Jen.
Jennifer Splansky Juster: Do you have any examples of folks that you think have done an interesting and good job of balancing the short and the long game in their work?
Justin Piff: I do think the Aspen Institute does that well at a portfolio level. They have and I mention this in the article, they have set out a set of systems change metrics that they actually have Equal Measure evaluate and assess annually. Those systems change metrics, while they’re long term are also short term and allow folks to monitor, communities I should say, to monitor progress towards those long-term systems change measures while really not losing sight of those. Simultaneous to that, those same communities are also asked by the Aspen Institute to collect data on youth outcomes and again, the opportunity youth outcomes, which again, helps communities I think toggle between the immediate needs of addressing young people’s needs and maintaining that longer term view of the ultimate goal of systems change. I think simply asking communities to track and pay attention to both without I’ll say strong consequences for the implications of that I think go a long way and I think it’s a really effective strategy for nudging communities in both directions.
Jennifer Splansky Juster: That’s helpful. This isn’t something you talk about in the article but you’ve perked my interest. I know that there’s a framework for evaluating and measuring systems change that is used in that portfolio of work. A lot of folks listening to this podcast, I think, are very curious about how to measure systems change in their work. We can include in the show notes a link to one of the recent Aspen Institute reports that talks about their systems changes but for listeners now, can you just tell us a couple of those measures and what the kind of things are that are being looked at as indicators of systems change?
Justin Piff: Sure, absolutely. It’s a self-reported assessment that communities respond to annually. When we look at systems change, we’re really looking at or we’re trying to pick up on the observable manifestations of systems change. We’re asking folks to report on real changes in their communities that they expect to see as a result of the quote system changing. I will answer your question but before I do, I also just want to mention those systems change measures were developed by Equal Measure but certainly co-constructed with both the Aspen Institute, even some of its funders in the communities themselves. I just wanted to reiterate the importance of having other folks contribute to the vision of what change should look like and what should be measured when we’re talking about community work. They did certainly provide quite a bit of input into those definitions for systems change.
Back to your question, Jen, the types of things we look at include policy changes, whether they’re efforts towards or success towards state policy changes, local policy changes, school organizational policy changes. Those sorts of changes certainly are manifestations of a broader systems change. We’re looking at new funding. We’re looking at new funding that promotes the work of opportunity youth-focused efforts within communities. We’re looking at funding and tracking funding to support backbone organizations in those communities, and we’re looking at just general funding from partners as well.
I think it’s fair to say that a system is changing when partners demonstrate that they really have some skin in the game and one way to do that is to provide the effort or backbone organizations with funding as well as in-kind certainly support as well. We’re looking at programmatic changes. We’re looking at the development of what we often call a new pathway, new partnerships and collaborations across organizations, and we’re looking at culture changes within organizations as well. That notion of change within and across organizations is something we pay attention to, again, to give us some confidence that these aren’t what we often refer to as one-off changes but these are changes within a community that are really going to take root and are broad across a number of partners and deep within those partners as well.
Jennifer Splansky Juster: That’s really helpful, and thanks for humoring the little detour into the world of evaluating and measuring systems change but I think folks listening to this podcast will be really interested in that approach as well. So, Justin, toward the end of the piece you provide some specific guidance for funders who are investing in collective impact or place-based collaborative work. You’ve referred to a couple of tips or points for funders already around asking for data on the near- and long-term work, recognizing and appreciating the value of not only quantitative measures but qualitative indicators as well. Tell us a little bit more about how funders can best support effective data use in collective impact work.
Justin Piff: I think the opportunities are almost unlimited, Jen. I really do think though if there are any funders listening to this, the primary or maybe the greatest wish and the hope I have for data use and collective impact is really that funders will just continue to support it more. Nearly every funder asks for data, requests data from grantees but very few provide resources or capacity to collect that data and use it effectively. So I’d love to see that ratio change of asks versus gets if you will, and I think that’s a really important start, and it’s not just the technical solution but it’s really about helping to build that evaluation, that data learning muscle, I think, is just such an important skill that needs to be built across communities. So it’s that strategic aspect to using data in addition to the technical component.
Jennifer Splansky Juster: That’s really helpful. You work with a broad range of folks that are doing this kind of collective impact work. From the practitioner perspective, the backbone, and the data partners on the ground, what do you see as some of the major challenges or stumbling blocks that people face? Do you have any tips for addressing them?
Justin Piff: I think data use takes time, right? And everybody’s busy and everyone is under a lot of pressure to produce results so I think certainly that pressure can get in the way. When we talk about the issues we’re dealing with in communities, they require urgency too, right? And so it’s easy to want to advance the work very quickly because I think that is how we should be thinking about the work but at the same time, good data work takes time. It takes careful planning. It takes conversations. It takes hashing out disagreements with partners about what’s really important. I think all those things are in service of the work but it is easy to feel like a community can get by without really addressing those things in the interest of advancing the work. So one recommendation for practitioners and folks in community would be to carve the time and give yourself permission to slow down and know sometimes you need to go slow to go fast or go slow to go far.
Jennifer Splansky Juster: Justin, another challenge that I often hear folks talk about is that they’re actually overwhelmed by the amount of data that they have access to. What guidance might you have for helping them consider what data is most important in different situations?
Justin Piff: I almost feel like the answer is in your question. It’s really important to help folks and for folks to think through why they might need data for different purposes, and align what types of data they need for those purposes. So when we had worked a couple of years ago with the Opportunity Youth Forum, we developed a data framework that may come in handy here. We had identified six uses of data and then had folks really think through what data and data sources might help them apply data in those ways. So I’ll rattle these off and maybe they’ll be helpful for some folks listening.
The first is communicating the vision and really being clear about the type of data you need to bring folks around the table, and to articulate that vision with the broader community.
The second is case making. Many folks will require you to quote make the case, right? Whether it’s funders, policymakers, and so on, and that type of data needed to do that is different from other sources of data.
Continuous improvement sometimes can get overlooked because that data tends to be a bit more granular and requires more regular investigation. But continuous improvement is really the only way collective impact initiatives are going to get better.
Understanding the population and its needs I think is really, really important so who is benefiting from the structure of the current system, who isn’t, and among those who aren’t, why aren’t they and what needs do they have. That’s where qualitative data in particular can be really helpful.
Partner accountability can sometimes be overlooked as well but collective impact really relies on partners working effectively together, and so there are specific datapoints that can help initiatives or partnerships really think through whether partners are fulfilling their role, and whether they’re really contributing in the way that they need to be contributing to move the work forward.
And then lastly, assessing partnership health gives partnerships or collaboratives an opportunity to take a step back and understand the functioning nature of the work, clarification of goals, communication styles, inclusivity, and so on.
So together I think those six data uses provide a nice framework for helping folks think very intentionally about the types of data they might need in the context of what it is they’re trying to accomplish. So, Jen, that might be my answer to your question there.
Jennifer Splansky Juster: Yeah, that’s helpful. Thank you, Justin. I think that’s a really helpful six-part framework. Well, Justin, this has been a treat and a great conversation. Thank you so much for taking the time not only to chat today but to capture these reflections in the article, Data in Collective Impact, for the Stanford Social Innovation Review. So thank you again. Thanks to all those who are listening, and we’ll catch you next time.
(Outro) And this closes out this episode of the Collective Impact Forum podcast. If you are interested in learning more about what was discussed, you can find links to resources in the footnotes of this podcast, including a links to the SSIR series Collective Impact: 10 Years Later where the article discussed today is featured.
We would like to acknowledge that this episode was produced and edited on the unceded, traditional lands of the Coast Salish people, including the Duwamish, Suquamish, Stillaguamish, and Muckleshoot tribes. We honor with gratitude the land itself and the past, present, and futures of these tribes.
The Intro music for this episode was composed by Rafael Krux and our outro music is composed by Kevin Macleod.
And our recent news is that registration is now open for our virtual Collective Impact Action Summit that will be held on April 26-28, 2022. The Action Summit is our biggest learning event of the year, with over 25 virtual sessions focusing on topics like culture and narrative change, shifting power, data, and sustainability.
And one big plus for being virtual is that we’re recording many of the sessions and sharing those recordings with attendees after, so you’ll be able to plan a schedule that fits best with you, and watch other sessions later.
We hope you can join us this April. Please visit the Events section of CollectiveImpactForum.org to learn more about this year’s Collective Impact Action Summit.
This is Tracy Timmons-Gray, Associate Director here at the Collective Impact Forum, and your podcast host. I want to say thank you so much for listening, and we look forward to connecting with you more in our next episode. Until next time, we hope you are safe and well.