0:00 My role is really just to introduce our wonderful coach and outline how this room works. In just a minute, I'm going to hand things over to Meena. But this is how things work today mean as an expert in data equity, and we'll be ready to answer any questions that you might ask that are within her expertise. There's not a formal presentation, but instead, this is an open forum that will focus on anything related to Meena's expertise. The platform, as you may know, does allow you to share video and audio during the session, so feel free to hit that button to share. However, you don't need to join the video in order to participate, you could watch and participate in the chat. So you can share by speaking or use the chat. This room is being recorded. So we ask that you be respectful of the presenter and other participants by muting your microphone if you're not talking. So with that said, thank you. And I am excited to introduce Meena today. 0:54 Thank you so much, Kristina, for that introduction. And welcome, everybody. I'm so happy to talk to folks here about beta equity. I first want to thank Kristina for being here on camera. I requested her two minutes ago, if she would be willing, and so kind to be on camera with me. So thank you for that. So I'm Meenaa, my pronouns are she, her and hers. I am joining from Vancouver, BC I am an immigrant I moved from India about six years ago, I would say what are my feelings about data? And I'm going to let you ask questions too. But I want to start with what are my feelings about data. I have worked with data for a really long time, in different roles, through different numbers through different dashboards through KPIs. I'm at a point right now where I wanted to understand what are the stories behind it not from the point of view of what is shown in front of us, but really the silences that goes into that data, because every data you have just doesn't have what it shows you. But there are also lots of pieces that are not visible to the naked eye. And I'm trying to understand those silences. To see how can we advance equity to data. My work focuses a little bit on consulting on research on writing on workshopping my focus of my researches algorithmic biases for especially on immigration purposes. And I am really excited to be here. I have questions for you. But I want to open this space by welcoming you all and asking you if you have any questions to start with. But if not, I really want to ask two questions to you. So let's start with here first. Anybody who's here any questions that you want to kick us off this session with?Okay, I see, I see no responses on chatbox. That's totally okay. So I have a question for you. Today morning, wherever you are joining from? Do you think you have generated data so far? Any place, I'm going to give an example I have generated and consumed data in two places already. I have a Fitbit on my watch. I took a I took some walk. So I have collected some data here. I have a very specific temperature in which I made my tea in my microwave. So I have consumed some data from there making sure the temperature Have you done anything with data so far, do you think you have reasonable? Do you think you have done something with data so far and just within the short few hours of waking up until now, have you already consumed and work with data so far? You can type in Yes, you can type in some messages, you can type in how you have done anything with data. Ross is absolutely Okay. Anybody else here who did something with the data? I mean, I would say we all have done something with the data. But I am going to assume for the rest of us who are here in the room that we have done something with the data. Let me ask you. And while you were thinking about it, let me ask you, what are your first feelings about the word data? When you hear the word data? What do you think about it? It can be a feeling it can be you know, I'm not going to feed you the answers. I want to hear from you. Kristina Meredith says our filming okay, there's data everywhere. True. 5:01 I see a couple of overwhelming that's true information to the set. Viruses, I get excited, but one to understand purpose and context. That's true. I'm not sure how to use to our advantage. That's true. Well, let me while you're writing this, let me share my screen here. And I know Kristine, Kristina, you just send me how can I do that? So I'm going to 5:34 the computer icon, is that showing up for you? It probably has a red slash through it right now. 5:40 Oh, yes, it does. Yep. Okay, perfect. All right. share my screen. Okay, is my screen visible to everybody? Yeah. Okay. I want to share something with you this little slide and I'm reading a book. And I made diagrams and visuals as I read books. Now, one of the books that I'm reading gives me this amazing idea. And I turned it into a visual data has many actors, right? So assume you are filling up a form for signing up for a volunteering volunteeringring work around your community. Now pick up this first bubble. Now I, I just made this visual earlier this morning today. So it's not very pretty. But bear with me, this top bubble, someone designs a form, right? Someone has designed a form for volunteering, kept it in on an ad on the internet, or Google or website somewhere, right? Somewhere has someone somewhere has designed a volunteering form, you now go access that form, fill it up. And now you're here in the second bubble, someone fills it, you'll fill that because you're interested in volunteering, then it goes back to that organization and someone in that organization doesn't necessarily have to be the one who designed that form, analyzed it, create some charts, find out who wants to do volunteering does something about it. That's step number three. From there, something happens that data gets shared, that data gets shared with the leadership within the organization or team depending on what it is right now. In the example we are talking about volunteering. So let's assume it goes in the volunteering team and everybody who has access gets access to that data, they understand. Okay, so So person you are trying to become a volunteer, they try to see some other data around impact of volunteering. Okay, so we need more volunteers. This is our impact. Let's call this person for volunteering. That data doesn't end there after creating a chart, it gets from there this last little bubble, you see that data leads to distribution, it leads to summaries, it leads to contextualization, it leads to insights. And that becomes the basis for once again, tweaking the design form. So you'll see there are many actors throughout that cycle from getting that data to all the way up to distribution and summarizing and contextualization and going back up to it. So because there are many actors in there. Oh, so long. lost you for a second there. Yeah, yeah, I have a million tabs open on my computer. Because there are so many actors. With the data, it's really tough to understand and can be overwhelming whose voice? are we hearing? Who whom? Are we centering in our research? Whether or not are we thinking about inclusion on purpose? Or are we unintentionally harming so I think there are different layers where we can start thinking about advancing equity and data. And this is what my work is, this is what I want to speak about. But I hear when I hear some of these responses that it feels overwhelming. It feels exciting, because we are not seeing the complete picture now that we know that there are different actors, whose voices are we ready to center in our work? We can have a more unified a more clear response to data and my job in this one hour is to answer your questions on all those pieces. So I can help you to talk to data better. So let me take a pause there and see if anybody has any questions. Have you struggled with the words advancing equity in data have you thought about it Before? I have 10:11 a question for you. I'm thinking of the many organizations that I work with here at Neon. And I feel like so many organizations struggle with the basics of data collection and analysis. And maybe it's spread across 18, different systems and spreadsheets and all these different platforms. So thinking about equity and data, and what you're talking about feels like that big next step when we're struggling just to get our spreadsheets in order. So how do you help folks connect those dots? 10:39 You know, it's such a good question, because I do get that and usually, whether it's my consulting or volunteering investor, the questions start with, what are the data points should I be collecting about impact? What other things should I be collecting? I tried to unpack that sentence. Now while I recognize that the nonprofit said they're ready to collect more data, I asked them to take like five steps back from where they are asking this question. Because one, most often they come with this idea that I want to be more inclusive. So I want to collect more data, I want to be more inclusive of which communities I am not serving right now. And so I'm I want to collect more data. There is a gap. The gap is we haven't really clarified and come on the same definition of what inclusion means to us what equity means to us, when we are jumping, if we are taking two steps quickly, to collect data on something and feel like okay, we can quickly get our hands dirty and get get rolling with this work. But we really need to have that definition. Hone, hone down, like what is inclusion to us mean? And the before even answering that question, where can you start with data collection, I often say that the relationship between the word data, and the word equity or inclusion is a partnership. I read in a book, your data is as good as your democracy and your democracy is as good as your data. And so your data informs the work that you are thinking about doing with inclusion and equity, the work that you would be doing with the inclusion and equity will inform back your data. So it's a two way street. It's a partnership. And once they have that clarity, I go into asking then we start with those some frameworks into how can you get started with the data collection? Where are you starting? Why do you need that data? What do you have existing data? And what can you do really, right now, to make sure that you are not just adding on more to your database, but really doing it that purpose? So we get started on those frameworks. But to your question, which is a great one. I started with five steps back starting okay, why? Let's start with five it's like peeling an onion five times I could keep doing it. And you get to the good part. A Barbara says we've thought about it, and we're making it in a focus in this program here. I'm really glad Barbara. How are you thinking about it? If you want to share a little bit over the chat or through the voice? Is there anything specific you are doing? While she's typing up, let me share something that I have been learning through my workshops on advancing equity. So I do these workshops on data collection and advancing equity. I have done it with nonprofits, I have done it individually and one common thread that comes up through every workshop. Is this question on? Okay, I am here to learn about advancing equity through data. But where do I start? That's the basic question which you were kind of asking, Where do I start? And it's interesting when people ask this question, because the answer sort of depends on who's asking, I have have been asked this question by a Data Coordinator. I have been asked this question by a board member. Oftentimes, what gets missed is this unified outlook. Two words that we need to be collectively we need to be a team, we need to have a group answer to this question. So we all instead of me as Meena, one individual employee in the organization, trying to do something with data and writing stuff, something with data that could be isolated that could be in silos and silos don't help. So advancing equity in data is not just about data. It's about like the whole culture in pushing towards equity. But let's read what Barbara says okay. First, we are thinking about what data we are collecting and Who is being excluded, which is great. As an example, if we are asking people to fill out a survey on their smartphones, we are missing people who do not have that capability. That's a brilliant example. Barbara, I would give an example from my end. So last year, we were I was working with one of the organization based out of Seattle. Their mission is to support immigrant women going through domestic violence, they offer legal services. And my job was to support them in collecting and creating the impact report and creating the impact stories. The way that data collection was happening, before I joined was sending out emails, sending out emails and sending out postcard based like letters, like send your stories back to us. And when we joined, the first thing that we realized was, why are they getting so few stories, that's because the way the how of their data collection is not inclusive, they are reaching out to women who took legal services, because of domestic violence, they do not necessarily one have a stable home address to get physical cards to write down their stories and two it's not safe, it's they don't feel safe, they feel don't feel comfortable in sharing their stories out loud it on a piece of paper and sending it back. So it's an excellent point, when you are starting out with your data collection to think, how are you doing it? It's not just about the questions. It's also about, how are you sending out your data collection? Sometimes it requires phone, sometimes it requires paper, if you're doing it on a piece of paper, you might you could what you could do is use a QR code of your CEU theme data collection tool, let's say on a survey platform. And you can distribute that survey in two or three forms depending on where your audience is, the primary objective of data collection is to meet your audience where they are. And so if they are more comfortable with paper based survey, you can send them a QR code and then you could collect that data in your systems. So I'm really glad that you're doing that. wording of the question. So there is that absolutely makes sense. One of the examples I would give is I was recently doing a workshop for someone for an organization, and they support BIPOC entre-entrepreneurs here in Canada. And the work that we were doing, I was offering this workshop and we were doing this the question they asked one of the questions they asked in the data collection is the so there as part of their process. They go out in the community and interview BIPOC entrepreneurs. One question they ask is, what are some of the advantages that BIPOC entrepreneurs have? Question one and question two? What are some challenges that BIPOC entrepreneurs have for running their business and innovating in their business? Those are where the two questions and both have very wording that should be changed? One? What are the advantages to bipoc entrepreneurs? This was a simple question. But it has an assumption, it has an assumption that bipoc entrepreneurs have some advantage. So it has that it can start with that starts with the bias right there. So the question needs to be broken down, do you think if there is an advantage, and then explain what that advantage really means? Because to one, it could be about connections to other it could be about resources? So that's one example. And the other question about how what do you think about innovating and running a business? What are the challenges? If you ask the question like that, you're actually bringing two questions in one. So the wording again, make is really important. 18:57 And once we get first round of responses, we look to see who's missing and figured out how we can reach them. Do we do community outreach meet people? I'm not sure? That's a perfect comment right there, Barbara saying, Who have you missed in your responses? I would offer one more suggestion here that from the things I've seen out in the community, if you were asking questions that are sensitive to someone's identity, so identity based questions, offer an option of prefer not to respond like explicitly state that out loud, prefer not to respond to the checkbox. And you will be surprised when you analyze that data. You could actually track how many people selected prefer an auto response. So let's say that could become like your trust metric. So this year, let's say you launched it to 70 people and 50 people selected prefer not to respond. Great. Next year. If you do the same survey, you will find maybe 40 people selected so 10 people less that's like they're starting to trust your work. work towards demographic data. And so I would say, definitely not just back who is missing from that data, but whoever is responding to what extent are they comfortable in responding to you what is missing in their response? When you are asking them questions. This will also help you next time when you are doing survey, let's say, you mentioned do we have? Do we do community outreach meet people in the laundromats churches, if you have to change your way of the data collection? To understand who is not there in the first round of analysis? Who are the people who got this, you also get to understand how can you tweak the language and inviting people next time for the survey? Like how can you be more conscious, more intentional? How can you give examples, and these are the kinds of things that we are hoping to get these this is what I'm looking for. This can be another way when you aren't going to approach in the second ways who are second or third way of data collection. But those are some really great points when you're thinking about. 21:05 That was such an interesting point about the trust metric. That's so interesting to me, because it can be I think it's something we all fall into, right? You try something once you're like, oh, that doesn't work, we didn't get a lot of responses. But having that data to dig into it and keep improving and not just throwing an idea out the window. I think that's so smart. 21:27 And one thing that I have not noticed, especially when it comes to sensitive information, data collection, and sensitive information, like demographic data, social identity data, most surveys or data collection tools, they only have like one email, if you have technical issues reach out to so and so person that's the only human humanizing connection through that data collection, the more you can humanize your data collection tool, it's going out to so many people, the better the chances are for you to collect data. That means one of the things that you can do, apart from the basics of making sure that offers trust it offers humanity offers privacy is your fifth, this is a very concrete example that I often include, offer at least three email addresses in your cover letter or in your top of the survey or top of the data collection saying one is for anyone who wants to know the outcome of the data collection, you need to offer them human email address, which doesn't say, a generic one but a human email address, say for the outcome of the study, whichever this person second email address is, we are including 10 demographic questions. If you have any questions or concerns about why we are asking for race, gender, ethnicity, please feel free to reach out to us on this email address. This is our designer or this is something if once you give that email address, that instantly creates some level of trust in the in the audience mind to say, okay, they are offering me a chance to react to these questions and not just forcing me to respond to it in these checkboxes or in these text boxes. And the third is, of course, the technical assistance. I mean, I'm not saying replace it. But humanize your data collection tool so it can come back with good data. 23:18 That's an interesting point, too, because a lot of times I can think, oh, maybe it's an anonymous survey, or you wanted to seem more formal or I could see why people veered that direction, especially if it is sensitive, like you said, so it's yeah, it's balancing it maybe it's anonymous, or it's sensitive information, but humanizing it too, I've never thought of it like that. 23:39 And one of the things that you achieve and you have like these two or three email addresses as you kind of add more accountability to on your nonprofits leadership, once you offer these email addresses, your nonprofit leadership feels okay, I can no longer ask gender in this just binary male female format because people have an email address to reach out and react to it. So you automatically add a level of consciousness just to make sure that the data collection is ethical. But I want to take a pause here and make sure if if anybody has any questions reactions to what we are talking so far, this does this resonate with you or do you feel like this doesn't belong to your realm right now? 24:37 Okay, sounds like I can keep talking more and keep sharing more from my work. Well, one of the things that often comes up is how also so I talked about the partnership right between equity and data, your equity work gets being formed from the data. And then once you do something with equity and inclusion, then you inform the data. So sort of a partnership. But what do you do to actually translate that into action? So one of the things and here's something that I can actually share, give me one second, so I can share it with you. 25:31 I want to share with you what does what is the connection between data and equity. And I'm going to do that right now. Okay, so everybody can see my screen. Perfect. Okay. So here's my screen. So when you think about equity, advancing equity or inclusion, you don't necessarily start with data. You start with your clarifying, understanding around the words, equity and inclusion. And it could be for your program, it could be for your policy, it could be for your grant making, it could be for your funding, let's say you want to know, you want to make sure that your program is inclusive. Now, what's the connection between that and data? Like I said, data will inform that work, you will look at your data and find out Okay, so if you're not the communities, that that are not represented in my program that do not get served by my program. So you will take some actions, you will do something about it in your program. And that will inform the data that you are collecting to start seeing some impact, you will collect some data, you will bring it back to your system. So it's a partnership. But when we when I say the word data, what does it really mean? It can be about collection, it can be about analysis, and it can be about recording strategies and dissemination. Dissemination means how do you share back all the research that you are doing back with your community? So the green is collection is green right now? Because they're talking about collection here. But there are three things that the three aspects that he informs or gets informed because of your work. Now, what does it mean in action? In terms of collection? What can you do? This is what can you do? You can create a metadata template, this is instantly simplest, easiest thing that you can do to create a template to say, what can you do now here are a couple of questions that you can include in your metadata, any data that you are collecting, ask a few questions, who collected the data? How was the data collected? Why was it collected? Where was it collected from when? What was the sample size? How was how was the decision of the sample selection done? Who was included and excluded? Here are some basic questions, the metadata questions that instantly gives you the power to bring some equity in the conversations when you take it back. Thinking about your program. 28:31 Let me, go back to where you are in my million tabs, and I'm here. What do you think? What do you think looking at that metadata? Is that something that adds? I'm curious, because I often say that out loud in my workshops that include this metadata. But I want to ask, do you do first of all, do you do anything with metadata? Do you think? Is it possible? Or can this add of justice add more to your workplace? Because this easily adds to your equity lens? This can easily help your question. I want to understand the challenges and barriers and doing this metadata, what are your challenges? 29:25 One of the challenges the common challenges that I have heard so far is sometimes when people start collecting metadata, they, they understand, oh, this data point was collected 20 years ago in this organization, and they go back to the leadership and they ask, Well, what happened here and they say, oh, you know what, the testing here and be even before I joined, so I don't know where it's coming from. So to avoid that, we can create some metadata in in our with our data collection with our surveys. So I want to know if you who have done something with the metadata? Or are you thinking about it? How are you thinking about your data collection? We can even talk in a more general sense, not just the metadata part. 30:21 One challenge that I hear come up, oh, Barbara, you go first. 30:26 I didn't know that was gonna work. I'm so sorry. So I'm gonna Mikva challenge is we do action civics. And with a, we have been tied to critical race theory, although that's not something that we do. And so in America, that's something that's that's a bit of a hot button issue at this point. And so I think that this metadata is is super important to give us a context and integrity to our data, to perhaps counterpoint some things that might necessarily be as truthful as they're coming at us. So I do understand that it could be seen as additional data to collect that could be a little bit burdensome, but I think in the long run, it would pay off so that we are able to, you know, really speak able to stand behind it and know that we have data integrity. And I also looking at your questions, Meena, I think that those really do help us ask in different ways to ensure that we do have equity being collected, how's it being collected? So I also know, to your point with domestic violence survivors, I think that oftentimes the desire for us to collect data can be burdensome, where does it collect everything, and then, you know, asking somebody to fill out a 20 page survey or whatever is just it's very burdensome. And then that there's an equity issue there, too, where you're losing people. And so I think that there's a lot in there. I don't know how to do it myself. Figure it out. I'm sorry. 31:50 That's what the you don't have to That's okay. I'm going to show you a book, Barbara and everybody. Kristina. This is a book called democracy's data, the hidden stories. It's a book by Dan Bouk. See, I'll write the name down. Democracy's data. It's a great book. It's it says it's about U.S. democracy and the connection between data. And right now, it speaks specifically about the story on how the census was designed in 1939. Who were the people in that room? And what were the exactly come some of the examples though, this author is a historian too. And so he brings these examples of who was collected in the census in 1940. And how some of those people who are take from door to door when we collected, they were Mexican and written as MEX in the form, and he chose couple of forms. And by the time they those forms reached DC, the people who were analyzing that data, turn those written as MEX turned into white. So within that time, from Texas to DC, the race changed, it's a very empowering story to see how data responds to very nuanced things. And it has still it has stories. And so if you are ever interested anybody in the audience, in fact, ever interested to understand all of these little pieces of how every data has a story, it's on us how we start reading it, it's on us how we start understanding it. This is a good book to start to think about it. But thanks for sharing that Barbara. And coming on camera, too. It's always nice to see more people in just the like, okay, they're here. What's happening with your survey, by the way? When are you doing it? 33:52 We do surveys pretty regularly. After programming, we have youth programming as well as teacher programming. I'm the one that I am gearing up toward that I'm really thinking a lot about equity is every other year, we do an alumni survey, very wide reaching, you know, several 1000 people we tried to reach, we don't get that many responses, of course. And so I'm hoping that I'm in planning process. So next fall at this time, hopefully that survey will be coming out but really taking a look at these equity portions to make sure that we're really getting a good picture of our long term impact. I think it's super important. 34:27 Absolutely. And I would say if, as you're thinking about your surveys, think about how can you have one of the things that comes out of our my workshop is how can you clean up your data collection to have absolute one source you can use in the surveys these logic based things right. So several questions goes to alumni several goes to your donors and goes to your board members and it can be the more you can have through the same form. It becomes easier on you to analyze that data and just see that there can be overlap. Some of your board members could be alums, some of your volunteers, were alums or alums were volunteers, all those kinds of things. So I think that's really interesting. If you have like one big survey, you could send it out to all you can find a lot more patterns there that can also add to your equity piece, because you'll be following someone's life journey with your organization, and seeing how we can engage them 35:28 Barbara and I'm curious, what's your role in your organization? 35:33 Director of data and evaluations, there we go. Great. That's exactly why. And what I'm what I'm starting to take on a role as compliance as well. And so the the marriage between evaluations and compliance is really becoming a lot more clear to me and I could see it, I'm better able to understand how to make the data less overwhelming for those that otherwise don't get to see data quite as much through the compliance piece. So it's actually been a really interesting My eyes have been opened in an interesting way, most recently. 36:11 It's definitely one of the ways I have seen making data more accessible and understandable is like take one sentence, like any sentence say 25% of the mangoes are going bad. Like, take an example. It's a very simple sentence, right? And ask those folks who are not on a day to day basis doing analysis and stuff like that to ask the ask them, what can they do? What do they understand from that sentence? What are the kinds of questions they get from that tendency? Most people will say, it's about mangoes, or it's about things in the fridge. You can understand where are those folks coming to you when they read that one cent 25% of the mangoes are going bad? You aren't you want to hear from from those folks saying 25% of what? What is 25% of 4 mangoes or 100? mangoes? When was the mango collected? What? Is it important? Why are we even asking this sentence? Why are we even really, those are the kinds of questions you want people to get used to. And so the more day to day, basic examples you can pick for someone to start looking at data differently, I think they would that would translate you into your work as you get to the compliance 37:38 that was interesting to me when you were talking about the meta data, you acknowledged too it's more data, right? So if you've got a team of folks and folks that's their responsibility to dig into that that you know, it probably comes naturally to someone like Barbara but when we're talking about more data that's where you can get in that loop right of it feels overwhelming because then you've got more data to work through so I like I like the suggestion to get feedback from folks of how is this resonating with them? How are they hearing this and understanding this because not everyone's very into data analysis you can't throw so much at folks. It's it's a tough balance 38:25 any reactions I mean, I love Barbara um, you and I are connecting definitely after this because I want to and of course Christina, you and I are going to be hanging out in the networking room after so yeah. But anybody else who has a reaction to all those good things that we are talking about as this resonates with you this this feel just even the discussion does it feel overwhelming? Right now? I'm curious how it's just landing with folks. Awesome, take the 39:15 am curious to know for other folks in the room, what role in your organization do you have? We were talking to putting a spotlight on Barbara of course. But do we have other folks that have a data specific role or is data one of the 12 hats that you're wearing? 39:46 Looks like we are still coming to take the responses one of the things that I would say is When I think about data, and as people are tired, hopefully people are tired. They type what are their roles, I would say, regardless of what our role is, data feels to me. And I take a lot of examples of kitchen. Like, it's like being in the kitchen, everybody in the family has a relationship to that space, right? Someone pack the fridge, someone buys groceries, someone cooks, someone only raids the fridge to eat like me. And there could be like different roles in the kitchen. But everybody has a relationship, their data is something like that. Some of us are responsible for creating charts. But that's just very portion of it. majority of us are reading about it in the media, Google census alerts, we see it in reports, we see it in impact stories, when we are talking to our donors, we take some data, we take some help in forming our narratives or context, we are dealing with data, whether or not we like it, whether or not we acknowledge it, whether or not we feel comfortable, we are dealing with data. And I think this is a good time when we can be really conscious and ask a few questions to explore that relationship with data. I'm gonna, of course, while people are writing and responding, I'm going to leave my email address here too. And my website address just in case, it helps. Paula asked, what can you share about storytelling with the data? Um, that's a great question, Paula, thanks for asking storytelling the data. Let's see. I would say it's really important. Like I said, everybody has a relationship with data. So it's important that we, I also showed you the picture where the different actors in the day, so storytelling is truly important. But that metadata helps you in your storytelling, right, so So oftentimes, if you pick up any books around storytelling, or you Google what you will get is, a tip sheet, or 10 things you can make to do your charts that are. But that's only part of the story, to be able to create a good story, you need the full picture, right? You need the full context you need, what gets included, whose voice gets included, what got missed, who got deleted from that data, and why those are not the things that are often captured in a chart, because the chart is showing you neat little five data points with bars with labels and text. But it's not giving you a lot of different things that are behind in that data. So when you are thinking about storytelling with data, there can be different things. And we can react and respond to it in different ways. Or the only primary thing I would say is make sure that you are offering enough to give a holistic, complete picture of your of the context of the story that you're offering. So it could be through the metadata. It could be by including the stakeholders in the room, if you're not collecting metadata. It could also be a by including your community in the conversation, invite them, you can include directly the audience and ask them, hey, this, this story and this analysis and this report that we are producing, does this resonate with you? How do you feel about it? And asking simple questions can really help you informing your stories that you are producing out of your data? That would be my primary reaction to the question about storytelling with data. But other than that, I will say most of the resources only offers what good things you can do with charts. But if you're thinking about also equity in storytelling, go back a few steps and go to the fundamentals of data where it's coming from. Paula Does that answer your question to some extent? We have Tyler talking I don't work with data so much and my current role however, I am a huge data person is the and would like to pursue a career in the field someday this has been super insightful. Thank you, Tyler. Um, things like what's your overall Perfect, thank you, Paula? Tyler, I love that you said you are thinking about pursuing a career in the field someday. That's beautiful because I love data. So I always like someone moving to the data field. I would say. Just as a reminder, whatever your role is today, you already have a relationship with data so you're starting a strong foothold already. You are You don't have to learn from something like something new. So you are really close to the subject already you will be credited. The only thing that I would say as you are thinking about pursuing a career in this keep asking why's and you will get better. curiosity helps in data. It leads to imagination and it leads to taking new steps. So if you are curious, stay curious, and you will do great with data. Thanks for sharing Tyler. Any other questions that people have? Are they thinking about? Are you thinking about moving into the field of data too someday? Course. One of the ways I talked about at the beginning is I want to listen to the silences in the data. And it's a tough sentence and a tough word because you don't know how you can think of the silences. Right? And so now I have I personally keep jotting down these metadata points in anything that I'm reading. John asks a question when you are referring to data. Are you referring to Google Data Studio? That's a great question, John. So Google Data Studio is a tool to visualize your data. And when when I'm referring to data, I'm really referring to like the basics. It could be your Excel spreadsheet, it could be your numbers there, it could be a survey coming through Survey Monkey. So I really refer to the generic word data where it can be coming from different places, while Google Data Studio refers to the specific tool that can help you to visualize your data and create some analysis out of it. The differentiation again, going back to the kitchen example, differentiation, I feel is data is like the raw material. It's like the tofu, which you can use to cook different things. Google Data Studio is like that nonstick pan, it's a tool that can help you to cook that raw material, Tofu thing. So that would be something there. I take a lot of examples from kitchen. I'm a foodie. Any other questions that folks had here? 47:31 Along those lines with technology and using tools like Google Data Studio, are there tools that you recommend if folks are embarking on the you know, the early stages of this? Or is it really it's that curiosity? That's key to getting started? 47:47 I would say it's the curiosity to be honest. I mean, I do I have worked with a lot of tools in my past jobs in different roles. I refrain from making you know, unless it's super specific, like if someone is wants to visualize something, then yes, go for Data Studio or go for Tableau or Power BI, whatever works best. But if you really want to learn just about data, it's about the curiosity and sitting with your team, maybe it could be a team meeting, and you take 15 minutes to talk about, hey, let's talk about these five data points that the CIO frequently use. Let's talk about where are they coming from? And who is it serving? Who are we excluding in this conversation? That kind of conversation? That's what I mean, like that. Tool agnostic, product agnostic kind of conversation. John, Hi, thanks for being here. 48:41 I just thought I'd ask the question in person, rather than put out the predicate to having data really relates back to the key performance indicator of the organization correctly, would that be? I mean, yes. So I mean, how do you so when you're coming up with the key performance indicators of your organization? You know, we that's pretty easy to do online. I mean, involved that, you know, we have the Google analytics side of things we know, you know, website visits, we know, website user sessions. And then when you get into the AdWords side of thing, you you know, you can track click through rates and things like that. So I'm a little confused where, you know, when we refer to data here, what is it? Actually are we measure measuring in a typical nonprofit organization? 49:43 That's a great question because that response will, so change when you brought up the examples of, let's say, digital fundraising, or SEO or KPIs around what you're measuring on your web. side, that same example of what you are measuring with data will differ. When we ask to Barbara she will be more responsible for compliance around Alumni or data from alumni would change. When we asked Kristina, it would more be around, let's say, neon products sales on yon products awareness and relevance. So when I speak of the word data, it could mean differently depending on where we are coming from to you. It may or may mean about websites to one, it could mean alumni to one it could mean sales. But what I really mean to say is that we can have better fundamentals and wise around our data, more intentionality of what we are seeing, even when we are creating those KPIs. So for example, for the website, project, you're talking about, what are the important KPIs that we want to look for right now, we have a ton of KPIs spread out over spreadsheets sitting in different places. Do we look at all of those KPIs? Which ones are the most important? And where are we getting the data from? Those are some of the questions that I I've been talking about when it comes to data. But you're absolutely right. If the answers don't match up, your and my answers won'y match up because I work with a different kind of data, you would be different kinds of data, where we would match up is ensuring that our intentionality matches our action. That's where we would reconcile with each other. John did we lose you? Or do you are you? 51:43 Yeah, Meena, you can talk a little bit, but I hope I'm not talking over you right now. But if you can hear me, I know, you know, there. There's data as it relates to program outcomes and things. 52:01 Oh, no, I think we lost him. 52:07 Oh, yeah. No, but I think he raised a good point when we will. So oftentimes, I get this question. I took your workshop, not now what? Now when I go back, what do I do? The first thing? Well, the first thing that you can do is you can talk to your team about these fundamental questions, your team may be responsible, let's say, Kristina, your team may be responsible for brand awareness of neon, or sales or engagement of the audience. And then you can talk within your team starting with Okay, these are the data points around audience engagement and brand awareness. Are we looking at the right things or not? Where are they coming from? And maybe if you do have been doing them all throughout the years, now we really need to think about who was getting excluded from these conversations who have been the missing in these conversations? 53:06 Yeah, that's, that's the thought I just had. So often KPIs are based on benchmarks, or loose benchmarks or past performance indicators that you have. And sometimes KPIs it's, well, we've just always done it this way. And you know, you're moving a million miles an hour, right? Nobody's stopping to think a lot of the time. It's okay. But is the benchmark data, we have equitable. It's a very interesting thought exercise benchmark 53:32 is the most interesting, I would say, it's been 14 months since I'm doing my business, this my practice of namaste data. And I have refrained myself, even as an entrepreneur as a business woman to look at benchmarks of how consulting and consulting practices should run because they are, it's not really comparing apples to apples in there. And it can create those KPIs. If you don't have intentionality can put more pressure in the culture does feel like, Oh, am I matching up to something like a random number where I should be? And we need to move away from that. I want to give some space to folks if we have five minutes left any questions so far, and I'm going to type in my email address as well here, I use my Gmail. I realize my work requires to build some trust, because I'm talking about data equity. So I picked up my Gmail cleaned it up from personal, we framed it away. And this is my Gmail, just as reminder, this is my website. I have a practice called namaste data. And I work with nonprofits and social impact agencies on their data side of things, research, AI, algorithms, all those sorts of things. So if you don't have questions right now That's totally okay. But I hope you have some at some point. And then we can talk about it. 55:19 Meena I'm thinking of Tyler's question about wanting to get into a more data oriented career, how did you get into your line of work? 55:29 Oh, that's very interesting. So I had formal education, I was going through my school years in computer in computer science. I started the tech job 15 years ago. And finally I started my own school for I come from India, I had my own school for sexual assault victims and so I was exposed to the idea of doing something good with the data and doing something good. But, you know, when at some point, I realized, and my tech job doesn't support me enough to sustain my personal school, so I moved to the States, and from states to Canada, I have moved through different roles. And I would like from my experience, I can say there is no one way you can get into the data field, or data science or AI for that matter. It really comes with one word, curiosity, whatever data lies in front of you try to find from make some charts, then become more curious to make those charts better become more curious to see what other data lies that can augment your data, the more questions you have for your data, automatically, you will be exposed to more kinds of things that you can do in your jobs and your careers with data. It could become a data visualization expert who can become a data science expert to do predictive things. My journey became a lot with just asking these questions at different points. And then figuring that out, what I'm doing right now came sort of picked up the awakening through the pandemic racial justice movement. And to see there is a lot of data that we have sitting in front of us that do not include people whose voices are different accent is different, who looked different, and how can we make sure that we have that intersectionality in the data visible to us in the analysis as well. So that's interesting. Oh, we have John back. 57:26 Yeah, sorry. Sorry. I don't know what bandwidth on my side or whatever. But, yeah, it just as a follow up from the previous question I had made, you know, we were you know, I was just trying to nail down, you know, drill down on the question of data. There is data that's has to be followed, you know, for program outcomes, things like that. In logic modeling, and things like that. Those What What, in your view do you see those two coming together in terms of, you know, how nonprofits effectiveness? is, you know, to the general public at large, versus programmatic data outcomes, measuring programming data outcomes? Would? 58:23 Like I say, Yes, I would say absolutely, I mean, I never say let's not measure data, I mean, by nature of the data, it can need to it obviously needs to measure. And so we're using a programming side of things, using it for a logic model that makes sense. The question really is, what are we including in that data? And what are the kinds of metrics we are making coming out of that process? Let's have the process. That's a good process. It's making us intentional, it's making us think our programs working or not, it's really about what are we feeding into that process? And what is coming out of that process that can help us to become more purpose driven with our programs? 59:08 Okay, yeah. Yeah, cuz I know, when you write grants, you know you. Yeah. Very, they're very specific, you know, if you want you to be very specific, as to objectives of the program and the outcomes, you know, and the processes that go in and generate the outcomes. And because we're in a process right now, looking for someone that could help us really articulate that data plan in the, in the context of a grant that we're writing, you know, we we feel that we pretty much have laid out the logic modeling and we know you know what it will be the inputs, and the outputs. And then you know how we define six Thus, there, we have a little trouble articulating that in a grand narrative and a grand narrative. 1:00:09 completely understandable. Acknowledging that, yes, it's a very common issue. I'm going to connect with you, John after this session, I want to leave the space Kr\ristina for the next coach, right. Do we have them here? 1:00:26 Yep. I think we're one minute past our time. So we might be eaten up our next session. time here, but thank you, everybody for joining today. This was a great conversation. Thank you, Barbara. And John for chiming in as well. Have a good rest of the day everyone. Thank you so much. I really appreciate it. Thanks all bye. Thank you Transcribed by https://otter.ai