Tim Sarrantonio 0:05 Our rights. Hello, hello. Welcome. Welcome everyone to today's presentation data equity and practice for nonprofits presented by neon one and La Tech for Good. Really excited for this one. And by the registrations, you are too. So we are going to get started in just a moment, I'm going to do a little tiny bit of housekeeping and about a minute or two as people mosey into the room. And then I'm going to hand it over to Rachel Whaley. And we are going to have a really fun time today on a very important topic. As we are getting settled in, my name is Tim Sarrantonio, director of corporate brand at NEON one, I am going to be here to help manage the experience during the presentation. So running polls answering questions, and we are recording this. And we do have the live Captions Enabled for the presentation through zoom as well. Afterwards, this recording will be emailed out. So if you do have to go to another obligation, don't feel bad about that Rachel won't be offended either. And we're going to send out the recording by tomorrow to you with the transcript and a link to the slides as well. And some other great resources that the LA Tech for Good folks have put together on this topic. So with that housekeeping out of the way, I think that we're going to go ahead and get started really excited, Rachel for today's presentation. And otherwise, the floor is yours. I will be here to launch a poll when you tell me to and tell you any pressing questions, but otherwise, take it away. Rachel Whaley 2:02 Thank you so much, Tim and the other one for having us. We're so excited to be here. And great to be here with you all today. We are talking about data equity in practice. So not just in theory, but in practice for nonprofits. I'm very excited to be a part of this series with you all today. So let me tell you a little bit about what we're going to talk about, we're going to talk about kind of what is data equity? What does that have to do with your mission? How does that build into data ethics? How does that build into starting to think about algorithms, machine learning artificial intelligence, whether or not you're interacting with those technologies? Yet? How can you start to think about them kind of from this foundation? And then what does it mean to think about implementing some of these things? So getting from the theory into the practice? How do we actually make some of these real in our organizations. So we like to think at La Tech for Good of data equity as sort of this pyramid. So starting from the bottom, this foundation is where we like to think about concepts of design justice, which we'll get into a little bit thinking about, you know, kind of why are you doing what you're doing getting really clear on, on the why of each individual practice, as well as the broader kind of projects that you're working towards, as well as the broader social context in which you're working. On top of that, so once you've got kind of that clear foundation, then we can get into that kind of the meat of our conversation today, which is going to be about how you approach equitable practices, ethical practices, in your data work, how do you make sure that you're being really responsible and really aligned with what you're doing with data with kind of your nonprofit mission, your kind of social good mission more broadly. And then finally, sitting on top of all of that, you can start to think about or frame how you might think about getting involved with artificial intelligence or machine learning, a lot of technologies that you might see in the market are increasingly kind of incorporating these. And so even if you're not building, you know, algorithm from scratch at your organization, even if you're just buying off the shelf software, you are likely to start encountering this soon, if you haven't already. And so we like to think of these topics as a great way to give you a really solid foundation to evaluate those things and think about them strategically and responsibly for your organization. So that's kind of a brief overview of what we're going to be talking about today. I would love to start with a poll. So Tim, if you could open up the poll, just want to get a sense for folks, and if these are totally new ideas to you. So on a scale of one to five, one being our data equity and data ethics, totally new terms or ideas for you, or five is like you feel like you're totally an expert. And you're just here for kind of a little bit of a refresher perhaps and so a scale of one to five, I would love to get some responses and just get a feel for this group kind of where where you're at. Tim Sarrantonio 5:01 We are having a lot of responses, I'm actually going to let it go a tiny bit. Because when the 77% responded range, which is Wow, really good to have somebody using the chat to all good, amazing good there. Yeah. So I'm going to go ahead and and a poll now. And let's go ahead and I'm going to share the results for folks to see. Yeah. So do you see that Rachel? Rachel Whaley 5:27 I do. Thank you so much, Tim. Okay, so it looks like we've got for like almost a third of folks, this is totally new. That's awesome. And for more than half of folks that's either totally new or somewhat new, which is great. I'm so glad that you're here. I think I, you know, really only started learning about these topics a couple of years ago, myself, I was feeling like they impacted my work a ton. I've worked in data and a few different industries. And I felt like I kept running into these same questions, right, these questions of kind of what information is getting collected? And sort of who's deciding that and how is it getting stored? And who's deciding that? And what are we doing with it? And how do we make those decisions? And who gets to be in the room when we make those decisions? And all these type of questions. And at least my data work kept bubbling up. And there was never a clear answer that I was able to find, when I was working on these things, it would sort of get batted around from sort of department to department or team to team, there often wasn't a clear decision maker on you know, should we collect this piece of data? Or if so, how should we collect it? There's sort of this this gap between the the technical, you know, pieces needed to actually collect and store the data, and then sort of the strategic decision making that needs to happen about what that actually means for your organization? And then how do you think about responsibility and equity and ethics across all of that? I was feeling, you know, like, I needed some more resources on this, like this was kind of bringing up a lot of big questions. And there weren't a lot of great answers. And that is what put me on this journey of learning about data equity, for myself. And with La Tech for Good, which is a nonprofit that I'm involved with. And we started bringing cohorts of data folks together to talk about these things. And because we feel like they're really important ideas, they're just starting to gain some traction, there's a lot of folks who are doing great work on these in sort of the, the research realm and in the academic realm. And we were reading that stuff, we're really excited about it and wanted to kind of pull it down to a more practical level to say, you know, we see all this great kind of theory being developed in these ideas kind of at a high level. But how do you walk into your team meeting and say, hey, you know, I think we should start implementing this stuff, you know, what do we do at our team meeting next week? Like what are what are those really, really practical things that we can actually do to make this a reality. And that's where that's where we got really interested in this space. That's where I'm really interested in this space. And that's kind of where li Tech for Good is coming from, with this. So great to have you all, especially if you're new to this, I'm really happy to have you on this journey with us. Great. So let's talk a little bit about equity to start. This is one of my favorite quotes from data feminism, which if you're interested in learning about data equity is an awesome book to read. And they tell us that equity is both a process and an outcome. So what that means to me is, it's often the case that you're doing some sort of data project, right, you're producing some sort of report, or perhaps you're trying to build some kind of predictive model, or perhaps you're just designing a survey to collect data from your constituents, your stakeholders or your community, right, you're doing some sort of project. And you really want when you get to the end of that project, you want to say, you know, we feel like this project, the result of this project is equitable, right? We feel like the outcome is fair, that it's free from bias, to the extent possible that it is representative of the folks that that we feel we're representing, right, all of these types of things that can feed into a definition of equity, and you want to reach at the end of your project. The arguments of the argument that data feminism makes, and that I strongly agree with, is that you cannot get an equitable outcome unless you're thinking about equity throughout the entire process. So whether it's the how you're designing the visualizations for your report, or whether it's how you're choosing which survey questions get included in your survey form, or which variables get fed into your machine learning model, right, whichever of these kind of types of projects that you're doing. If you're not thinking about equity, and kind of how you can be responsible data practitioner throughout the whole lifecycle of your project. It's not really possible to get an equitable outcome. So you've probably seen variations of this chart on the left hand side or this little graphic. There's a few different ones out there. They're kind of some fun to On on this idea, which is kind of what does equity actually mean? And the answer is pretty complicated to that. So this graphic is a little example on the left hand side, right, we've got some kids watching a sports game kind of see over the fence. And this graphic is posing that the idea of equality is to give each kid sort of one block to stand on. And when you think about equity, right at, on depicted on the right hand side of the graphic, you see that the the kind of outcome of that is really different, right on the equality side, like, yeah, each kid gets a block, but one kid didn't really need it. And one kid actually needs to blocks because she's bit shorter needs to be able to see over, and another kid actually needs a ramp to be able to get up to see over the fence. And so there's lots of variations on this. But I think the idea in this and court sort of similar graphics is the same, which is that defining equity in your specific context involves getting super clear on exactly what you're trying to accomplish, and what you need to do to get there. Right. So if your goal is to make every kid see over the fence, right, then in this graphic, like the equality approach isn't, isn't achieving that rate, only one and a half of the kids are seeing over the fence. If your goal is to give every kid a block, then yeah, on the left side of the graphic there, you've achieved that, right. And so that clarity, those sorts of things in practice can get a bit fuzzy when you've been working on something. And there's a lot of kind of documents floating around, a lot of people involved a lot of complicated decisions that need to get made, it can be very easy to kind of lose sight of that nuance. And it's really, really important from an equity perspective to stay super to get super clear on it in the first place. And then to stay super clear on it throughout your project. That's kind of how we start to think about equity. As I mentioned, in the original kind of pyramid, when we were kicking off this, this little session, design, justice is a framework that we really love to use. It's from Sasha Costanza chalk. And I think this is a great approach to think about not just kind of visual design projects, it's my might be what comes to mind when you hear the word design. But really, as you're thinking about your data project as a whole, like how, how are you making decisions about this kind of data endeavor that you're on? And specifically, you know, a lot of folks who work in data, I know, I've been in this role, I talked a lot of folks in this role. When you're working in a data role, you're often only involved with one piece of the project, you know, maybe you're a data analyst, and there was some survey that was collected a while ago, and the just kind of the results are being fed to you as an analyst, right? So you might not actually be involved in the whole kind of design of the survey, the decision to even do a survey, right, like you might not be involved in the whole process. That said, I think these design, Justice kind of questions and this framing are still really, really important to ask, even if you're a little bit downstream, even if you're not kind of there, from the inception of the project, these kinds of questions like, you know, what story is told? How are you framing the problem? And there's some more examples on this slide. Right? These are the types of questions that you should know the answer to, if you're working on a data project. And if you don't know the answer, even if you're just kind of a downstream analyst, maybe you're making some visualizations, you know, you're kind of like the last person to work on the project, you should know these things. And if you don't know them, you should ask. And if the people that you're asking don't know, then that needs to be a bigger question, right? If you are not sure kind of who decided the scope of the data that you're talking about, then you might find that the choices that you're making and kind of what to visualize or what to leave off the visualizations, you don't really have a great sort of resource to guide you, in making these sort of micro decisions that feed into a big kind of macro vision. As we were talking about with equity being a process and an outcome, right, you cannot have the process be equitable. If you are not clear on, you know that the goal is to get all the kids to see over the fence, for example. Right. So and and who decided, right, who decided that that was the goal? What is the value kind of behind that? And as a nonprofit in particular, right, thinking about the folks kind of the group of us that are here today? How does that kind of feed into the values and the mission that your nonprofit is serving? Right? So presumably, you have some sort of kind of mission for social impact and that you're working towards as an organization. A lot of organizations have sort of a set of core values or or sort of norms that you've established that describe more about the how how you go about reaching your mission, right? So how are those values and that mission kind of fitting into or informing what you're doing with your data and how you're how you're even approach? In your data project to start, right to design justice, there's is a is an entire book in and of itself that I also highly recommend if you have the time. But if you don't I at least recommend kind of reading a little bit more about this framework and thinking more, a little bit more deeply about kind of where does all this stuff come from? And how do you write it down and communicate it so that everybody is really clear on on the answers? Okay, so that was a little bit theoretical, right? We're kind of talking about asking big questions and difficult questions, and how do you kind of think about your approach to your work. And let's get a little bit more tactical, right. So you might have heard this phrase, what gets counted, counts. So you know, things that you're measured on in your performance review, right might inform things that you choose to do in your job. There's the kind of countless examples of this where, you know, kind of anything that's being measured or set as a metric, or that's reported on regularly especially to maybe leadership or maybe to your, your constituents and clients or maybe to your funders, or kind of whoever you're working with, the the kind of what gets you decide to measure is really, really important. And so I really liked this example, I already mentioned the book data feminism, which is an excellent resource on, certainly to think about these data equity issues. And I wanted to share this example, which is actually from the appendix of the book. But I think it's a really cool example. And the authors of the data feminism book, designed metrics for themselves to measure their own success in addressing some of the structural problems they wanted to help address with their books. So, for example, they have they list many structural problems, I pulled one example here, which is racism, obviously, there is no one, one book writing project that is going to fully address racism, right? That's like a deeply entrenched ongoing issue that many of us are trying to address. But they wanted to see, okay, if we think about the structural problem from the lens of our project, which is writing a book, it's not even really fully a data project, it's just writing a book. How could they design a metric to actually start to address that? And so what they did is they started to think about the citations in the book that they're writing right there. It's rather long book, they cite quite a few different kinds of researchers or practitioners who are working in this space. And so they said, All right, well, we look at the total number of all of our citations in this whole research process. And there's quite a long list. We want 75% of those citations to come from people of color, and 75% of the examples of data projects that we're sharing, to be led by people of color. Right? So it's kind of a little bit of a creative spin on how do you design a metric to to kind of measure what matters and how, you know, how do you decide kind of what counts based on what you're what you're counting. So they've used these pieces of citations in their research and examples, a cited in their research as their metric. And the other piece of this that I think is really cool. Their book is open sourced available online, if you'd like to take a look at it, you can actually see this online now is they published these metrics, they there's some transparency here, where they publish it to kind of a draft of their book for peer review, very common kind of writing process. And so at the draft stage, they said, All right, you know, our metric was 75%. And here, we're at 36, for the first one and 49. For the second one, right? Like we're not there, to the aspirational metrics that we set, but like, here's where we're at. And here's where we got to the final metrics. So and you see that their numbers fall a little bit more when they go through copy editing, and all of that stuff. So it's opening up transparency at two different levels of the process, towards this metric, which I think is really powerful, and a really powerful example, to say that, you know, these types of metrics are a little bit different from organizational KPIs, which might be you know, the type of type of thing where you like, really need to hit it in order for your program to work, right? Like, these are no less important. But the process to get there might take longer, right, if you're thinking about some more kind of systemic issues, you know, it's, it's going to take some time, it might not just be the first draft, or even in their case, maybe not even the first book that they write that they sort of hit these things. But they're putting it out there. And they're saying, This is important to us. We're measuring it, here's our progress that sort of opens them up to to feedback or suggestions on how they could improve that lets other people they're working with know that this is really important to them. And it's a priority. Right? So it starts to kind of bring up all of these pieces that I think are super crucial to actually getting to a reality where this is possible where it's possible to have 75% of their citations come from people of color, for example. And so this is one example I'm in LA Tech for Good we, we bring folks together. And one of the things that we talk about in our cohorts is, you know, how would you start to design a metric like this for the project that you're working on? Right? And you have to think a little bit outside of the box, you probably have to get a little bit creative, like they did here thinking about citations of their book, right? But how would you kind of start to think about that, or sort of brainstorm some ways that you could look at at, you know, racism, or heteronormativity, or any other kind of problem that you are interested in addressing. And there's a bunch of resources on this. Also, data feminism, if you're, if you're curious for kind of a list to start thinking through. And certainly there are many, but there's, there's some new kinds of thinking about listing data feminism. So this is a really cool example. And something you could bring to your organization to kind of start thinking about some of these ideas in practice. All right, so we've talked a little bit about equity. And let's kind of go up one level in the pyramid a little bit and talk a little bit more about ethics. I really like this quote from the Center for Applied Ethics, which says, we often focus too narrowly on whether we have complied with ethical guidelines. And we forget that ethical issues concerning technology, don't just go away, once we have diligently performed our own particular tasks, is essential to think about what happens to data after it leaves our hands. If you're super interested in data ethics, I definitely recommend that you check out the film coded bias, it has some much more in depth examples of how how we should should and should not be thinking about technology from an ethical perspective. But I wanted to bring this here, because we talk a lot in kind of the data community, or communities that I'm a part of about, you know, what are the kind of true kind of black and white guidelines for, for kind of an ethical approach data, and you, you know, you hear about laws such as GDPR, or CCPA, in California, right, there are some kind of very baseline laws and that start to kind of protect some of these pieces regarding technology regarding your personal data, and kind of start to think about what is a more ethical approach look like? And so I think we're we like to think about this, from the perspective of La Tech for Good is that data responsibility includes those things, but it's also much bigger than those things. And so for thinking about an ethical approach to our data, a really big piece of it is thinking about, you know, okay, we want to make sure we meet kind of these baseline kind of minimum guidelines for good data practice. Beyond that, though, we want to think about what does it mean to think about this after our, you know, after we've moved on from a specific task? So for example, if you're collecting data, what happens to that data set, say, when you get a promotion? Or if you move on to a different job? Right? What happens to the analysis that you produced, or the dashboard that you built? Or the predictive, you know, system that you maybe helped create? What what are the things that kind of come after you? And how do you start to think about that from a data responsibility? perspective? And there's a lot, there's a lot to think about that with, right? You know, these pieces of data or these kind of data artifacts that you might create, creating them in a certain context. And that context is often not super well documented, or the documentation is a little bit of an afterthought. And so one of the sort of practical approaches to data ethics that we talk about a lot with folks is this idea of documentation and documentation as a tool for a more responsible, more ethical, more equitable data practice, you know, why was this data collected? When was it collected? What is it appropriate to be used for versus what is it not appropriate to be used for based on how it was collected? There's a great kind of multidisciplinary work called the feminist data manifesto, which is also available online, it'll be linked in the resources, I highly recommend it. And a quote I really like from them is that data is a sharing a process and a relationship that we make and put to use, we can make and use it differently. So you know, data doesn't exist, like out in nature, right? Data is something that we kind of construct and collect as humans. And so it's something that we are kind of in a point of power to think about how we're actually using it, and different ways to use it. So thinking about documentation, thinking about kind of how could we reframe or how could we kind of use data differently. I wanted to share this example from a really cool organization called data for black lives. They published some really interesting reporting and kind of data analysis of COVID-19 in the black community in the United States. And what I'm sharing here as a screenshot, not from the the kind of top kind of part of their dashboard Has the charts and the kind of parts you normally look at first, I've skipped all that and sort of gone down to the appendix again. And what they've included at the end of their dashboard is this really great section called Data Set usage and best practices. And what they've done here is they've included some detail on what their data that they've collected in this report should and should not be used for. So they've got a great section here on the intended purpose of their data, right, which is intended to help with public health is intended to help with contact tracing, right, it's intended to help with all of these things that enable the community from which the data was collected to, to kind of heal or or to be in a better place. When thinking about COVID-19. As opposed to using this data in sort of a more weaponized way, using it to surveil anyone or using it to deny benefits to someone who might based on this data rate be shown to be at at a greater risk for some of the after effects of COVID. So this is one example. And you can go to their website, D for bl.org, that also be linked in the resources, where you can actually kind of see their whole report and kind of read this full section. But this is really rare. If you've ever kind of looked at a dashboard, or kind of one of these kind of public open data projects, or even within your own organization, if someone has put together a research report or a dashboard, or done a survey, and he's kind of putting the data out there for everyone to look at a section like this kind of piece of documentation specifically has equity in mind where someone has specifically thought through, you know, what was the scope of this data project? And what should it be used for? And what should it not be used for? This is very uncommon. And I think part of that is an awareness right? Us as data practitioners, you know, even being aware that this is something we could or should be doing, I think is is not a norm yet, I I hope that someday it will be a norm. But it's not right now. So I think one piece is just kind of making folks aware that this, this is an option, this is out there, this is a really cool thing to do. And a really tactical step you can take towards making sure. Again, thinking back to that quote from the Ethics Center, that kind of data after it leaves, your hands doesn't start to spin or snowball into something that it shouldn't write this, this documentation doesn't can't prevent that on its own, but certainly helps prevent against that sort of thing. So this is just one example, I highly recommend that you go kind of check this out. And think about how something like this might apply to the work that you do with data in your own organization, because I think it can be really, really powerful. Okay, so we've talked a lot about kind of what does equity mean? How do we start to think about responsible data practices? What are some of the ways we could start to think about this in practice? Which brings us to the top of that pyramid diagram, which is to say that, you know, once you started to think about what does it mean to have a really solid foundation for our data work? How do we make sure that it's really aligned with our values with what equity means, in our specific context, we've kind of worked to align our practices around data around visualization collection, kind of all those steps of the lifecycle, with those values, and when kind of with equity in mind, we've kind of reached this middle part of the pyramid. And then kind of something that sits on top of that is, you have all of this data, which you either might currently be using to inform some kind of predictive algorithm or machine learning or artificial intelligence system, you might be currently doing that. Or you might be thinking about doing that you might be exploring options to start doing that. Or you might not have thought about that at all yet, you know, maybe that doesn't totally apply to your organization, for whatever reason. And I think if you're in any of those situations, it's really important to start to think about what would algorithms mean for the data that you have so to mean by that is, we have this quote from the author of a great book, weapons of mass destruction, it says, predictive models or opinions embedded in mathematics, or to any kind of predictive system that's being fed in from from data. You know, humans have set up that system and to kind of look at specific data points or not look at specific data points, or maybe even to weight certain data points, you know, more or less, have more or less importance. So all of those kind of little decisions that get made and the design of the system and, you know, we can you can get much more to the technical depths of this, I'm going to I'm going to not do that in this conversation. But all those little decisions that get made are are an opinion, right? And so the, you could have a predictive system that spits out, you know, some sort of predicted score or predicted status or something like that. And it might seem like it's, you know, very mathematic And but it's all coming from this data. So all this data that we've been talking about this whole time up till now. That's what feeds these systems. And so if there's, you know, if you don't have good data practices in, kind of throughout your data lifecycle, right, as we were talking about, with data feminism, that that equity is both a process and an outcome, if you don't have good practices throughout, then your predicted any sort of prediction that you're doing based on your data, is not likely to have equitable results, I think it's really unlikely that you'll be able to get a good result you can feel really good about from any kind of machine learning solution, when you don't kind of have all kind of all your ducks in a row, if you will. It's kind of that first part of your data pyramid. And so the reason that I think this is important to talk about, even if you're not using this sort of thing yet, is that AI is being built into more and more kind of software tools and different products, and you might find yourselves, you know, there might be a vendor that approaches you and is, you know, has this sort of built in as a feature to, to a product that you're using, or thinking about buying for your organization, or, you know, this may just, you know, be a matter of time that it kind of comes along down the road for you maybe a few years, or, you know, months, whatever, whatever the timescale is. And so I think starting to think about, you know, some of these key questions, which is, where is algorithm processing used kind of in your, in your field in your industry, and kind of the area that you have services that you offer at your organization? You know, when is it an algorithm that's kind of predicting or deciding something versus a human, right, like, and for for kind of which populations of people? Is that decision making happening? How would you like that to work in, in your organization? Or kind of how would you start to think about that from an equity lens of what does it mean to have an algorithm predicting or deciding something, you know, is what are the cases where that's helpful? What are the cases where you're only doing it for, say, maybe a certain subset of your population, right, versus where human decision makers used. There's a lot of great resources on this on the wall. So share later cuz I realize it's a little bit of a high level. But the other thing that I think is really important to think about kind of, especially if you're at the stage before your organization has implemented anything like this is accountability. Right? You hear this a little bit in the news, sometimes, you know, when when an algorithm makes a decision, kind of how, how does accountability work? How, how could it be upheld? How is it being upheld? How should it be upheld, for an you know, decision or prediction that it makes it then informs actions or informs outcomes for individuals. So I know that's a lot, this is kind of the the high level view. But this is just to get you thinking kind of just to get the gears turning about what this could mean for your organization. So everything that we talked about today, you know, these concepts are all really interesting, but to me, they're only really valuable if they actually get put into practice. And that's why we titled this talk, as we did, right thinking about this in practice, what does it actually look like to implement responsible, equitable ethical data practices? And as I've alluded to, at many points in this presentation, a lot of this has to do with asking tough questions, you know, we're kind of regardless of your role, maybe you're the database administrator, and you're trying to decide what options go on the drop down menu, or maybe you're the executive director, and you're trying to decide, you know, kind of what strategic kind of data you want to collect or even look at to begin with, right, kind of anywhere on that spectrum that you sit. If you think about those design justice questions, if you think about kind of this question of defining equity in your context, those are difficult questions, is there a clear kind of messy things to untangle, and they're not super straightforward. And so the kind of a willingness to ask tough questions and kind of embark on this type of project is super, super important. Lucky for everyone here, you know, I talk to folks in a lot of different industries about data equity. And if you're in a nonprofit, and you have, you're in kind of a great place where you have already kind of a mission towards social good, right? So you're already kind of working towards that you've got values that are aligned with that. And so I think in this case, where you're starting to think about, you know, essentially like internal data practices of your organization, and how do you start aligning those practices to kind of living the values and living kind of the work towards your mission that organization is already doing, you've kind of already got that vision, like broadly in your organization's this is just a matter of kind of connecting the dots. And if you haven't already, to the data practices that are going on at your organization and making sure that those those are aligned. And a lot of this is also about, you know, some of these ideas are pretty new, right? As indicated from the poll at the beginning, more than half of you were like pretty new to these ideas. And so, a lot of this work has to do with kind of making the case or sort of getting people on board with this idea at your organization that, you know, we might want to take a closer look at some of our data practices, we might want to start to reevaluate, you know, some of what we're doing maybe what we're collecting, or how we're presenting it, how we're analyzing it, how we're sharing it, how we're documenting it, right, all those different areas, and even addressing any one of those areas, you're probably going to need to get some other people on board to figure out what it actually looks like to do that in practice, you know, are you kind of adding a different section to something that you're doing? Right? Like, what is? What is the actual tactical thing that you're doing? How are you measuring your own success toward it, right, thinking about what gets counted counts? If you've made a decision that this is important, you've gotten people on board started to do it? How do you make sure that it's that it's happening, that you're kind of moving the needle in the right direction? And then how do you kind of embed it in your organization? So how do you make sure that it's something that is sustainable over time is being managed, you know, and evolving, you know, I'm sure it kind of new considerations will continue to arise in this arena. So kind of being ready for that and making it kind of not just a one time initiative, but something that's more of a norm with with your data practices. So you know, implementing this stuff is really hard. There's, there's a lot here, we offer cohorts, Li Tech for Good for people working on exactly the sort of thing. So if you are interested in working on this organization, but you don't really know where to get started, or you kind of want some support some communities and camaraderie around around you, as you're working towards this stuff. We're offering a workshop, we are based in Los Angeles, more than half our participants are from around the country and a few beyond. And it's all in zoom. So we'd love to have you at one of our upcoming workshops, we've got one coming up in April, as well, as one spoke focusing specifically on health, public health, health care data related. So if you're working data related to that space, please join us for that one. If anyone would like to chat with me a little bit more directly about this about your project, or what you're working on, always happy to chat with folks. And then la Tech for Good. We're also publishing a lot of content on this on this topic, putting together events and with lots more to come. So please make sure you're following us, or checking out our email list for kind of more details on ways to learn more. And speaking of learning more, I think we have time now for q&a. So Tim, I think you will let me know how we're doing on on questions, and we can start to address some of those in the group. Yeah, absolutely. Tim Sarrantonio 37:37 We have a few questions that were submitted, that I had flagged that would be good to answer live. Some of them are neon specific, which Rachel, whoever she might be a fan of our work is not an expert on that compared to what I can answer. So I'd actually rather make sure we get to some of the questions that were submitted in lead up. That were really interesting. And then we have the things that were asked live to so we have time for all. So kind of the neon specific ones, you might have some ideas on Rachel, but we'll put those off just for a moment. And I kind of wanted to go with one of the big ones because it was hit on a few different times, which is about the collection in the first place of data. Interesting thing that just came up. I even posted about this on LinkedIn earlier today, as an example is that there's there's a situation that came up with St. Jude, St. Jude's and their legacy giving program. I don't know if anybody's heard about this, but basically a ProPublica published something that was like St. Jude's went a little too far when it came to going after families for their bereavement kind of bequests. Inside of that, it's a pretty long piece. So I'm not going to get into it. That's not the focus of today. But inside of that there was something where the organization appears to have been collecting sensitive medical information and putting it in the donor records. And so there's always questions of legality versus equity on what we are collecting in things. Now, it should be noted no matter what, don't put sensitive medical information into your nonprofit CRM, but that is one thing. But then we get into a little bit more equitable questions as opposed to legality questions. Thank you, Karen, for posting that in the chat. And so I'll Karen post it to us. I'll make sure that everybody else can see it by the way, so What was interesting, though, is somebody asked about other questions. So let's get into other questions from a data equity standpoint, because medical, pretty cut and dry in terms of HIPAA compliance, but things like gender, age, identity, you know, gender identity race, what the question was, what are important to make sure that we have data equity to inform us in the first place? So I think that's actually in two ways. I'm going to start there as a two part question. To help inform data equity. What should we collect? And how should we collect it? Mm hmm. Rachel Whaley 40:39 Yeah, that's a great question. And it, it would be so easy if there was a one size fits all answer to it. And there's not, unsurprisingly, um, yeah, I mean, I think it sounds like what the, what the question asker is getting at is that there are all kinds of different identity categories, there are all types of different ways that we could think about equity and inclusion at your organization. Right. And probably from a strategic perspective perspective, you're thinking about a lot of those different categories, be it you know, inclusion of different religious practices in groups, in your kind of what your organization does, or race or gender or age, anything that you mentioned, you know, nationality, immigration status, and there's like many, many, many different types of things that we could think about, as how do we think about inclusion related to those identities. Tim Sarrantonio 41:28 And this this organization, specifically, they, they work with mental health issues. So that was the lens that they were taking, but mental health Rachel Whaley 41:35 amazing, so kind of health status, or kind of kind of mental health status in particular, maybe? So there's lots of lots of different dimensions to this. And there's no, you know, okay, these are the, you know, nine most important dimensions, you know, to collect in your survey, right, like, that doesn't exist. And the reason that that doesn't exist is, it really matters, why you're collecting what you're collecting, right? Like, that's what drives this entire thing. So, you know, you might be thinking to yourself, like, Oh, should we put, you know, a gender dropped down on this on the survey or just an open textbox? For gender, right? Like, there you get even sub questions there? Like, should we be collecting gender on the survey? And I can't answer that for you. Right, the only way that you can answer that is to think about what is the broader goal of the data that we're collecting? How does it serve our mission? And what we're doing here? For example, have we as an organization noticed, or maybe received feedback that we're not our practices are not super inclusive to a specific group? Maybe you've implemented some, you know, some practices or some new programs to try to improve that and you're trying to figure out, you know, do we feel like we've made progress? Do we feel like we've done better here on this specific metric? And that might be a great case to include as a question on your survey? And, you know, how do folks identify in that dimension, so that you can see if you've made progress, right, that you have a really clear why you have kind of a starting point, something you're looking at something you're working on. And I think it can also be really powerful to include that context, in whatever you're sending out, right? So at La tech for Goodwill, ever, anytime we send out, you know, surveys or, you know, to program participants, we ask them a bunch of questions about their experience in the program. And we also ask them some demographic questions, pretty common. And that kind of context we provide is that we have specific goals towards being a really inclusive and Representative organization. And we want to make sure that we're holding ourselves accountable to meeting those. And so to the extent that folks are comfortable, perfect, like sharing some of that information with us, it's all optional. You know, that we're using that only ever in aggregate and only ever to kind of help us understand, are we moving in the right direction as an organization? So I think those are the questions that you got to ask yourself, how does it fit in with your broader strategy? And what's what's your why? Tim Sarrantonio 43:51 And I think that does help answer kind of a, they had a pretty meaty question, folks. It was they also in the This is why also, we started asked, adding questions or comments to the registration form of these webinars, because we've been getting great stuff like this. And they actually asked about sexual orientation as a data point, I think that just goes back to your, your point that you just made is is it depends, in many ways, and what are you going to do with that information in particular, and we see this a lot with like things like donor surveys, as well. And and a lot of the, one of the long things that I've been proud of, for instance, within neon CRM, as an example, is that our gender field is like fully configurable, right? Like a lot of CRMs, especially once they were created a long time ago. It's just like, here's your, here's your two options. Right. And, and we actually, I remember having an LGBTQ organization early on, say, like, could you open that up, and that's why we did that. So even going back to some of the items that you mentioned Rachel for Maybe if you want to, you know, maybe we can stop the sharing so people can see our faces a little bit more. Um, one of the things that was was interesting on a real relating to the AI questions is, and you flagged this is, you might not be doing it, folks, but your vendors starting to think about it. Okay. Like, let me spoiler, if you don't think neon one's thinking about AI, if you were thinking about AI, right, and, and, and things that that actually impact that, that we think about very deeply here, for instance, our let's think about zip codes, folks. Wealth inequality, can be heavily tied to zip code analysis, right. So. So when we're doing things like prospect and wealth screening, the data that is being informed into that is going to be as simple as what the census might be telling us. And those are, there's a lot of bigger questions there. So if you don't think that this is affecting you step back and ask your vendor about it, too. Now, we have a bunch of interesting questions. Some folks are using the chat. And then they're sending it basically just you and me, Rachel. So so. And then I am handling that. And other folks are using the question and the answer if you want to make sure that we absolutely see something, um, use the question and answer feature. The chat is great, though, for people to talk individually. So um, what I want to do is, let's go to Abby's question, and then I want to get to some of the quick neon ones. And then somebody brought up a question that only you and I see can see Rachel butts of data scientists question. So I want to make sure we get to that we have good amount of time here. I'll do Abby's question first, any good readings on best practices for things like data security, and who can access what and things of that nature because that does relate to one of the other questions on how to be on thought about this too, from an access and equity standpoint. So what's good to start with for people. And note that the resources and deck that Rachel shared will be as part of the the recording and the follow up, so so if you if we're referencing something, there's a good chance that you'll just be able to click it later on. So, Rachel, where should I Rachel Whaley 47:25 start? That is a great question. And I don't know that I have like a specific reading or some things for data security, specifically, as we think about kind of how this relates to equity. That's a really good question. I might. Yeah, I don't think I have a specific link. But I will kind of like, kind of just provide some thoughts on that topic, which are, I think security presents is really interesting kind of tension. And I don't mean tension in the negative way. I just needed them kind of like a things to balance way of, you know, what is what does it mean to kind of respect privacy and kind of individual records and kind of how do we keep confidentiality, and especially for a lot of nonprofits here, right, the services you're providing, there's a lot of confidentiality kind of involved a lot of those things. And we fully want to respect that. And as we think about kind of how do we approach data responsibly? How do we think about accountability? How do we think about, you know, what does it mean to have equitable data practices? I think transparency is a really important layer to that as well, right? Like we talked about the kind of metrics that the data feminism authors shared, right, they they set out a metric, and they did not achieve it, and they still publish the results, which is a pretty brave thing to do. And transparency requires, you know, some of that bravery. Right. So I think that the the kind of key tension, if you're thinking about security, and kind of who sees what I think there's a tendency to think about it really strongly from the privacy and confidentiality perspective, which is important. And I think that that there's a lot of room to balance that with, how do you provide some transparency to especially the folks you're collecting data, from or with right to, you know, maybe your your nonprofit, you work in a specific community, for example, you know, what could you use of what you've collected? And maybe it's not even folks you traditionally share with maybe it's not people who have, you know, a log into your CRM system or something like that, right? How do you think about, you know, if you've gone out and surveyed hundreds of people in the community, are you sharing that information back with, with the community in a way that that is meaningful to them, and that helps bring, you know, maybe helps bring resources or maybe helps just bring conversation and bring information back to them? Right, so I, who sees what when I think can easily kind of get boiled down to like, which permissions in the database are switched on and off for which profiles and I actually think it's a much bigger question than that. So I realized that's probably not super helpful for the person who asked this question, but I would be happy to kind of think more about a specific resource on that. But I, the two pieces are I think confidentiality and transparency both have a lot of value Tim Sarrantonio 50:08 Rachel Ira plan, actually, I'm doing a dedicated session just on data security, that I wanted to do this, this very, very thing. So So and there's things like donor privacy, like, please, goodness, don't email a bunch of like Excel files that have people's names and like sensitive information in it around like, we're going to cover this type of stuff. So we'll focus back in on the core question on equity, though. And so we'll go to the live questions that were answered. Now, somebody did ask in what ways can the can be used to turn stories into data rather than data into stories. We're gonna put a bookmark on that one, and pin it for our next session next week, which is with Julie Cooper on storytelling. And so I think, come to that one or register for it and get the recording, because I think that we're going to be able to dive into that type of storytelling there. I think we also did get a question about how we have thought about the creation and, quote, incorporation of equity into making the platform itself accessible. So accessibility itself, and equity itself can be defined in a lot of different ways. Inherently, we feel that our platform, compared to some other systems out there is accessible. Because it's a cloud based system, there are still a lot of people who have to rely on Excel spreadsheets getting emailed around, or they use an installed version of QuickBooks or an older database. So a cloud based system, we were actually one of the first systems created for nonprofits to be in the cloud. But with that, we need to have an evolution. Things like multi language support is something that we we want to invest more dedicated effort into. But I can say comfortably, that something like the ways that we created our donation forms, took into account accessibility for things like screen readers and other types of things like that heavily into account. So the design experience for your donors, volunteers who might be signing up for things through account forms, we took that into account when designing those elements in the CRM so, so not a product thing you might not even know like what I'm talking about, if you're not using neon, totally cool, but this is something that we care very deeply about. And Rachel knows about some cool stuff we're working on, just let's just say more to come when it comes to all of this for resources. So in the last few minutes, let's see we got some other stuff coming in. Yeah. Rachel Whaley 52:48 When I wanted to grab one question from the chat, which someone mentioned, which is, you know, thinking about machine learning, if you're somewhat interested in learning more about that, you know, a lot of the courses you find online, tell you kind of what machine learning is or what some of those algorithms are. But they don't kind of talk about the if or how you should be using it. And I'm so happy that someone mentioned this question, because this is like my main talking point I feel when talking about algorithms, and the biggest question that people kind of often skip over, especially if it's something that your vendor is just offering to you, you know, they're like, Oh, you already subscribed, let's throw add in this one more feature. And so that's why we talk a lot about this foundation is if you've got a clear foundation in place, if you've got strong data, equity practices and principles, then when someone comes to you and says, Oh, I've got this cool, predictive, you know, thing, you know, all the questions to ask like, Oh, that's interesting. Okay, what what data does it take in? What does it consider what, you know, what is it serving? What Tim Sarrantonio 53:47 was informed the model? What is the model what, Rachel Whaley 53:50 you know, exactly what was like sort of trained on right, you can ask all of these more critical questions or be from a place where you can ask all those critical questions. And I think many times in many use cases, the answer is, this isn't appropriate for your use case, or the data that you have, isn't going to give you the answer that you want, because it's not quite the right data. Right. So maybe that's feedback that you might want to go back and collect different data that could put you in a place to use something like that. So all of these things come together. But I love the question. I think it's a super important one. Absolutely. The question of if, is just as important as any other question you would ask Tim Sarrantonio 54:28 we in the nonprofit and social good space, in particular need to think about this very deeply, because tech bros think that they can save us and that is not inherently going to be correct. A lot of times and we're seeing this, so to resources, Rachel knows about them. I've been I've been loving them. I'm gonna pitch them a little bit. This just came out. The tech debt comes next hack. So this is a great book. This is specific to the nonprofit sector. And then another one that just came out a smart nonprofit human centered by Beth Kanter. Allison fine. talks about different things, different things. So, Rachel, are you also able to see any of the chat items? Oh, I am able to see the chat. Yes. Yeah. Just because we've had a few things. Well, somebody asked if you related to Jim Wally Rachel Whaley 55:24 at HP? I don't think so. But it's somewhat common last name, actually. Tim Sarrantonio 55:28 There you go. Let's see. Okay. RACHEL had posted to everyone the data workshop. Um, somebody was asking, oh, and I know, this is if this was in the same question about, yeah, yeah. Yeah. So so maybe that's a good one. Because because I want to make sure that we answer her she's been pretty pretty. Rachel Whaley 55:48 Yeah, yeah. Thanks for being for the question. Um, yeah, you know, I, one of the reasons that we started doing these workshops at La Tech for Good is because I had recently gotten my Master's in Computer Science. And I was pretty frustrated with how little the curriculum addressed issues like this. And I remember one instance of sort of asking a professor, you know, why isn't this more of a, you know, topic on the syllabus? He was like, Whoa, like, you know, we just don't have time. And I was like, that seems like kind of an odd answer, you know, considering that you design the syllabus, you know, it's a matter of how, how important do you think it is? Or how prioritized Do you think that it should be. And my experience is that often the education that focuses on the technical pieces of data on data architecture on becoming a data developer on software engineering, any mention of this stuff is sort of like an afterthought, or it's tacked on at the end. And I don't think that's how it should be. That's how it currently is. And so I think it's going to take some kind of bigger waves of change to make that a more normal part of, for example, hands on machine learning curriculum. Until that happens, you know, there's organizations, you know, US organizations like us are kind of doing this kind of as on our own, in the hopes that it gets incorporated more broadly, later. But yeah, I mean, I would say, for anyone who's like you're in maybe an education program that's not addressing this stuff. I think finding like minded people to kind of work on this with you, and kind of like, keep you motivated, and keep you actually working on the implementation, which is the hard part is crucial. So we'd love to have you in our cohort. And if our cohort doesn't seem right to you, like, feel free to reach out anyway, and I can help, you know, direct you to something that made you feel might be a better fit. But we would love we would love to have you if you're having that specific frustration, you're learning about this stuff, and you feel like these topics aren't coming up enough, we would love to have you because that's where our program came from. Tim Sarrantonio 57:37 Sounds like Sergio, Sergio, you're gonna find another, you know, for your cohort, because he he's here right now. And we also talked about these things. Kim actually, is part of the connected fundraising community that we have. So I'm popping that into the into the chat if you hadn't heard about that. We have a Slack group, even where we're talking about things. I do a morning prompt question for people, and we're going to continue to roll out more resources. And and we'd love to have Rachel, you know, be part of that and have those types of conversations too. As we grow this together. Rachel, in our final minute or so, what is one thing that somebody could do today today about this? Rachel Whaley 58:20 Yeah, I would say documentation is a really easy one to start with, figure out what documentation do you have about your data? What's it, you know, telling you that you can and can't use it for? What's it say about context. And if you find it, that's great. I suspect you'll find a lot of places where documentation is missing. And and you can start writing some, that is one thing you could start doing. Tim Sarrantonio 58:40 Tags can remember we're talking about cat tags, and slack. This is where it all comes down to you tagging how you tag people with your software. It's it depending on how you're thinking through that it could be kind of doing something off on on equity or not so awesome. Well, this was great, Rachel, this is such a pleasure, we're going to get this condensed from an equity standpoint, we also make sure that the transcript will be provided with the slides, the resources that Rachel is talked about somebody mentioned, I want to make sure that I can watch this again, we're going to get that recording up. You're going to see it by tomorrow at the latest typically we're quicker than that, but I'm in charge of it while this person is out. So I'm going to hedge your bets a little bit and say it's tomorrow. Rachel, any final words before we break for today? Rachel Whaley 59:30 Thank you so much for having us, Tim. It's been really fun. And thanks to the to everyone for being so engaged and all the great questions. Tim Sarrantonio 59:36 Awesome. Awesome. Thank you, folks. We're putting some more things in the chat there for you for LA Tech for Good. Please check them out. They're a fantastic organization. We're happy to support them through this endeavor and more things that I know we're going to be doing. So yes, and check out our next webinar, that that's coming on storytelling. And then we have some awesome stuff on volunteerism for eight Roll on emotions and fundraising and a fantastic panel on creating a community centered fundraising initiative. So the recording can be found at NEON one calm, but surely if you're in here, you're going to get it because we're going to email you that directly. So there you go. Okay, folks, Rachel, have a great day. Thanks. Thank you. Thank you. Transcribed by https://otter.ai