By Lauren King, MA
At ImpactTulsa, I serve as the Continuous Improvement and Learning Specialist, where I design and facilitate training to teach continuous improvement. I also serve as a coach for improvement teams. In this role, I have the privilege of getting to work with a wide variety of community partners, as well as get to work more intimately with several teams. ImpactTulsa prioritizes continuous improvement within our own work. One reflection has come to light in recent months, that I believe is a vital realization for all those doing continuous improvement work: “Data driven decision-making” can be an emotionally-loaded term.
A foundational component of continuous learning and improvement (CLI) is the notion that change is built upon data-informed decision making. This basically means that the CLI cycle and supporting tools push teams to collect data to help guide decisions. It is a method for helping override assumptions and emotions that may interfere with effective decision making. It is one of my favorite components of continuous learning and improvement. In fact, it’s often the first thing I mention when detailing why it’s a valuable process. However, the phrase “data-informed decision making” does not resonate with everyone. In fact, it can be off putting for some people who have seen negative results come from data collection.
As someone whose background is in assessment and program evaluation, I am guilty of foolishly assuming that teams automatically view data-driven decision making as a positive tool. The reality is, whether collecting data from staff members, community members, students, clients or partners, everyone has had a different experience with data collection. Some have shared feedback, only to have it be ignored. Some have shared feedback only to have it used against them. This article from the Atlantic Council provides an overview for understanding how data has been weaponized against marginalized populations, particularly people of color. With this in mind, it is a fair assumption that not everyone has a positive view of initiatives that are “data-driven.”
If people do not have a positive connotation of data collection, it may make them less inclined to contribute (either as a participant or via regular data entry), less inclined to prioritize use of data, and less inclined to use it effectively. Thus, improvement becomes difficult, if not impossible, because without relevant data to guide the process, we risk making decisions that support the status quo as opposed to initiating meaningful improvement.
At ImpactTulsa, we have come across many barriers to a positive data culture, but three in particular seem to arise most often within education organizations:
- Harmful Results: People are often wary of why data is being collected and how it will be used. Historically, data has been used to harm marginalized groups both intentionally and unintentionally. Further, within organizations, data is sometimes used solely for performance reviews, thus viewed as a punitive tool.
- Lacking Resources: When organizations lack accessible, easy to use data systems, the amount of time and work that is required for data collection can be overwhelming. In addition, systems without checks and balances can produce inaccurate data, which in turn make people less likely to be able to use the outcomes in a meaningful way.
- Accountability Over Learning: Collecting data is sometimes viewed as necessary for bureaucratic purposes only (eg- grant reports, accreditation etc.) versus a tool to help support staff learning in a way that directly impacts quality of services received by students and families.
Understanding how your audience currently perceives data collection helps to ensure you properly explain how data-driven decision making will be used at your organization in a meaningful and just manner.
Here are a few concrete tips to help overcome these barriers:
Clarity in Purpose of Data Collection
- When sending out a survey or conducting interviews, be very intentional in articulating why the data is being collected and who will have access to it. People want to know if there are any limits to confidentiality- E.g.: Before sharing out results of the survey with the team, all identifying information including your name and role will be removed. Aside from providing clarity, this is ethical practice within research.
- When possible, co-develop the questions with someone who is part of the group you want to survey. This can help ensure that language is clear and appropriate for the target audience. It also provides external eyes to help identify potentially offensive or confusing language. For example, when collecting information about gender, the way you set up answer options can either make the survey more or less inclusive.
- After soliciting feedback, be intentional in sharing timely updates as to what was learned and how it will guide decision making going forward. It can be easy to focus on sharing results with decision-makers and forget to loop in the participants who are the ones that made the project possible. It can also be easy to share outcomes without including how that information will shape decisions going forward. Timely and transparent sharing of outcomes with all stakeholders can help increase buy-in for the value of data informed decision making.
Technical Support that Cultivates Meaning in the Task
- When teaching teams how to collect and enter data, it’s important to help them connect the steps of how to do it with the ultimate purpose. For example, instead of just asking teachers to note the race and gender of their students, help them understand that the information may later be tied to academic outcomes to better understand if there are any racial or gender disparities that exist among the students. Providing that explanation serves two purposes: helps motivate those who are collecting the data because they see how it will be used as opposed to collecting a meaningless data point and when people know how data will be used, they are better equipped to collect to do live troubleshooting to ensure they collect the right information.
- Provide clear instruction and follow-up support so that team members feel well equipped to use required data systems. It’s helpful to have multiple modes of support (E.g.- peer champions, written instructions, etc.) available to help ensure team members have timely access to technical support.
Inclusivity in Data Interpretation
- Be inclusive when interpreting the data. Having multiple people with a variety of perspectives (including at least one person who is trained in working with data) review the data can help reduce bias in interpretation. It also creates an opportunity for more people to directly connect to how data connection leads to learning. For example, in looking at attendance data for high school students, school administrators may look at the rate of chronic absenteeism and notice trends related to the system (ie- variances by district, by school, by teacher) whereas parents reviewing the data may be more inclined to notice patterns related to individual students.
- When discussing the results from data analysis, create inclusivity by allowing for questions and not making assumptions about what is already known or not known. It can be helpful to intentionally create a culture of curiosity so that people are encouraged to ask questions about the meaning of the data, and how it potentially connects to their work moving forward.
- One regular activity that ImpactTulsa uses with major projects is to create time to do an After Action Review. During the After Action Review, team members walk through the process of a project to reflect on things that went well and things that did not go well. We also review participant feedback together and make decisions about what could be adjusted in the future to improve outcomes.
Overall, creating a positive data culture takes intentionality and investment of time and resources. However, the outcome is a team that is aligned to the value and purpose of data collection, which ultimately enables teams to make decisions that lead to better results.