News: Data, Research & Evaluation

Tips and Tricks for Taking the Next Step to Data Maturity

Monday, May 4, 2020  
Share |

By Bill DeBaun, Director of Data and Evaluation, NCAN, and Nathan Porteshawver, Product Manager, CoPilot

Over the past five years, NCAN members as a whole made significant strides in putting data to use for program improvement and expansion. Through the implementation of customer relationship management (CRM) platforms (e.g., Salesforce), a broader swath of NCAN members became much more mature around data collection, analysis, and reporting. NCAN accelerates this process through the Common Measures and the Data and Evaluation Toolkit and also through efforts like the Benchmarking and Impact Survey projects.

As far as many programs have come, members still vary widely in their data usage. Some organizations still find themselves early in the process while others with expertise are always on the lookout for ways to further refine their systems.

College Forward is an NCAN member that demonstrates substantial capacity and maturity in the world of data systems. That capacity and maturity have spilled over from a robust internal system to an external service known as CoPilot that dozens of other programs have adopted for their own work.

Based in Austin, Texas, and launched in 2004 as a college access program called Admission Control, College Forward started its college success programming in 2005 and adopted its current name in 2006. The program currently partners with 11Ccentral Texas high schools, three charter school networks, and both Austin Community College and Texas State University. More than 1,700 College Forward-served students have completed a postsecondary degree or credential, and 90% of College Forward students enroll in college after graduating from high school; 82% of those students persist into their second year. A program evaluation revealed that College Forward-served students were 2.5 times more likely than their unsupported peers to graduate.

Conversations between NCAN and College Forward that started at last year’s national conference in Indianapolis bloomed into a broader discussion around helping other NCAN member programs identify their place in the analytical maturity life cycle (that’s fancy for “how sophisticated are we with data?”). Below are some insights for members around data usage and how to move into “data maturity.” No matter where they are, organizations can move to the next level.

Data and Practice, Not Data versus Practice. Expand the Pie Because This Isn’t Zero-Sum.

Too often we see programs not folding valuable insights from data into their practice. Conversely, sometimes they forget to ask valuable questions of their data based on their practice. Usually it’s because “there isn’t enough time.” Thinking about data and practice as an “either/or” is a trap. Data and practice’s thoughtful partnership can both improve student outcomes and scale the number of students a program can serve. When programs don’t make the time to feed data into practice or vice versa, they do not reap the benefits (in the form of additional time or efficiency) that come from that partnership.

For subject matter experts (SMEs) like college access programs’ coaches, counselors, and mentors, the relationship between data and practice is a two-way street. SMEs derive value from data analytics, and program models improve when they are guided by high-quality data. This mutual relationship is essential to securing a return on a program’s investment. When data scientists use, for example, predictive applications to scale an operation (i.e., to serve more students) they are relying on both SMEs’ common sense and expertise and industry best practices gained from years of experience to inform those predictive systems. The results from a data system never replace individuals, but these predictions can be used as suggestions or additional resources coaches can use to gain a deeper understanding about students.

Examples of the partnership between data and practice might be an identified lack of engagement from students around communications on certain days of the week or times of the day; having those data means that SMEs can tailor their outreach during windows where students are more responsive, improving the effectiveness of the intervention. Conversely, SMEs who are anecdotally noting significant challenges among students working while going to school might suggest systematically collecting data on which students are working and for how many hours so that the data system can examine trends and suggest which students might be at-risk based on their work behaviors.

Having a Theory of Change and Logic Model Is Critical

Resist the urge to zone out upon reading the phrases “theory of change” and “logic model.” It’s most people’s tendency to do so, but it’s a short-sighted move. Does your program know what you do, why you do it, what you do it with, and what you expect to see happen as a result? “Of course” is the usual knee jerk reaction, but stop and think for a second about whether it’s actually true. Inertia is a powerful organizational force, and sometimes the way things are done is a direct result of the way they have always been done.

Logic modeling is a way to make sure that the road map a program is using to get to its destination is accurate, efficient, and sensible.

Logic modeling and data collection interact because every item listed as an input, action, output, or outcome should be quantifiable in some way. Recall the old overhead projector “transparencies” from grade school. A thoughtfully designed logic model should have an overlay that shows the data points the organization is using to measure what it says it is doing.

If this sounds like a lot of tedium, get back into a positive headspace about it. Logic modeling can be a tremendous team-building exercise that makes sure everyone is on the same page and what they do, how they do it, and why.

For more on logic modeling, check out the W.K. Kellogg Foundation’s exhaustive guide or NCAN’s thoughts on the topic in “Driving Toward Program Improvement: Principles and Practices for Getting Started with Data” (in collaboration with Exponent Partners) or “Logic Models To Support Program Design, Implementation And Evaluation,” a workbook from the National Center for Education Statistics’ Regional Education Laboratory.

What Are the Steps in the Analytical Maturity Life Cycle?

We can use the flagship model Davenport and Harris developed in their 2010 book "Competing on Analytics" as our framework to discuss an organization’s analytical maturation. Many nonprofits don’t know how to take the next step in their analytical maturity. This task first depends on a program honestly identifying its current stage. This section discusses the stages from least to most mature and offers some insights on how to move up to each stage from the one below it.

Stage 5: Analytically Impaired

This stage defines organizations struggling to use data in any meaningful way. This may be because an organization lacks key skills, which can be addressed with training and hiring, or because the organization has inherited a bad system that can’t keep up with current needs. Organizations at this stage struggle with missing or incomplete datasets and are therefore unable to derive value from data.

Stage 4: Localized Analytics

This refers to organizations that may be collecting and using data, albeit in silos. This might be, for example, the development, marketing, or accounting departments. However, data still might be incomplete. Reports are often used for particular stakeholders but without broader implications for the rest of the organization. It may still take significant time to create reports, and skills may still be lacking to create more sophisticated reports or dashboards. Departments or teams using data at this stage may not have support from other departments attempting similar work.

How to get to Stage 4: Identify one team (or one person) and provide them with the resources to collect and use data for one purpose that has a high return on investment. Organizations often begin with “low-hanging fruit,” or problems which can be solved by data that don’t require a lot of effort. Document this project to replicate with appropriate future projects. Create and maintain your data dictionary with field descriptions and help texts.

Stage 3: Analytical Aspiration

Organizations at this stage have invested in business intelligence solutions. However, access to these tools continue to remain in silos. Often times, the rest of the organization lags behind in skills and competencies. This means the use of data continues to be isolated for individuals.

How to get to Stage 3: Invest in an enterprise-level data system, even if your use case for this is still siloed in a department or two. Ensure your data dictionary can account for how each field is collected for each end-user and non-end-user type in preparation for new end users in the system. Concentrate on acquiring the technology stack to meet the requirements of future users. Identify how dynamic reports and dashboards can be used to influence actions internally. If you already have a project or two under your wing, consider if you have the resources to work with other people or departments and help them meet their goals with analytics.

Stage 2: Analytic Companies

Analytical capabilities and competencies have been cultivated throughout the organization. Silos have been broken down. The companywide culture drives highest-quality data collection and processing. Organizations at this stage can report relevant business intelligence to every department, as well as to C-level administrators, with meaningful intelligence that is actionable. At this stage, data help individuals do their jobs better.  

How to get to Stage 2: This is a leadership task. The goal is to create a companywide culture that embraces data adoption. This requires formalizing incentives and employee pathways as well as professional development and training opportunities. This may also include creating new positions, titles, or committees, such as ambassador programs or data analytics centers of excellences. These processes formalize how your organization approaches data research, integrity, transformation, extraction, reporting, UX, forecasting and simulation, among other areas of interest.

Stage 1: Analytic Competitor

Once the infrastructure has been formalized and iterated to support the people, processes, and technology used to create enterprise-wide access to analytics, organizations can begin to use data to predict future outcomes in order to influence business decisions. This requires organizations to use internal and external data sources. These data can then be used in statistical models to inform all decisions throughout the organization.

How to get to Stage 1: Once you have high-quality data and a companywide culture, you can begin to prioritize model building. Similar to how this process began, this can begin with a consultant, individual, or a team that supports predictive applications identified by departments throughout the organization. Models can be built to predict future response that will influence overall strategy. This also requires an understanding of the real-world implications and a healthy conversation about acceptable error and bias to not leave out underserved demographics from receiving services.

What Are the Major Considerations for Securing Buy-In for Data Systems?

Nonprofit analytics leaders must clearly link their organization’s analytics ambitions to the overall mission. This means selecting use cases for analytics that are strategically prioritized to address stakeholders’ most pressing issues. Buy-in occurs when organizations create a clear picture for how analytics creates value and supports the organization’s mission. To create a clear picture, concentrate on these three pillars:

People

Identify all major stakeholders. Do you understand their interests in analytics? Have you spoken to them personally or issued a survey? What do stakeholders need to have, want to have, and wish to have? How will you continue to address their reservations? Do you understand each individual’s analytical capabilities or the organization’s overall competencies? Are you asking too much of your stakeholders? Do they feel supported or incentivized when they go above and beyond their job responsibilities?

Process

Often analytics processes are siloed, and many times it’s unclear how analytics projects get pitched, selected, and how they are approached or completed. Have you considered creating your own product road map or project management plan? When do you plan to use a consultant? How can you formalize every process to ensure greater transparency and accessibility?

Technology

Technology supports the people and the processes you’ve identified above. Analytics requires investment and therefore must have a return that makes the initial spend worthwhile. Did you find the best vender? Does each product have the data analytics structure required to support key capabilities? Do you own the rights to the appropriate data? Does each product integrate with each other?  How are data backed up and restored? What is the end-user experience like?

As an analytics leader, you want to empower your people with transparent processes supported by technology. Without clearly stated common goals to work toward, you will not produce the strongest return on investment as possible. If analytics projects aren’t appropriately strategized, outcomes may quickly become obsolete or may not help in the way you’d expect. If you create incentives for your people and carve out time in their schedule to engage in this work, people will drive this process.

What Are Best Practices Around Securing Buy-In for End Users?

Analytics requires good, clean data. This often means relying on end users to collect information in the best way possible. Human error is inevitable, especially if the people entering the data don’t see the value in this task. Programs can ensure end users are using the system to the best of their ability and are encouraging staff to use technology for data analysis by doing the following things:

Data Entry

Are there any data entry tasks that you ask your end users to do that could be automated or uploaded in bulk with help from technology? Where is the source of your data and how often do you collect new data? Do you use forms or surveys? Do you use validation rules or collect dependent information so that users are prompted when mistakes happen? Do your field labels, descriptions, and help text help your end users understand how to enter data into your system? Do you have a training manual? How about a formal training process that is accessible to multiple learning types? How is future training encouraged?

Incentives

Maybe your organization doesn’t have much room in the budget to offer financial incentives, but there are other ways to encourage end user compliance. You could offer time off, pizza parties, professional development opportunities, and other awards or honors. You could also formalize the pathway for how end users can get promoted internally.

Feedback

To ensure that training materials and the end user experience remain accessible to everyone, be sure there are multiple ways users can provide feedback. Whether this is in person, anonymously through a form, or even in all-team meetings, be clear that feedback is encouraged and hold the organization accountable for addressing the issues that are brought up.

When end users understand their role, which has been clearly defined and supported internally, their use of technology becomes dependent on their job responsibilities (and vice versa). Each organization’s goal can be to create a companywide culture that embraces the data analytics process.

How Can We Combat Bias in Analytics?

Much of this post has been about how organizations can work their way toward systems that improve programs’ efficiency and increase their scale. By increasing buy-in, thinking about what a program does and how, and acting decisively to improve analytical maturity, the idea is that systems will develop that make people’s jobs easier and serve students better.

Much of this post has been about how organizations can work their way toward systems that improve programs’ efficiency and increase their scale. By increasing buy-in, thinking about what a program does and how, and acting decisively to improve analytical maturity, the idea is that systems will develop that make people’s jobs easier and serve students better.

As leaders in the college access and success field, it’s important to understand the assumptions each algorithm makes and select models that err on the side of benefiting our students. For example, we know certain demographics of students undermatch to top-tier universities. Model building is an opportunity for us to understand when that might be the case and when and how to build recommendations that can counteract this phenomenon. In a future blog post, NCAN and College Forward will share more thorough recommendations for combating bias in analytics, but for now programs with algorithms would do well to think about the assumptions made by their systems.

Rome Wasn’t Built in a Day; Your Analytical Maturity Won’t Be Either

The heading here borrows a phrase from “Driving Toward Program Improvement: Principles and Practices for Getting Started with Data” (in collaboration with Exponent Partners), and readers whose heads are swimming from all of the technical details and advice here should take a deep breath. There are a few good places to start.

First, understand what your program does, why, and how and make sure data collection is in place around answering those questions. Take stock of where your program’s current capacities and activities around data and practice. Look for discrete, actionable, “quick wins” that can build momentum for having conversations around these topics and engage the entire team. Then work purposefully to take advantage of areas of opportunity presented by those conversations. Following these steps, any NCAN member program can take its data maturity to the next step for the benefit of students.