Skip to content
Episode #55
Filling Seats Podcast | September 18, 2025

How Colleges Can Turn Data Into Meaningful Success

In this episode:

On this episode of Filling Seats, StudentBridge CEO Jonathan Clues sits down with Dr. Curt Merlau to unpack how colleges can transform raw data into meaningful student success. From shifting away from “data overload” to building actionable insights, Dr. Merlau highlights strategies that empower institutions to make smarter decisions, improve student outcomes, and drive measurable enrollment growth.

Key topics covered:

  • Guest: Dr. Curt Merlau.
  • How to move from data overload to actionable insights.
  • Turning raw information into student success strategies.
  • Smarter decision-making through data-driven enrollment practices.
  • Practical steps colleges can take to boost outcomes and retention.
podcast-55-headshot

Dr. Curt Merlau

Vice President @ Resultant


ksm_consulting_logo

 

Episode Transcript

Host  0:00  
Hi, I'm Jonathan Clues, CEO and founder here at StudentBridge. Thank you for joining us for today's episode of Filling Seats Podcast, episode 55 I'm with Dr Curt Merlau.  I just asked you how to present your name, and I said it wrong straight away, but you want to say the wine, and that's totally okay. Why don't you say it? 

Dr Curt Merlau  0:19  
Merleau

Host  0:20  
There, so Dr Malau Merlau, so look, we're going to talk about how education leaders can move beyond basic dashboards to harness predictive data analytics for meaningful student success. Curt, you bring a lot of deep and meaningful expertise here, both personally and through work. Please feel free. I know you're Vice President Education Practice at Resultant. I enjoyed meeting you the other day, but tell the audience a little bit about yourself.

Dr Curt Merlau  0:47  
Well, I appreciate it, and thanks for having me on. I look forward to the conversation today. As you mentioned, I do serve as the Vice President of Resultant Education Practice that spans early care through post secondary education, and at resultant we're focused on helping organizations put the right data and the right place to the right people at the right time. So I really enjoy my work. Former teacher administrator spent some time in higher education as well, and now have the opportunity to work with institutions and agencies in the public sector across the country, where you based out of, I'm based out of Indianapolis, Indiana. A great time to be in Indianapolis. It's May Race. Race season is here. 

Host  1:32  
They don't have carb day anymore because they don't have carb directors, but they still have the carb day. Oh yes. Doesn't mean anything. As a former racer myself, it was a, it was a, it was a, my first ever race in America was in Indianapolis, so not the 500 it was the IRP the night before I did the night before the 500 but anyway, so there you go. So look. Thank you for joining us from Indiana today. So I'm gonna start with the first question, probably a big question you pose to education leaders, are we using the right data in the right way? You spoke about right data at the right time? So are we using the right data in the right way?

Dr Curt Merlau  2:03  
Yeah, fantastic question, yes and no, we sit on huge amounts of data. And when we talk with institutions, I typically refer to a scale in terms of organizational culture and data access, on how we are how they are utilizing data. And when you look at that scale, from level one being completely unaware of the data that they have, to five, data being pervasive across the institution. I would suggest institutions are largely between a level two and a level three. There's a commitment to use data. There's an appetite. There are dashboards, and perhaps they start to have a central data warehouse. But you know, what I noticed is that fewer have really, truly embraced predictive and prescriptive analytics as a part of their data, different strategy. So the ultimate vision is that data is coming in near real time. You know, we still hear a lot about data being underutilized or siloed or in legacy systems or processes across an institution, and our belief, just to sum up, my answer here, is that those who harness their data will not just survive these uncertain times, they will thrive. So today, they need clean data in a central location to understand the efficacy of their programs, so that then tomorrow, they can take full advantage of platforms and predictive analytics and other products to help deliver their their initiatives in a personalized way.

Host  3:48  
No, no, I get we use data a lot. I say data. You might say data. Let's not let's not separate. Yeah, exactly. You know, I can't pronounce your last name half time, so, but like what? Let's talk about the power of data and and, but also the downside of data, something I say to our team all the time, that there's, there's data scientists in this world. AI is helping with this. But analysis paralysis is like zoom fatigue. Analysis paralysis is a real thing. But, like, talk to me about what you've seen out of the place, because we love our education partner, customers. That's what we work with. StudentBridge only works within education. We've dedicated our own careers and livelihoods to it, and we understand that sometimes there can be that kind of freeze. There's too much data. How do I actually turn data into a good business intelligence, slash recommendation to go and have a good action or good data, informed decisions. What do you see in the marketplace? I'm sure it's improving, but, but what do you see?

Dr Curt Merlau  4:49  
Fantastic insight. I mean, we do have analysis paralysis. I remember as a teacher, we would have these, these, these data war rooms, where we have printed dashboards on walls, and we would walk around. And it was just Yeah, highlighter in our clip. It was just overwhelming and and there's a lot of cognitive energy that goes into interpreting dashboards, a lot of sense making and sense giving and discernment and looking at multiple things and trying to put them all together. And by the time you do that, you're fatigued. And so the opportunity that we have with with advanced analytics is to do that work for us and serve it up on a platter where we can still exercise our professional discernment, but we're not having to go through the rigmarole of looking at dashboards or printed reports and try to put it all together like A Beautiful Mind, yeah, and still exercise that discernment. But data has been a part of education from the beginning. It's not something we can avoid, but it's important that it not be used solely for accountability or judgment alone. It's got to be used for purposes of continuous improvement, with an inquiry based lens to help problem solve the challenges that we face as institutions today.

Host  6:04  
I am, I like talking about that data, bad data in bad data out, yes, saying I agree with and I see it all day long as running an enterprise. I mean just bad data in bad data out, and then you go and try and create a decision based that data, or you're recommended by AI a path forward based on the data it's gone through and analyzed. Big difference between and I might not be using data appropriate terms here, but implied versus given, that's the right so I implied and implied is given. So so like explicit, explicit versus implies. There you go. I've got my tongue back in, and I'll give you the good example. I just let you had this with a potential partner school. Now I said, Look, with data. Have you said to Jonathan, Hey, Jonathan, what's your favorite meal? I'm going to tell you steak and champagne, because I love it, right? And they go, Well, we're going to treat you to it's Friday, steak and champagne. I go, I don't want that today. Why not? What I want I on for every Friday I eat fish tacos and have a margaritas, my end of week celebration. End of week fish taco Margarita. We didn't say that you said steak and champagne in the data, it says steak and champagne. You're now using the data that you explicitly told me versus now, but I might be able to have you observed me if you observed and watched me over the last 20 Fridays. You know, I never once had a steak and champagne. It is my favorite meal. Anniversaries, birthdays, special treat. You asked me what my favorite was. You didn't ask me what my favorite Friday meal is. And so the questioning can lead to skewed data, sometimes, depending what date you're watching or looking at, talk to me. So let me put a question there. Help me understand or kind of rationalize when you see data going in, how actually analyze worthy? Is it like?

Dr Curt Merlau  7:52  
Yeah, that's fantastic point. And what we often see is that data and its collection is largely a centralized function, and the attitude is almost, I need to collect data for the sake of compliance, or it's a mandate, or I'm just going to gather as much data as I can. Data collection for the sake of collection, right?

Host  8:13  
It was a real world example. So in an education university, real world example of compliance data collection would be what?

Dr Curt Merlau  8:21  
So IPEDS reporting or financial aid reporting or accreditation things like that, things that they have to collect, to report out on for any number one of those regulatory things. But then there's, you know, hey, we want to collect data on our students that were enrolled, and where are they in the process, like, there's some of those required things for the function of the university. And that's not inherently bad strategy, but where the disconnect happens is that that is a largely an IT central function that is separate from the business function. And so often we see that institutions are have this kind of invisible wall between the decision makers and those within the organization that are making decisions based on data and those that are collecting it. And so there's not an understanding of, well, these are the questions that I have that I want to ask the data that's going to impact what I collect and how I collect it. Or, hey, there's some nuances here about this data that's collected that you ought to know about that impact, how you interpret what this data is saying. And so we often have to work with universities to bring those two parties together to have a human centered design around data strategy. You know, the IT division is not just the people with the wires and the hardware and the afterthought, every project, every initiative, becomes an IT and data initiative at some point in time, and the two have to work closely together to really understand those nuances and make sure the right data is collected in the right time and in the right way to make the right discernments.

Host  9:58  
Are there any ways today? To to the turn which data is somewhat implied and could be inaccurate versus observed and watched about people. Is there a way to tag data as like, Hey, we've done this, and this is why, hey, we watched 100 students in our quad. This is where they go, versus we asked 100 students in our quad, they told us, this is where they go. 

Host  9:58  
This is where they go. Yeah. I mean, certainly that's where you get into like with with data science and looking at regression analysis and really able to discern what's what's cloudy and what's more reliable. Make judgments on and my what's the confidence that I can make at a judgment on this data? But it really goes back to mapping out, what do we want to know, what do we want to ask, and where and how do we ask it, and make sure that that's done in a way that is methodologically sound. If you're a researcher trying to answer a research question, it's no different than if you are a leader within the higher ed space trying to answer a question yourself, the implications and how you ask the questions really matters

Host  10:05  
Exactly. So it's exactly what I was about to interrupt you to say, and then I let you go, but and you said it, it's, I believe that a lot of the answers given are totally driven by the way the questions asked

Dr Curt Merlau  10:05  
Yes, and that that is that falls in the realm of what we call data literacy. And so I often say the power of data dashboards or data products is not in the products or the dashboard itself. It's in the conversations that happen around the dashboard or as a result of it. Yes, so we have to train our people on data literacy. How do we ask the questions? You know? What is the right way to look at this? What are some of the nuances we need to think about? So we're not just taking it as blank face value. I love that, but we are practitioners with expertise, and we don't check that out the door. When we look at a dashboard,

Host  11:47  
That's right. I mean, like, look, we have that in our own four walls here at StudentBridge. We've got some great team members. They might present data. And I'm like, Yeah, but that's not the final report. Let's now discuss around it. Let's question it. There's some oddities in this data. Has anyone wondered why that happened? Has anyone wondered why that, that that data set came in that way? So I think that's absolutely true. I think, I think that that we look at data to say, hey, there it is. Was like, and then what? But then, then what? That's the thing, great. You got some data. Wonderful. If you want to tick a box like, say, compliance, you might just need that data. But if you actually want to make sound business decisions, to improve enrollment, recruitment, retention, fund, giving, for don't donor, giving, whatever it may be an advancement, you need to be able to question the data. Yeah, I see a lot of misleading data out there, and I'm like, wait a minute, like, you know, just that was that really was just the way it's asked, because that doesn't sit with other data sets I've seen. And then you then you have kind of competing data,

Dr Curt Merlau  12:46  
yeah, being curious. Have an inquiry mindset, but also have a growth mindset. And I think it's important to recognize where people are at with their perception of data. Some people have been hurt by it or burned by it, and they totally either ignore it or reject it. That's not okay, but we should health in a healthy way, question it, be curious about it, drill more into it, seek to understand it, and then embrace it as a tool for inquiry, not see it as a judgment sentence and be scared or avoidant of it.

Host  13:23  
Surely, one of the best things about also digital, the digital world and data, is if you now create a new business concept, idea and improvement, as you say, based on the way you've translated interpreted data, you've then got a baseline data to go and test again. So you go and do something different, and then all of a sudden the results are worse, or result you then should very you don't have to go all in. I mean, you should go a little bit of an AB test type thing. I mean, do you see a lot of schools doing that? .

Dr Curt Merlau  13:53  
Yeah, so actually, when we when we get into more prescriptive analytics, we have developed solutions that are centered around optimization, so really, if then scenarios so more advanced institutions can use data to play around with it. Well, if we change this policy in our financial aid giving, or if we introduce this intervention, what then happens to the likely outcome of a student's matriculation or attrition. So we certainly see that when I was in higher ed, we did to your point AB testing, and we tested different segments of the prospective pool based on their likelihood to matriculate, because what works for a certain student population may not be have the same output for the for the those who may be less likely to matriculate, right? So a lot of that advanced analytics allowed us to exercise discernment and test again, continuous improvement in ways that we wouldn't be able to do by just looking at a descriptive dashboard alone.

Host  15:01  
So I'm not we may have asked this question, but one of the questions we had was like, Look, how does the combination of it, expertise and real world, real world education practice help schools, institutions make more meaningful use their data. I mean, that's what we talk about. That's real world. There's the so can you talk to that a bit?

Dr Curt Merlau  15:18  
We believe it's everything. I mean, a result of, we're led by subject matter experts like my team who have been practitioners, who have worked in higher education, who know the lingo, who know the context and but we come alongside with our data engineers and our data scientists and translate. We're translators, right? It's crucial to bridge that gap, because there's a lot of nuance in understanding the real world application of education, and you are able to unlock much deeper, more meaningful ways and more innovative ways to apply data. And again, it's back to that human centered design approach. We believe, before you ever start collecting data or putting code in or a product in we've we've got to think about the people, the process and the technology concurrently.

Host  16:07  
Now I use the word interpreter. Use translator. Are those two words? You're smarter than I am. So do they mean the same thing?

Dr Curt Merlau  16:14  
No, it's still the same like since making sense, giving right? You know, someone who is a data scientist may not really understand those nuances or the implications of but they need a translator, or they need that interpreter to then devise a way to meet the need that ultimately drives the outcome.

Host  16:35  
So we discussed this already, the technology we talk about schools being a little bit slower than enterprise and embracing these things, and so maybe that healthcare and other industries that may move a bit quick. Automotive obviously jumps on new technology rather quickly. It's a real volume thing, but many schools now have dashboards and reporting tools. What's the difference between that and true predictive analytics? And why does that shift matter?

Dr Curt Merlau  17:00  
Yeah, that's a great question. You know, we went through a dashboard phase where it's everyone wanted a dashboard. You know, the President needs a dashboard. Okay, let's give him a dashboard for what, for what, you know, so, so just again, the dashboard alone is avoid the bright, shiny object, but, but here's how I would break it down. So there are different types of data analytics to improve decision making, ranging from descriptive, what happened in the past, then we get into diagnostic, why did it happen now we edge more into more of the advanced applications, where, with predictive analytics, we're looking at what's likely to happen in the future based on historic trends of data like the example you gave, we've observed you the last. You know, 20 weeks. We understand a little bit more your behavior, and we can predict with high certainty what you're going to eat on Friday. The next iteration beyond that is prescriptive. Well, how can we make the ideal outcome happen? What interventions or policies or levers? Can we pull Exactly?

Host  18:06  
Exactly what? Wait, let's test you. What do I eat on Fridays?

Dr Curt Merlau  18:10  
Fish and chips. Fish Fish tacos.

Host  18:12  
You just totally stereotyped me.

Dr Curt Merlau  18:15  
 Fish Taco. Yeah, so that you're

Host  18:19  
saying prescriptive means, but we want Jonathan to order a steak taco or fish and chips to get away from so we're going to do things along the way to drive that outcome that we want. 

Dr Curt Merlau  18:29  
What influences his likelihood to choose one over the other, exactly? And I can give an example. Yeah, we've built early warning indicators to identify students at risk of attrition, but then we've created modules that allow student support teams and others to say, Well, if we introduce this intervention, what does that do to the student's likelihood of attrition? Another example, we've done it to help with institutional based aid. If we gave this much more money in institutional based aid. What does that do to the likelihood of matriculation for this student or for this population? So it's like having a sandbox you can play around to say if then what happens to ultimately help justify the policy or intervention decision that you need to make. Great. No, get it. Okay, cool. The difference really is in complexity and value add. Dashboards are an example of descriptive analytics. They have a place, but we can't let ourselves just say, Well, we do data. We have dashboards. They require a lot of data literacy and a lot of like brain power to make sense and interpret and put together multiple data points to make a decision, whereas advanced analytics allows administrators to focus their time on those more complex and nuanced cases. And it's almost like this example of preparing a meal with descriptive analytics, you have to do the work of. Putting ingredients together and making sense of it, and making it a meal, and then cooking it and then and then plating it. Advanced Analytics puts it on a plate for you to where you didn't have to go through all of that energy trying to figure out what to make from the cupboard and how to put it all together and how to make sense of it and how to plate it it's served up for you, you still exercise discernment about what you want to kind of pick from, but it really helps conserve a lot of energy so that we can focus the things on those important human judgment aspects that require our expertise.

Host  20:36  
Great, no, that's really important. And I love the food analogies, because analogies go a long way. And I love food. So all right, so you emphasize the importance of local data ownership. That's something that you know you speak about when we speak offline. What does that look like in practice, and how does it empower schools to act more effectively?

Dr Curt Merlau  20:53  
Yeah, so I want to reference to something we often talk about institutions, which is this kind of again, on a spectrum of the what does it mean to be data informed, or data driven, or what we call data transformed, and to be data transformed goes beyond being data driven, being data transformed means every part of the institution is rowing in the same direction. They're focused on data. They have a culture of sharing about sharing it. They have a culture of talking about it. They have a means to access data in real time to make the decisions they need to make, and it's not locked away and held up in some central repository that requires a 10 step process to try to get get out the insights you need. So in a decentralized way, or more of an a local data ownership way, within universities, different departments or schools have direct access to generate the insights they need. To generate kind of a self service set of offerings, you know, where a nursing department can examine data that's relevant for them to make decisions, and it allows for that direct application of data to make daily decisions. And so it's a shift from data being locked away in the central repository inaccessible to democratizing that data access to empower folks, but that takes a lot of change management, a lot of modeling and a lot of processes that need to be in place to make that technology accessible.

Host  22:35  
All right, no, great. Thank you. And from your perspective, as both like a fraction here and a parent, what are the practical steps can education leaders take to ensure their data strategies lead to like, proactive interventions rather than just reactive reporting?

Dr Curt Merlau  22:53  
Yeah, that's a great question, because by the time we're reacting to something, we're already behind the eight ball, right?

Host  22:58  
Data's already happened, right? 

Dr Curt Merlau  22:58  
That already happened. And, yeah, and to your earlier point about that's the the fault of descriptive analytics, it's like, well, it already happened, and now we're scrambling to try to make sense of it. And we don't know when we have so many questions, but, but to answer your question that you just posed it for us, it starts with why, before it starts with what, why do we or why do we need to know x, y, z, how does that translate into our goals? Do our goals have measurable indicators? And then you can prioritize, what data should we collect and manage that's the most relevant to those indicators, so that it's not just data for the sake of data being collected and all this effort, but it's a very purposeful collection in a way that translates back to the why?

Host  23:53  
Well, let's double down on that, because I think again, for our viewers and listeners, that could be something quite interesting. But to collect data for the day sake, which is everywhere, right? Data is everywhere, yeah. Do do you have a best practice? Maybe it's but maybe you do, maybe you don't, but of like, saying, Look, just start here. Like we can always grow it. We can always evolve and iterate, but evolve in a trip. Where are you seeing? Like, there's just one least tick this off, then move here, then, then move here. Or is it No, no. It depends on the school's unique needs, and we just have to jump in. Maybe they've got a retention issue. Maybe it's, maybe it's knowing what programs to create, because, before I go on, because we really believe in the issue out here called the root cause, right? So I said a client. I just had a potential customer that came to me and said, I only want to hear about your one solution. I'm like, Well, we have this thing where we want to thing where we want to consult and hear what's going on. And he kind of got annoyed. I just want to hear about this one solution, like, no but. And it turned out, after him being annoyed going through my process, the one solution he wanted wasn't the right solution, but what he was trying to achieve, he'd self diagnosed. He had created this notion his head. They knew exactly what I'm the buyer. How dare you just sell me the thing I want. I might know, but if you're trying, So how often do you see that like the root cause of an issue, and they try and fix the wrong thing all the time,

Dr Curt Merlau  25:10  
all the time. I mean, it's like that Henry Ford quote. If I were to ask people what they need, they would have told me a faster horse, right? And it's not that we ignore or dismiss what they're saying, but we use it as a starting point to continue to ask why, and to be curious to understand that root cause, I need a data dashboard. Why? Well, actually, you don't need a data dashboard because what you told me is you have a data infrastructure problem, and if I sold you a data dashboard, it would be junk, or it wouldn't work, because your data is in 500 places, it can't talk to one another, and it's not standardized. So we also go through that root cause, and we want to appreciate the specifics of the university and the challenges that they're seeing and the goals that they have set out for them. So we often look at things in terms of where are, where, where will the most value be added, and what is the most feasible to come up with? Like, what's the low hanging fruit that we can do to generate quick wins, versus the things that are going to take a longer time to develop, because the feasibility is just not there, or maybe there's not enough interest in it to garner enough support across the university. So, for example, I was just talking with a university around how can AI be used in the university? Wow, that's a broad question. Let's narrow that down. Well, then we got down to student experience. Okay, tell me more about your student experience. I want to understand where are the bottlenecks and the choke points today in the student experience, and what would be the most valuable in terms of solving but then I have to look at the feasibility of solving those problems.

Host  26:56  
Could you not even go one further and go help me understand how you've even concluded you got a student experience? Yeah, absolutely. What are the flags like? What are just like? Why is that self diagnosis issue that could be dangerous? It's like, Look, don't self we're here to work it out together. Yes, maybe you're 100% correct. That's the issue. But what's making you think you've got student experience issues? 

Dr Curt Merlau  26:56  
Yeah, and so often. And I understand this, you know, this approach, that there's this proliferation of of, well, I just want to buy this product. Click, drag, drop, I'm done. And like your approach. It's like, well, we're not just here to sell you a product. We want to understand why, to under to give you the right solution for what you need. Because there's been this band aid approach, where there's this, been this grab and get a random products, we've now have tech debt, or we have disparate products that don't work, or people aren't utilizing the products,

Host  27:48  
and they might work in a silo, and not I mean vendor management,

Dr Curt Merlau  27:52  
and then we're scratching our heads, like, why are we still struggling to get basic insights? Well, let's take a time out and really understand the root cause so we can prescribe the best path forward to generally agree the value 

Host  28:03  
don't agree. I think that's a real key part as well, because that's where someone can come to the table with their own data that maybe even supports their self diagnosis, right and right, because they've interpreted, as we call it, and they've interpreted the data in their own way, and maybe they've asked the wrong questions. Maybe they they've just done that. So you really have to get back down to that, to the almost the basics

Dr Curt Merlau  28:30  
You do, you do, and you have to be able to see the system for how it is today. You know, every system was is perfectly designed to get the results that you're getting. So if we map out the current state of the system today, when I by system, what I mean is all of the processes across the university, we have to map it out. Well, well, okay, so we're seeing this result. Why are we seeing it? Well, the system is designed as such, intentionally or unintentionally. So we have to heal into that, to really reimagine the student experience and how technology can mitigate that, to address those but until you understand the root cause you're you're just going to be putting band aids on. Band aids

Host  29:08  
off topic of data, but tapping into your knowledge of the space overall. When people use this term, student experience, are you seeing it used more at the matriculated student at Campus? Are you talking about student experience during the recruitment process, when he when are you seeing more people to wet student experience?

Dr Curt Merlau  29:25  
So I typically see people use that to more of the from matriculation on. However, our belief is that the student experience starts well before that correct, and when you're recruiting and enrollment, I know you share this belief too, you have to be recruiting and enrolling with with your retention in mind, right? It's not enough just to increase your enrollment of the incoming class if they're just going to transition out. Yes, we've not solved anything, right? There's got to be the right student, the right fit, and all that has to happen.

Host  29:59  
Yeah. Yeah, no, I totally agree. Like, I think, I think the student experience isn't considered enough during the during the during the recruitment, recruitment, at the end of the day, that enrollment process goes from marketing, recruitment, enrollment. I mean, those three things happen during the whole you got to market them, say, No, you're there. You've got to recruit them, or start messaging to them, but then you've got to enroll them and say, and that's a that's not just a choice. Is a process. Like, Hey, are you the right choice for me? And now I think you're the right choice. How on earth do I get into you? And so I think the student experience needs to be more holistic. And I think arguably, advancement, if you can stay with a good experience to your graduates, right? You have common sense dictates that you will have higher amount of donations. 

Dr Curt Merlau  30:43  
That's right. That's right. Experience. I'll give you an example of what I often see from a student experience perspective, the way we defined it, which is through that, you know, marketing, enrollment process, oftentimes universities will have their own, you know, enrollment management, data, system and process that's separate from individual school or college in the university, they've got their own. So if a student's visiting campus, they're in the enrollment system, but they're also in this separate system at the college or the department level, they don't talk to one another. They don't know anything about one another, but the student is interacting seamlessly on campus with those different departments, or, let's say, athletics, they have their own data. So our data systems have to be as seamless as the the way that our students are consuming the programs and services the university is providing, and they're not today by and large.

Host  31:38  
Absolutely, absolutely, that's great as we start wrapping up here. Love some final thoughts from you. Curt, I mean, look, where's this all going? Ai, like you say, some school said you we'd like an AI stretch. That's just two letters, what? Where's everything, Where's where's it going? Are we going to become more and more and more dependent on on data? Are we? Are we? Are we hit saturation and we nowhere near saturation. Yeah, be a futurist where we go

Dr Curt Merlau  32:05  
No near yet saturation, especially when we think about that scale from descriptive to prescriptive, right? We've got a long way to go to get to that fully actualized state. And with the position where higher ed is today, there's going to be those who are resilient institutions, and one of the ways to be a resilient institutions is to have data as the bedrock for your decision making. The only way we get there is if we have clean, clear data coming in that we own and that we can give access to and have everyone utilize. So we talk a lot about building resilient institutions, and data's role in that, I foresee that with whatever changes may happen with, say, student financial aid, we are going to see more pressure on institutions from an enrollment perspective, but also on how they're distributing their institutional based aid. AI can help predict what is the most efficient way to distribute that to create the highest yield of students who are a great fit for us, who stay and who persist. So we've got to get smarter, and the way to achieve efficiency, efficacy and transparency is going to be with those surgical tools that only advanced analytics can can give and so those who embrace evolving from the descriptive dashboards to the prescriptive, advanced AI type things are going to thrive. And so there's a real crossroads here. And I personally believe data is one of the largest assets the university has, and it would only make sense to leverage it to its fullest extent.

Host  33:43  
Absolutely, data is the most valuable thing most businesses, universities, anything. I mean, just having your data, even if you own a shop, having a data is key. Well, that's what we got time for today. Curt, thank you very much for taking the time to join us. I know that I'm in my dark studio there. I don't know why. We obviously haven't paid the power bill, but we are very excited to look forward to continuing conversations with you. As always, feel free to our viewers and listeners. Thank you for taking the time to join us today. We have lots of great other resources on www.studentbridge.com, under our resources as podcasts and webinars and all kinds of great stuff. Also, we'll be linking to Dr Curt. I won't even try and get your last name wrong again, so again, lucky if you want to get in touch for any more insights from him as well. But until then, again, thank you for taking the time to join us today, all of you and drive safe, listen safe and see you soon. Thank you. Take care. Bye, bye.