Using Health Scores to Drive Better Outcomes w/ CSM Office Hours
Jeremy: I hope that everybody had a great discussion around using health scores and having that data, using that data to drive better outcomes. But what I want to be able to share in the larger group is I would like to have, I think whoever thinks they have probably the most unique way to use data like health scores to drive better customer conversations. And even Claudette brought up a question about, do you have up programs for those customers that are strong and what does that look like? So I'm going to call on, we've only got four groups, so hopefully we can get a little bit of feedback from each of the groups. But I'm actually going to start in reverse order with group four, Alana, Christie David, Josh, Kevin, and Vignesh. If one of the six of you wants to come off mute and maybe share a little bit about your discussion and maybe something unique that came out of came out of your breakout group.
Vignesh: I can take that.
Jeremy: Okay, Vignesh. How are you, sir?
Vignesh: I am well, Jeremy. So good to meet everyone here. First time I'm jumping on to GGR. So it was a really interesting discussion. To start with, we were talking about how everyone's using data in their day- to- day jobs. And where that gravitated and got towards is how do we change health scores? Where are health scores getting to at the end of the day? When is it probably time to hit the reset button on your health scores?. One interesting talk that came from Josh that came in from a couple of others also was the fact that the time to value for a customer to realize," Hey, lovely, right? What your product does, what you folks have done for me, it just hits it out of the park." And that's a problem solved really fast. The sooner you can bring it down, the lesser duration it can run to. I think the customer is going to start realizing more value and that can act as a very effective measure of a customer's health.
Alana: To piggy back off of Vignesh, I just wanted to say, if any of you have Amazon Fresh or Prime and you get an order before the expected time, I think that's the goal for a time to value. Like," Oh wow. I got this in such little time." So that's, I think the reaction that we want to see from customers as well.
Jeremy: I have a question for you, Alana, and Vignesh, great recap. Thank you for sharing what came out of your group. Sounds like that first value, quickest time to value, it seemed to come maybe from you, Alana, kind of reading between the lines a little bit. Is there anything that you're using from a metric perspective that helps that you're using to help kind of gauge that, we'll call it a health score. It's not really health score. But gauging whether you're achieving or exceeding that first value for those customers?
Alana: Yeah. So I mentioned this in the group, but specific to my company and my industry, everyone's goal or their customer journey is pretty similar. So because I work in compliance, a really easy way to check the box if you've added value is getting customers to their compliance goals. So in that sense, it's pretty linear and easy to have visibility into. But another microway to do that is if they specifically asked for a feature to be released or a product in tandem for what we're currently providing in our platform, then once we've released that into the product or we've released a new product that they asked for, then I would check that off as well for immediate value.
Josh: I'd actually like to jump in here on this one, because getting that determination of time to first value, not full value but first value, was something that we used a great deal in inaudible. Rather than saying," Hey, when you come on and we get started, it's going to take you, we have to put together this project plan and work through it with you." It was more like," Hey, you know what? Within 90 seconds you can have created your account and added your first AWS cloud account, so that way we can begin scanning." Now there's dozens of other things that we can do and they will add more value, but you know what? I gave you that first little kind of like Pavlov, I gave you that first little treat and that kind of built the momentum.
Jeremy: Yeah, that's a good point, Josh. And Alana, thank you for kind of adding some color to what you're looking at and in compliance. I work in cyber security and we have a lot of customers that you need to meet compliance needs. And there's a hard, fast timeline. You either meet it by this date or you don't. And it's very black and white. But for the other groups going around the room here, group one which was I think Ian and John, Matt, Walter, or group two, Dan, Kevin, Tom, or group three, David, Erica, Stephanie. Are you tracking any kind of a metric similar to what Alana shared around that first value and whether you're hitting that or not?
John: I'll jump in with that one.
Jeremy: Okay, John.
John: So we have much more of a enterprise play in the company that I'm working for currently provides apps for commercial real estate builders. So time to value, the app is available and built in the app store and usually white labeled and branded as for the building or for the landlord. And we track very stringently our onboarding time and our time to value of how long does it just take us to build this app and you can download it. So it's a pretty tangible moment as a customer where I can download that first app. And we have much more of an enterprise play. It's a little bit of a warmer process, but for every customer, we have a goal of turning that off in two months, roughly is from kickoff to app available in app store. We're really targeting about two calendar months on making that happen and get that to them. And as it relates to health, to bring back the original topic, if we do it in two months, we consider a customer to be green. If we're in between two and four, they're yellow. And if we go beyond four months, they're red. It's pretty, pretty simple. And that speaks to time to value. We know our customers will start losing patience with us. There are already exceptions, but it's just a hard and fast rule that we've put into place which is pretty accurate. Then when we get the NPS scores after onboarding of if we do it in two months, we see promoters. If we are two and four, we see passives. And if we go past that four month window, then we start to see our detractors.
Jeremy: Yeah, I like that play. And it seems to be heavily revolving around we'll call it the onboarding timeframe,. You've got to target of, most customers when they're using your service, they want to get an app launch. And so getting that app launch is what's going to cost that value. Throwing this out kind of a follow- up to John too, is is anybody using we'll call it more incremental or smaller milestones in that first time to value around, maybe it's not getting the app launch, maybe that is the goal, but maybe there's something smaller that really hits critical that would actually push that needle from making somebody a passive to a promoter? Daniel Chapick, I saw you come up. I don't know if there's something there.
Daniel: There. Yeah, so we're, we're in a bit of more of a unique boat. Our typical client timeline, our client relationship is measured in decades. We're working with a client for a very long time in the design and engineering space. And the software that they're using is a business requirement. If they want to design that bridge, design that building, they have to use this design software to do that. So our time to value is much more on the engagement standpoint, making sure that they're supported and that they're on the latest and greatest. So those increments that you're asking about Jeremy, are very much smaller increments of," Are you on the latest version or are you three versions back? What would upgrading do for your IT staff? What new features would you like to see developed?" That type of thing. So they're much smaller, incremental time to value in the long- term relationship with the client.
Jeremy: Yeah. So Daniel, then you bring up a good point where you've got long, we'll call it onboarding period, longer time to that initial value. So how do you track, is it versions behind? Is that the big metric for you as to making sure that whether somebody is a green versus a red? Or are there other things that you look at that kind of play into the factor looking at a customer's quote unquote health score?
Daniel: Sure. So I mean to be fair, we're in our infancy with this, so we're not nearly as developed as I want to be. But traditionally, what we're looking at is that engagement level. As long as the client is working with us on a consistent basis, then they're making progress. So we're looking at capabilities that they have from not necessarily features and functions in a software, but a combination of a number of softwares to be able to get to the outcomes that they want to achieve per their project type and their industry that they're working in. So I guess to answer the question, it's much more engagement wise where we're consistently moving them forward toward their outcomes.
Jeremy: Yeah, great. I appreciate John and Daniel I do appreciate both very short to very long. I like that, being able to kind of see, tie it one to the other, engagement versus onboarding and how you look at those things. So again, I appreciate you both sharing your feedback on that. I want to turn it over to maybe group two, Dan, Kevin and Tom. I see the, I think I still see the three of you out there. What kind of came out of your group discussion?
Kevin: Well some of the things that came up was that we were, granted your mileage may vary on this topic, but we felt that the health score was more of a unicorn in that we were seeing from anywhere between startups, all the way up to mature SAS companies, that health scores can be potentially few and far between. And if there is one, it may be generic and not specific to the company. But we did manage to have a great discussion with Dan who with Monday. com does have a health score and was the only one of the six of us that had a health score that they were using. Some of the great things that came out of it was our conversations about proofing your customer health score and actually showing that it is accurate to determine just how healthy a customer is, and doing so by not necessarily sharing the customer health score with the customer, but sharing the details or the metrics that go into that health score and determining if that's accurate, if the usage levels really dictate the amount of positive sentiment that they have within the service or product or if the usage levels, even though they're low, are still consistent with what they perceive that they were buying. So we thought that proofing the customer health score was going to be a lot of value down the road.
Jeremy: Dan, I see you came off mute. I was going to call on you to maybe expand a little bit on how you use it. Kevin, thank you for this summary. I heard that actually came up a couple of times as people were kind of jumping in late around," What, if my company doesn't use a health score?" I think that's also a really valid question to ask. So then what are the metrics or what are the things you are looking at to determine whether an account is quote unquote green or healthy versus an account that's red. So Dan, I don't know if you want to share maybe a little bit of your journey and how you're mapping out a health score and maybe some recommendations for companies who may not have a health score, maybe things that they can look at.
Dan: Absolutely. So just to close the loop first on the part that Kevin had mentioned, I think that one of the big pieces for us that's really vital is getting the customer sentiment before showing the data that makes up the health score, because we don't want to lead the conversation in that sense. We don't want to color that way. Yes, we'll use data to validate back up and make the case for whatever story we're telling, but we want to make sure that when we're asking their sentiment and validating that, that we get that kind of pre them seeing data and us saying," Wow, look how great the utilization is in this area." Because then it makes it harder for them to be honest, if they're like," Look, we're just not feeling how things are going right now." So that's a big part of that is the ordering on that presenting. As far as how we approach doing the health score, we try to account for both depth and breadth of usage as well as just kind of general engagement. So it's everything from weekly active users to number of events per user to how many days a week users are going in, their capacity, how much are they using of their seats versus the purchase, things like that. Just trying to capture a holistic picture of the health of the account using all of those different metrics. And as far as for companies that don't have that yet and ways to, excuse me, think about that in the meantime, one of the things that we, we always have said a lot internally especially, and I know others say this a lot, but the idea that perfect is the enemy of good. So start with what you do have. If you don't have access to a fully built out business intelligence tools and able to map all of that, take what metrics you do have. What are those that you can have and look at? And then using something that I think Kevin touched on in our breakout room that was super important was validating the model that you're going to use to see how healthy customers are by looking at it and saying, okay in the example he gave which I thought was really helpful, if you come up with a model for ranking the health of accounts, and according to your model, there are certain accounts that are really healthy but a lot of those accounts are leaving, you probably need to rework that because it's not actually measuring the real health of your accounts. So the idea of starting with what you have, not letting the ideal get in the way of starting somewhere, and using whatever tools you have to validate what your assumptions are. Chase some of those down and that can be a good starting point. I know that we don't ever want to just use assumptions, but start with what are the assumptions that you're already making and start with validating what you can with what data you do have. So if your thought is," Look, this feature is our north star one. We know that this is the one that we feel is the most sticky." Well, then you can start with validating even that, ranking customers even just on how they're using what you think is your sticky feature, and then you'll either learn you were right about that, or," Oh my goodness, no, we were totally wrong. This is a neat feature, but it is not the sticky turn buster that we thought it was." And so starting with your assumptions so that you can validate them is helpful. Don't let the assumptions be the be all end all, but start with those and then use whatever you have and then iterate from there.
Jeremy: Yeah, Dan, and great summary by the way. And I think you hit on a couple things that looking around the room, I think that we've kind of touched on a couple of things earlier is John talked about onboarding time. That's a very easy metric that you can follow, whether it's your CSM, whether you have an implementation team, project manager, but you can track that in as simple as in Excel," Hey, customer started on date X and finished at date Y," and then kind of looking at getting that customer feedback, whether it's NPS or something else. Daniel talked about and you touched on it too, Dan, what's the engagement within that look like? And are they interacting with our tools, our software, are they responding to things or are they not? And that's also another thing that I've noticed over my CSM career is just because somebody doesn't respond to an email doesn't mean that they're not engaged. And so being able to find other ways to look at, how does that person prefer to communicate? Maybe they are just a power user and they only want to talk to you when there's some technical stuff or product feedback. And maybe it's a completely different cadence versus an executive may not respond to an email because they would rather take 15 minutes to talk to you in person. So, I think you've provided a really good overview as to that. And I think for those of you that I know brought it up, and I don't remember all the people on the call that said it as you're coming in, but think about that. Can you look at utilization? Can you look at onboarding time? Can you look at, I don't typically default to support cases, but you could even look at things like support tickets as a starting place. If you don't know where to go and just look at," Hey, what's their volume of tickets in your stack rent?" Like go," Hey, this customer is way up here and we know that their use case is very small, so they're kind of over and ballooning it." Exactly, Dan, everything is data. So just a couple ideas. And Dan, I appreciate you sharing from that regard. One question that I do, I see Claudette still on the phone. I think this ties into Dan, what you're saying about using that is, Claudette brought up a question before everybody came back around what's a proactive approach as you talk about health scores, when you look at a customer who's has a really good program? They're a hundred out of a hundred. They're 90 out of a hundred. They're really at that top echelon. Do you use that data to drive proactive programs and engagement with that customer, whether it's in getting them invited into special projects, special releases and that sort of thing? Is anybody looking at the health score from that lens of, how can we take our good customers and the ones that are at the top of the line and make sure that they don't ever slide down that far. I don't know if anybody is doing anything like that within their program saying not even necessarily health score, but I think that would help Claudette as she kind of thinks through that as well in identifying that.
Daniel: Yeah, so it ties in with the health scores, since we're only looking at engagement so far. But we've formed a customer council from our top customers. They will drive our activities as far as, as we're our maturity and customer success, they're driving how we do that, what they want to see, what they want from us to make sure that we are engaging in the right stuff. And if they are already net promoters, if they're already doing success stories with us and stuff like that, we'll have them guest speak to other clients, not about us, but just about industry in general and really put them on a pedestal and showcase their expertise as users of the product, which they get marketing bumps out of and we're coordinating industry thought leadership. So overall, it just continues to grow that relationship that we have with them over time.
Jeremy: Awesome. Well, thank you, Danielle. I appreciate it. Thank you for sharing with the last couple of minutes here, David, Erica, Stephanie, Depali, was there anything in addition to what we've discussed that came out of your group before we move to wrap?
Speaker 9: Yeah, thanks for asking this, and start with thanking the Vignesh, actually. Here we may be discussing this today, but I discussed this topic with Vignesh around two weeks back when they were talking about the metrics and the health scores and how to measure. We are not doing it currently in our organization, but after listening to all the thoughts and after specifically listening to the measures, the data which we are actually measuring or making use for making the health scores, I have a question that does this actually help us to understand the churn or predict the churn? Because I just three or four weeks back, I lost one customer. I faced my first Churn. And this client was very active on the platform. They were having I think 90% or even more of that as a usage in the data if we measure, not even recognizing they are going to leave us. So is health score, is the prediction based on the health score actually correct to measure or it just depends? That's what I wanted to talk about.
Jeremy: Yeah, John.
John: So I'll jump in on this one. I think early stage when you're building it too, it's more about churn prevention than it is about forecasting your churn rate. And I want to separate the two of those of the health score can serve as an early warning system. If it's right and if it's tuned, then you're going to find out about those things that might cause a customer to churn earlier and be able to tackle them. And then over time, that will show in your churn rates. But when you're first going, it's about having that early warning system. And I think that might even be a controversial statement here, but the fact of the matter is until you have the health score running for a period of time and have data to back it up against, you can't go in on day one and say this is a predictor of churn. It's iterative. You're going to fine tune your health score. You're going to make it better over time and you'll figure out what is an accurate representation. But in the early stages, it's more about being that early warning system. It's about not having surprises. And then if that leads to less churn in the short term, then you're on the right path. If you are still seeing churn, then you might have the wrong warning indicators, and that's that's iteration from there. So I think that's my only piece of guidance on that is don't go into this day one managing expectations with your executives that this is going to predict churn. It's not. It's going to help you develop an early warning system, and hopefully it does that well for you. And if it's not doing that well, iterate. Throw it out. Start over again and work on the next one.
Jeremy: I agree, John. The only I'd add, and then Kevin I'll go to you, as for closing comments on this is, for us we actually do a heavy wait on the relationship piece of it. We give the CSM the highest, largest amount of that health score bucket. If the CSM says," All the users relations, green. Everything looks great." But the CSM's gut's saying," I don't like what's going on here. Things are great." That overrides the whole score. It doesn't matter. You can have perfect utilization, but we give the CSM who's closest to the account that authority to override that. So another thing to keep in mind, even if you don't have a health score is as a CSM, be empowered or if you're a leader on the call, empower your CSMs to have that level of authority and to be able to communicate that up. So Kevin real quick here, and Vignesh, I'm really sorry, we're going to run out of time here, but Kevin, if you don't mind sharing maybe a quick thought, and then we're going to have to wrap up.
Kevin: Well, what John said about it being an early warning system there, because it's very important not to use a customer health score as a churn analysis because there are a lot of external factors that we're still not considering within a health score. For example, back in my previous company, we had a lot of customers that left, not because of any usage. They had positive sentiments on the company and the product, but they had personal relationships in the executive level with a prime competitor that we had and we were losing customers based on that relationship. That was not something that we could been able to measure unless we were able to actually get in the minds and read them of our own customers and understand who's in their address book, who's talking to them all the time. And that became a major factor for how we needed to retain customers is that we needed to start to build those relationships with the executive leadership. Not just the CSM to the executive leadership, but our own executive leadership to that executive leader. So those are things that we can't monitor or can't really measure within a health score. So that's why it's still important to do a churn analysis, to really see where all of your churn is coming from and not relying on your CHS to make that determination for you.
Jeremy: That's awesome. I appreciate that, Kevin, because I think that's spot on. So couldn't have said it better myself. So team, we're right at time here, so I want to be respectful and make sure we get you on to your next activities for today. Thank you for joining. You're going to get a survey. Please fill that out. If you've got any topics and suggestions as we get towards the end of the summer or stuff that you'd like to see, let us know. Starting next week for the month of July, we're going to be doing a office series around customer engagement on various topics. So be on the lookout. I'll be posting some stuff on LinkedIn here later this week on that. But with that, you all have a great rest of your Tuesday. Look forward to seeing you all next week.
Speaker 10: Hey guys, Thanks so much for taking the time to listen to the Gain, Grow, Retain podcast. If you liked what you heard, please take a moment and share the podcast with your friends and colleagues and subscribe. We really appreciate it. Talk to you soon.
This week we are discussing health scores and ways to use that data to drive ideal outcomes.
A weekly segment:
CSM Office Hours
Every Tuesday. 11:30am ET.
If you want to join the discussion with thousands of other customer success leaders, join Gain Grow Retain: http://gaingrowretain.com/
This podcast is brought to you by Jay Nathan and Jeff Breunsbach...
Jay Nathan: https://www.linkedin.com/in/jaynathan/
Jeff Breunsbach: https://www.linkedin.com/in/jeffreybreunsbach