.jpg)
gtmPRO
Visit gtmPRO to subscribe and get free access to our frameworks and guides!
Practical Go-to-Market guidance specifically for B2B software and service companies between $5MM-$50MM in revenue.
gtmPRO
Putting The 'Success' Back In: Customer Success
This episode highlights the evolution of customer success, advocating for a shift from process-driven strategies to genuinely solving customer problems. By implementing an understanding of ideal customer profiles and leveraging AI and analytical tools, companies can profoundly improve their customer success efforts.
• Redefining customer success to focus on client outcomes rather than metrics
• The importance of an Ideal Customer Profile for targeted support
• Empathetic communication and knowledge of customer challenges are essential
• Utilizing conversational intelligence to inform proactive support
• Product analytics must contextualize usage within customer circumstances
• Rethinking organization structures to include analytical resources for customer success
• The potential of AI in enhancing, rather than replacing, customer engagement
Welcome to the GTM Pro Podcast, your essential audio resource for mastering go-to-market discussions in the boardroom. Here we share insights for revenue leaders at B2B software and services companies, especially those with less than $50 million in revenue. Why? Because the challenges faced by companies of this size are unique. They are too big to be small and too small to be big. This dynamic pushes revenue leaders into executive leadership without a lot of help or support. We are here to provide that support.
Speaker 2:Your journey to boardroom excellence starts now. All right, here we are. We took a week off, so kind of a crazy January for everybody it's holiday and then back in. So we're back at it here for GTM Pros. We are going to stay on the same theme of customer success, however, because there's a lot of change around that and, frankly, we're doing a lot of work on it ourselves and just how we rethink that in this new AI world and what we can do about it. So, andy, you were kind enough to kind of frame up a discussion for us. You want to kind of kick that off, and then we'll just dive right in.
Speaker 1:Sure a discussion for us. You want to kind of kick that off and then we'll just dive right in Sure. So putting the success back in customer success, I mean it really is a very good jumping off point from everything we've really ever talked about with ideal customer profile and really framing the problem for the customer. And that's really, in my opinion, where customer success really starts is really understanding the problem and everything from there. So there's really a framework to it which includes the problem, understanding the path to the solution, how the product gets adopted for people, and then really a methodology for how we might attack that. And we can kind of finish off with a little bit of a tidbit and I know we could do multiple podcasts on AI, but since that is such a hot topic right now, it obviously does play into it. But since that is such a hot topic right now, it obviously does play into it. But really starting with like the framework, like how would we, how would we like lay this out, and I think it really does start with the problem.
Speaker 2:Oh, I couldn't agree more. I think, when you talk about bringing the success back to success, a lot, of a lot of it is around the process, and you you've heard a lot of people talk about how we've weaponized customer success. Right, it turned into, its only mission was net revenue retention, which became an obsession with strategies to mitigate churn, which we would only focus on people who are at risk from a health core score perspective and we would try to work on expansions and cross sales, and it became all about us. It wasn't really about success, it became all about us, and we just kept throwing bodies at it and we would have executive business reviews and quarterly business reviews, and the sole purpose of those things was that we wanted to set the hooks that you know you were going to retain, and we were trying to prove to you that we showed value, versus really thinking about it from a success perspective.
Speaker 2:Yes, the spirit of those things is sound, right, there are some cases where a QBR makes sense.
Speaker 2:There are probably more cases where it does not make sense, but yet we do it anyway.
Speaker 2:Same thing with an EBR, and so I think, as companies grew and we fell in love with this process, to me it's very similar to what we see with marketing, which is we have a whole generation of marketers over the last 10 years who grew up in an era where it was more about the process of marketing than it was the outcome of marketing, because you could get better and better and better at the process of driving leads and getting new people in the funnel, and it became about the tactics of ads and downloads and gating content and all that kind of stuff. And I think customer success was very similar. Right, it came, it became about the process and getting better at it and using tools and systems and data and it made all of that quote unquote better. But at some point in time you reach diminishing returns and the law of shitty click-throughs kicks in and everybody's doing the same thing and it becomes less and less effective and you go back to basics, which is, but you're not really helping them solve their core problem.
Speaker 1:So we have a whole generation of people who, for them, marketing and customer success not to overgeneralize, but is the process of it versus what is the purpose of it. We talk about checkbox marketing, and I think that's exactly right, gary. I think people have gotten into for lack of a better word a rut around checkbox customer success, right? So there's great technology, especially to your point about being able to detect people that aren't using the product. They're at risk, right, and an over-reliance on that as something of a leading indicator for churn, for sure, but, as we know, oftentimes that's too late, like you've got a window of getting people to that moment of value and beyond that it almost doesn't matter what you do, right.
Speaker 1:And I think another compounding factor in all of that is the notion of features. And to your point about with marketing and with sales, right, we often want to talk about and this is with founder-led companies, oftentimes product-led companies. We want to talk about the features. This is what it does, this is a cool thing it does right, but that doesn't necessarily talk about the problem does right, but that doesn't necessarily talk about the problem, right yeah, I was just going to reference that.
Speaker 2:I'm glad you brought it up is that the feature is a tool to help you solve the problem. But I think, as we understand the problem, the level or the amount of information that we need to understand about the problem is broader than it's ever been, because we can't just say here's a feature, this is how you implement it, this is how it works, and you should do this because it's going to drive this outcome. But what we fail to understand is well, what are the dependent things that happen outside, that need to happen outside of the tool, that make the the that outcome possible? Like do does this affect my stakeholders? Does it affect other parts of the process? If so, how do I properly message those changes in the process to other parts of the organization? What objections am I going to run into? What you know, if there's something with which it needs to be integrated, well, what does that department need to know? How is that going to affect what they do? There's a broader perspective that's needed there, and so that's when you talk about the problem.
Speaker 2:I think one of the biggest challenges we have is that we equip our customer success and even support professionals with, theoretically, a lot of product knowledge, but we do not equip them with a lot of problem knowledge.
Speaker 2:Right, a lot of context for what is life like on the other side. And, as I think about the adoption of this feature, what are the other things that I'm thinking about? You know, walking a mile in their shoes and having that empathy for not just the generic customer but that specific customer is you know what moves the needle. And then, if you further complicate it by the fact that for many products um, that are, you know, that aren't true enterprise, where you have a dedicated CSM who actually understands the organization, who really is having frequent communication, right Is is a part of that. There are many, many, many products out there where that just that relationship doesn't exist, for good reason, because it's it's a more transactional workflow tool or what have you. Now, how do you create empathy with your team when they're when they? They don't have that relationship, they don't have the conversations, they don't know the people, the departments, the org structure, right there are two compounding variables there, right?
Speaker 1:One is it's positioned too broadly in general, right, so you're selling to people and the specific use case isn't really well known at all. And then the other compounding variable is when CS gets involved, they don't have the context for any of it, regardless of like, even if they did, even if it was well documented and captured and everything knowing what you know use case to bring forward in the first place, to really get to that moment of value, that one thing which we'll talk about here in a little bit it's that one thing.
Speaker 2:They don't have any context for. Right, right, okay, so you talked-.
Speaker 3:It revolves around giving it a little structure. People understand how much structure just relieves the pain out of going through the process, because when you have run a problem you sometimes don't know where it is, where it starts and when it where it ends. And if you really give like and he said, if you put some example of use cases where this actually happens and just lay out, uh, certain paths that the customer go through when you're they're going through your product and what they're using it for, I think that alternative ABC, if this happens and this happens and then this other thing happens, like you know, straight path forward, and if you can walk them through that I think that's just something that relieves the pain out of it.
Speaker 2:Yep. So, andy, you were talking a little bit about you know methodology here. So we've established that it all starts with really all things in go-to-market start with ideal customer profile, period Check Forever. We've said that, we'll continue to say that, and chances are, if you're listening to this, because you're listening to this, you're farther along than others. So pat yourself on the back. But in our experience, which is over the last, well, decades, but especially the last five years as advisors, it all comes back to ideal customer profile and the lack of a clear and deep and methodical definition around it. So get that right first, because then you can understand the problems. But once we understand that, now what can understand?
Speaker 1:the problems. But once we understand that, now what, what's next? Yeah, and I think it's useful to try and give some context to what to do. Right, and we've really been grappling with this around customer success, which is it does feel almost so analog where we talk about QBRs and stuff, but better data is at our fingertips, whether that's the ability to integrate product analytics or the ability to do conversational intelligence. I think our philosophy now and moving forward, is that CS needs to be way more analytical. So what can you do around that?
Speaker 1:So one simple example is conversational intelligence. Right, that used to be really hard, gary, we have friends that used to run businesses around trying to get to conversational intelligence, right, like, and we think back, and that wasn't that long ago how for lack of a better word crude, that might have been right. It was like there were tools and you know it was like tech-enabled service and we were trying to like pull information out of like daily conversations and a lot of those were like like daily conversations and a lot of those were like calls, let's just say actual calls. And that wasn't that long ago. But now, assuming people are recording these conversations CS conversations, sales conversations, which might include discovery, demo and so on. Right, there is a wealth of information at one's fingertips to be able to do what we just talked about, which is map.
Speaker 1:You know the customer's journey Like how do they get to that moment of value? What is the problem? Can we glean more about this use case? And around you know the people that have that use case. You know ideal customers, so you do that through. You know there's a lot of tools for that sophisticated tools, but you can do it in in much simpler ways too. But that's one example of being more analytically driven is just just dig into that sort of stuff. And the other side is, as I mentioned earlier is product analytics right? So not just taking what the tool and and I won't name names says about somebody being at risk of churn, actually doing a lot more to understand why and to get to the bottom of that, to say you know what? We lost them here, and we've seen that now three times. And it was this situation, it was these type of customers and we didn't do this.
Speaker 1:And I bet if we did this it would make a difference in testing and doing it, but it's really all about being analytical and probably having someone whether that's a full-time, that's going to be dependent on resource availability, but having someone dedicated to that not just on, we, we, we all know about that from you know marketing and sales, but really truly on the cs side yeah, so there's a lot in there, andy, so so I there's three things.
Speaker 2:Right, the is and I want to touch on this because it actually is um, uh, very hot off the press from a conversation I had this morning and that's around that conversational intelligence piece. The second is around uh, product analytics as uh, as an indicator to help prioritize where we should be spending time and effort. And, um, thirdly is around resource allocation. So let's tackle those in those that now were. So the first is that the thing. So, as some of you may know, I am stepping into a role, have stepped into a role, as chief revenue officer for SparkHire, which is a portfolio company of Boathouse Capital, and we are actively working on how we think about extracting valuable from the customer information from unstructured data, mostly calls and emails. And, to your point, there's been quote unquote conversational intelligence tools as long as people have been recording calls and creating that category. But really, how it informs what you do beyond just a selling process and how you bring it into customer success is really important. But I knew this going in, but what I've discovered is that the or reaffirmed, I should say the AI tool is only as good as what you provide it. And that starts with the questions and conversation and exploration that your team pursues, whether it's on a Zoom call or via an email.
Speaker 2:And the problem most organizations have when they start to implement those tools is they want the output but they haven't done the hard work to get the inputs right 100%. And that goes back to ideal customer profile understanding what their challenges are. In our case, we use the SPICED framework situation, pain, impact, critical event decision-making process. In almost every stage of GTM, all of those apply. There is some critical event, there is some decision that needs to be made, there's certainly situation and impact. And so getting the organization to think like an advisor or a consultant who starts all conversations with great discovery that that is what feeds the true insights. For so you can begin to see patterns and what you want to take action on. But most organizations jump straight to the tool and don't spend the hard work there.
Speaker 2:The other thing that I've learned is that there really needs to be. You can't just throw this over the wall and implement, because what you get out of the tool is largely a function of what you're, what you're asking it and how you're asking it. And so we've heard a lot about this prompt engineering thing, right, and so there really is a. In order to get the fidelity that you'd like out of these tools, you really, senior leaders, need to invest time in getting that right so that we get to the insights that we need, which is, again dependent on the questions that you ask.
Speaker 2:So it really comes back to fundamentals, right? I mean, yes, now, for the first time, we actually have tools that can help you get and make sense of, and begin to structure unstructured data in a way, but it's not rows in a spreadsheet, right? It's not something that's deterministic, that's created in one system and passed through to another and then we want to analyze it. It's humans, it's conversation, so we need to mold it and shape it, and we do that, by the way that we ask questions.
Speaker 1:Yeah, I do think there's a starting point there and it's not implementing a sophisticated tool, right? If you're not doing that today, if you don't, if you're being really honest about it and you're saying I'm not entirely sure what my ICP is, if we really think about ICP and really dig in and say, okay, do I know not only like pharmacographics associated with you know who I might sell into fairly typically, but what are the conditions, what are the situations around where somebody actually ends up getting value from that? If we're really thoughtful about that, we know we're not quite there yet. If we're really thoughtful about knowing that we don't, you know, quite have our onboarding where it needs to be and a lot of things around that. But starting there, you can start to create a virtuous cycle around that.
Speaker 1:Right, using the simple methods that we talked about a little bit using what are the conversations I'm having today?
Speaker 1:Can I use thoughtful prompts and we won't get into any of those right now to start to tease out some of those patterns of discovery?
Speaker 1:If I can combine that with Spiced and really, you know, try and map that and say, okay, I'm at least at a point where I have a place to begin with discovery, then I can start to have more thoughtful discovery, which will then feed better pattern matching and so on. So it becomes like a cycle of saying, like I have to start somewhere, I can start with at least pattern matching what I've done to date. You know, let's say I don't know what volume makes sense to do that against. You know several dozen conversations at least where I can take them pattern match against those using you know the crude tools. If you will chatPT for one, which we've done, we've proven that can be done and start there, I totally agree, gary, if you just go to one of these more sophisticated tools and just say, actually, I'll just say this they know, based on conversations we've had, that there are customers that shouldn't be using their tools.
Speaker 2:Yeah, you are not ready for us, you're not ready. They won't admit it, but yes, they don't want to admit it.
Speaker 1:They'll try and make it work, for sure.
Speaker 2:But yeah, yeah.
Speaker 3:How can you determine if a customer well, of course, knowing well your ICP like going, but going beyond that. How do you determine if a customer is right based on your ICP? How do you strictly define who they are in order for you to be able to understand if they are a right fit for you? And then how can you find the signals that indicate that the customer will be successful with your solution?
Speaker 2:Let me twist that question just a little bit and bring it back to Andy's point. So the point you made is fantastic, which is, before you even think about the AI tool by walking through and in this case we're talking about discovery from a sales perspective, but it's just as important from a CS perspective, if not more important, where you have the opportunity to do that, and that is that there is immense benefit from structuring your conversations in that way, I think for multiple reasons. One is because so many times we provide people like here's what, here are the questions that you should ask. Well, they'll go ask those questions, but they don't know why they're asking those questions. So what we need to help them with is this is what we need to understand. Here are examples of questions that you could ask, but after that, you need to give them the freedom to go two and three and four layers deep to really unpack what's going on and and and. The benefits of that are that you start to create more structure so that everybody in the organization is following a similar framework not necessarily script, but framework so that when you start doing deal reviews and film reviews and situation reviews and risk level reviews for customer success. You're all getting, at the same, information that you know is important and relevant. Then when you put AI on top of that, then that's how you can start to really make that home. But you, but putting that first before you have the other, you'll just you'll be spinning your cycle.
Speaker 2:So, tiana, you were asking about this, the, the signals, and so, in terms of fit, let me say that. So now we're thinking about customer success and their customer is. It goes back to what we talked about from discovery, which is there. You know, at this point you're a customer. So we have to recognize, based on where we are today, you may very well be a customer, but that doesn't mean you're an ideal customer, right, because the needs of the business have shifted and that happens all the time, and so we need to recognize that. And then the other, as we start thinking about the application of features, or you know, giving you advice or recommendation is around. Your readiness to adopt those features Is? Is the, you know, your version of the pain sufficient enough that this makes sense? Is your organization structured in such a way that you're going to get value from this? And then so you're looking for for those I think from a quote, unquote, fit standpoint.
Speaker 1:Yeah, I mean I wish there was one answer to ideal customer profile, but that's kind of the point, is it really is business conditions for not only your prospect but their customers. It does boil down to, in a lot of cases, a unit of value which we won't get into here but like what is?
Speaker 1:what is like the benefit you convey to your customer? What is it based on? And there's usually a thing right could be like impressions or seats or something like that. And then there's a bunch of things below that, which is what are those pains associated with getting to administering that? Administering something, getting visibility on something, getting value for something, and it really requires Tiana just hearing that directly and indirectly. Sometimes sales teams know that inherently and they just don't know how to get it on paper sometimes. So, yeah, yeah, I think like around.
Speaker 3:What you're saying is like the focus on the one thing, like what's the single most important success factor for your customers, and like how do you ensure that you're not overcomplicating that journey, the journey to get there, and just focusing, like on those core, main aspects and of course, there's way too many things involved around that. But I think if you ask yourself that those questions like constantly, every day, when you're, and you're paying attention to finding, like those, those two key signals, to me I feel like that that will make everything just much simpler for you to decode. How to get them there.
Speaker 2:Yep, yep, and I would say so that's, and then I think related to that then is the second point you made, andy, around product analytics is. You know, we've been using that for a long time, but it's been overly focused on. We have a predefined, we have a set of signals that we have either validated or assume mean that you're getting value out of the product based on the usage of the product, right. But we've seen over and over again that those can often be false positives, depending upon your industry and what you're doing, where, yeah, I'm using the product, and then suddenly you show up and you're like I'm going to cancel and we're moving over this other tool. But wait, but you've been, you know, you've, you're green, our, our, our health score says you're green, like you're using it all the way that you should.
Speaker 2:And I think that one of the things that we are doing and exploring is how can we begin to use those signals as indicators of where we should step in and offer assistance, which is on the basis of how big of an organization are you? What do we know about your organizational structure? Take everything that we learned from a Spice perspective and have that be kind of the foundation and then knowing that, looking at your specific usage of the product and when you begin to do something, it may signal that we need to step in and say, hey, I see you're doing this for the first time. Let me offer you some help, and that doesn't necessarily need to be a human being stepping in Although I think there is some context there that you can provide, because it's not just and here's how to use this feature, but it's hey, there's some other things that you might think about as you begin to implement this feature. Here's a good blueprint that's worked for other organizations that look just like you. Here's what they have learned in this process. So now you're truly putting the success back in, success Like I'm.
Speaker 2:I'm not waiting for you to run into a challenge and ask me, or worse, run into a challenge and not ask me, or, you know, get some. Not even know about a particular feature or what have you, but stepping in proactively, like, hey, I see that you're doing this thing, but I also noticed that you've not used these couple of features. We built this things specifically for companies that look like you who are in this situation. Can I help you with that? Would you like to see that, and here are the benefits. If you were to do that, and sometimes like, no, this is good enough for us, fine, right, but the fact that you stepped in and were available to them at the moment and I think that's the place that I believe that that's another place where AI can begin to help us is, again, it's not deterministic, but it's like hey, here's a pattern here that you may want to step into and take a look at and lean into, and I think that's the one thing, tiana.
Speaker 1:It doesn't mean one feature to Gary's point right. It means the situation that, like the team, the organization involved get, it's getting to that moment of value they're doing. They're doing the thing as it's intended. They're doing the thing in a prototypical fashion that we've seen multiple businesses that are getting good value of this, and we've heard this right. Like the company has said, we are getting good value and we're using it in this way. We're now getting them to use it in that way, and it could be, you know, any mishmash of features.
Speaker 1:What it isn't, though, when we get outside of that one thing is when we try and get too many, too many things going at once, because this is, it's an exercise in change management. A lot of times, right, you're getting an organization to change the way they do things, so ideally, you're not over complicating that with oh in this thing, oh in this thing, it's here's a process and there's a number of ways, and I think we're here's another way ai is going to really benefit is giving a number of different ways to train, to onboard, to give them a process to get from point A to point B in getting value out of the tool. I'm really excited about that. That's maybe a playbook type of situation multiple formats, video, the written word, tutorial, know, tutorial of some type, getting them from point A to point B, but it's not giving them too much to chew at once, I think is the one thing. Getting them to a value but not overcomplicating it, yep.
Speaker 3:Measuring success in terms of outcomes and not focusing too much on feature usage.
Speaker 2:Absolutely Yep, connecting the dots Exactly Well. It leads to the third point, then, which you mentioned around the need for some potentially in the organization, some form of analytical support in the organization, some form of analytical support, and I frame that as a resource allocation perspective, because I think that's another thing that we need to think about, and part of the discussion I had this morning regarding AI is that we need to rethink how we solve problems, and one of those is how are we thinking about how we should structure our organizations and what skill sets we need? How are we thinking about how we should structure our organizations and what skill sets we need? And when you look at all the level of nuance that we now have access to through AI, that we didn't have before. The only way we had that nuance was to get somebody involved and have a conversation directly and then extract that and interpret the nuance or whatever, but we have now an ability to take some of those and start to see themes and patterns. It requires a much more data-driven, analytical approach to this so that we can prioritize where we spend our time, so that we can be efficient, and that doesn't mean that we're only helping the quote, unquote at-risk customers and not helping the ones that are doing okay. It's like, truly from a success perspective, we're at the moment that the point at which help is the most powerful is in the moment you need it.
Speaker 2:Right, you get a flat tire and suddenly somebody all of a sudden a tow truck shows up right beside you Like that was awesome. Right Versus waiting, having a call, figure it out, wait three hours, get it scheduled. And I think that's where we have the opportunity to be either much faster from a responsiveness perspective or even anticipate what challenges are, knowing the questions that you should be asking before you're asking them, and giving you this smoother path. That actually okay. Yes, theoretically I could throw this.
Speaker 2:Hey, here are these features and you know, here's a bunch of help articles to how to implement them. But what if I actually walked them through, walked alongside of you to see you go and what that would look like, walked alongside of you to see you go and what that would look like, and now you're off and running. So that requires us to rethink. Does that skill set sit inside of each individual, or should we think about creating? Is that in RevOps? We hear a lot about a GTM engineer? Is it in CS. There's no frankly right answer, but that capability is something that is increasingly important and we're going to need to invest in.
Speaker 1:That's changing. I mean honestly, that's changing every day too. It's really. I mean just observing that, like the whole notion around a GTM engineer. Now that's mostly, I think, geared. I've seen it around like the clays of the world.
Speaker 2:Yeah, mostly acquisition oriented yeah.
Speaker 1:You're 100% right, that is absolutely changing rapidly for all facets of GTM and I would say this CS might have the nearest term and biggest uh gains from doing that thoughtfully anyway, right like it's to your point. That's a different skill set, it's a different mindset yeah we've been talking about right, the, the think of ways of doing things differently. You talked about the, the, the story about the printer, somebody, somebody printing out the like, and I don't know if you want to say that, yeah, so real it's funny, Andy.
Speaker 2:It's so funny. You brought that up because I literally referenced that this morning, because it jumped out in terms of how we have to rethink, and that's and I wasn't listening.
Speaker 1:by the way, this is a total story.
Speaker 2:But but I don't even know if it's real. But you know, we, as experienced people, are encumbered by the curse of knowledge and the way that we've solved problems. And so an intern is asked by someone to put 50 pieces of paper together, collated in the conference room so that we can use them to put up ideas on the whiteboard. So they go back to their desk, they open up a blank document in their word processor, they hit print, they hit quantity 50, and they hit go and the printer spits out 50 collated pieces of paper and they take the output over to the conference room and slap it on the table. You know and you're like well, honestly, is that lazy? Is it a waste of printer capacity or whatever? But nonetheless, what struck me was that I never in a billion years would have thought that's how I would go produce 50 pieces of collated paper in one go. And so I think a lot of that is true for those of us who have been around this a long time is to be we. We will have a tendency to try to take this new tool and put it into the framework that we already understand and get it to conform to the way we would solve the problem versus how could I, you know, unlearn all of that? And if I were, this is, I guess, your first principles thinking take what this thing is able to do and rethink how I solve the problem.
Speaker 2:And I think what it's a good segue to kind of wrap us up to, which is I'm increasingly seeing, like where does customer success begin and product end? Like those lines are really blurring because, as we think about analytics and data driven and AI, it's like the, the, the, the success team really done well, is an extension of product and vice versa. Right, and it's the CS aspect, which is understanding the nuance in the situations to be able to help you conform, mold the product to meet your needs. And the problem that we've had in the past with the product is it's very deterministic, right, it works like this, it works like this, this is how you use it, but creatively, we think about ways that we can mold it and shape it for our particular business purpose, and so now do we have that ability to start to do that? You know where does the product actually start, to recommend, based on that knowledge of the customer situation and conditions, how you might implement this or how you might use that. So anyway, it's the wild west in many ways.
Speaker 3:I think it's even extremely interesting to think about the product person in a customer success position, for them to first go through the product and then to be at the customer success part of it, because nobody understands the product better than them.
Speaker 3:because they built it, they're a part of how it was built and what it what it's for and, at the same time, them being able to exist from a point where they already have all the information and it's. I think a huge part of this is like you were mentioning is just being very proactive. I don't think I don't remember where exactly I saw this. There was this small if you're confused page or something like that. And if you went in, the questions were fairly simple, like are you stuck, you know? And I was like, yeah, I feel stuck.
Speaker 3:So you started like and like I was telling you how you connect. If A happens, then B is the solution and if you do B, then C will be your outcome. And instead of just thinking about teaching people how to use the features, they go hey, do you want to accomplish this? Then this is how you do it, like, after you tell them what they're getting out of it, then you start teaching them how to do it and like how to start using the feature. But if you're just telling people to use features, for what purpose?
Speaker 2:Yeah, yeah, that's a great point too, and I think it's a challenge, because the other aspect is that we live in an attention deficit era and so people just want the easy button and in some cases, like, okay, here's this feature and the utilization of this feature will drive this outcome for you. But before you get started, you really need to have answered these questions. In our case. I'll give you an example. In our hiring platform, we have something called a custom scorecard, which allows you to customize how we're going to assess a certain candidate.
Speaker 2:Well, before you go diving in to use that, you better have a pretty good idea of well, how do I want to structure that? Like, what are the competencies that I want to look for? How am I going to grade those? Is that consistent? So it doesn't matter what the tool does if I haven't previously answered those questions. And if I don't answer those questions and just dive right in, then the results that I'm going to get aren't what's advertised, and sometimes that's like, ah, it's too much work and so. But from that perspective, like, is that a customer's fault or is ours? Like, okay, well, how do I, how do I baby step you there so that you can do it in increments and pieces. Can I give you a template? Can I give you a framework? Can I give you something that makes it easy for you to get started so that you're off to the races.
Speaker 1:That's getting to the point. Give me something usable now. Get me to something usable so I can see that, so that I can then augment it and really fit it to, say, the role I'm hiring for. But give me the like, get me started, show me the light. And then then, yeah, they'll have a little bit of work to do, of course, but that's that's. That's a recurring theme that we keep hearing, right like you've got to have a process yeah, indeed, all right.
Speaker 2:Well, that's how uh, the beginnings of how you can get the success back in customer success. We're going to follow up on this theme and really dive into, because I think it is the most interesting place where you can see AI have a very real impact in the near term, and so we're going to be spending a little bit more time there. But in the meantime, go be a pro. Bye, go be a pro. Bye. Thank you for tuning in to GTM Pro, where you become the pro. We're here to foster your growth as a revenue leader, offering the insights you need to thrive. For further guidance, visit gtmproco and continue your path to becoming board ready with us. Share this journey, subscribe, engage and elevate your go-to-market skills. Until next time, go be a pro.