In this article, I'll look at how onboarding and data-driven onboarding, in particular, can help organizations drive customer retention and revenue.

I’ll discuss what customer onboarding is, what it should be, the interlocking components of onboarding, and how a data-driven approach can lead to better outcomes. I’ll also dispute the idea onboarding is a singular discrete event, and finish by drawing the line to retention and revenue.

My name’s Dominic Constand, I'm the Vice President of Customer Experience at ZoomInfo, and in this article, I’ll discuss onboarding for retention and revenue.

About ZoomInfo

We believe data drives the foundation of all go-to-market efforts; the insights, the engagement, and intelligence. No more acutely so than in this period of accelerated digital transformation.

With our platform, we help our customers discover their ideal buyer with real-time intelligence, insight-based targeting, and prioritized outreach. We then fuel that outreach with automated data enrichment, activity reporting, and integration with CRM systems and emails.

Finally, we help orchestrate their go-to-market motion with AI intelligence, targeted online advertising, and custom API solutions.

To provide a bit of context about our CX or our broad customer experience org, it's made up of four sub-business units:

  • Customer success,
  • Customer support,
  • Customer delivery, and
  • Customer onboarding and learning.

That represents around 250 employees under the customer experience umbrella whose functional purview falls into any of those four sub-business units.

The agenda

With all that said, in this article, I'll look at how onboarding and data-driven onboarding, in particular, can help organizations drive customer retention and revenue.

I'll talk about:

  • What is customer onboarding and what is it not?
  • What customer onboarding should represent for both the customer and your business,
  • The interlocking relationship between each of the components within a holistic onboarding motion,
  • How taking a data-driven approach to each phase of onboarding can help you measure performance gaps and set the foundation for a better sense of outcome attribution.
  • Dispelling the notion that onboarding is the singular discrete event. And finally,
  • Trying to tie these elements together to draw a more solid line from the onboarding experience to retention and revenue.

What is customer onboarding?

I believe it's easier to get at what customer onboarding should be by identifying frankly, what it shouldn't be. The sophistication and comprehensiveness of each organization's onboarding motion exists somewhere on a spectrum.

Where your organization falls on that spectrum is largely predicated by a number of factors varying from the size of your team, the product you're working on, the customer base you serve, your tech stack, and any other resources you have at your disposal or can bring to bear.

I don't think it's correct to say that there is a definitive right place to be on this spectrum. That said, I do feel confident in the assertion that high-quality high-impact onboarding is probably neither of these two extremes that you can see in the image.

Balance of two extremes

On one side, you have the notion of the customer is sold on a Monday, we groom them on a Tuesday, they're logging in and clicking around on a Wednesday and is thus onboarded.

Because while these are certainly milestones in the broader onboarding motion, they are not tantamount to holistic onboarding.

Conversely, onboarding isn't the opposite side of that spectrum, wherein there is this unbridled data dump of all the tools and features within the product and service suite.

Because ultimately, you want to use this valuable time to help the customer zero in and get really fine focus on the value use cases and how you're going to bring them to bear as opposed to losing these precious nuggets of information amidst this deluge of content that you're going to throw at them.

Those are the things it isn't.

What customer onboarding should be

Now I've talked about what it isn't I want to talk about what it should be, but not from the perspective of tactical activity, but more in terms of what we should hope to garner from this vital stage in that customer journey.

The customer's first experience of value

It's really their first interaction with your brand, your people, your product, your processes, and you hope all of these elements bring something positive and benevolent to their experience. But nonetheless, it is that first experience and it's going to set the tone. I'm going to talk about that a little bit later.

A time to understand your customer

There is a wealth of knowledge that exists in the minds and systems of your sales engineers, your new business AE folk and you're going to learn a lot from your customer as well. It's a time to understand the customer, it's a time to understand it from a firmographic perspective:

  • Who are they?
  • Where do they play?
  • How big are they?
  • What industry are they in?

It's time to understand from a technographic perspective:

  • What does their tech stack look like?
  • What are the integration points?
  • How does this fit into the ecosystem?

And also understand:

  • What are their goals?
  • Why are they talking to you in the first place?

All of these typical discovery phase-type questions, this is the time.

A time to set clear expectations

So often I see the notion of customer-centricity being conflated with just saying yes to the customer all the time. It's not the same thing because what I really advocate for is, in this period of setting clear expectations, having a consultative approach to this.

At the end of the day, you want to define what benchmarks through this process for them mean that it's going well, what are the benchmarks for what they can expect, whether it be time, whether it be the deliverable itself, what can they expect through this process?

And what can they hope to expect once they're fully through the onboarding motion? I always define disappointment as the delta between what you expect to get and what you did get. If what you did get is less than what you expected that's disappointment.

By setting clear expectations, being consultative, and being an advisor you mitigate the gap that may exist there and ultimately you're going to have a much more positive outcome.

A time to align with the customer's goals and metrics to ensure they see success

This is a super obvious one but it's very easy to make assumptions and I'm sure we're all guilty of having done so. You want to make sure you're on the same page so don't assume it, make sure you understand what their goals are, understand there might be different goals based on if they're the functional practitioner, the executive practitioner, the financial buyer, understanding what parts of the organizations care about what things and how they're going to measure those things.

Sometimes, again, be consultative. We have clients who'll ask, "What do your best most successful clients do? How do they use your product? How do they know it's working? What do they measure?"

Again, don't be afraid to take a consultative approach but regardless, be aligned.

A time to empower the customer so they can take ownership right away

In previous careers and lives, I've seen implementation or been involved in implementation where the customer buys a tool or buys a platform, expects this team to show up on their doorstep, and then through technological wizardry, they have this system that is perfectly configured, well-administered with beautifully clean data and empowered users where there is no user error.

This is an absolute fantasy, it never works out that way. In my experience, the most powerful compelling onboardings and the swiftest times to values are in situations where the customer has been really empowered.

Empowerment comes and stems from accountability, it's a function of the right expectations, and them being able to take ownership and feel like this is their tool, and they can go forward and be successful with it, rather than waiting passively for the tool to do something to their business.

Any way you can empower the customer to take ownership is important. There are lots of ways you can think about it, but nonetheless, empowering the customer to take early ownership and feel that sense of accountability over their success with the product.

To recap, I've covered what onboarding isn't, I've highlighted in terms of value and purpose what onboarding could and should be, now I'm going to take a quick look at a fairly archetypal onboarding motion and its associated components.

Onboarding components

Obviously, there can be and will be different permutations of this but the themes and the flow should be fairly ubiquitous.

Before I dive in, I'm going to shift over to a quick analogy, which I like to use in this context, which is I often think about the customer journey at large as being analogous to a traveler's stay at a hotel.

Onboarding analogy: hotel stay

This is probably a metaphor or an analogy that you've heard but it's this notion that when you go to a hotel, the hotel spends a lot of money on the lobby because it's that first visual experience, it's that first sensory experience.

They want it to wow you or they want it to be any number of other positive adjectives that they're going for. But there's money spent there and then there's money and time spent on making sure that check-in staff and front desk staff are really high quality and representing their brand well.

They want them to be really customer-centric, attentive, informed, and having staff be able to provide this high-quality standardized experience that represents the brand, whatever that is, whether it's the way they greet you, the way they address you, the way they dress, the welcome gifts, all these things fold in and hotels spend a lot of time and money on really getting this right.

Good versus bad

Those of us that have traveled for work or pleasure know what a good hotel check-in feels like; you get there, the line's quick, they have your reservation, everything's ready to go and before you know it, you're in the elevator with your keys, you've got into your room, and it felt seamless, your bag's off your shoulders and you feel great.

Conversely, you know what the crummy ones feel like; the line was really long, it felt like they're understaffed, you got to the front of the line, the person wasn't very friendly, it took them a while to find your reservation, maybe there wasn't a room available when you got there, some of your preferences, if you’re a rewards member, weren't properly saved so for some reason, they've put you in a room that's directly next to the fire alarm, right in front of the elevator or at the back of the hotel by a dumpster.

This is enervating and when you get to your room you're miffed. It doesn't matter how nice the bed is, how big the television is, or whether the shower has the fancy soaps because you're bummed.

The hotel from that moment is playing catch up, especially because there aren't that many human interactions, at least in my experience, when you stay at a hotel. This is one of the few, that check-in moment, and it's the first one so it's really important.

I think that moment of hotel check-in is very similar to onboarding. It's that first moment of truth. You don't have to be perfect the whole time you just have to know when to be perfect and this is one of those moments.

Now I'll talk high level about some of these important things to think about during this stage.


Maintain momentum from purchase

Handoff is maintaining momentum from the purchase, strike while the iron is hot, time kills all deals and all those other fun catchphrases apply.

Maintaining the momentum - the customer's excited, they've got buy-in. Momentum on your side too internally - all of that important information that exists in the sales engineers head or the AEs head, you want to get it out and onto a piece of paper or into your system.

Wherever or however you facilitate your hand-off, you want that knowledge transfer to happen and happen quickly because it's all fresh.

Ensure internal alignment of goals

The other piece is ensuring the internal alignment of goals. So they've got all this knowledge about the customer, they know about the nuances, the personalities, the advocates, the detractors, the EMPOC, they might understand all sorts of other important things that will help you drive a positive kick-off and onboarding.

Also internal stuff too like resourcing:

  • How does this fit?
  • Is the training queue backed up?
  • Is this a really complex situation where we need to get more scope analysis?

All these sorts of things you do in the handoff because that great handoff is going to drive a really awesome kickoff.


If you have all that information, you show up to kickoff informed, big, and in a way that makes the customer feel that this is not merely a repetition of everything they've already said.

But instead, they're moving forward, this is forward momentum, our communication and procedural rigor are airtight, we've lost nothing you said in translation and we are here to reiterate the things we've heard you say, reconfirm, and move this thing forward.

Strong handoff results in great kickoff. Great handoff and kickoff drive really effective training.


Training is a really polarizing thing. Sometimes customers really hate it sometimes customers really love it.

Be targeted

The net is that powerful training is targeted, it's incisive, and it's appropriate for who the attendee is. Whether you format your training in a webinar, you do group stuff, private trainings, make them pay for it, it's how you structure it and good training is good training.

It helps the user take exactly what they're learning in your classroom, and apply it to what they need to do at their desk, and also helps them build that foundational knowledge that's going to help them go further with the platform than maybe they originally even thought.

You're sowing the seeds for more questions around deeper usage and broader usage, you're priming the situation for a really positive growth conversation. At the very least, you're making sure they're not just going to log in once and then get bored and leave.

Ultimately, you're going to set yourself up better for a stronger renewal situation.


With all of those things, that's going to drive your implementation.

  • Are you going to go soft launch?
  • Big bang?
  • Are you going to deploy to certain cohorts or groups?
  • Are there phase one items that are going to get delivered?
  • Is there phase two that comes after more training or any custom work?

All of the other things help you draw up the exact deployment strategy. None of these things are mutually exclusive and independent events, they're all cumulative.

To use a sporting analogy, you can't win the US Open on day one, but you can certainly lose the US Open on day one. It's similar here, you can't fully do this at the initial stage, you can't complete the implementation just by getting a good handoff. But you can certainly blow up the whole thing by having a bad off and a bad kick-off.

It's cumulative and each thing feeds into the success and effectiveness of the next stage.

Onboarding should be data-driven

Now I'm going to talk about where the rubber meets the road, and where you evaluate the effectiveness of your onboarding and through that quantitative discipline start to harden the line of attribution between onboarding activity and retention of revenue and outcome.

You can see I've got this pyramid, you can also think about it in terms of a funnel - inverting it and having it the other way. Any cone-shaped object is going to do the trick.

Core metrics

At the base of your pyramid you've got core metrics - I'm using an example of training here for the sake of simplicity. Collecting data around how many accounts and users at those accounts are trained. It's fairly simple, it's binary - trained, not trained.

That's your base level of data. How many people that got signed on did I train?

Satisfaction metrics

The next level up is the satisfaction metrics. That's CSAT, NPS, survey data, smile sheets basically. The net here is you're saying of all the people that that did get trained, what did they think of the training? Did they think it was effective?

Engagement metrics

  • How many people engaged?
  • How often or at what intervals do they engage with your content?

Not your platform, your tool, but your content.

  • How many of them took the next level training, the role-specific training? If you offer something like that.
  • How many of them watched the loom video?
  • How many of them went to your online community?
  • How many of them checked out your knowledge center?
  • How many of them did your certification program if you have it?

All of these things talk about that engagement. You're culling down your funnel and you're measuring the fallout or the persistence of success through each of these phases.

Usage metrics

This is the one that everyone knows about and there are a bazillion acronyms and metrics on this one. Everyone thinks about this and that's important, usage is important.

But the best one is impact metrics.

Impact metrics

This is the one that says:

  • Did they use the product how you taught them?
  • Did they spend less time trying to figure out something that improved the overall way they do work?

That's hard. This is hard to think about.

Impact metrics analogy: cell phone

The analogy I like here is a cell phone example. Let's say I'm looking for a tool that helps me get up on time in the morning, an alarm. I can use my phone, go to the clock tab, go to the alarm thing and enable the alarm. That's a manual process and it works.

It will wake me up at 7am. That's usage. I'm definitely using it. That's great and I do it every day. But my Motorola Razor phone from 2001 also did that so is that really my impact metric? It's probably not.

My impact metric is probably “is it changing the way Dominic's waking up in the morning?”  No, because I'm setting the same alarm as I did on my flip phone. So they offer things, they say you could use Siri because then you can have her remind you to set your alarm. Or you could just tell her to set an alarm.

There's the shortcut feature that's going to set my alarm for me depending on the day of the week. There's the weird nighttime feature now that tells you what time you're supposed to be going to bed based on what time I'm getting up, and all sorts of other fancy things that automate and make this process more seamless, less burdensome, less manual, less prone to failure potentially, or manual failure.

Those are impact metrics because it's changing the way I'm thinking about my bedtime routine and how I'm waking up in the morning. It's not just the alarm.

Example usage metrics

Impact metrics are tough so what might some of those be?

For us, we have this concept of key usage beats. So you might think, okay, using the platform is one thing, but can you actually identify what sort of usage is tied and is most prevalent amongst churn cohorts, or down sale cohorts?

Versus what type of usage is most prevalent and associated with growth cohorts or renewal cohorts?

If you can say I noticed the users that do the most of this tend to be the ones with the positive outcomes, then you've got a key beat, that can be an impact metric.

  • Did they just use the basic search?
  • Or did they use the fancy advanced search?
  • Did they just get the regular updates that they've got to go and get themselves?
  • Or did they get the updates that push automatically to their email in real-time?

Thinking about impact metrics beyond just basic usage is really the key. They're hard to get to but you've got to measure every step of this so you can get up to isolating and distilling the impact metrics.

Onboarding is a process and not an event

You've got the nice forming, storming, norming, and performing wheel here which I like.

The net from this is change is the only constant.

Business goals change, user change, executive sponsors change

Business goals will change, users will change, other things will change. You may have formed, stormed, normed, and performed really well the first time, but then maybe their business changes and the way they need to use your system might change with it.

You need to go through that reforming, re-storming, re-norming, and reperforming. You're iterating. Your user base might change, you might get a new division come in, you might get a bunch of extra seats in an existing division.

Regardless, you're constantly in onboarding mode.

Usage & adoption rates may fall

Your usage or adoption might rise or fall in different areas, which might prompt you to retrain or reoptimize or re-educate. Maybe your software or your product is evolving and you need to retrain on that.

The spirit here is always an onboarding mode, always looking to iterate. Obviously, at different points in the journey, some of this is CSM-owned, as well.

But the net is that at ZoomInfo, our onboarding team, as part of the broader customer experience function, does a lot of stuff mid-customer journey, as well.

Drawing the line to retention and revenue

A lot of people often talk about time to value. Time to value is not just simply the date the contract was signed minus the date of first login.

There is a difference between first experience of value and total expected value.

Difference between first experience of value and total expected value

To go back to my cell phone example, I buy my phone, I get home, I open the box, I rip off the wrapping, I grab the instruction manual (and I throw that in the bin, because who's got time to look at that?), I'm immediately turning on my phone and doing things.

I'm probably texting within five minutes, I'm going to make a phone call in a few minutes, I'll probably send a work email pretty quickly. My first experience of value is probably sub 10 minutes for this phone.

But is that my total expected value from this phone? Did I pay all that money for that? The answer is no.

My total expected value is I expect this thing to not only do all the telephonic stuff, but I want it to do my calendar, I want it to do my social media, I would like it to manage my bedtime and wake time.

A level beyond that is all of the integration, I want it to connect with and integrate with my iCloud, iPhotos, iTunes, and all the other things beginning with ‘i’, sync with my watch and my computer, etc. All of these things are part of my total expected value.

Apple doesn't stop recommending things to me through this phone or through stuff online once they've seen I made a phone call. They realize there's so much more to this that's going to make me buy another iPhone.

Value should be measured at each stage of the engagement/adoption funnel

We don't want to do that in our onboarding process, we don't want to just say "Great, you logged in, you clicked on a thing in week one, you're great" - that's not the case. That's not the total expected value.

What you want to try and do is get them pushed off. You're not just trying to push the boat into the water, you're trying to really shove the boat off with enough positive inertia and enough empowerment in its user base such that they're on a better trajectory for that total expected value.

And you measure that each step.

You want to be able to say you measured every step - we did the training, they liked the training, they went and looked at some other stuff, and then they logged in. They also logged in and did some actually important stuff that really adds value.

That's how to think about it because that's how you're going to drive that retention, mitigate the churn, and ultimately end up with a better revenue outcome.

Thank you.