Active vs. Passive: Wording Matters

Photo by Anna Auza on Unsplash

For many, passive attitudes are a heck of a lot easier. Avoid stirring the pot. Refrain from creating conflict. Allow the other person to make up their own mind on their own time. 

Confusion is created. Goals become unclear. In fact, these sentences are written in the passive voice!

We sometimes don’t even realize when we write in the passive voice — it can be so natural. The passive voice is often misleading. One way that we see that play out is when we’re indirectly asking for something — a call to action. The root of passivity leads to a flourishment of poor effects.

If you build it, they will come. 

We all dream of easy solutions; we don’t aspire to hard work. What if what we think of as a straightforward request is actually a lax, unconfident ask? 

  • “Buy now”
  • “Learn more”

There isn’t anything wrong about learning about a company, however, does our email drip campaign, website, and other collateral unintentionally connote confidence you’re the right solution? Alternatively, does passive selling result in passive purchasing? Put another way, to an audience reading your content, what’s the difference between shying away from what you offer (the “buy now” option) and you not being sure of how to phrase what you offer? 

As a seller, if I’m confused about what I offer, my wording will be vague — not by my choice, but because I lack the refinement of my ideas — and customers won’t know how to proceed. Effectively, they move because of my vagueness.

So, two things to consider when you think you might want to be more direct. Wait! That’s weak and passive language there. “Consider.” “Might want.” 

Employ these messaging core principles:

  1. Know thyself. What do you offer? Why would anyone want to talk with you?
  2. Be direct. Do not gamble away opportunities. Employ active language.

We’ve used Marketing and Sales examples as backdrops, however, this can easily apply to any arena: Customer Service teams, Operations, Leadership, Customer Success, Customer Onboarding, pizza delivery, grocery store clerk, and on. 

The point is this: know what you want and communicate that. Many people want to help. However, how can they help if you don’t see what you want and if you don’t tell them?

Distinguishing Support, Customer Success, and Customer Onboarding

This article is a work in progress. I was sharing this with several people so I figured I might as well make it public.

Customer Support, Customer Success, Customer Onboarding, and Customer Experience are often conflated terms. For one company, it’s the same thing. In another company, it’s different. I may use “Customer Success,” and you may use “Customer Experience.” Our industry will continue to refine the language we use to describe each role, and to help, here are a few ways of thinking about it. 

Note: I’m excluding Customer Experience for two reasons: 1) brevity and 2) because it can be quite complex. Is it Product-focused? Does it oversee Support and Customer Success? Other?

No matter what, here are guiding principles that all customer-facing teams should find agreement. Just like any other principle, one held onto too tightly and not held in tension by something else can go too far. A maniacal focus on the customer experience sounds good, but pursued to its logical end is not a good plan.

Guiding Principles

  1. Best for the customer, not for us.
  2. Leverage your expertise, not weakness.
  3. Everyone is busy. Don’t adopt a victim mentality.
  4. Teamwork.

Examples of Guiding Principles

  1. Best for the customer: if you’re on the phone/email with the customer and know the answer, help them.
  2. Leverage your expertise: each of us develops specializations, we should play to our strengths. If my weakness is sales, I should support others in selling. If my strength is technical, then I should utilize that and help others.
  3. Everyone is busy: helping customers is our job, and yet many things can get in the way. It’s common in fast-paced environments to adopt a victim mentality. At the same time, voice your concerns—don’t bottle them up!
  4. Teamwork: let’s honor our teammates, help them out, and make sure we ask for help.

Departmental keywords

These are examples of descriptive (not prescriptive) terms for each team. There will — and should be! — overlap.

SupportOnboardingSuccess
ReactiveProactive/ReactiveProactive
TransactionalTimeline: first X daysNo endpoint
TroubleshootingAdoptionAdoption/Renewal
Product expertiseImplementation expertiseRenewal expertise
Break/fixEngage usersBest practices
Cases (problem & incident management)Develop habitsProjects
How-toHow-toHow-to

Team Objectives

Support

  • Reactive. Respond to customers (live chat, email, phone).
  • Product experts.
  • Problem/Incident management (manage bugs and coordinate with engineering).

Onboarding

  • Proactive/reactive. Reach out and respond to customer needs (email and phone).
  • Present high-level business case of the product to new customers. 
  • Generate revenue. Through hitting customer success milestones, generate referrals and upsells.
  • Share resources, best practices, tips with customers.
  • Not product experts. 

Customer Success

  • Long term care team for customers exceeding $X ARR.
  • Proactive. Reach out before the customer is thinking of it.
  • Renewal manager. They are the primary owners of renewals (this is sometimes its own team).
  • Upsells. After the customer has surpassed onboarding, the CS team will take over.

Note: sometimes it makes more sense and is more efficient to combine Onboarding and Customer Success. As they say, mileage may vary.

Product Knowledge Goals

While each industry, company, and manager may see it differently, here’s a framework to think about the level of knowledge for a CSM (note: I fully expect plenty of disagreement on the percentages applied to each category, that there should be more/fewer categories, and the definitions.

Let’s assume this is about Acme, Inc., which sells an AI tool to help inbound Support reps understand if a customer is likely to be agreeable, scream/cuss at them, cancel their account, etc.

Customer Success Product Knowledge Goals

  • Know 100% of the internal processes
    • Examples: how to sell, upsell, cancel, downgrade, pass customers to appropriate sales reps
  • Know 100% of product functionality
    • Examples: How the product functions, navigation, best practices, selling points
    • Metric: should know almost everything in the Help Center, but NOT necessarily [internal wiki] or institutional knowledge
  • Know 20% of product methodology
    • Examples: fundamental methodology, assumptions, calculations
    • Does NOT include: in-depth knowledge on the algorithms at play
  • Know 40% of industry knowledge
    • Examples: understand broad terminology
      • The typical tooling of a Support rep, their average day, and the structure and handoffs between teams
    • Does NOT include: nitty-gritty detail about the Support industry (helpful, but not required for  — can be learned on the job)
      • Specific tooling in, say, Zendesk, or how the phone system connects to XYZ

The lines between these teams are blurred and will continue to be so. For instance, when a customer has a need, who helps them? Below are general guidelines. If a customer needs help, we should help them.

Straightforwardness Delivers Results

Simple, straightforward language. 

That’s the most powerful, effective way to communicate value propositions to prospects. Anything more can get confused or lost in translation.

Taking a cue from the FinTech industry, let’s examine the life of a financial advisor for a moment. Clarity helps thousands of advisors communicate their proposal. Two essential means to differentiate your firm are: 

  1. Simplify your message
  2. Explain your (often niche) value proposition without gimmicks

 As an advisor, having the knowledge, skills, and capabilities to tailor a financial plan, establish and follow a rigorous investment strategy, and maintain a solid set of books and records are vital — no doubt about that. However, on a first date would we ever talk about having kids, where we want to be buried, and where we’d like to spend our retirement together? Probably not — unless you want to scare off your date! In other news, you kept your dating message simple and didn’t go into detail — yet. Same is true for your business, find a way to simplify your message. Life (and working relationships) is a journey; it will take some time to get to a level of comfort. 

Many websites and proposals lay out an unfathomable amount of information. How often have investors been scared off by 100 pages of “light reading” or didn’t click “contact us!” on a website because they got overwhelmed with options? Boil it down to one or two takeaways for the investor. It’s not about convincing them all of the ways you can serve them, but something short, something memorable.  

As a thought exercise, pick a field you’re not very familiar with—dentists, contractors, plumbers, or a family practice doctor. Look up local websites and act as if you were vetting them because you were interested in switching. Is their message clear? Do you know what they do or their specialization? Do they give you a convincing reason for why they are the answer? If you are convinced to get in touch, do they have a smooth, quick way to get hold of them without losing you as a prospect? Are you looking to solve 5-10 things all at once, or just get started with the first thing? 

In the same way, demonstrate a simple, easy-to-grasp message to explain your firm in light of your niche. Many investors want your services, but they want to vet their choices. A significant component is ensuring you’re the clearest — not the loudest or flashiest. They need you, not a fancy website. They may want to know the technical components later in your relationship. Right now, they want clarity instead of enough information to make their eyes glaze over, just like our dating analogy up above. A few concrete examples of how you can improve your presentations:

  • Common ground. Start by focusing on what matters to your client. Ever try to win an argument by starting from the point of disagreement? Instead, find your common ground. Discuss why you agree there. Then realize you agree on 90%, and then with that as a reference point, discuss the remaining 10%.
  • Brevity is king. Shorten your client reports. Do they need to be thatlong? What can you cut out that isn’t absolutely necessary? Alternatively, what could you cut out that you could merely mention, “I reviewed XYZ too, but I just wanted to focus on this for now. If you’re interested in XYZ, just let me know a good time to discuss it.”
  • Client-centricity. Similar to the reports, prepare your meetings for what matters to your client. It may make sense to prepare ten different analyses, but if your client isn’t ready for it, slow down. When I was a practicing CPA, I once gave a client their “management recommendations” list which encompassed well over 15 items they needed to “fix.” While I naively thought this was perfectly helpful, it was overwhelming and defeating to them. All they saw were how terrible I thought their processes were. I could have proposed a solution in a more straightforward, more step-by-step manner to get to the same conclusion of helping them become a better version.

 As you’re thinking about who you are and how to serve your clients and future clients, remember to keep it simple. They should walk away with that single, memorable line. In a word: clarity.

Need an elevator pitch? Try one of these: 

  • A lot of people know how much they have, but they don’t know if that is enough. I help investors learn their chances for success, discover any potholes in their future, and find ways over, under, around, and through.
  • Many good people want to do the right thing and provide for their future self, but they don’t know how. They want to avoid the costly mistakes they read about in the paper. I help guide people to make the right decisions so they get that future they want.

Thinking Evergreen

Earlier this year I read a book that was written in 1989, 30 years ago.

It’s made me realize how much of my thinking, writing, philosophy is developed for the here and now. It’ll will only be relevant for a couple years. How much knowledge am I amassing that’s irrelevant in three weeks?

On one hand, we’re on the cutting edge of a lot of cool things in technology such as AI, machine learning, autonomous cars, and 5G, but it’s also a reminder that a lot of what I do and how I think will be obsolete pretty soon. Remember when taking photos with your phone was radically new? Remember when you first started using a computer NOT connected with multiple cords — you had this thing called wireless.

Compare that to the book I’m starting — written 30 years ago — it’s still extremely relevant today. In fact, you could call at Evergreen. It was valuable 30 years ago, it is valuable today, and it will be valuable 30 years from now. 

What are you doing today that will be valuable 30 years from now?

What are the contributions that I want to make? Sure, I want to make contributions that affect the here and now. However, I also want to make long lasting contributions. Leave a legacy, is what they call it. They want something that will last far beyond their own short life. They realize life is short and want to be remembered. And I guess, in a way, I want the same thing. I want to create something that has a long half-life. I don’t need to become Albert Einstein, Mohammed, Buddha, Alexander the Great…but I want my work to be meaningful. 

I believe part of this is when I think about the “here and now,” I should not only find the specific momentary relevance , but define the underlying principles. An old Jewish proverb says there’s nothing new under the sun. On one hand, that’s honestly a little depressing because that means there’s really nothing new in life (sure we can make discoveries like whether there was water on Mars, a different type of frog in the Amazon rain forest, or something else. But it’s not like I can reinvent the second law of thermodynamics. That just is.). But there are certain underlying principles that can be rediscovered, explored in new ways, or taught to the next generation. Too often we put aside what is true and established for what is new and fleeting.

So as the memories of this book fade but the truths remain, I want to reflect upon the underlying principles I’m learning in my day-to-day life and how they relate to other fields, industries, and the future. Some things are evergreen, they were with your grandparents, your parents, you, and will be around for a long time. Don’t you want to know those truths too?

Operationalizing Customer Feedback (NPS/CES/CSAT)

Tons of articles argue whether to use a customer feedback tool or not. There are valid (and not so valid) points on all sides. This article is written assuming you have or are planning to institute some sort of customer feedback. If you’re already an expert but want to see some ways of incorporating surveys, skip down.

Let’s take three types of customer feedback, when/where to implement each one, some recommended vendors, and the biggest question: “okay, so I’ve got the data. Now do I follow up with every customer? How?” Let’s explore different considerations.

Your company size

As a small company, it’s easy to maintain a strong pulse on customers. As a company scales, that distance grows wider. Companies must institutionalize processes and procedures to minimize some of the effects of elongating feedback loops. E.g., compare the feedback loop to product at a small 10 person company versus a large, publicly traded company. Early on you can get away without solid systems, but later on you’ll struggle. 

When to set it up

It’s easiest to start sooner. The work required is less, and you can usually get on an affordable plan with a vendor. Secondly, starting today means you have tons of data one, two, and five years from now. You’ll want that data and be kicking your future self if you don’t have it — true whether you are bootstrapped or are or will be VC-backed. Your company value goes up.

Customer size

If your customers are extremely high touch, surveys may not be necessary because you can simply talk to them. 

Cmon, if I, a CSM, only manage 3-10 accounts and I’m talking with each customer every week, they’re going to tell me their issues and joys. Okay, it may not be totally necessary for extremely high touch customers.

That may be true, but also consider the following:

  1. Some users want to reflect on the questions, rather than give a knee-jerk reaction over the phone.
  2. Many customers would rather write it in a survey than say it face-to-face.
  3. Survey results come directly from the customer and are not the CSM or account manager’s interpretation.
  4. Even extremely high touch customers many still have tens, hundreds, or thousands of users. You can’t talk to all of them, but only your top execs. This does not apply to all high touch customers, though.
  5. The survey results can be a catalyst for your next customer business/executive review (“your users told us they were ________ satisfied with Acme, Inc.’s product and…”
  6. Leadership hears customer feedback when it comes directly from the customer, rather than through an intermediary.

Should I use in-app, email, phone, SMS, or..?

Listen to your customers. How do your customers want to respond? For most, email will be best, but that depends on a few factors.

  1. If you’re running QBRs with your customers, you’re already asking questions for insight (“How are you using the product? Are you experiencing any issues? …”). Better to focus on that over the phone than diverge to NPS questions.
  2. If you use email, SMS, or in-app, then those can serve as talking points in your QBR. That way you don’t need to react to their feedback while on the phone or schedule a separate follow-up. Chronologically, they fill out the survey in one month and you address the feedback in the next QBR.
  3. Phone interviews tend to skew higher scores because of conflict avoidance (some people don’t mind conflict, but a good percentage of people avoid it at all costs). Email, in-app, and SMS abstract away much of that conflict.
  4. If you want a realistic NPS, email is probably best. In my experience, in-app gets a higher completion rate but is more annoying to the user. Email has a lower completion rate BUT results in better qualitative answers because they have time to answer and is a better customer experience. And it’s less annoying.
  5. Asynchronous methods (email, for instance) allow a more thoughtful response. It helps mitigate knee-jerk responses that you’d see in-app or over the phone.
  6. Do your customers use your product with clients? If so, in-app is terrible as it likely pops up at the worst times. 
  7. Similarly, if you expose the question in-app, is it likely to derail your customer from using your product? Or does it enhance their experience?

Intended outcomes

Here are some of the outcomes I’ve come to expect and have found to be true. I’ve notated which surveys are most likely to deliver the intended outcomes.

  • (NPS) Receive direct, product-specific feedback for the entire company to view—celebrate our victories
  • (NPS) Hear from customers who are thinking about cancelling—allowing us to reach out them first
  • (CES) Hear from onboarding customers and intercept them if onboarding went off the rails—preventative medicine
  • (All) Understand the product’s ease of use or confusion
  • (NPS) Track the product’s performance from the customer’s point of view
  • (All) Great feedback will be delivered to Marketing for creating success stories and developing brand ambassadors
  • (CSAT + CES) Help the Support team gain qualitative feedback

Which survey(s) to use?

Each survey their own strengths, weaknesses, and most suitable environments. It’s not one versus the other, but each have their areas of expertise. For instance, I’ve employed all three at one company before. Done right, it works well and isn’t a nuisance but enlightening.

CSAT

Customer Satisfaction as in C(ustomer) Sat(isfaction). The express goal is understanding the customer’s satisfaction around a transaction. Usually that is a Support call, email, or chat. For instance, after you contact Amazon for a refund or help with a purchase, you may be asked ““How would you rate your overall satisfaction with _________?” It’s best used for a specific event or activity rather than the overall product or company.

Ownership

Support should own CSAT if the surveys are specific to Support tickets. Management should track feedback scores, at least in aggregate.

Reporting

  • Feedback should be piped into Slack, DM’ed to the specific rep, or to a specific channel for managers to be aware of and then share with their teammates
  • Within the ticketing system, set up CSAT reports so each employee can measure and monitor their own scores
  • Support team trends (over weeks and months)
  • Percentage of respondents (2% is low, 10% is good, higher is better)

One of the most important — yet often overlooked — metrics is the percentage of respondents. If it’s low, that’s a terrible sign that your customers are not willing to engage. You should see flashing red sirens in that case!

NPS

Net Promoter Score, even with all its criticisms, is a helpful tool to measure a customer’s likelihood of promoting your company or product to others. Gauge the level of satisfaction a customer has with their overall experience. This is specifically not transactional, and is best abstracted away from any specific instances like the customer support example above. Here are several example questions to get an idea:

  • How likely are you to recommend [company name] to a friend or colleague?
  • How likely are you to recommend [product name] to a friend or colleague?
  • How likely are you to recommend […] to a friend or colleague who does not compete with you for business? (This is useful if you get responses like “I would never share your product with anyone! This is my competitive edge!!”)
  • How confident are you that [product] will help you reach your goals over the next [time period]? (all credit and rights go to Lincoln Murphy)

From there, it’s best to use with existing customers and NOT new customers unless you want to responses like “I gave a 0 because I haven’t spent enough time in your product”. The frequency can vary, but anything from every 3-12 months per customer is fine. Keep in mind most vendors are also nagging your customers with NPS. Survey fatigue is real. 

Ownership

This can vary as I’ve seen it owned by Product, Customer Success, Marketing, and other teams. What are you hoping to achieve? 

It’s likely a joint effort, and it’s okay for one team (E.g., CS or CX) to own the number and process. Whoever owns it must be empowered to share feedback with other teams and work on cross-functional initiatives to improve those ratings, regardless of the ownership structure (e.g., Marketing owns NPS, but they still need empowerment to tell Product what customers want). The person who handles the feedback (e.g., the CSM) may not necessarily be the one who has to address the feedback. For instance, Product feedback comes in many channels, but rarely does it go directly to the Product team. 

Reporting

  • Compare level of engagement with scoring (is there a relationship between usage and scores?)
  • Analyze NPS by customer segment (ARR, location, company size, etc.)
  • NPS monthly trend analysis
  • NPS as a churn early warning system (again, do you find a relationship between NPS scores and churn?)
  • Percentage of respondents (how many users take time to fill out the survey? Is it low? What can you do to improve that?)

CES

Customer Effort Score is more transactionally focused, like CSAT above. Understand the ease for the customer to accomplish their intended goal, whether a support conversation, adopting a new feature, or onboarding. For instance:

  • Support transaction: “How easy was it for you to complete your task?”
  • Adopting a new feature: “How easy was it for you to [use/adopt/incorporate] [new feature]?”
  • Onboarding: “How easy was it for you to get started with [company or product name]?”

For instance, I’ve historically used it toward the tail end of customer onboarding to measure their success and delight early on. If we start off on the wrong foot, we need to resolve that ASAP and not wait months to learn about that. The relationship is new. If you’ve screwed up, apologize.

Ownership

This depends on the question. In the CES Onboarding example below, the feedback and process should be owned by the Onboarding team. If the question were product related (e.g., “How easy was it for you to adopt [product feature]?”), then it should be owned by Product. 

Reporting

  • Compare level of customer engagement with scoring (is it true that customers who use the product more tend to rate it higher?)
  • Understand CES by customer segment (ARR, company size, Onboarding Manager, etc.)
  • Track onboarding trends over time (are we trending up or down? We just shipped XYZ to help onboarding, did it make any difference?)
  • Correlate CES with the sales rep to track effective onboarding expectations
  • Percentage of respondents (how many users take time to fill out the survey? Is it low? What can you do to improve that?)

CES Onboarding Process Example

For onboarding, I’ve seen the following question be effective: “On a scale from 1-7 with 1 being terrible and 7 being exceptional, how easy was your onboarding experience?” 
If customer onboarding is expected to last, say, 60 days, send it between day 40-50 so the customer has a pretty good handle on their experience and it allows your company 10-20 days to respond. And to state the obvious, only one onboarding survey per customer. 

Reporting and analysis

Reporting can be a black hole, so be careful. Sticking with the onboarding example above, here are the core relationships I focus on:

  1. What are the scores as they relate to each sales rep?
  2. What are the scores as they relate to the Customer Onboarding team? (This could be a CSM, account manager, Onboarding Manager, or entirely self-service content)
  3. What are the M-o-M trends? 
  4. What are the scores for some VIP or white glove customers or enterprises?

Now what? What do I do next?

I believe it’s best for everyone and anyone in the company to see customer feedback. It may likely create small issues (“what if one of my coworkers looks down on me from that interaction?”), but the long-term is that it actually builds trust. Engineers, Product, Sales, Admin may not interact with customers. This is the lens for them to empathize with you. It also helps illuminate issues for teams to help or respond. Fixing bugs, changing the sales demo, or prioritizing a feature launch. In this case, the Onboarding team cannot fix everything themselves. They rely on others. That’s the power of a “team.”

Because of that, I like the following process:

  1. Integrate the survey tool with Slack (or whatever tool you may use) for everyone in the company to see the feedback
  2. Pipe data into your CRM and/or CS platform
  3. Customer follow-through mechanism
    1. Most streamlined approach: use your CRM or CS platform to create follow-up tasks for the CSM or Onboarding Manager to follow up with the customer. This can be done with automation, such as a task is created based on the score. 
    2. Small scale: use Slack as your task list and apply a reactji whenever you respond to a customer. Note: this is not idealas Slack is A) not intended as a task manager and B) it’s not intended as a task manager
  4. Pass any relevant feedback to the proper team (e.g., product feedback to the product team)
  5. Create any additional tasks in your CRM or CS platform to check in with the customer

Note: you can create an automatic email that goes out for high or low scores, but that seems silly since they’re already responding to an automatic email. Why not douse it with some humanity at this point? Plus, they may have added some qualitative feedback that it’s best to reply with a custom email.

Benchmarking

Establish a loose benchmark as a place to start. Realize you may only have a month or quarter’s worth of data and there could be seasonality effects or other concerns. So hold loosely to these benchmarks. Companies and industries are different, so it may not be realistic to shoot for X score just because someone else’s company achieved that. That said, here’s an example process and presentation to share with your leadership team:


Example CES benchmark presentation (fictitious figures)

We believe healthy CES scores can help us measure how the onboarding experience is going — the effectiveness of Onboarding Managers, content, webinars, and ease of product.

For Q3201X, we’re ready to set that CES goal that we discussed last spring.
Average CES scores:

  • Jan 1 – Mar 31: 4.5
  • Apr 1 – June 15: 5.1

Suggested Q3 goal: 5.4.

Rationale: 5.4 will still be a stretch, as getting closer and closer to 7 becomes asymptotic. Meaning, moving from 4 to 5 is much easier than 5 to 6, which is easier than 6-7 (averaging exactly 7 is impossible if anyone gives less than a 7, for instance). It’ll take some good work and ensuring our team is focused on deliver high value to customers.

[add in additional detail on why the goal makes sense, and how you’re going to get there. For example, you have several changes planned. This helps validate raising the bar as reasonable rather than a pipe dream]

I welcome your feedback.


CSAT process considerations

In addition to the CES Onboarding process example above, here are several CSAT specific considerations.

  • You may or may not want to pipe all Support CSAT feedback to a company-wide channel. Negative feedback can be targeted at the Support rep. They may be new and become afraid. You may work for a large company and coworkers form their view of Support reps based only on that feedback.
  • Should the Support rep respond to the customer? Should the manager? Should they first discuss it? This is more dependent on your leadership, culture, and needs of the people. Most important is that this should reinforce trust and confidence between teammates, not tear it down.
  • It’s really important to track the percentage of respondents for CSAT. A very low amount could indicate customers don’t even want to respond. 90% of the iceberg is beneath the surface, how many of your customers are withholding their issues with your company?
  • Do not annoy your customers. Experience CSAT from your customer’s standpoint. Are they getting a follow-up email after every Support interaction? That’s obnoxious. It’s better to embed the feedback request in the Support ticket, if it’s an email. Zendesk does this really well. Help Scout has it embedded well, too. Think about the experience.
  • Track CSAT as it relates to your enterprise customers, are there any warnings coming up that your CSM(s) should be aware of?

NPS process considerations

Here are NPS specific considerations:

  • Piping this data into Slack is beneficial, but you may see a string of promoters or detractors. Be sure to focus on the aggregate number and not succumb to today’s trends.
  • Keep in mind the NPS methodology is different. NPS ranges from -100 to +100. You cannot have an NPS rating of 8.2.
  • NPS fatigue is a thing. Don’t annoy your customers.
  • Diversify whenNPS gets sent out. A few each day is better than surveying en masse. Getting overwhelmed with thousands of NPS responses likely negates timely follow-ups.

Which customers do I choose to follow-up with?

These are only considerations…feel free to deviate. The most important customers to follow up with are your happy customers. These are the people who will speak on your behalf. Don’t you want that? If you’re short on time, spend some time reading customer feedback to decide which strategy to apply.

Promoters / high scores

Please follow up with your happy customers! It’s super easy to play a firefighter role and try to turn the ship around on upset customers. It’s “problem-solving” which is how most of us are wired. Instead, first start with the customers who give you a great score — NPS for example. Why try to appease those who are likely to churn? Priority number one are your advocates, your ambassadors. 

First, consider creating a task to send them a thank you note — whether that’s an email, a hand-written card, swag, or something else. 

Secondly, if you’re looking for customer testimonials, this is a great source. Reach out to them. Marketing can choose customers who give great feedback — especially if it’s posted to a Slack channel, then it’s easy to scan.

Third, these are your customers who will most likely give referrals. Treat these customers with delight and care.

Passives / lukewarm

Don’t ignore these customers. These are the customers who had a so-so Support experience, or are a “passive” according to NPS. Some of these had an okay experience, or may be undecided. Using a political election as an analogy, these are the swing voters that you want to win over to your side.

Consider following up with them. If they didn’t already give it, ask them for feedback: “what could we do better next time?”

Learn what’s keeping them from being more engaged. Is it something with the product? Response time? Poor onboarding or training?

Detractors / low scores

These customers create a goldmine for feedback. You’ll likely experience quite a bit of negativity (and risk getting yelled at), but many of these customers are willing to share critical feedback of your company. Sometimes it’s an edge case. Other times they’re trying to solve a problem with your product that it was never intended to solve. Still other times they’ll be the canary in the coal mine and give that feedback that others aren’t likely to give.

They can also be the customers who had a bad experience and help uncover where your customer journey map is insufficient or holes exist.

Customer segments

While not related to their “score,” after you analyze by customer segment, understand the themes and if there are actions you can take. Perhaps many of your enterprise customers see your product/service deficient in one area that’s different from how your SMBs see it. 

Vendor selection

Full disclosure: I’m not paid for this nor are there any rewards or gifts that I am getting from this. These are simply my experiences.

Before evaluating vendors, write down your requirements! I prefer the MoSCoW Prioritization (the term is scarier than it actually is):

  • Must Have (what are the make-or-break features?)
  • Should Have (what should it have but I may be able to do without?)
  • Could Have (what would I like it to have, but would only be a nice-to-have?)

For example, here’s an excerpt of a MoSCoW Prioritization that I’ve used in the past. Add a column for each additional potential vendor.

CategoryDescriptionDetailed needPriorityVendor 1
Pricing



IntegrationsSlackPush responses to relevant channelsMust HaveYes
Integrations CRMPush responses to Contact pageMust HaveYes
Customer ExperienceWhitelabel Emails must be white labeled as XYZ brandMust HaveNo
Customer ExperienceWhitelabelWhitelabel emails so they don’t say the vendor nameShould HaveYes
DeliveryDKIM/SPF recordsHave emails be sent from XYZ.com domainMust HaveYes

Another example is that I needed to run NPS and CES. I could go with two separate vendors or a single one. Also keep in mind the surveys like NPS are being absorbed into other products, like CRM, CS platforms, reporting, etc.

Now that you have that information, here are a list of vendors to start with. That said, there are always new vendors so do not consider this a complete list. Also, they’re randomly organized, the first one is not necessarily “the best” nor is the last “the worst”. You need to figure out what fits your needs and budget. Oh, and be sure to check out www.g2crowd.comwww.capterra.com, and other review tools.

  • AskNicely
  • Delighted
  • Satismeter
  • Wootric
  • NiceReply
  • HubSpot
  • Promoter.io
  • Surveymonkey
  • Getfeedback

Conclusion

There’s a lot involved in customer feedback. The three biggest points are the following:

  1. What do you want out of this?
  2. What’s the customer going to experience?
  3. How will measure and improve your operations?