Skip to main content

Activity-based targeting

Out of the box, when you first begin using CommandBar, you can create Audiences using various characteristics — things that CommandBar provides out-of-the-box (like device or when the user was last seen) as well as characteristics imported from other tools.

As users begin to engage with CommandBar, they build histories: unique stories that tell the tale of their interests (and hopes, dreams, and desires). These histories can then be used for targeting users in an event more personalized way.

What activities are available for targeting


Intent, in our personal opinion, is the coolest activity available in activity-based targeting. We love all our children equally but this one the most, because it unites the two halves of our product (Nudge and Assist) in a way that showcases our philosophy ("assist users by getting closer to their intent").

Intent represents topics a user might be interested in based on their own words, which CommandBar captures in the form of searches and chats. Searches can come from Spotlight and HelpHub, and chats come from Copilot.

A user can have intent for a topic if their chat and search history contains any semantically equivalent concepts.

Intent targeting

Some examples of using intent for targeting users:

  1. Feature updates. A customer searches or asks about anything “CSV” related —> show them an Announcement when the CSV export feature launches. You can even tailor the copy to reference the fact that you know they will be interested: “You ask, we delivered” type thing.
  2. Targeted offers. A customer searches for a premium feature —> show them a nudge encouraging them to upgrade to a plan that includes in the feature.

Nudge interactions

Nudge interactions facilitate targeting based on ways users have engaged with nudges they have seen in the past. This is especially useful when you’re building a nudge that is related to some nudge that you have shipped in the past.

This activity lets you target based on three possible interaction states:

  1. Viewed - user had at least one impression of this specific nudge
  2. Completed - the user made it to the final step in the nudge (if there were multiple steps). If there was just one step, this means that the user clicked the CTA in the nudge.
  3. Dismissed - the user closed out of the nudge, or didn’t make it to the last step in a multi-step nudge.

A common example of a targeting rule using nudge interactions would be: avoiding showing a nudge to users who have previously seen a related nudge in the past. Or, perhaps, you’re ok showing your new nudge to users who saw a related nudge, but not to users who completed the nudge.

Checklist interactions

Checklist interactions work just like Nudge interactions, except it’s scoped specifically to checklists.

Survey responses

Users dump loads of useful information into survey responses, and you can use that information to construct really intricate targeting rules. There are two types of survey input blocks:

  • Text-based response blocks (e.g., short answer, long answer)
  • Number-based response blocks (e.g., rating, list)

For now, only targeting on number-based blocks is supported.

Survey response targeting

The most common example of targeting based on survey responses is to use the results of an NPS (or equivalent) survey to target particularly unhappy or happy users. For example, you could send a nudge to users who scored < 3 on an NPS survey asking them to book time with a friendly human account representative. Or for users who put a 10, you could create another nudge to encourage them to leave a positive review on your review site of choice.

Help doc views

Sometimes it helps to know when a user has some context on a topic. Help doc views can capture user knowledge; we do this by allowing for targeting based on whether a user has opened a particular doc within HelpHub in the past.

A common use case for condition is: updating a feature —> nudge users who have looked into that feature about the change. This can be combined with intent ("users who have expressed interest in the feature") as well for maximum coverage of interested users.