Measure User Usage

We would like to have insight into the usage of our now 53 licensees to better understand who our “technical business champions” are on the desktop side. The addition of project author was a nice addition, but it would be very useful to understand our users at a greater level of detail. Some examples:

Projects being run locally
Number of server connects
Number of projects published
Daily usage time
Error counts

Some of this may be exposed in the new journaling feature in 5.0, but it would be nice to have the additional detail to help us better understand how to best support and enable our clients.

1 Like

That’s an interesting question. How would you identify a “champion” if you had these metrics? Who has more chances to be identified as a champion:

  • a person who runs more projects locally
  • a person who keeps EasyMorph Desktop open for a long time
  • a person who added/deleted more actions
  • a person who processed more rows of data

Some metrics mentioned in your post may not be as indicative as expected. For instance, if “Daily usage time” means the time when the Desktop was open, then it may not mean actual usage as the application can be open but sit idle in the background, or left open overnight. What if two Desktop instances were open - should we count their run time separately, or overlapping? How about the Launcher that is always running in the system tray? Etc.

Also, “Number of projects published” can be hard to identify as projects can be published in a number of ways. Besides pressing “Publish project” in the Desktop app, a project can be uploaded to the Server via browser, or via an EasyMorph action. Not counting them would produce an incomplete number. Counting them may quickly become non-trivial.

I like the idea of identifying champions because the champions become the sources of expertise for people nearby. The question is what would be simple and reliable metrics to understand that. I would probably use just two:

  • the number of added/edited/deleted actions
  • the number of project runs (count separately runs initiated manually in Desktop or Launcher, and scheduled Launcher runs)

They are rather unambiguous and strongly correlate with actual engagement.

1 Like

Hi Dimitry,

Thanks for the response. You make some good points and I agree on the simplification of the metrics as you suggested. Instead of Projects Published, it might be useful to see # of Projects Created. This could give us an idea of a user who is trying many different things. And these metrics are really more qualitative than quantitative in the sense that they would identify users that we would want to speak with directly to understand how they are using the solution.

And as always a shout out to your team for developing a really good solution. The adoption rate here is quite amazing.

Thanks.

1 Like