My previous blog post described the general approach of data-driven design in the refinement of digital services. It also introduced the fundamental division of passive tracking vs. active elicitation as data collection approaches.
Today’s post goes deeper into examining the variety of tools that leading designers of contemporary web services utilize. These tools are geared towards UX designers, service designers and visual designers, all people in the business of crafting solutions for digital services. This scoping will leave out apps, physical touchpoint, and some marketing tools, e.g. email tracking. All of these tools also rely on passive tracking (from the user perspective).
In brief, I will cover Descriptive behavioral analytics, Heat maps , Experiment management , Screen recordings, and Design variation generation & testing as distinct categories of tools.
This post focuses on established web design tools. It is clear that cutting edge work may call for novel tools that may recall custom data collection and analysis tools. Considering the wealth of data, this likely calls support from a data scientist.
Aims of data collection and analysis
Data-driven design is well compatible with evidence-based design approaches. For instances, Lean UX is one philosophy of digital design preaching that all designs should have a clear underlying hypothesis that the design tests out. This perfectly fine, but as I see it, the use of data-driven design tools can be either confirmatory (hypothesis-driven) or exploratory. In fact, I believe both approaches will likely be needed even though the former is much preferred.
I have discovered that exploration of user behavior becomes more important the more complicated the service is (or shorter the history of systematic design is). The more choice and opportunities the interface affords, the more likely it is that users will appropriate it to their own liking. They will come up with use cases and patterns that reveal new opportunities for service and interface design - something designers cannot anticipate or predict to happen!
Data-driven design tool types in web services
Available tools do not fall into easily distinguishable categories. To clarify things, I have chosen to focus on publicly available tools for web services and hybrid applications that utilize passive tracking.
Many of these tools I discuss are also known as web analytics solutions, but many recent services have opted for growth or optimization related tag lines instead of “analytics”. Despite the label, they are all fit for use in data-driven design.
I will highlight five distinct categories of tools available for passive data collection in the order of decreasing usefulness:
- Screen recordings (why)
- Heat maps (why)
- Experiment management (what)
- Design variation generation and testing (what)
- Descriptive behavioral analytics_ (what)
One important difference between different types of tools concerns whether they (primarily) provide answers to user thinking and behavior in terms of why or what. You could relate to this as a difference akin to qualitative and quantitative, or thick and big data. The important thing is that the perspectives are complementary and often times you as a designer must descend from what to why in order to vision how to advance your designs further.
In the following depictions, I have included some illustrations and videos, but I suggest you also visit the linked sites to get producers' pitch on the potency of their solutions. Most of the toosl are available for testing, often even for free.
Besides the effort related to getting started, I recommend that you talk with your development team in advance to estimate the technical performance impact of each solution. Usually having one or two scripts in production will not create much overhead, but it is not advisable to stack all possible tools to your real website at once just for trying them out - eventually they will slow down your site and work against your efforts.
1. User screen casts and recordings
Remote usability tests have pioneered the capture of full user interactions with digital services. This first happened in real-time (i.e. synchronously) when the technology simply connected one screen with another. But then someone (that is, ClickTale, userfly, and Mouseflow.com) miraculously realized that the user sessions, with views, clicks and all, could be recorded and stored as videos for viewing later on (asynchronously)!
Session replays offered through screen recordings provide a very detailed view into user activities and even intentions. This is important as the traditional usability tests were often the most informative when they showed what users failed to achieve.They can be much more valuable for service and UI designers than all other types of tools, but they work best in combination with other methods.
We’d like to help you make a fully informed decision about cookies. Accessing some of our embedded content that could be of interest and use to you requires you to enable cookies.The choice is always yours.
Video: How screen recording playback works in Hotjar
I have used screen recordings in combination with UI related AB-tests. This has greatly built my understanding of why the numbers describing the experimental outcomes turn out the way they do. For instance, you may have witness conflicting performance indicators such as improved conversion rate for purchases despite increased transaction time. Screen recording might reveal that instead of streamlining, your new design makes it easier for user to try out different shipping and billing option combinations, therefore taking more to purchase.
Currently at least Crazy Egg, ClickTale, Mouseflow, Inspectlet, Lucky Orange, and HotJar offer this type of service. Read more about their difference and pricing from UI Patterns review.
Getting started:
The screen recording setup require some initial development effort. From there on, designers will have it pretty nice from there one, being able to initiate recordings themselves. The biggest concern in screen recordings is privacy. The screen recordings present a new level intrusion that can make even the designer feel uncomfortable as disclosure of personal details can sometimes be unavoidable.
2. Heat maps
Heat maps are one of the most popular techniques to quickly and simply reveal how user interact with a single view (typically a page). They display the amount of clicks users make on a website, as well as how much users scroll down from the initial view (scroll depth ). CrazyEgg, Clicktale, Hotjar, and SumoMe are the best known examples of this approach.
Any scientist would be dumbstruck if I told you that eyeballing is not only the most popular, but nearly the ONLY available analysis method for heat maps. At best, you can compare two maps with one another. Practice and experience help in interpreting the maps, but this shortcoming is evident.
The insights that you deduce from heatmaps are two fold:
- Absence of clicks in elements expected to be clicked
- Presence of clicks in elements not expected to be clicked
Becoming aware of the scroll depth helps to explain the distribution of the clicks. It is evident that nobody will click on targets they do not perceive… For design insight, the concentration of click attempts may help to discover potential area for UI elements.
We’d like to help you make a fully informed decision about cookies. Accessing some of our embedded content that could be of interest and use to you requires you to enable cookies.The choice is always yours.
Video about Hotjar shows both heatmaps and user screen recordings made with the tool.
Getting started:
Heat maps are quite designer friendly tools. Designer can configure and utilize heat map tools well alone once the initial web server configuration is in place.
3. Experiment management
Experiment management enable running different type of experiments on digital services. Most famous of them all, AB or ABX tests that pit multiple design variations against one another. AB refers to a simple comparison of two designs (one of them usually being the old, control version), whereas ABX (aka. Bucket or multivariate) test includes several variations, sometimes in multiple dimensions. For instance, if you only change colors on ecommerce website you could call it AB, if you have two color variations and a layout change, that would be a multivariate test.
Experiment management solutions make experiments possible. The content of the experiment itself, as well as figuring out the right question for study, is up to the designer and usually requires a different set of tools.
Getting started:
In web use, these solutions tools such as Optimizely and Visual Website Optimizer are quite comprehensive. They require some configuration also in the server side for flexible experiments so a designer is unlikely to manage them alone. Often they can’t be even implemented using a tag management solution such as Google Tag Manager. Then again, once running many of tools offer WYSIWYG type of editing opportunities that allow designers to configure experiments comparing different UI designs. But if major changes in services are needed, this is not going to be enough.
4. Genetic manipulation and fitness tests
My previous post on the fundamentals of data-driven design tools noted that there are hardly any generative tools that would allow designers to create new designs based on. I admit I was bit fooling you. There are certain tools that allow to generate new design proposals based on both designer and user input data. They only tend to be very specific.
For example, Optimal Products Ltd. has released products called Optimal Sort and Treejack, which attempt to facilitate the generation of hierarchical menus. Regardless of what you may think about the progressiveness of this menu concept, these tools are made to assist finding optimal structure for a number of menu items; or even assess the need for certain items. In combination they allow for both creating new designs based on user performance on menu traversal tasks.
Optimal Sort is a variation of an old usability method card sorting that has been used to generate menu structures that are intuitive for each user. The beauty of this tool is the automated analysis of user patterns and removal of the human usability tester from the equation. This is a great example of the additional power the new breed of tools can hand to designers.
Many designers have been taking the advantage of rapid UI prototyping tools such as InVision or Sketch to get feedback about their designs. These approaches would also fall under this category of tools, although I do not associate them with passive tracking type of data-driven design tools as they lack automated collection, aggregation and analytics features.
Getting started:
The tools from Optimal workshop, for instance, are stand alone SaaS products. They require some skilled setup work for which reason independent designers may need external help to run the tests. However, in a longer run, acquiring the skills is not that difficult and tests can well be conducted on their own.
5. Descriptive Behavioral Analytics
Descriptive Behavioral Analytics are the foundation of all current data-driven design solutions. They have been around for pretty as long as the first web server was running. These services collect data about user interactions with a web service, along with all contextual data related to the visit, and aggregate it for reporting. They commonly produce metrics such number of users, visits/sessions, pageviews, traffic sources and so forth. The outcome can usually be simply summarized as two dimensional table or a time-series diagram.
While these type of tools have been generally adequate for marketing and business purposes, they tend to be less actionable from the design perspective. They can be more useful if you combine them with any of the other tools I’ve enlisted this far.
Behavioral analytics can be collected through two channels: server or client-based. In the former case, the web server which handles the content also collects so called click stream data. In the more popular case of client-based collection the user’s web browser sends the (with the previously mentioned potential handicap of being blocked by an ad blocker or similar). These two approaches can also be combined. Either way, some server configuration by a coding competent perosn is needed before any data is available for reporting.
In my work, I’ve found that the most useful descriptive statistics are those that describe the current audience of a digital services. After all, audience is something that every new service is dying to discover! For UI design, knowing the basic situation about user device categories (mobile vs. desktop), screen resolution and operating system/browser version can make all the difference for attempts to optimize UX.
Because descriptive analytics solutions, like Adobe Analytics, offer much more data than any designer can bear at a time, it has become common to pile it on top of a dashboard. I personally believe that dashboards have limited value in everyday design work, with the exception of experiments should they be running fast. Design insights, unlike business KPIs, should not change over night whatever happens in the competitive environment.
I do think some metrics are so important that is good to keep them in sight - and why not on a dashboard! The most recent design and development dashboard I’ve created included many of the aforementioned audience characteristics, as well vital signs of the ecommerce business (revenue, conversion rate), landing pages and service errors. However, much of this information was geared for the developers involved with the everyday operation.
Getting started:
Installing GA, Adobe, Kissmetrics or similar client-based tracking solution requires a tracking code on the web server. The aforementioned tag (tag = tracking code) management solutions may help to pass on the power to try new tools to designers as well. Configuration of server-based solutions is going to be more tricky. The good part is there are lot of free services so experimenting costs only time.
Conclusion: digital designer’s messy toolbox
Things are not so clear cut with these tools. Real world tools available for designers combine several features or categories of tools. After you’ve picked the tool, you will likely need to decide on the appropriate subscription as few services afford real work for free. Most tools are affordable at maximum couple thousand a year, although in some cases the premium come very an extremely hefty price tag (over $120K/annum for Google Analytics 360, former Premium).
The development of data tools for web services has taken the path that most analytics solutions have become packed with overlapping features. For instance, probably the most widely used web analytics tool, Google Analytics, combines numerous features. It includes not only aggregate descriptive statistics of user behavior and technology, but also views into individual users and individual page heat maps. And since Google incorporated an Experiments feature into it, it also enables running AB-tests!
This is quite radical. To have both massive aggregate (and sampled) together with detailed individual level is like a dream come true. However, most of the time it is still a dream and in reality you will need separate tools to capture both ends of the spectrum - the big and the small. However, GA is not really an integrated solution yet even though it does offer multiple kinds of views into customer behavior, such as the User Explorer tool.
User Explorer in Google Analytics provides a crude view into user flow. This is a skeleton of a user session without the skin, bones and unsuccessful twitching that comes with the real User Explorer in Google Analytics provides a crude view into user flow. This is a skeleton of a user session without the skin, bones and unsuccessful twitching that comes with the real session recordings.
As a digital designer you need to feel comfortable by experimenting with a number of experimental tools. There is, in good and bad, for every use case several tools that might do the job, each one just does it little bit differently. It is up to you to find which tools best suit you and your team in achieving your goals. That is the only thing that matters.
Great insights can sometimes to be found with the lousiest tools - but remember, designers also deserve a good UX.
- Lassi A. LiikkanenDigital Analyst Lead