Design

Validated design: No more guesswork

Topics
Analytics

I gave a presentation last week in Aalto University's Service Design Breakfast titled "Work begins after the launch". With the slightly provocative title, I wanted to put an emphasis on life after the initial launch of a service and the importance of data-driven post-launch development. The presentation slides and video can be found at the end of this post.

Validated design = design + development + analytics

In too many cases, the majority of the effort and budget is being put into the design and implementation of the service, and little resources and energy is left for post-launch development. The situation easily leads to the service decaying during the first couple of years of its existence, while it should be constantly improved in order to bring more and more value and euros to the organization.

Screen_Shot_2012-11-06_at_11.29.41

In my talk, I looked into ways of keeping the service alive with data-driven design validation. Obviously, there's a lot we can already do in the design phase to base the design on data (in my talk I gave an example of using keyword research as a service design tool for learning about people's expectations and needs towards the service), but in the end we won't know if the service really lives up to its expectations before it's out and within people's reach

Traditional user-centric methods, such as usability testing, are still relevant and help us formulate hypotheses about the usability of the service, but they won't really answer the ultimate question: do people need the service and are they willing to use it?

It's also difficult to predict how people will apply the service in their lives and social contexts with simulations and testing. Some relevant aspects of a service, such as marketing, can even be close to impossible to test beforehand. In many cases, it just makes more sense to build the minimum viable product (MVP) and then start iterating.

Analytics is a powerful tool for finding answers to the big questions. The digital world makes collecting huge amount of service usage data easy, but data itself doesn't help much if we don't understand what's important. Analytics needs to be designed into the service.

Goals and KPIs help keep us on the right track

It's important to define the service goals in the early phases of the project and make sure they're DUMB (Doable, Understandable, Measurable and Beneficial). Each goal should have clear key performance indicators (KPIs) and target values defined in order to know if the goal is being reached or not. 

Good and concrete service goals direct the design and development towards things that bring the most value. Bad goals are the ones typically found in project plans - abstract, impossible to achieve, close to impossible to measure and somewhat controversial in their usefulness. Bad goals aren't actionable and can even lead the development into harmful direction.

Screen_Shot_2012-11-06_at_11.33.02

In post-launch development, goals act as a sanity check: are the euros we're putting into development actually paying off in the bigger picture? Goals should be derived from your strategy and business plan and act as reminders about why are we actually developing the service in the first place. Goals can act as an important part of a product vision, to put it in agile terminology.

It is usually a good idea to think about the goals in a larger scale than just inside a single service: "If this is our strategy, what services should we have, and what purpose do each have in putting the strategy into practice? What role does each service play in the bigger customer journey or does it support the customership that we're trying to build?"

It's surprising how many services are still being built without a clear understanding of the goals, when this really should be the first and most important phase of service design.

But what's really going on - and what should we do about it?

Goals and KPIs give us the big picture of the service, but we need to also analyze (on a more detailed level) how people are using the service. How are they navigating through the service, where are they coming from, what did they do inside a single page or functionality of the service? Goals give us direction on what aspects of the service to look into (what needs improving), and we then need to analyze - or make academic guesses - on how to enhance the design. There are tools and methods to help, such as A/B testing.

We're designing people's behavior inside the service in the early phases of the project. After the launch, we need to validate whether the design decisions were the right ones and adjust where needed. Analysis plays an important role in this.

f593ae0a92039b66bf4333e9fbf0bf00

The whole point of analytics is to change things instead of just report what's going on. That's where analytics really starts paying off.

At Futurice we don't want to just maintain the services we build, but actually develop them further in close co-operation with the client. That's one of the reasons we have a team called "Lifecycle management" instead of plain "Maintenance". The best time to start thinking about the lifecycle of the service is at the beginning of the project, not in the end.

Edit: If these ideas get you interested, we still have some seats open in our upcoming Lean Startup courses. These ideas are discussed in more detail there, and actual tools are given, with practice on their use. 

Work begins after the launch from Service_Design_Breakfast