When it comes to implementing dashboards, machine learning solutions, and other analytical applications, oftentimes organizations find disappointingly low levels of adoption among their team and/or clients. On some occasions, the tool is developed and never actually used.
Adoption is never guaranteed, though there are many things an organization (or even a developer) can do to increase the odds of adoption. The first thing an organization must realize is that a dashboard, just like a startup, must achieve product-market fit.
Product-market fit refers to when an organization identifies a need in the marketplace, builds a solution, and results in customers wanting to buy it.
While product-market fit is something that most people would think about in a large-scale operation, for a dashboard, the scale is much, much smaller– yet, the same principles apply. On this smaller scale: you identify a need for an analytical application, build it, and because it fits a specific need, users want to adopt it.
So how do you drive product-market fit in the world of dashboards, machine learning solutions, and other analytical applications?
This post will discuss the different steps that can be taken to maximize and achieve product-market fit.
Understanding the Use Cases
Unlike the startup scenario, organizations have an unfair advantage in achieving success in that they have access to their clients who can tell them precisely what they need and how they need it.
Spending time with clients –the people consuming the analytical products– will help organizations uncover what their clients need, how they are using data today, and where existing data products are just not solving the problems.
When starting the discovery process, it’s important to create working groups and share existing analytical products. This way clients can share what’s working, what’s not working, and what’s missing.
For the products that are working, attempt to group them and understand common themes around overall adoption.
For products not being used, create similar themes and compare them to the working products. Ask about the intent of these dashboards and why they haven’t been adopted.
By asking about why something has not been adopted, you’ll likely gain more insight into the gap that exists and help identify an analytical solution that doesn’t yet exist but possibly should.
As you gather each use case, document exactly what you learn. There are many different attributes about a use case that could be documented, but here are a few specific ones to consider:
- The department
- The audience of the dashboard
- The business process to streamline
- Type of product (Executive, Investigational, Detailed)
- Description of the benefit
- Description of who benefits
- Quantification of the return on investment
The first two attributes may seem obvious, as you need a way to aggregate all the potential use cases for an organization, line of business, or team. With department and audience, you are narrowing in on who your use cases have been supporting.
When you document the business process, you are defining what needs to be streamlined. Does the sales team have an alert for where their actions should be for the day? Does your marketing team have a way to track marketing spend back to individual customers? Here you are specifying the exact processes you can support for a function.
The next question is open-ended and requires the customer to brainstorm about what a potential solution could look like. It’s important to also document the type of solution. Is it a data warehouse? A dashboard? A scenario planning tool? Helping define the potential solution can also describe how long it will take to design, develop, test, and release.
The last three questions are about the benefits. It must be stated which audience and business process is benefiting from the solution. The solution proposed might not always benefit the consumer of the product. It could instead, be a time saver for the developer.
Last but not least, considerable time should be spent to quantify the time or dollars saved, the revenue generated, or the risk decreased due to the use case. Collecting the return on investment data is the gold to continuing to develop long-lasting analytical applications.
How Do You Build a Backlog of Analytical Use Cases?
Your use cases should not live in isolation. You should consolidate all use cases into a backlog. By keeping track of the use cases and potential solutions that need to be completed, you can ensure that nothing gets forgotten and you have a repository of business problems with a quantified potential return on investment.
This backlog with noted ROI can help you to prioritize work and make sure that the most important analytics applications are completed first.
There are a few different ways to create a backlog. One popular method is to use a spreadsheet, which can be easily shared with others on the team. Another option is to use a project management tool like Asana or Jira. Whichever method you choose, the important thing is to keep the backlog updated as the project progresses.
How Do You Choose What to Build?
You should never select analytical products for development from your backlog in isolation. The decision should be made with the input from a number of stakeholders.
There are many determining factors in finding the right solution but a robust process should include four levels of stakeholders:
- Leader/Program Director
- Steering Committee
- Business Champion
- End-User Representatives
The Leader or Program Director is the person who is accountable for any analytical application created–and owns the value of the team or program.
The goal of the steering committee is to help prioritize the backlog of use cases and provide visibility to the potential return on investment listed within the backlog. This group consists of team leaders that are directly benefiting from the analytical applications being developed.
The business champion is likely the largest beneficiary of the solution as they are typically the power users with the most detailed feedback.
End-user representatives can be consulted–if you would like. These are consumers of the analytical application who were hand selected by the Business Champion. These consumers are likely influencers of others or are early adopters of the application.
Let’s consider an example of a mid-size company attempting to roll out a series of products across business units. This company has a centralized analytics unit within IT. The Program Director is likely a Vice President or Director of Analytics. The Steering Committee will consist of Directors or VPs of the respective business units supported across all analytical applications.
The Business Champion will be the specific Director or VP directly impacted by the development of a specific analytical application being developed. For this example, it will be a VP of Sales Operations. End-User Representatives are the super users and early adopters of the analytical application. In this case, they are direct sellers who are supported by the sales operations team.
With feedback from the steering committee, the Business Champion, and End-User Representatives, the Program Director can make an informed decision. The ultimate decision is on the leader/program director as they have the vision for what the analytical application roadmap will look like. This vision will dictate the next product to be developed.
When coaching steering committees and Program Directors, we coach them to focus on the highest-value projects. High-value projects–from an analytical perspective–are projects with the highest ROI that can be delivered over a short period of time.
It can be difficult to manage and prioritize work on the projects in the backlog. Often projects are evaluated by their level of priority and effort. When done this way, it’s very difficult to compare initiatives within the same combination of classifications.
Since many of the solutions in the backlog will not be an apples-to-apples comparison, our suggestion is to instead estimate the “value density,” which is simply the estimated ROI divided by the total investment (in terms of time and dollars).
With a value density calculation, you can make closer comparisons and more easily decide what scale of projects should be taken on during a given period of time.
What Do You Do After Selecting a Use Case?
After selecting a use case–much of the development should follow a structured analytical application lifecycle.
Every team will have a slightly different lifecycle. One suggested workflow would be:
- Gather detailed requirements
- Build a prototype
- Develop the solution
- Gather feedback and iterate
- Usability testing
- Quality assurance testing
- Release management
In a future blog post, we’ll go into greater detail on the steps listed above. For the sake of this blog post, let’s stay focused on how organizations can use the development cycle to drive adoption.
How Do You Get Buy-in From Your Stakeholders?
As mentioned earlier in this post, one key element in getting buy-in is to include a variety of critical stakeholders from day one, rather than just the developer, a handful of peers to the developer, and a product leader.
In the requirements-gathering phase, the developer should gather detailed information from the Business Champion and End-User Representatives. The goal of these requirements sessions is to get a deep understanding of the problem that faces the end users and their existing ways of working.
When gathering requirements, you should think about how you can create a solution that fits directly into an end user’s ways of working but still address a challenge that the analytical application solves.
As you brainstorm the details of the solution, work with end users to understand the “what,” the “so what,” and the “now what.”
- The “what” will say what happened in the data.
- The “so what” should be the ability for a user to see why something happened.
- And the “now what” takes the information and helps articulate the next course of action for a user.
Work with stakeholders to define these areas.
Engaging with Business Champions and End User Representatives doesn’t stop at requirements gathering. Feedback from these stakeholders is critical for gaining trust and ensuring the adoption of the application.
We suggest creating at least two working groups of End User Representatives. Meet with these working groups for feedback on the prototype and during the iteration phase. Create a structured review process that includes you giving the background, them providing input to the background, you providing a live demo, and the working groups providing feedback directly on the current state of the product and an ideal state for release.
It’s important to note that sometimes the ideal state and the final state includes compromises. Sometimes the data doesn’t exist at the needed level, or the solution is far too complex for individuals outside the working group. These challenges can affect product-market fit and decrease adoption–but you can overcome this gap with strong communication.
Never assume an analytics application can be built and instantly adopted without appropriate communications.
If you are going to build a product, you need to create a communication plan for how you will roll out the product. A successful product will have communication that announces the upcoming release of the product, supports the natural early adopters, incentives the core, and appropriately addresses objections from detractors. This plan dictates how each persona will be upskilled for the use of the product.
Communication plans should include key product talk tracks for leaders, business champions, and end users. Provide your steering committee with talking points so that there is alignment from all involved.
The best thing to do is to build communication templates. This way, as you release your products, it will be easy to share the updates and follow a structure that will be familiar to your end users.
We suggest the following types of templates:
- All stakeholders: What’s coming in the future
- All stakeholders: What’s new for everyone
- Persona-specific: What has been updated
- Persona-specific: What is new and how to be upskilled
- Use case specific: What is being created and when will it be released
- Use case specific: How to make the most of a new release
- Use case specific: How to upskill in the new product
As part of your communication plans, it is important to vary the style of communication. An e-mail works for some people, but for others, it needs to be shared on a call or in a meeting. Make sure you have varied styles of communication so that awareness consistently grows for the product.
We highly recommend utilizing badges to reward those who have been trained and those who are regularly using the platform. While some may see this as cheesy, it is a way to gamify the solutions and increase engagement overall.
Additionally, build a supplementary leaderboard–primarily designed for key leaders but potentially end users–that highlights the top consumers of the dashboard and helps facilitate the gamification adoption of the analytical application.
Finally, one of the most critical components of an analytical application release is training. As mentioned earlier: you cannot release a product in isolation. You have to over-communicate and over-train.
Like communication, you will need to vary your training styles. Think about a three-fold approach: training for those who will immediately adopt and have no questions, a webinar-style training where individuals can interact with a moderator while seeing and understanding how the application works, and a high-touch solution for the late adopters.
Above all, make sure that the leaders of the end users know how the application works and the value it brings and that these individuals are communicating downward to their teams.
What Do You Do After You’ve Released a Dashboard?
As part of your analytical application lifecycle, you have to monitor adoption–which is critical for several reasons.
First, if the product is not adopted early, it’ll be very clear there was a miss on the product-market fit. If this happens, get back to your working groups for feedback immediately.
If your adoption starts to wane over time, this can be for several reasons:
- The application is no longer addressing the business problem.
- The ways of working of the end users have evolved and it no longer fits into their workflow.
- Leaders are no longer holding individuals accountable for the business process that the application supported.
In any of the scenarios, a root cause analysis will need to be performed to determine what is waning adoption. Based on the analysis, either evolve the application, improve training and communication, or sunset the product.
When creating a dashboard– or analytical application – build a process that takes the entire product lifestyle into consideration.
Provide a clear line-of-sight to the return on investment and the timeline to deliver the return on investment, building the trust of leaders.
Create a steering committee of individuals responsible for delivering streamlined business processes. Work together to prioritize use cases and share the plan for delivery to business champions and end users, building the trust of peers.
Create a plan for gathering feedback from stakeholders who care about their work. These individuals are already looking to make their lives better–and their peers’ lives easier. Their feedback is gold, as they will help transform a cohort of end users based on the products you deliver.
Be sure to communicate with end users throughout the development lifecycle. When the product is released, create a diverse way of upskilling individuals on how to utilize the analytical applications in the ecosystem.
Above all, remember that adoption does not happen overnight or in isolation. The more people you can bring along in the development process, the greater the adoption rate will be.
Finally, it is important to note that these steps don’t guarantee adoption. Sometimes, the fit just isn’t there, but following these tips and steps will help you on your quest in prioritizing the adoption of your dashboard.
If you’re still not seeing the adoption you’d hope for, another option is to consider outside analytics training platforms like Data Coach to upskill your team. Data Coach works directly with your teams to help drive analytics-centric change. WIth a proven track record of driving successful analytics platform adoption, it’s worth checking out!