Creating DivvyCloud's Cloud Compliance Dashboard

Background

DivvyCloud is a Cloud Security Posture Management (CSPM) tool that allows large companies to ensure their cloud environments stay compliant; continuously and automatically. DivvyCloud does this by analyzing users' cloud environments in real-time against a set of compliance standards and then taking action if a violation of those standards ever appears, all in one place.

The Problem

Using DivvyCloud features, users can investigate DivvyCloud's automatic remediation of their cloud and understand violations' potential ramifications if DivvyCloud hadn't stepped in. Therefore, make conclusions about how DivvyCloud is helping their company's overall cloud compliance.

However, decision-makers at the user's company are the ones who make big picture decisions about their company's overall cloud security strategy. They need summaries and trends of how DivvyCloud improves their cloud compliance to inform those company-wide strategy decisions, which DivvyCloud does not provide.

To remedy this gap, users have been using collecting different data points found in other parts of and creating independent trend reports give the decision-makers the information they need when they request it.

At DivvyCloud we must design for both our users and their decision-makers, who are the buyers and financial managers of DivvyCloud

Prospects Helped us Discover the Problem

We discovered this problem when the sales team reported a trend in lost prospects with these needs, choosing a competitor over DivvyCloud.

We needed to determine if this was also a problem for our current users while creating a prospective buyer solution.

Research

We have many data points and trends that we could show users based on the data we collect in the backend. But we didn't know which information they care about, what they would find useful, or how they want to receive this information. So it was time to talk to users and decision-makers.

Using Figma, I quickly put together a simple static wireframe displaying some of the data we thought might be useful. Because we didn't really know which information was helpful, I didn't want to spend too much time organizing. Instead, I wanted something fast that would allow us to spark conversation with our users, decision-makers, and internal stakeholders.

My Product Manager and I collaborated with the Customer Success and Sales teams to set up calls with both users and decision-makers. Some of the questions we asked them are:

  • Why are you using DivvyCloud today?
  • What value you are getting out of DivvyCloud 
  • Are you sharing any metrics or reporting about DivvyCloud with your team and stakeholders today? If yes, what are those metrics? 
  • Are there any metrics about DivvyCloud that you'd like to share with your team and stakeholders today that you don't have? If yes, what are those metrics? 
  • If you share metrics about DivvyCloud to your team and stakeholders today, how are you doing so? How would you want to do so?

We interviewed five users and prospects, and then we did stakeholder interviews with internal subject matter experts; sales, customer success, customer support, and executives. These rounds of talks resulted in new iterations of mockups. We continued these interviews until our confidence level about the user needs and requirements grew significantly.

Synthesizing the Research

Our research helped us confirm our assumption that decision-makers don't care about the granularity of DivvyCloud's actions. Users have another DivvyCloud feature called the Compliance Scorecard that allows them to investigate DivvyCloud's remediation in real-time. But what decision-makers need are summaries of the current state of their cloud environment and trends over periods. Are they getting more compliant, are they getting less compliant, where, how long has this been going on; those are some of the questions they need answers to make informed cloud security strategy decisions

Based on the research findings, decision-makers want the following information:

  • The current percentage of compliant vs. non-compliant resources
  • The percentage of compliant vs. non-compliant resources over time
  • The current total number of non-compliant resources by severity level
  • The total number of non-compliant resources by severity level over time

How Research Changed the Design

After conducting research and iterating on the initial mockups, we decided to create a dashboard to display these data points and trends. A dashboard could act as a central location for decision-makers to get scannable and digestible summaries about their company's cloud compliance.

Testing an On-Premise Solution

DivvyCloud is an on-premise software, not a SaaS solution; this means our customers have in-house installations of our tool, vs. Saas solutions which host through the internet. When customers buy DivvyCloud, our internal IT team works with their internal IT team to get DivvyCloud setup and running. We ship out the code for every new release to customers to install and then start using.

Because DivvyCloud is on-premise, we need engineering resources to ship out a beta version of any project to customers. So we worked together with engineering to create a quick and dirty working version of the Dashboard. We randomly selected a group of customers to be our test group and shipped the Dashboard to start including it into their workflow. We also gave the sales team access to the Dashboard to hear feedback from prospects. Then we collected more feedback before committing more resources to this project. We collected feedback through Zendesk tickets, comments, and interviews set up with customer success and sales teams.

Additional Requirements

The feedback we received from this beta period helped create a Jobs To Be Done (JTBD) framework, which allowed us to define the requirements for the Dashboard further:

  • As a security engineer user, I need to be able to investigate the information in DivvyCloud's summary of my overall cloud compliance to answer questions my boss may have about the summary.
  • As a security engineer user, I need to be able to customize the data displayed in DivvyCloud's summary of my cloud compliance so that I see only the data that matters at the time.
  • As a security engineer user, I need to be able to share a copy of DivvyCloud's summary of my cloud compliance with my boss, so they can make high-level decisions about our overall cloud security strategy without having to log into DivvyCloud.
  • As a security engineer user, I need to be able to customize the date range for the data displayed in DivvyCloud's summary of my cloud compliance to allow me only to see data that matters to me.

Defining Scope

At this point, because of the stakeholder feedback we had received, we had high confidence that the Dashboard would be valuable enough to dedicate full resources to. So my Product Manager formally kicked off the project with the dedicated scrum team. I conducted a few meetings with the team to get collective feedback on my proposed solutions for the additional requirements. After coming to a teamwide consensus, it was time for higher fidelity designs.

During the design phase, my Product Manager and I worked closely with Engineering teams to determine what was feasible and define the amount of work it would take to ship the Dashboard. Because of executive leadership's goal to ship the Dashboard in 2 releases (4 weeks), we had to decide which features to prioritize before the initial release and after. Within our team, I presented the research findings, solutions, and suggestions for the project. The Engineering team also shared what they believed would be the scope of all of this work. These presentations helped my Product Manager determine what to prioritize. She defined the MVP requirements for the Dashboard by prioritized the features that would deliver the most value to users based on their needs and feedback that would require the least amount of engineering resources. With those decisions, the Dashboard was ready for final design, development, and QA testing.

Design, Development, and QA Testing

I worked on the visual design for the Dashboard and made sure it was ready for development. I worked very closely with Engineering to communicate the overall experience we wanted to deliver. Once we built the Dashboard, it went through rigorous testing. Once it got okay from QA, Product, and Design, it was ready to release.

Next Steps

Iterating. The next step is to get the product into users' hands and collect more feedback. Build, test, iterate, and repeat. 

What Success Looks Like

The Dashboard is a solution created to address decision-maker user needs. Our decision-maker users aren't able to determine how DivvyCloud helps their company's overall cloud compliance because we don't provide them with summaries of the current state of their cloud environment and trends over periods. As a result of this gap, prospective decision-makers are going with a DivvyCloud competitor and current users are hunting through the app together to create these summaries and trend reports themselves.

The Dashboard will be a success when:

  • The trend of prospective buyers with the above needs choosing our competitor over DivvyCloud starts trending downward instead of upward.
  • Existing customers stop creating summaries and trends independently but instead have their questions answered by the Dashboard data displays.

We will be collecting feedback through Zendesk tickets, comments, interviews with the customer and sales team, and sales reports generated by leadership to determine the Dashboard's success.

Sound Interesting?

Send me and email for more details.