DivvyCloud is a Cloud Security Posture Management (CSPM) tool that allows large companies to ensure their cloud environments stay compliant; continuously and automatically. DivvyCloud does this by analyzing users' cloud environments in real-time against a set of compliance standards and then taking action if a violation of those standards ever appears, all in one place.
Using DivvyCloud features, users can investigate DivvyCloud's automatic remediation of their cloud and understand violations' potential ramifications if DivvyCloud hadn't stepped in. Therefore, make conclusions about how DivvyCloud is helping their company's overall cloud compliance.
However, decision-makers at the user's company are the ones who make big picture decisions about their company's overall cloud security strategy. They need summaries and trends of how DivvyCloud improves their cloud compliance to inform those company-wide strategy decisions, which DivvyCloud does not provide.
To remedy this gap, users have been using collecting different data points found in other parts of and creating independent trend reports give the decision-makers the information they need when they request it.
We discovered this problem when the sales team reported a trend in lost prospects with these needs, choosing a competitor over DivvyCloud.
We needed to determine if this was also a problem for our current users while creating a prospective buyer solution.
We have many data points and trends that we could show users based on the data we collect in the backend. But we didn't know which information they care about, what they would find useful, or how they want to receive this information. So it was time to talk to users and decision-makers.
Using Figma, I quickly put together a simple static wireframe displaying some of the data we thought might be useful. Because we didn't really know which information was helpful, I didn't want to spend too much time organizing. Instead, I wanted something fast that would allow us to spark conversation with our users, decision-makers, and internal stakeholders.
My Product Manager and I collaborated with the Customer Success and Sales teams to set up calls with both users and decision-makers. Some of the questions we asked them are:
We interviewed five users and prospects, and then we did stakeholder interviews with internal subject matter experts; sales, customer success, customer support, and executives. These rounds of talks resulted in new iterations of mockups. We continued these interviews until our confidence level about the user needs and requirements grew significantly.
Our research helped us confirm our assumption that decision-makers don't care about the granularity of DivvyCloud's actions. Users have another DivvyCloud feature called the Compliance Scorecard that allows them to investigate DivvyCloud's remediation in real-time. But what decision-makers need are summaries of the current state of their cloud environment and trends over periods. Are they getting more compliant, are they getting less compliant, where, how long has this been going on; those are some of the questions they need answers to make informed cloud security strategy decisions
Based on the research findings, decision-makers want the following information:
After conducting research and iterating on the initial mockups, we decided to create a dashboard to display these data points and trends. A dashboard could act as a central location for decision-makers to get scannable and digestible summaries about their company's cloud compliance.
DivvyCloud is an on-premise software, not a SaaS solution; this means our customers have in-house installations of our tool, vs. Saas solutions which host through the internet. When customers buy DivvyCloud, our internal IT team works with their internal IT team to get DivvyCloud setup and running. We ship out the code for every new release to customers to install and then start using.
Because DivvyCloud is on-premise, we need engineering resources to ship out a beta version of any project to customers. So we worked together with engineering to create a quick and dirty working version of the Dashboard. We randomly selected a group of customers to be our test group and shipped the Dashboard to start including it into their workflow. We also gave the sales team access to the Dashboard to hear feedback from prospects. Then we collected more feedback before committing more resources to this project. We collected feedback through Zendesk tickets, comments, and interviews set up with customer success and sales teams.
The feedback we received from this beta period helped create a Jobs To Be Done (JTBD) framework, which allowed us to define the requirements for the Dashboard further:
At this point, because of the stakeholder feedback we had received, we had high confidence that the Dashboard would be valuable enough to dedicate full resources to. So my Product Manager formally kicked off the project with the dedicated scrum team. I conducted a few meetings with the team to get collective feedback on my proposed solutions for the additional requirements. After coming to a teamwide consensus, it was time for higher fidelity designs.
During the design phase, my Product Manager and I worked closely with Engineering teams to determine what was feasible and define the amount of work it would take to ship the Dashboard. Because of executive leadership's goal to ship the Dashboard in 2 releases (4 weeks), we had to decide which features to prioritize before the initial release and after. Within our team, I presented the research findings, solutions, and suggestions for the project. The Engineering team also shared what they believed would be the scope of all of this work. These presentations helped my Product Manager determine what to prioritize. She defined the MVP requirements for the Dashboard by prioritized the features that would deliver the most value to users based on their needs and feedback that would require the least amount of engineering resources. With those decisions, the Dashboard was ready for final design, development, and QA testing.
I worked on the visual design for the Dashboard and made sure it was ready for development. I worked very closely with Engineering to communicate the overall experience we wanted to deliver. Once we built the Dashboard, it went through rigorous testing. Once it got okay from QA, Product, and Design, it was ready to release.
Iterating. The next step is to get the product into users' hands and collect more feedback. Build, test, iterate, and repeat.
The Dashboard is a solution created to address decision-maker user needs. Our decision-maker users aren't able to determine how DivvyCloud helps their company's overall cloud compliance because we don't provide them with summaries of the current state of their cloud environment and trends over periods. As a result of this gap, prospective decision-makers are going with a DivvyCloud competitor and current users are hunting through the app together to create these summaries and trend reports themselves.
The Dashboard will be a success when:
We will be collecting feedback through Zendesk tickets, comments, interviews with the customer and sales team, and sales reports generated by leadership to determine the Dashboard's success.