Anomaly Detection for Data Imports


Methods: User Interview, Usability Testing
Timeline: May 2025

Introduction
Currently, Anaplan does not provide the ability to detect data anomalies in data that is going into Anaplan. Some of our current customers have expressed to their Customer Success Partners that they would like to see that functionality in Anaplan, as they currently have to do it through external tools, whereas they have expressed a desire for an integrated solution.

Through this research, internal stakeholders wanted to understand how Anaplan users are currently doing anomaly detection today, as well as what functionality would improve their anomaly detection experience if implemented in Anaplan.


In order to answer these questions, we talked to five customers who already conduct anomaly detection for data in their Anaplan environment. We asked them to share their process with us and to test a design prototype for what anomaly detection could look like in Anaplan. 

Findings

After interviewing five customers, we found that Anaplan customers who conduct anomaly detection for data coming into Anaplan do so in different ways:

  • On the simpler end, detecting anomalies is up to the end users. When an anomaly occurs, the admin is then informed and will perform ad hoc processes to identify the anomaly and resolve it. 
  • On the more complex end, anomaly detection involves external tools and automated processes to prevent anomalous data from entering Anaplan at all.

Anaplan customers are interested in native Anaplan functionality for anomaly detection in data. After viewing an initial prototype, the participants were asked to rate the following questions:

On a scale from 1 - difficult to 7 - easy, how easy or difficult did you find completing the example tasks through the prototype?
  • Participants rated this an average score of 6/7.
   
On a scale from 1 - not useful to 7 - very useful, how useful or not useful did you find this prototype to your role at your company?
  • Participants gave an average score of 5.4/7. Participants with a more robust anomaly detection process already in place gave lower scores, while participants with a more ad hoc process in place gave scores of 7/7.

Impact
Through this research, designers ended up refining the prototype design, as participants in the research had difficulties with certain tasks, like deciding what metrics to track. In addition, Sales and Senior Leadership gained a better understanding of what types of customers would use a feature like this, as participants with a more robust process in place seemed more hesitant to adopt native functionality. 

Links
If you would like to learn more about the process for this research, click here.