Lean Analytics: Evolving Beyond Agile

Back to all resources
July 1, 2021

Data Science, Machine Learning, AI, and analytics have changed how businesses work. Marc Anderseen said, “Software is eating the world”. Analytics and AI are doing the same. Many analytics teams have inherited the Agile or Kanban approaches from software development, but face unique challenges that slow the adoption of analytics. Analytics teams are forced into the existing processes and struggle to keep up with the increasing demand from the business. Ikonos Analytics developed the Integrated Analytics Framework (I.A.F.) to address the unique challenges analytics teams face. The I.A.F. integrates into and extends beyond Agile and Scaled agile processes to address the unique challenges of analytics and Data Science.

The advent of pivotal technology has required a corresponding shift in operations at scale. Henry Ford leveraged the assembly line to mass-produce automobiles. Teams adopted agile to develop software. Now analytics and AI/ ML require a new paradigm shift to deliver impactful Analytic Solutions at scale. Organizations now have the data and talent to build critical models. Teams are facing increasing pressure to get the models they’ve built into production. Teams also need to align the models to the needs of the business to ensure the Return on Investment needed for the continued adoption of data analytics.

We have developed the Integrated Analytics Framework to implement the concepts and paradigm shifts of lean to the novel challenges of analytics. Lean is a collection of methodologies to reduce waste in order to deliver more value in the eyes of the customer.  We want our analytics teams to be lean, mean, value-producing machines.

Lean has become a bloated term these days. Let’s start by unpacking what lean really means in the context of analytics.

Lean for Analytics

The lean approach focuses on removing waste. Waste means all the meetings, waiting around, unused models, and all the other things we do that do not solve the customer’s problem. Leaders care about this because it is not just a cost-saving measure. Lean analytics provides your competitive edge and influences your time to market for innovation.

The Lean Startup by Eric Reiss also utilizes lean to streamline validating, testing, and experimenting new ideas. Innovation is inherently messy. It requires experimentation and failure which creates waste. The goal of lean is to remove as much waste as possible from the innovation process using faster learning iterations. Agile and Kanban processes focused on enabling the flexibility to iterate quickly for software teams. These nimble and iterative processes replaced the sluggish waterfall approach. Agile is definitely a step in the right direction for Analytics teams, but analytics teams face more uncertainty than software development.

Our team has spent years helping teams design, develop, and ship Analytic Solutions. We identified common patterns and challenges across teams ranging from startups to Fortune 10 enterprises. That’s why we’ve developed the Integrated Analytics Framework to balance the freedom and structure teams need.

Challenges with Agile for Analytics

Analytic Solutions manifest themselves as code, but they’re different than software products. Analytics solutions have a lot more going on, especially with Machine Learning and AI. Let’s look at some of the common challenges teams face using Agile for Analytics.

Data Science shouldn’t be Time-boxed

Data Science is research work to identify relationships relevant to the business problem. The data scientists need the freedom to explore the data. They need clear alignment to the business problem and structure as they pursue the problem. Time-boxing the exploratory data analysis and solution selection in Agile sprints yield misleading or unusable results.  The I.A.F. recommends teams use stage gates and alignment points like a waterfall project to provide structure to research and solution formation.  When the model is ready for operationalization, it can be integrated into agile development and release processes.

Data Science needs Flexibility

Traditional software development focused on incremental improvements and adding new functionality to an existing codebase. On the other hand, data scientists need flexibility in the language and approach that best solve the problem. They then may need to change the whole approach to improve the solution each iteration (e.g. change from linear regression to XGBoost). Teams need the methods and processes to facilitate changes within bounds of governance. Software teams need DevOps to rapidly deploy and iterate. Analytics requires ModelOps for providing the equivalent flexibility and speed while maintaining governance of the models. This also requires deeper collaboration and structured handoffs between IT and Analytics.

Flexibility in Architecture

Analytics requires establishing new architectural patterns to support the diverse use cases and requirements of the business. Software development depends on incremental improvements to a codebase relying on established architectural patterns. The I.A.F. provides the flexibility and processes to accommodating architectural freedom and changes.  This means systematically defining requirements for evolving architecture and data pipelines.

Data Changes Over Time

Analytics solutions and models face real-world data. This data can change drastically over time statistically as well as structurally. We saw this with the COVID-19 pandemic. Every model had to be reassessed against the new world we live in today. Analytic solutions are built on assumptions that can be quantified and therefore measured over time. Agile processes do not include this form of reactivity. Monitoring and the controls to handle changes in the data are crucial parts of the I.A.F.

Integrated Analytics: Beyond Agile

Here at Ikonos Analytics, we’ve learned to solve these problems across organizations and distilled best practices into the Integrated Analytics Framework. The I.A.F. focuses on improving how analytics teams function by defining handoffs and methods to increase impact, quality, and velocity. It’s a blueprint for an analytics team that runs smoothly from idea to production and iteration. The I.A.F. integrates with existing Agile processes while addressing the challenges of Analytics.

The I.A.F. focuses on the core work streams of analytics:

  • Product - managing analytics as a product including prioritizing requests, prototyping, and managing the roadmap.
  • Impact - ensuring on the business and return on investment with rigorous benefit definition and tracking.
  • User Design - leveraging the best practices of user-centric design to drive user adoption and benefit.
  • Analysis - managing and structuring the solution development from a Data Science perspective including Exploratory Data Analysis, Validation, and Monitoring.
  • Data - designing, developing, and managing the data pipeline for the model including DataOps.
  • Architecture - defining and evolving the infrastructure and architecture to meet business needs and standards including ModelOps.  

Get Started Today

Here at Ikonos Analytics, we focus on practical results. Now let’s look at how you can get started today. You write out these exercises in one centralized document to help your team get on the same page. Even if your project is in-flight, these can be productive conversations.


Write out the requirements and business case for the projects the team is working on. Write out the coming features or functionality planned for the solution. Have your business stakeholder review them.


Write out the metrics your analytic solution aims to increase. What is the key performance indicator it changes (customer retention, quality defect rate, time savings, etc.)? Write out how these metrics are calculated and tracked over time.  

User Design

Write out the user personas for your solution. Who are they and why do they use your product? How do you collect feedback? What do you do with feedback from users?


Collect your research and documentation for how the solution is developed. Link to your Notebooks and the data sets and views you used. Organizing these assets will really help your fellow data scientists/ analysts see how awesome you are.


Draw out the data pipeline and write out the various data sources. Link to the git repo where the ETL code for the data pipeline is stored.  Note the latency requirements.

Click here to watch

Join our newsletter

Receive the latest resources!


Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.