KPIs for ASPICE

Performance measurement of ASPICE process areas is key to not only continuous process improvement but as a prerequisite for reaching Level 3 and above. While this seems to be a critical aspect, research does not find any best practices or examples for KPIs (Key Performance Indicators) for ASPICE process areas. Based on this and my experience in implementing and assessing automotive processes, I decided to write a blog post and share my thoughts on what I think would be suitable and useful measurements.

I believe that development processes should use agile/lean principles, which would mean that the proposed KPIs are based on iterative product increments. Thus all of the relevant work products identified in the KPIs are finished and reviewed before the start of the next development cycle to the extent that they can be used for development. Furthermore, I believe that there should only be a few but very meaningful KPIs, as it becomes more difficult to focus on the significant ones and make the right decisions if the data is already overwhelming.

Although these measurements are made taking an agile and iterative approach, with a bit of tweaking they can be implemented in a waterfall/V-cycle setting as well. In addition, it includes only process areas from the HIS (Hersteller Initiative Software) scope since those are the ones most commonly used in automotive.

MAN.3 – Project Management

KPIs

  • Deadlines and upcoming tasks known by the whole team (transparency)
  • Interface definition and system integration plan available

Goals / Incentives

  • Enables big picture view
  • Encourages team building and flow of information
  • Establishes communication and coordination between disciplines

 

SUP.1 – Quality Assurance

KPIs

  • Number of found errors in validation cycle
  • Number of major bugs in customer releases

Goals / Incentives

  • Focus on fault prevention rather than fault detection

 

SUP.8 – Configuration Management

KPIs

  • Nightly builds have a success rate > 99%

Goals / Incentives

  • Enforce continuous integration
  • Create a stable build environment

 

SUP.9/SUP.10 – Problem Resolution Management / Change Request Management

KPIs

  • Lead Time and Cycle Time

Goals / Incentives

  • Reduce work in progress, multitasking and task switching
  • Get work faster through the value chain
  • “Stop starting, start finishing”

lead_cycle_time

ENG.2/ENG.4 – System Requirements Analysis / Software Requirements Analysis

KPIs

  • Number of customer Change Requests per iteration

Goals / Incentives

  • Use front loaded development
  • Discussion and refinement of requirements has high priority

 

ENG.3/ENG.5 – System Architecture / Software Design

KPIs

  • Number of module/interface changes after design freeze
  • Number of reused modules (measurement only in successor project)

Goals / Incentives

  • Create modular and stable designs right away
  • Support frontloaded and concurrent development

 

ENG.6 – Software Construction

KPIs

  • Static code analysis errors = 0
  • Module test coverage = 100%
  • Code review coverage = 100%

Goals / Incentives

  • Create clean and maintainable code
  • Build quality in rather than inspecting it in
  • Create a modular design with low complexity

 

ENG.7 – Software Integration Test

KPIs

  • Degree of automated integration tests > 90%

Goals / Incentives:

  • Perform integration tests for each iteration
  • Faster analysis after module update

 

ENG.9 – System Integration Test

KPIs

  • Full test coverage early one
  • Extensive amount of test cases

Goals / Incentives

  • Create a stable specification and high quality very fast
  • Ensure coordination and cooperation of different disciplines

 

ENG.8/ENG.10 – Software Testing / System Testing

KPIs

  • Degree of automated tests > 80%
  • Test case developed in parallel to features

Goals / Incentives:

  • Allow test cycles for each iteration
  • Support finding of faults early on

 

ACQ.4 – Supplier Monitoring

KPIs

  • Number of Change Requests per iteration

Goals / Incentives:

  • Joint interfaces and processes are defined early on
  • All information including progress is shared

 

Although product development can differ very strongly based on the product, tools and processes used, I hope that this post will give you some valuable insight. I am looking forward to your opinions on those KPIs and your experience in process performance measurement in general.

Leave a Reply

Your email address will not be published. Required fields are marked *