When it comes to measuring design system success, both what you measure and how you measure it is important.
Nathan Curtis shares a nice system for measuring success with the following 6 pillars. This article reviews each pillar in detail and shares helpful tools that will help you measure DS success.
Measuring design system success by Nathan Curtis.
1. Productivity
This pillar measures the output generated by the design system and its contributors. It reflects the system’s ability to produce components that help teams develop features faster.
Questions
How much output did the system produce?How much output did contributors?
Key metrics
Number of new components: Track how many new reusable elements have been added to the design system in a current release.Number of fixes turned around. Tracks the average time required to fix product design issues using a design system.Number of releases or updates made to the system. This metric will help assess the system’s pace of improvement and iteration.
Measuring method
Component code tagging. This involves tagging components within the source code with metadata like their creation date, origin, and usage. It allows teams to track the lifecycle of each component, monitor adoption, and assess how frequently they are being reused.
2. Coverage
This pillar evaluates how widely the design system is used across different products and whether these products are aligned with the design system’s latest version. It assesses both reach and influence.
UI component coverage over time for a design system. Image by Cristiano Rastelli from his guide to measuring DS.
Questions
Who is using the system? (products, teams, etc)How up-to-date are products relative to the system?
Key metrics
Number of products using system. Measures the breadth of the system’s adoption across different projects and platforms.% of pages applying system. Helps determine how deeply the system has embraced in product design.Level of adherence to the DS across different touchpoints. This metric assesses the extent to which the design standards defined in DS are consistently applied across various touchpoints (e.g., pages, apps).Onfido’s design system measures component adoption across projects. Image by Steve Dennis from his case on measuring Onfido’s DS
Measuring methods
Automated component tracking. By tagging components, teams can use automation to track where and how frequently they are being used in production.Onfido team shows component usage on a web page. Image by Steve Dennis from his case on measuring Onfido’s DS.Manual page/product audit. Periodically (once in a quarter), teams can manually review products and pages to see if they conform to the design system, especially if automated tools are not feasible or available.
3. Efficiency
Efficiency measures how much time or effort the design system saves in the design and development process, especially in speeding up the delivery of features or bug fixes.
Questions
How much time is saved using the system?Is speed-to-market improved?
Key metrics
Time-to-market (days). This tracks how much the system accelerates the process of bringing features or products to market.Average time for bug fixes or feature updates. Measures how the system’s components contribute to faster resolution of issues or enhancements.
Measuring method
Velocity task type tracking. By categorizing tasks based on whether they used the design system or not, teams can compare how much time is saved on system-related tasks versus custom work. This comparison helps reveal how much efficiency the system adds to development.
4. Satisfaction
Measure how design system’s users (primarily, designers and developers) are satisfied with the tools, processes, and resources it provides. This is essential to ensure the system’s ongoing use and adoption.
Questions
Are designers satisfied with the system?Are developers satisfied with the system?
Key metrics
System usability score (SUS). A widely used method to assess how intuitive and easy the system is to use for its users.Complaint frequency. How often do users raise concerns or issues about the design system, its ease of use, or its effectiveness?
Measuring methods
Surveys. Regularly gather structured feedback from design system users to gauge their satisfaction and understand where improvements are needed.Sentiment analysis. Analyze Slack channels, forums, or other internal communication tools to assess the general tone of conversations about the system. Sentiment analysis can help surface recurring issues or frustrations.Employee reviews. Open-text feedback provides qualitative insights into the pain points and successes experienced by users.
5. Consistency
Consistency ensures that the design system delivers a uniform look, feel, and user experience across all products. This covers visual design, UI, and UX alignment with the system’s standards.
Questions
Is the UI/visual design consistent?Is the UX consistent?
Key metrics
Visual design consistency. Tracks whether elements like color schemes, typography, and layout remain consistent across products.UX consistency. Measures the degree to which interaction patterns, navigational structures, and user flows are aligned with the design system.Frequency of using the latest components. This assesses how often the most up-to-date components from the system are being used across products.
Measuring methods
Design and code linting: Automated design and code analysis tools can be used to ensure that design components are implemented consistently across the product. Design Lint is a nice Figma plugin that finds missing styles within your designs. It’s perfect for fixing errors in your Design System or designs before handoff. If you want to do code linting, here is a nice tutorial for linting ReactJS and CSS.Design Lint Figma plugin by Daniel Destefanis.Manual product audits: Teams can manually inspect products and pages to ensure they adhere to the design system’s visual and UX guidelines.
6. Quality
This pillar focuses on the end-user experience, evaluating how well the products designed using the system meet usability and accessibility standards.
Questions
Are products usable?Are products accessible?Are products satisfying?
Key metrics
Usability testing. Determines how easy or difficult it is for users to interact with products that leverage the design system.Accessibility scoring. This ensures the system adheres to accessibility guidelines like the Web Content Accessibility Guidelines (WCAG), which focus on making web content usable for all people, including those with disabilities.Automated accessibility review results using Wave tool. Image by WebAIM.
Measuring methods
Design audits/manual assessment. Product teams can conduct manual reviews to assess the usability and accessibility of products.Automated scoring. Tools can be employed to evaluate code against accessibility standards automatically and provide a score or feedback on compliance.Automated scoring for UI components in a design system. Image by Lucid.
Want to master your product design skills?
Whether you’ve been working as a designer for years or are completely new to design, Designlab has programs and courses to help you take the next step in your design career. Check Advanced Figma courses to master your UI design skills.
Online UI and UX Design Courses and Bootcamps | Designlab
This post contains affiliate link(s)
Measuring Design System Success was originally published in UX Planet on Medium, where people are continuing the conversation by highlighting and responding to this story.