Nathan Curtis
Founder of UX firm @eightshapes. Speaker. Writer. Fan of Arsenal, Hokies. Cyclist & runner. Father & husband. VT & @uchicago grad.
Originally published 6/8/2017 •
5 min
Measuring Design System Success
Use OKRs to Set Goals & Track Progress
As an advocate of your system practice, can you paint the picture of what success looks like? And, is it measurable?
I’ve had some success helping system teams express goals in the form of objectives and quantifiable key results:
- Objective : Where do we want to go?
- Key Result : How will we pace ourselves to see if we are getting there?
Google established this process of OKRs to help individuals and teams to set ambitious goals and track their progress in a measurable way. To me, they are a fantastic blend of being specific, direct, and flexible to help you reach further than you think you can go and be satisfied with incremental, reasonable achievement.
What follows is a glimpse of the kinds of examples I’ve heard and helped others form using that model. May they inspire your team to express their aims to publish tools and documentation, foster adoption across a product portfolio, operate a systems team, and create a community of contributors.
Product Adoption
Let’s make one thing clear: you don’t succeed when you launch a living style guide or release a code kit.
A system’s value is realized when products ship features that use a system’s parts.
Therefore, your main motivation is to help your customers — teams making experiences — improve efficiency, cohesiveness, and quality. And that’s accomplished by them adopting what you make.
Sample Objective
Sample Key Results
Within a defined period, such as 12 months:
- Initiated Adoption : 100% flagship and secondary products installed the system as an npm dependency and implemented ≥1 core concern — color, icons, buttons, forms, or typography — in ≥1 high-priority product features.
- Graded Adoption : 100% of flagship products and 50% of secondary products achieve level 3 system adoption (where progression across levels use criteria like what system parts are implemented, coverage across product features and sections, and/or quality/precision of implementation).
- Commitment to Adopt : Published roadmaps remaining secondary products and ≥5 other products to achieve level 2 adoption.
- Adoption as Migration/Upgrade : All flagship & secondary products have upgraded to {new system} and no products depend on {old system}.
- Adoption Age : System version dependencies are no more than 6 months old for flagship products and 12 months old for secondary products
Operating a Systems Team
Setting up a smoothly operating systems team can be important to getting things done. You need people dedicated to the cause, and some teams struggle to persist the commitment and keep their leaders engaged.
Sample Objective
- Effectively operate a stable systems team as a funded concern with regular sponsor participation.
Sample Key Results
Within a defined period, such as 6 months:
- Capacity : Leadership funds 5 head count (some capacity part-time) as a system team for four consecutive quarterly release cycles.
- Participation : Team members miss ≤5% of scrums, critiques, planning and review sessions due to conflicting priorities.
- Productivity : The systems team improves average velocity from 45% to ≥75% with no sprint velocity below 50%.
- Predictability : The system publishes every minor release on a quarterly basis (or other regular interval) within 48 hours of announced date & time.
- Sponsor Engagement : Primary sponsors — VP, Design and Director, Platform Front End Engineering — attend ≥ 100% of release planning sessions, ≥ 66% sprint reviews, and ≥ 20% sprint planning sessions.
Cultivating a Community
Some design and engineering leaders see the system as a lever to change culture and collaborative practices. Whether literally open sourcing a codebase or nurturing connectedness and collaboration, it’s time to build a community.
Sample Objective
Form a community to share decisions on, contribute to, and release shared design assets and code.
Sample Key Results
Within a defined period, such as 6 months:
- Critiques/Shares : Conduct regular design (10+) and engineering (10+) demos and critique of system topics: ≥ 160 non-system team attendees, in aggregate.
- Process Development : Document and publish 5 processes for soliciting outside contributions, such as coding style, contribution reviews, & accessibility requirements
- Authored Doc, Utilized Workflow : Publish documentation composed by 7+ contributors that leverage the system’s editorial workflow.
- Code : Merge PRs from ≥6 non-systems team contributors.
- Voluntary Attendance : Community contributors attend regular system team sprint planning and/or critiques: 30 non-system team attendees, in aggregate.
- Feedback : System investment mentioned positively in ≥10% of employee satisfaction surveys.
- Collaboration : Active participation in #system-design and #system-engineering Slack channel by ≥ 30 non-system team members.
Monitoring Product Improvement
For most teams, capacity to make a library is an ongoing challenge. Measuring efficiency or product quality seem a luxury unless the enterprise has an existing program the system can build upon. Beyond that, I’ll direct teams to consider triggers of a system investment.
Accessibility could be one. In a recent system pitch to executives, conversation when a VP responded to our promise of built-in accessibility with “Oh yeah, our product teams can’t do this themselves. And we’ve needed to mitigate that risk in a few markets for awhile. Let’s do it!” That’s an executive endorsing an objective.
If a corporate goal exists — accessibility, responsive web design, or another measurable criteria — and the system can help, redirect system objectives and track results commensurate with that goal.