


Shipped
Shipped
Shipped
SaaS
SaaS
SaaS
UX Writing
UX Writing
Streamlining a Disruptive Survey User Flow
Streamlining a Disruptive Survey User Flow
Streamlining a Disruptive Survey User Flow
Duo Security · 2022
ROLE
Product Design Intern
SKILLS
Product Design
User Research
User Journey
TIMELINE
May ‘22 - Aug ‘22
TEAM
1 designer, 2 researchers,
2 engineers, 4 data scientists
Introduction
Introduction
Introduction
Cisco Duo provides two-factor authentication (2FA) for organizations to keep user logins secure. In 2023, they introduced the Universal Prompt, a more secure authentication flow.
The update improved security but also introduced changes that risked frustrating users. To support a smooth rollout, Duo needed a clear, measurable way to monitor user sentiment.
Cisco Duo provides two-factor authentication (2FA) for organizations to keep user logins secure. In 2023, they introduced the Universal Prompt, a more secure authentication flow.
The update improved security but also introduced changes that risked frustrating users. To support a smooth rollout, Duo needed a clear, measurable way to monitor user sentiment.
Cisco Duo provides two-factor authentication (2FA) for organizations to keep user logins secure. In 2023, they introduced the Universal Prompt, a more secure authentication flow.
The update improved security but also introduced changes that risked frustrating users. To support a smooth rollout, Duo needed a clear, measurable way to monitor user sentiment.
My Role
My Role
My Role
As the product designer, I partnered closely with UX researchers and data scientists to identify the best way to capture meaningful user sentiment during the authentication flow. I led the research and design of a lightweight, binary feedback prompt—carefully placed to minimize disruption while maximizing response quality and actionability.
As the product designer, I partnered closely with UX researchers and data scientists to identify the best way to capture meaningful user sentiment during the authentication flow. I led the research and design of a lightweight, binary feedback prompt that had a 122% year-over-year increase in total survey responses (from 9,000 to 20,000) and a 28.6% response rate.
As the product designer, I partnered closely with UX researchers and data scientists to identify the best way to capture meaningful user sentiment during the authentication flow. I led the research and design of a lightweight, binary feedback prompt that had a 122% year-over-year increase in total survey responses (from 9,000 to 20,000) and a 28.6% response rate.



Understanding the Shift
Understanding the Shift
Understanding the Shift
To design an effective survey, I first needed to deeply understand what had changed in the new Universal Prompt and where users might struggle.
By mapping out the user journey, I was able to identify when issues were most likely to occur and where users would be mentally available to reflect on the experience.
To design an effective survey, I first needed to deeply understand what had changed in the new Universal Prompt and where users might struggle.
By mapping out the user journey, I was able to identify when issues were most likely to occur and where users would be mentally available to reflect on the experience.
To design an effective survey, I first needed to deeply understand what had changed in the new Universal Prompt and where users might struggle.
By mapping out the user journey, I was able to identify when issues were most likely to occur and where users would be mentally available to reflect on the experience.



Redefining Feedback Collection
Redefining Feedback Collection
Redefining Feedback Collection
But knowing when to insert the prompt was only part of the challenge. I also needed to know WHAT data we needed to collect and HOW to gather it effectively.
At the time, the beta survey was just a persistent Google Form link placed on the login page. Users were asked to recall their old experience and compare it to the new one.
But knowing when to insert the prompt was only part of the challenge. I also needed to know WHAT data we needed to collect and HOW to gather it effectively.
At the time, the beta survey was just a persistent Google Form link placed on the login page. Users were asked to recall their old experience and compare it to the new one.
But knowing when to insert the prompt was only part of the challenge. I also needed to know WHAT data we needed to collect and HOW to gather it effectively.
At the time, the beta survey was just a persistent Google Form link placed on the login page. Users were asked to recall their old experience and compare it to the new one.
But knowing when to insert the prompt was only part of the challenge. I also needed to know WHAT data we needed to collect and HOW to gather it effectively.
At the time, the beta survey was just a persistent Google Form link placed on the login page. Users were asked to recall their old experience and compare it to the new one.



Where the Old Survey Fell Short
Where the Old Survey Fell Short
Where the Old Survey Fell Short
The survey revealed that 75% of users preferred the new Universal Prompt over the old one, but it was hard to understand why.
The mix of free-response and Likert scale questions made it difficult for data scientists to analyze. We needed feedback that was clearer, more specific, and easier to act on.
I analyzed and sorted over 9,000 beta survey responses and identified 3 key issues:
The survey revealed that 75% of users preferred the new Universal Prompt over the old one, but it was hard to understand why.
The mix of free-response and Likert scale questions made it difficult for data scientists to analyze. We needed feedback that was clearer, more specific, and easier to act on.
I analyzed and sorted over 9,000 beta survey responses and identified 3 key issues:
The survey revealed that 75% of users preferred the new Universal Prompt over the old one, but it was hard to understand why.
The mix of free-response and Likert scale questions made it difficult for data scientists to analyze. We needed feedback that was clearer, more specific, and easier to act on.
I analyzed and sorted over 9,000 beta survey responses and identified 3 key issues:









Reframing the Problem
Reframing the Problem
Reframing the Problem
How might we build a professional, low-friction survey that collects accurate and actionable user feedback?
This was the core question driving the redesign. To solve it, I first consulted with engineers, data scientists, and accessibility experts to understand our constraints:
• No pop-ups (for accessibility and security)
• Authentication time is limited, so interaction had to be quick
• Survey must match Duo’s brand, visually and in tone
I set three clear goals:
Avoid disrupting authentication
Feel quick and easy
Collect accurate and actionable feedback
How might we build a professional, low-friction survey that collects accurate and actionable user feedback?
This was the core question driving the redesign. To solve it, I first consulted with engineers, data scientists, and accessibility experts to understand our constraints:
• No pop-ups (for accessibility and security)
• Authentication time is limited, so interaction had to be quick
• Survey must match Duo’s brand, visually and in tone
I set three clear goals:
Avoid disrupting authentication
Feel quick and easy
Collect accurate and actionable feedback
How might we build a professional, low-friction survey that collects accurate and actionable user feedback?
This was the core question driving the redesign. To solve it, I first consulted with engineers, data scientists, and accessibility experts to understand our constraints:
• No pop-ups (for accessibility and security)
• Authentication time is limited, so interaction had to be quick
• Survey must match Duo’s brand, visually and in tone
I set three clear goals:
Avoid disrupting authentication
Feel quick and easy
Collect accurate and actionable feedback
My Design Approach
My Design Approach
My Design Approach
I explored multiple design directions and refined them based on feedback from fellow designers and insights from user testing.
I explored multiple design directions and refined them based on feedback from fellow designers and insights from user testing.
I explored multiple design directions and refined them based on feedback from fellow designers and insights from user testing.









Testing & iterating
Testing & iterating
Testing & iterating
Over the course of three design iterations, I:
• facilitated 50+ unmoderated usability tests on UserTesting.com
• wrote 4 task-based testing scripts
• helped researcher analyze and aggregate findings from recordings
Key Takeaways:
• Users welcomed short surveys if they didn’t feel frequent
• Clear, familiar UI helped users complete the survey quickly
• Confusion in wording caused hesitation or inconsistent answers
• Large buttons improved speed and perceived importance
These insights directly shaped the final interaction flow and copy.
Over the course of three design iterations, I:
• facilitated 50+ unmoderated usability tests on UserTesting.com
• wrote 4 task-based testing scripts
• helped researcher analyze and aggregate findings from recordings
Key Takeaways:
• Users welcomed short surveys if they didn’t feel frequent
• Clear, familiar UI helped users complete the survey quickly
• Confusion in wording caused hesitation or inconsistent answers
• Large buttons improved speed and perceived importance
These insights directly shaped the final interaction flow and copy.
Over the course of three design iterations, I:
• facilitated 50+ unmoderated usability tests on UserTesting.com
• wrote 4 task-based testing scripts
• helped researcher analyze and aggregate findings from recordings
Key Takeaways:
• Users welcomed short surveys if they didn’t feel frequent
• Clear, familiar UI helped users complete the survey quickly
• Confusion in wording caused hesitation or inconsistent answers
• Large buttons improved speed and perceived importance
These insights directly shaped the final interaction flow and copy.









Impact
Impact
Impact
Quantitative Impact:
🕒 Survey time reduced by 99.6% (from 5 minutes → ~2 seconds)
✅ 100% task completion rate in final usability test
📈 28.6% response rate from 70k+ users
📊 20,000+ responses gathered to support the Universal Prompt rollout
Qualitative Impact:
• Users described the experience as “standard,” “clear,” and “expected”
• Engineers and leadership had actionable, real-time insights
• Reduced frustration and improved user trust
Quantitative Impact:
🕒 Survey time reduced by 99.6% (from 5 minutes → ~2 seconds)
✅ 100% task completion rate in final usability test
📈 28.6% response rate from 70k+ users
📊 20,000+ responses gathered to support the Universal Prompt rollout
Qualitative Impact:
• Users described the experience as “standard,” “clear,” and “expected”
• Engineers and leadership had actionable, real-time insights
• Reduced frustration and improved user trust
Quantitative Impact:
🕒 Survey time reduced by 99.6% (from 5 minutes → ~2 seconds)
✅ 100% task completion rate in final usability test
📈 28.6% response rate from 70k+ users
📊 20,000+ responses gathered to support the Universal Prompt rollout
Qualitative Impact:
• Users described the experience as “standard,” “clear,” and “expected”
• Engineers and leadership had actionable, real-time insights
• Reduced frustration and improved user trust

Sierre Wolfkostin, Design Manager
Sierre Wolfkostin, Design Manager
Sierre Wolfkostin, Design Manager
"Lily shows up everyday with a desire to learn. She proactively seeks knowledge both inside and outside of Duo to refine her designs. She solicits feedback early and often, and goes out of her way to seek multiple perspectives on her work."
"Lily shows up everyday with a desire to learn. She proactively seeks knowledge both inside and outside of Duo to refine her designs. She solicits feedback early and often, and goes out of her way to seek multiple perspectives on her work."
"Lily shows up everyday with a desire to learn. She proactively seeks knowledge both inside and outside of Duo to refine her designs. She solicits feedback early and often, and goes out of her way to seek multiple perspectives on her work."
Reflection
& Lessons Learned
Reflection & Lessons Learned
Reflection & Lessons Learned
As an intern, I faced pushback from other teams and senior designers who questioned my involvement in the project. Despite being "just a survey," many were concerned that adding a survey within the authentication flow would disrupt the user experience and ultimately go against DUO security’s design principles. I conducted extensive research, ran multiple rounds of testing, and iterated on feedback to address these concerns, ultimately shipping the project with some compromises.
As an intern, I faced pushback from other teams and senior designers who questioned my involvement in the project. Despite being "just a survey," many were concerned that adding a survey within the authentication flow would disrupt the user experience and ultimately go against DUO security’s design principles. I conducted extensive research, ran multiple rounds of testing, and iterated on feedback to address these concerns, ultimately shipping the project with some compromises.
As an intern, I faced pushback from other teams and senior designers who questioned my involvement in the project. Despite being "just a survey," many were concerned that adding a survey within the authentication flow would disrupt the user experience and ultimately go against DUO security’s design principles. I conducted extensive research, ran multiple rounds of testing, and iterated on feedback to address these concerns, ultimately shipping the project with some compromises.