Stand & Deliver

Redesigning a widely used training tool for IBM sales and consulting teams.
Company
IBM
My Role
Design lead
Type
UX/UI, Content, and Visual design
Tools
Figma
Impact
Drove a 40% improvement in sales training for 40,000 global employees, delivering a scalable learning solution that boosted productivity and cut support tickets by 15% for the product team.

What is it

Stand & Deliver is an HR tool that enables users to deliver successful sales pitches and presentations.

There are 2 main user groups: Submitters and Reviewers. Submitters record a video presentation and submit for review. Reviewers evaluate the video and send back a rating with written feedback. It is a widely used training tool for IBM sales and consulting teams with around 40,000 global users.

The Challenge

Unintuitive design caused lack of user trust and in turn, unsustainable support ticket volume for the product team.

The previous UI was created by developers without input from designers or users. User flows were fragmented and didn't correctly utilize the familiar components of the IBM Carbon Design System. Submitters were confused with the steps of the video recording process and unsure if their videos were successfully submitted. Reviewers continually received error messages due to missing fields within the rating form.

This caused issues not only for users, but for the product and business side. The team received a large number of daily support tickets and messages from concerned users. This hindered the team's productivity in updating the site, as much time was spent resolving individual user issues.

My Role

As design lead, I led the efforts to streamline the user flow and improve user sentiment.

I worked closely with a UX researcher, product owner, and developers. I led weekly design review calls with the team and monthly calls to walk our stakeholders through design updates. I also collaborated with design leaders to gain their input and ensure that the experience was consistent with others in our org.

I originally joined the project to design an admin experience. Once the team and stakeholders saw the value in my design work, I expanded my role to revamp the entire submission and review processes.
Screenshots of the previous screens created by the team (not designed by me).

The Goals

Create simplified and intuitive video submission and review processes to increase user trust and productivity.

Through a simplified video submission process, intuitive review process, and clearer messaging, users would have less of a need to create support tickets, reach out with time-based questions, and re-submit videos.

The team also wanted to introduce an AI component to further guide users. This feature would provide users with AI generated feedback before submitting their videos for review, minimizing the number of times they would need to re-submit.
As-is journey map for the previous Reviewer experience.

Process

Reviewer experience

The Reviewer experience was the highest priority - these users had the most issues completing their tasks. This slowed down the review process and prevented Submitters from completing their training assignments.

I started by documenting the Reviewer user journey using Design Thinking exercises (As-is journey and empathy maps). I highlighted key areas of the current experience through user interviews and surveys to identify gaps and opportunities for improvement. I focused in on the main pain points and provided corresponding recommendations for each to build an improved user flow:
Reviewers lost track of open items to complete in the Reviews dashboard.

Within the review form, Reviewers often missed fields. Many could not locate the submit button or accidentally submitted incorrect scoring.

Using Figma and the Carbon Design System, I designed a new version of the Review dashboard and form, taking the user pain points into account and applying my above recommendations. I revamped the layout and UI of the dashboard and form, making the design more intuitive and consistent with IBM Design standards.
Reviews dashboard, desktop and mobile view.
With each design iteration, I shared my work during weekly design reviews with the product team, ensuring everything was technically feasible. I shared my progress during showcases with our stakeholders, making adjustments to align with their specific needs. I also met with users to continually get their feedback using prototypes of the designs.
Rate & review form

Submitter experience

I approached the re-design of the Submitter experience in a similar way, focusing on opportunities to improve user sentiment in areas of disruption and confusion. The main pain points within the Submitter experience included:

Submitters unsure if their videos were correctly submitted to reviewers.

Submitters were lost in the video recording & submission process due to a fragmented experience.

Submissions dashboard showing pending reviews and the create new submission form.
For the Submitter dashboard, I borrowed several UI patterns from the Reviewer experience for consistency and due to positive user feedback. Both dashboards have a similar layout, both containing the same tabs, filters, and display options in the same location. For the Create New Submission form, I also reused the same progress indicator on the left side panel, for an easier step by step experience.
Record video screen and record slide presentation screen.
I completely redesigned the Record Video flow and design. I started by researching other similar video recording platforms, noting how they handled both the UI and different stages of the recording process. I used existing Carbon components to create new and intuitive patterns for the video control bar and options. These were tested with users and improved based on their feedback.

Throughout the design process, I updated the UI for each screen using Carbon components for better consistency and adherence to IBM Design standards. I used these components to create new patterns throughout the site. I ensured that the design was built within the proper grid system and had improved alignment and padding between elements. This allowed for more space to breathe and focus on the task at hand, without the previous visual clutter.

AI Generated Feedback

In order to further increase user productivity and take advantage of IBM WatsonX offerings, the product team and stakeholders wanted to introduce an AI component to the experience. This AI component would analyze the user's voice in their submission video, providing helpful feedback before being submitted to the reviewer.  

To start this process, I researched similar AI features in other video recording apps. I identified what voice results were provided and how they presented feedback to users. We polled key users on what feedback would be most valuable for them to receive. I then worked with the engineering team to see what feedback was technically feasible to provide from the AI models used.

From my research, the user results, and partnering with the technical team, I compiled 4 categories of feedback: Energy, Clarity, Sentiment, and Common words. Within these areas, users would receive feedback on their vocal tone, pace, volume, common words used, filler words, and other tips to improve.

I created the content for each of these categories, writing out the explanations for each, along with the text that accompanies the different scoring ranges and grades. I mapped out the ranges for the scoring criteria, along with the content that appears to the user according to each grade range.

With this new feature, users have the opportunity to improve their vocal presentation skills, as well as fix any sound issues before submitting to their reviewer. This allows for a better chance of receiving a higher rating, minimizing the need to correct and re-submit.

outcome & Business Value

Productivity increase and elimination of traditional training costs

Through implementing a simplified video recording, submission, and review process, along with UI improvements, users reported an increase in their training efficiency. Sales and consulting teams reported faster turnaround time for ratings and closing out training assignments.
"Stand and Deliver" videos offer a flexible and engaging alternative to traditional training methods.

Because videos can be done on the student’s own schedule, independently, it eliminates costs and coordination needs associated with in-person education. It also reduces impact to students’ client commitments, i.e., recordings can be made outside of normal billable hours.” – IBM Consulting

Effective learning tool with 40% training improvement

The updates made to the Submitter experience contributed to a great improvement in sales performance, increasing revenue for the company.
“Leveraging Stand & Deliver methodology as a training and development technique for our global sales team has proved to be an effective adult learning approach.

The metrics reveal that using videos significantly improves performance, resulting in a 40% improvement over traditional training methods.” – IBM Sales Enablement

Scalable training platform

The improvements better allowed for an automated training program that can easily scale to meet the needs of different teams, curriculums, and students.
“Our Stand and Deliver workflows automate the entire process, providing a scalable solution that can quickly adjust to meet the business needs. It is a foundational block of sellers' progression training.” – IBM Sales Enablement

Decline in support tickets

The newly updated and more intuitive user flow, along with clear messaging and status updates allowed for a dip in support tickets and messages received by the team. These messages have reduced by 15% since these updates have gone live.  
Although some design features are still being implemented and rolled out by the team, the new features in production have already been met with positive feedback from users and stakeholders.
Note: Images and text in mockups are placeholders as to not disclose any confidential internal information.