IBM Watsonx Analytics
UX Researcher
Creating a central hub for IBMers to submit generative AI use cases.
Short Term Project
SUMMARY
1
Product Background
2
Problem & Value
3
User Interviews
4
Outcomes
The product
IBMers submit their new generative AI product ideas to our platform, then they are reviewed and potentially approved.
Key responsibilities
UX Research to improve the platform’s overall submission and review process, qualitative research and sharing out with stakeholders.
What made this unique
Working on the cutting edge of new generative AI use cases and learning about how to calculate business value.
THE PEOPLE (the best part)
CIO Watsonx Analytics Team (alphabetical)
Aline Mian Soares
Ana Gerin
Annette Tassone
Bruno Bortulotto
Chloe Wang
Gabby Hoefer
Jay LaPlante
Jessica Bolding
Laura Rodriguez
Mika Jugovich
Randal Ries
Stefa Etchegaray Garcia
Steven Fiscaletti
Tyler Waite
Business Transformation
Automation Senior Leader
UXR Creative Director
Product Manager
Data Architect
UX Researcher
Product Owner Lead
Program Manager, CAO
Product Strategist
Senior Data Scientist
Analytics Manager
Head of Data Science, CAO
Chief Business Architect
Advisory Data Scientist
THE PRODUCT
Internal hub for genAI use cases
Like many companies, leadership is encouraging IBMers to bring new ideas for generative AI infused products.

Our product is a review system to decide which ideas IBM organizations will invest in.

We have several reviewers, but lately they’ve had an overwhelming number of new submissions and are falling behind.
PROBLEM EVALUATION
Main design challenge
How might we help IBMers submit high quality proposals for new generative AI use cases?
Applicants

IBMers have great ideas but often submit incomplete or inaccurate proposals.

Since there are so many different types of project ideas, the submission process is not one-size-fits-all and is confusing to applicants.

Applicants need clarified submission expectations so that they can submit high quality proposals.
Reviewers

Reviewers must go back and forth with teams, increasing overall approval time.

Incomplete proposals slow down review time, which has led reviewers to be far behind in their approval process.

Reviewers need to spend less time going back and forth with teams so that they can approve projects faster.
USER INTERVIEWS
Interviewing users to discover their process for calculating projected business value.
Applicants and reviewers both identify business value projections as the largest obstacle in the submission process.
Purpose
Improve business value calculations in order to improve quality of overall submissions.

Approach
Completed user interviews with applicants with submissions in progress in November 2023.
  • 30 minute virtual sessions
  • Walkthrough submission calculation
  • Probed for expectations and challenges
FINDINGS & OUTCOMES
Implement two project tracks based on likelihood of support needed.
  • Users submit one form but are grouped in the background into two different tracks.

  • Tracks are based on: team maturity, proposed product maturity and submitter experience.
Create a reusable, standard template for calculating business value metrics.
  • Teams need reusable, standard templates for calculating business value.

  • I proposed some design recommendations for the template, which are confidential.
Help submitters know which context to provide, without over saturating the reviewer.
  • The review team needs some context surrounding metrics so that they understand where the numbers came from.

  • Provide clear indications of which details are needed and which the submitter can leave out.