I am on the team for almost 3 years as the sole/lead UX/UI designer. I worked with the product manager and engineering partners to create a human in the loop design system for machine learning in the robotics field to automate repetitive tasks and increase work efficiency.
My Role
Lead designer for creating a "human in the loop" product for user to perform QA inspection, and create customized inspection routines.
Research, Prototype, User testing, wireframe, info architecture, UI design,
Results
Shipped the web app with 2 main functionalities:
1. QA inspection
2. Create customizable routines with robot control.
Increased customer's work efficiency
Reduced labor cost.

What Is It
Live Inspection works in quality control manufacturing environments.
It allows QA inspectors to semi-automate their work process by controlling robots with high res cameras to scan items with the preset routines and use pre-trained machine learning models to give Pass/Fail judgments. Inspectors need to supervise the robot's judgments then provide feedback base on their experience.
User Types
Inspectors are the main type of user who interacts with the live inspection UI.
Managers will be trained to create customized inspection routines with the robot.


Inspector
The Inspector experience is geared around using live inspection UI to work with the robot to reach their inspection goals, giving live feedback to the robot's judgments.


Manager
The manager experience is geared around using routine creation UI to create customized inspection routines with the robot.
They will be supervising multiple inspection stations from a distance to make sure they are going smoothly.
Inspector's standard workflow in the industry before

1
Open a box of parts that need to be inspected, unwrap the part from a plastic bag.

2
Use an inspection magnifier to find all defects on each part.

3
Mark defects with a red sticker using a pair of tweezers.

4
Pack the defected part back to the plastic bag, mark the defect names on the bag.

5
Place all defected parts into a box and mark the box.

6
Place all good parts back to the original box, and pass it down.



1. Live Inspection
The first big feature we launched for our customers is how to use our solution to inspect their product. We are taking a good amount of work internally on the ML engineering side to set up routines, collect data, and train the model for our customers.
Lift users from repetitive work and tiring postures.
Inspectors will unwrap the product and place it on the robot's plate. Now inspectors can use the UI on top of the robot to inspect the product. While the robot inspecting the product, users can be lifted from repetitive work and tiring positions. They just need to make sure the inspection runs smoothly, items are placed in the fixture properly.

Create trust between humans and robots.
Problem
Users wanted to understand what the robot's doing during the inspection. It was easy to get lost in the progress of the robot's inspection.
Solution
By offering a routine overview panel, we were able to give users a live update on the progress of the robot till the end of one inspection. Bounding boxes with pass/fail results will appear correspondingly when a certain position is being scanned.
Most importantly, this creates a sense of trust between humans and machines.

Increase efficiency
One pain point we discovered during the customer onsite interview was the new system wasn't efficient enough to match their current workload. We designed the UI to allow users to review images while the robot doing the inspection. Inspectors can simply tap on the "Review" page to start reviewing the inspection results. They can toggle back to the "Inspection Page" anytime if they want to keep track of the inspection.
By updating this design, we noticed a 40% reduction in time spent on each inspection compare to the previous prototype.

There are 2 elements that need navigations on this review page:
1st Area of interests (AOI), 2nd position.
One position might contain multiple areas of interest that need to be reviewed.
Since navigate to review the next defect is a repetitive action for this step, we have the navigation for AOI on the right bottom corner of the screen. This will also automatically navigate to the next position when the previous position is finished being reviewed. We are able to provide a faster hand movement for each inspection. It is also more comfortable ergonomically for inspectors.

Close supervised learning loop for machine learning model training.
With machine learning embedded in the system, the robot will be able to make judgments for the pass/fail result of each position at first. In order to increase ML accuracy for better performance in the future, we designed a step for the user's supervising input. Inspectors will be able to correct the robot's judgment based on their inspection experience.
This UX step also increased business value for the entire product. The human in the loop selling point is more appealing to our customers since we do not eliminate human labor, but only to increase their efficiency and provide a better working experience which aligns with customer's goal.

Problem
Customers have different preferences on a balance of accuracy and efficiency.
Some of them would only want to review failed images, some will review all images.
Solution
We provided a filter option to choose in between "All" or "Need Review " (Failed Result) on Areas of Interest and Positions panel.
When "All" is selected, we placed "Failed" and "Unknown" positions at the beginning of the cue to match the user's priority.

2. Routine Creation
With the scaling of the customer needs and the size of the company, We started to offer our users the ability to create their own inspection routines with our interface product.

Easily toggle between white and dark editions for routine creation UI.

Problem
Users are afraid of breaking the machine, how to make robot control more user friendly is the key.
Challenge
For this hardware model, it has 6 degrees of freedom. It is a fairly complex concept to describe robot movement in the mechanical design world.
Solution
We worked closely with the hardware team to translate a complex concept from mechanical design to a camera coordinate system, where all robot's movement is relevant from the camera perspective on the end effector.
The user only needs to care about which way they want the camera to look.
We also gave info on the info tooltip for the pitch and yaw concept.
Trade-off
This is not the perfect UX in the design world, I also pitched the idea of dragging the live camera feed to achieve some robot movements. But with the consideration of the size of the team and the timeline of the project, the team went with a faster solution for the MVP version of the feature.

Routine creation UI
Live Inspection UI
By having similar panels and the same placement of the main action button for 2 main features on the app, we were able to reduce the learning curve for inspectors. It also helps them to get over their fear of breaking the robot faster.
This design decision also helped our development team building the routine creation page faster.
This is the beginning of forming our own design system/guideline.
And we launched! Check out the Elementary Robotics for more info about the product.

3. Learning
01. Roadmap
When a feature can't be built for MVP but will achieve user goals, make a road map with PM for the feature to be completely built.
02. Flexibility and adaptability
Some of our customers had their manual inspection workflow established previously. It was important to think about how does the new product being adapted naturally especially with inspection goals assigned to individuals.
03. Build trust is the key
After external research and customer interviews were conducted, we found out that a good number of robotic machines were purchased but left alone after the first few attempts of running. The reasons being users are afraid of breaking them or they don't have anyone technical to program the machine the way they need it for the production line. Having trust as one of the main design principles was the key to user success metrics.
Thanks for viewing.😀
If you are interested in how we created a No-Code machine learning training platform that allows users to configure inspection routines, training the ML model, and deploying to inspection stations, please check out this project ↓
