top of page

Create a "No code" machine learning platform for QA inspection.

My Role

Lead UX/UI designer for creating the flow of no code machine learning platform with ML team, CTO, CEO.

 

Research, Prototype, User testing, wireframe, info architecture, UI design, 

Result

Shipped the webapp with a few main functionalities:
1. Configure the ML model.

2. Label Data.

3. Deploy model.

Hand user the tool to creat and train their own machine learning models.

Provide a no-code experience for a broader audience.

Group 196.png

The Pipeline

This no - code machine learning model flow provides users the tool to create customized trained routine that works in quality control manufacturing environments.

It allows the user to create customized routines with the robot hardware, assign tests to areas of interest.  

It allows the robot to give Pass/Fail/Unknown result to the item that is being inspected judgments. Inspectors need to supervise the robot's judgments then provide feedback base on their experience. 

09_bg.png
remy-gieling-KP6XQIEjjPA-unsplash.jpg

Inspector

The Inspector experience is geared around using the robot to reach their inspection goals, giving live feedback to the robot's judgments.  

User Types

Inspectors are the main type of user who interacts with the live inspection UI. 

The manager will supervise multiple inspection stations from a distance to make sure they are going smoothly.

Inspector's Standard Workflow In The Industry Before 

Group 201.png

1

Open a box of parts that need to be inspected, unwrap the part from plastic bag.

Omnivue_Magnifying_Lamp_2__71592.1569610

2

Use an inspection magnifier to find all defects on each part.

Group 198.png

3

Mark defects with a red sticker using a pair of tweezers.

Group 199.png

4

Pack the defected part back to the plastic bag, mark the defect names on the bag. 

Capture3.PNG

5

Place all defected parts into a box and mark the box.

Group 202.png

6

Place all good parts back to the original box, and pass it down.

Group 207.png
Group 994.png
Gray Structure

1. Live Inspection

The first big feature we launched for our customers is how to use our solution to inspect their product. We are taking a good amount of work internally on the ML engineering side to set up routines, collect data, and train the model for our customers.

Lift user from repetitive work and tiring postures.

Inspectors will unwrap the product and place it on the robot's plate. Now inspectors can use the UI on top of the robot to inspect the product. While the robot inspecting the product, users can be lifted from repetitive work and tiring positions. They just need to make sure the inspection runs smoothly, items are placed in the fixture properly.   

Group 1004.png

Create trust between humans and robots.

Problem

Users wanted to understand what the robot's doing during the inspection. It was easy to get lost in the progress of the robot's inspection.

Solution

By offering a routine overview panel, we were able to give users a live update on the progress of the robot till the end of one inspection. Bounding boxes with pass/fail results will appear correspondingly when a certain position is being scanned. 

Most importantly, this creates a sense of trust between humans and machines.

Group 992.png

Increase efficiency

One pain point we discovered during the customer onsite interview was the new system wasn't efficient enough to match their current workload. We designed the UI to allow users to review images while the robot doing the inspection. Inspectors can simply tap on the "Review" page to start reviewing the inspection results. They can toggle back to the "Inspection Page" anytime if they want to keep track of the inspection.

By updating this design, we noticed a 40% reduction in time spent on each inspection compare to the previous prototype.

Frame 1004.png

There are 2 elements that need navigation on this review page: 1st Area of interests (AOI), 2nd positions. One position might contain multiple areas of interest that need to be reviewed.

Since navigate to review the next defect is a repetitive action for this step, we have the navigation for AOI on the right bottom corner of the screen. This will also automatically navigate to the next position when the previous position is finished being reviewed. We are able to provide a faster hand movement for each inspection. It is also more comfortable ergonomically for inspectors.  

how er works2.png

Close supervised learning loop for machine learning model training.

With machine learning embedded in the system, the robot will be able to make judgments for the pass/fail result of each position at first. In order to increase ML accuracy for better performance in the future, we designed a step for the user's supervising input. Inspectors will be able to correct the robot's judgment based on their inspection experience. 

This UX step also increased business value for the entire product. The human in the loop selling point is more appealing to our customers since we do not eliminate human labor, but only to increase their efficiency and provide a better working experience which aligns with customer's goal. 

Wix 01-Jan-2021-010211.gif

Problem

Customers have different preferences on a balance of accuracy and efficiency. 

Some of them would only want to review failed images, some will review all images.

Solution

We provided a filter option to choose in between "All" or "Need Review " (Failed Result) on Areas of Interest and Positions panel.  

When "All" is selected, we placed "Failed" and "Unknown" positions at the beginning of the cue to match user's priority.

Gray Structure

2. Routine Creation

With the scaling of the customer needs and the size of the company, We started to offer our users the ability to create their own inspection routines with our interface product.

Wix 30-Dec-2020-205838.gif

Easily toggle between white and dark editions for routine creation UI.

Frame 985.png

Problem

Users are afraid of breaking the machine, how to make robot control more user friendly is the key.

Challenge

For this hardware model, it has 6 degrees of freedom. It is a fairly complex concept to describe robot movement in the mechanical design world.

Solution

We worked closely with the hardware team to translate a complex concept from mechanical design to a camera coordinate system, where all robot's movement is relevant from the camera perspective on the end effector. 

The user only needs to care about which way they want the camera to look. 

We also gave info on the info tooltip for the pitch and yaw concept. 

Trade-off

This is not the perfect UX in the design world, I also pitched the idea of dragging the live camera feed to achieve some robot movements. But with the consideration of the size of the team and the timeline of the project, the team went with a faster solution for the MVP version of the feature.

Frame 1008.png

Routine creation UI

Live Inspection UI

By having similar panels and the same placement of the main action button for 2 main features on the app, we were able to reduce the learning curve for inspectors. It also helps them to get over their fear of breaking the robot faster.

This design decision also helped our development team building the routine creation page faster.

This is the beginning of forming our own design system/guideline.

Gray Structure

3. Learning

1. 

2. 

3. 

Thanks for viewing.😀

If you are interested in how did we let users configuring routines, training the ML model, and deploying to inspection stations, please check out this project ↓

IR01.85.png

Thanks for viewing.😀

bottom of page