top of page

As the sole/lead UX/UI designer at Elementary Robotics, I worked with the CEO of the company to identify the market for industrial robot usage. After researching all the products in the QA market, we found out that most of the products were expensive, unfriendly, and hard to customize.

We then developed a human-centric machine learning robot that allows users to automate repetitive tasks and increase work efficiency.

Game Plan


My Role

Lead designer for a human-centric AI product that allows users to streamline the Quality Assurance (QA) process.


Research, Prototype, User testing, wireframe, info architecture, UI design, 


Shipped the web app with 2 main functionalities:
1. QA inspection

2. Control robots to create inspection routines

Increased customer's work efficiency

Reduced labor cost.

Group 196.png

What Is It

A tool that allows Quality Assurance (QA) inspectors to semi-automate their work process by controlling robots with high-resolution cameras to scan items with preset routines and use pre-trained machine learning models to give Pass/Fail judgments. Inspectors remain an integral part of the process by providing feedback for every Pass/Fail judgement to improve the machine learning models over time.

User Types

Inspectors are the main type of user who interacts with the live inspection UI. 

Managers will be trained to create customized inspection routines with the robot.



The Inspector experience is geared around using the live inspection UI to achieve inspection goals and to provide live feedback on the robot's judgments.  



The manager experience centers around using the routine creation UI to generate customized inspection routines with the robot.

They can supervise multiple inspection stations remotely to ensure smooth operation.

Example of a Traditional Inspection Workflow

After few customer onsite interviews, here's their current simplified journey map. 

Group 201.png


Unpack parts to be inspected from external supplier.



Perform visual inspections to locate defects on parts.

Group 198.png


Mark defects with red stickers.

Group 199.png


Group, package and label all defective parts with the same defect. 



Store defective parts in a separate location.

Group 202.png


Place all good parts back to the original box, and pass it down.

Pain Points Highlight

Group 207.png
Group 994.png
Gray Structure

1. Live Inspection

The first major feature we launched for our customers was the ability to use our solution to inspect their products. To generate the machine learning models our internal ML engineering team set up routines and collected data for our customers. The flow below is designed for inspectors. It solves pain points 1,2,3 mentioned above. 

User Flow

Lift users from repetitive work and tiring postures.


Manual inspection requires inspectors to complete detailed repetitive tasks to locate defects under a magnifying lens.


Our product only requires inspectors to place parts being inspected in fixtures and to validate the results of the machine learning models on a screen. The robot completes the pre-programmed inspection routine to record images of areas with potential defects and the machine learning models identify the defects in those areas. 


Manual inspection requires inspectors to complete detailed repetitive tasks to locate defects under a magnifying lens.


Our product only requires inspectors to place parts being inspected in fixtures and to validate the results of the machine learning models on a screen. The robot completes the pre-programmed inspection routine to record images of areas with potential defects and the machine learning models identify the defects in those areas. 

Group 1004.png



Inspectors wanted to understand how the robot was making decisions during inspections. 


We included a routine overview panel that gives users a live update on the progress of the robot for every inspection. Bounding boxes with pass/fail results around areas of interest appear automatically when a certain position is being scanned. 

Group 992.png

Increase efficiency


After an initial deployment of our product, we realized through onsite interviews that the workflow of our software was not efficient enough to match their current workload.


We redesigned the UI to allow users to review images concurrently while the robot completed the inspection routine. With the new UI, inspectors could tap on the "Review" page to review the inspection results. They could also toggle back to the "Inspection Page" anytime if they wanted to keep track of the inspection.

By updating this design, we recorded a 25 - 30% reduction in time spent on each inspection compared to the previous prototype.

Frame 1004.png

There are two elements that users can navigate on this “Review” page: 

- Areas of interest (AOI)

- Inspection position.

One inspection position might contain multiple AOI that need to be reviewed.

Since the inspector interfaces with a physical touch screen to navigate our product, the AOI navigator buttons are ergonomically placed in the bottom right corner of the screen to allow the inspector to quickly review AOI. For the inspection position, the view in the review screen will automatically navigate to the next inspection position when the previous position has been reviewed. Together, these features let the inspector more efficiently review ML model decisions.

how er works2.png

Supervised machine learning model training.

During inspection routines our machine learning model takes a first pass at making pass/fail judgement calls for every AOI in each inspection position. In order to improve the ML model over time to increase accuracy we included a step for the user to provide feedback on the results of the ML model. This allows inspectors to use their experience to correct or validate the ML model’s decisions.

This UX feature also increases business value for our product. The human-in-the-loop aspect of our product is important since our goal is to augment the inspectors’ capabilities to improve their efficiency for our customers.

Wix 01-Jan-2021-010211.gif


Customers have different preferences for accuracy vs. efficiency. 

Some customers will only want to review failed image results, some will want to review all images.


We provided a filter option to choose between "All" or "Needs Review " (Failed Result) on the “Areas of Interest” and “Inspection Positions” panel.  

When "All" is selected, inspection positions with AOI labeled "Failed" and "Unknown" are placed at the beginning of the queue.

Gray Structure

2. Routine Creation

With the scaling of the customer needs and the size of the company, we started to offer the ability for our customers to create their own inspection routines with our interface product.
Managers or inspection leads are the target audience for this feature.

Wix 30-Dec-2020-205838.gif

Easily toggle between white and dark editions for routine creation UI.

Frame 985.png


Users were afraid of breaking the robot so an intuitive interface to control the robot was needed.


This specific robot has 5 degrees of freedom. It can be difficult to describe robot movement relative to the user.


We worked closely with the hardware team to devise a system where the robot's movement controls to command the robot relative to the perspective of the camera on the end effector. 

This way the user only needs to care about which way they want the camera to look. 

We also included an information tooltip for the pitch and yaw rotation concepts.


This system may not be the perfect UX for our product. An alternative system that I pitched was the option of dragging the live camera feed to control some of the robot's movements. However, the team went with a faster solution for the MVP version of the feature due to the small size of the team and the timeline of the project.

I was able to come up with a
roadmap with the PM for the next version of this feature.

Frame 1008.png

Routine creation UI

Live Inspection UI

By making the routine creation and live inspection UI panels similar to each other, we were able to reduce the learning curve for inspectors. It also helped them more quickly overcome fears of breaking the robot.

This design decision also helped our development team building the routine creation page faster.

This is the beginning of forming our own design system/guideline.



We ranked each feature with customer value(request), business value, and development effort to organize them into releases. 

Gray Structure

3. Design System

- We used the Ant Design as our initial library for general components.

- I created new components for camera feed, images, positions etc on top of Ants for our Elementary Robotics UI.

- Modified color and style to match the UI to the Elementary's branding.

- We did accessibility testing with the team to finalize the font sizes, color that was easier to use at the inspection work environment.

Group 1018.png

General Topography

And we launched! Check out Elementary Robotics for more info about the product.

Gray Structure


01. Roadmap
When a feature couldn't be built for MVP but would achieve user goals, a road map was created with the PM to build the feature.

02. Flexibility and adaptability
Some of our customers already had established manual inspection workflows. It was important to think about how to seamlessly integrate our product into our customers' workflows to help them achieve their inspection goals.

03. Building trust is the key

After external research and customer interviews were conducted, we found that robots were unused after the first few attempts of running. The reasons were that users were afraid of breaking the robots or they didn't have personnel to program the robots for their production lines. One of the main design principles and a key success metric of our system was to design a product that users could trust. 

Thanks for viewing.😀

If you are interested in how we created a No-Code machine learning training platform that allows users to configure inspection routines, train ML models, and deploy to inspection stations please check out this project ↓

Live Inspection page with result 1.png

2018 - 2021


Train machine learning model,


bottom of page