Insight Assess – fortune 100

Overhauling Critical Field Assessment & Reporting Tooling


UI UX RESEARCH
– 2021
introduction

InSight Assess

Field assessment applications are mission-critical for large technical service teams — whether inspecting turbo machinery, auditing assets, or gathering condition data on site. But many legacy tools are built for a single use case, lack flexibility, and make real work harder than it needs to be.

I set out to…


2018 – 2021      PRODUCT DESIGNER (solo)

overview

What is field assessment?

Field assessment is the process of documenting the condition, performance, and risks of equipment, assets, or environments while physically on site. It often involves capturing measurements, photographs, notes, grades, and observations — all while working in unpredictable conditions like loud facilities, tight spaces, heat, and time pressure.

The right solution can reduce errors, streamline reporting, respect how people actually work on site, and unlock better data for the business.


THE ASK

How might we transform a purpose-built field reporting tool into a flexible, modular, and user-friendly application that aligns with real field work, supports multiple use cases, and scales with future business needs?

problem

The original assessment app was built for a single purpose with a narrow UI scope and no long-term vision. Workflows were rigid, usability was poor, and key interactions were disguised as non-obvious UI elements like buried icons and hidden tabs. Field personnel couldn’t reorder tasks to match how they worked and the architecture lacked modular flexibility.

HYPOTHESIS

If we rethought workflows around the real sequence of field tasks — letting users reorder lists, surface status clearly, and simplify interactions — users would complete assessments faster with fewer errors.

My inclination was that the biggest unlock wouldn’t come from adding features, but from re-framing how people actually perform their work on site — giving them control over task order.


Research 

Working within this large enterprise environment afforded me the opportunity to work closely with field engineers, attend their training sessions, and most importantly visit training facilities that mimicked real-world operations.

  • Observed how engineers and technicians actually record assessment data on site — the forms they fill, the photos they take, and how context influences sequencing.

  • Gathered direct input from field personnel about what slowed them down — confusing menus, hidden controls, and mismatches between workflow and screen order.

  • Catalogued all existing UI elements and interactions to identify overlap, redundancy, and usability barriers that no longer served real tasks.

  • Ran quick validation loops to see which interface patterns improved findability, task completion speed, and satisfaction.

Insights

Understanding that turbo machinery installations differ from facility to facility was a clear signal that I needed to design the task-completion workflow with the up-most flexibility.

I have to adapt my workflow to fit the app, not the other way around. That causes a lot of time spent circling around the machinery because every turbine installation is actually different.

Photos, grades, and notes should be grouped intuitively — right now it feels scattered.

Some buttons look like text — it took me weeks to understand where to tap.

I wear heavy gloves when working inside the package (turbine housing). I need the buttons to be bigger.

When I’m on site, I just want to get through the checklist in the most logical order.


Principles

Match the work, not the system — workflows should follow how people actually perform assessments, not arbitrary orderings.

Make value visible — status and completion indicators must be obvious at a glance.

Empower the user — users should be able to reorder and control their task sequence.

Simplify interaction — reduce cognitive overhead by replacing hidden controls with clear, ergonomic elements.


Task-Centered Flow

Replaced rigid workflows with dynamic task lists that users can reorder to match real job patterns. This enabled field personnel to assess in the sequence that made sense on site, not a predetermined order.

Tabular Views for Scalability

Shifted from oversized “card” layouts to table layouts with unlimited column control, allowing sorting, reordering, and scalability for large datasets. This gave users control and better findability without sacrificing context.

Clear Interaction Patterns

Replaced hidden tabs and disguised buttons with clear, labeled controls placed ergonimically near thumbs. Status indicators showed completion at a glance and reduced guesswork.


OUTCOMES

Field personnel reported improved usability, reduced context switching, and better alignment between how they worked and how the tool supported them. Tasks that used to feel scattered became predictable and efficient.

Re-architecting to be modular and use-case agnostic gave the product flexibility to support new reporting scenarios without major rework — enhancing long-term viability.

For the business, this translated into fewer support tickets, faster onboarding for new users, and stronger adoption because the tool finally respected how real work unfolds in the field.

By aligning the solution with real field patterns and giving users control over flow and status, Assess became more than a reporting app — it became a reliable extension of the user’s workflow.