Research and Design for Reference Data

While at Avanade, I led a User Experience work stream at JPMorgan for a suite of internal enterprise products. Through upholding human-centered design practices, I advanced the user experience for 14k~ users worldwide in NA, EMEA, and APAC across 48 countries. By emphasizing empathy-driven solutions and effectively collaborating across lines of business, the workstream grew into a team of four senior and three midlevel consultants.

Opportunity

The opportunity came to work on a deeply architected, relic software within the lengthy client onboarding process. The portal received upwards of 45,000 requests annually to apply regulatory due diligence. Many user experience issues stemmed from the deprecating Flex technology. With HTML5, we addressed those issues and no longer the reason to ignore best practices. Identifying the cumbersome workarounds and redesigning the experience was imperative to satisfy my users and stakeholders.

User Groups

The bank was a complex network of departments. I worked with 13 user groups within the digital ecosystem who used these applications daily. Upwards of 125k cases would undergo rigorous data entry and administrative processing. Records would circulate through the hands of the front office to the back office. For management and analysts alike, these digital tools were critical to their day-to-day, and performance was an issue that was always at the forefront.

Figure 1. The manager archetype

Research

To gain alignment across the lines of business and ensure the satisfaction of my users, I lead focus groups to gather insights, ideate solutions, and vet designs for their KYC application. I ran 36 user sessions with a team of two designers and showcased 11 times to an average of 45 stakeholders. The largest of the showcases had a total of 116 people in attendance, and we opened a live poll to vote on design options.

In my second year, I bridged the gap between the KYC redesign and applied the learnings to its umbrella application (the client onboarding platform). The human-centered design practice proved successful due to the collaboration with the business, development teams, and user groups that diligently attended the working sessions and continually provided feedback.

Research

To gain alignment across the lines of business and ensure the satisfaction of my users, I lead focus groups to gather insights, ideate solutions, and vet designs for their KYC application. I ran 36 user sessions with a team of two designers and showcased 11 times to an average of 45 stakeholders. The largest of the showcases had a total of 116 people in attendance, and we opened a live poll to vote on design options.

In my second year, I bridged the gap between the KYC redesign and applied the learnings to its umbrella application (the client onboarding platform). The human-centered design practice proved successful due to the collaboration with the business, development teams, and user groups that diligently attended the working sessions and continually provided feedback.

Research

To gain alignment across the lines of business and ensure the satisfaction of my users, I lead focus groups to gather insights, ideate solutions, and vet designs for their KYC application. I ran 36 user sessions with a team of two designers and showcased 11 times to an average of 45 stakeholders. The largest of the showcases had a total of 116 people in attendance, and we opened a live poll to vote on design options.

In my second year, I bridged the gap between the KYC redesign and applied the learnings to its umbrella application (the client onboarding platform). The human-centered design practice proved successful due to the collaboration with the business, development teams, and user groups that diligently attended the working sessions and continually provided feedback.

Redefining the Problem

Before running the focus groups, a heuristic evaluation and contextual inquiry helped determine the high-level design goals. Although the business prioritized the issues within the application, navigating through the experience firsthand allowed me to identify underlying structural problems. Additionally, a SUS questionnaire was administered to a user group that yielded a score of 64.5 which is a C- at best in terms of perceived usability.

Stakeholders and users alike were convinced that reducing the number of clicks was the answer to their problems and encouraged me to increase the amount of information on the screen wherever possible. There was such an apparent distaste for the unnecessary clicking of a mouse that it became a standard practice to display as much information on the screen as possible, resulting in a bevy of nested horizontal scrolling windows.

The funny thing is, the users did not complain about that, and it didn’t register as a pain, so it was never considered a problem to solve for the business. Furthermore, the application would frequently ‘time out,’ contributing to pages of data loss, user frustration, and efforts to input vast amounts of information accurately before the system cut them off again. I have also observed users repeatedly clicking their mouse out of boredom or frustration while waiting for their data to load. Reducing clicks should not be an evaluation criterion. At times, purposeful friction justifies ‘clicks.’

Redefining the Problem

Before running the focus groups, a heuristic evaluation and contextual inquiry helped determine the high-level design goals. Although the business prioritized the issues within the application, navigating through the experience firsthand allowed me to identify underlying structural problems. Additionally, a SUS questionnaire was administered to a user group that yielded a score of 64.5 which is a C- at best in terms of perceived usability.

Stakeholders and users alike were convinced that reducing the number of clicks was the answer to their problems and encouraged me to increase the amount of information on the screen wherever possible. There was such an apparent distaste for the unnecessary clicking of a mouse that it became a standard practice to display as much information on the screen as possible, resulting in a bevy of nested horizontal scrolling windows.

The funny thing is, the users did not complain about that, and it didn’t register as a pain, so it was never considered a problem to solve for the business. Furthermore, the application would frequently ‘time out,’ contributing to pages of data loss, user frustration, and efforts to input vast amounts of information accurately before the system cut them off again. I have also observed users repeatedly clicking their mouse out of boredom or frustration while waiting for their data to load. Reducing clicks should not be an evaluation criterion. At times, purposeful friction justifies ‘clicks.’

Redefining the Problem

Before running the focus groups, a heuristic evaluation and contextual inquiry helped determine the high-level design goals. Although the business prioritized the issues within the application, navigating through the experience firsthand allowed me to identify underlying structural problems. Additionally, a SUS questionnaire was administered to a user group that yielded a score of 64.5 which is a C- at best in terms of perceived usability.

Stakeholders and users alike were convinced that reducing the number of clicks was the answer to their problems and encouraged me to increase the amount of information on the screen wherever possible. There was such an apparent distaste for the unnecessary clicking of a mouse that it became a standard practice to display as much information on the screen as possible, resulting in a bevy of nested horizontal scrolling windows.

The funny thing is, the users did not complain about that, and it didn’t register as a pain, so it was never considered a problem to solve for the business. Furthermore, the application would frequently ‘time out,’ contributing to pages of data loss, user frustration, and efforts to input vast amounts of information accurately before the system cut them off again. I have also observed users repeatedly clicking their mouse out of boredom or frustration while waiting for their data to load. Reducing clicks should not be an evaluation criterion. At times, purposeful friction justifies ‘clicks.’

Figure 2.1. Before (left) and After (right) the redesign with improved use of space, navigation, and branded UI

Figure 2.2. From requirements to design: Breaking apart a dense interface without sacrificing the need for more data. The new design prevents errors by removing unnecessary inline editing and focuses each task.

Design

To change their perception, I had to understand and specify the context of use. By defining the task models, I could guide the stakeholders through the proposed solution and redefine the success criteria. If the goal were to improve the efficiency of data entry, then time-to-task would be a clear definition of success instead of how many clicks or steps.

The new design allowed the user to focus on completing one task at a time, which was grouped by type. In the case of the onboarding agreements, they were either sending or receiving data packages. We borrowed extensively from the concept of sending an email to draw from common patterns that any user would recognize. Following best practices would save time in learning new habits and make the experience intuitive for any new user.

The prototype test proved the redesign successful. The users completed their tasks promptly, and their feedback revealed themes for near-term and future enhancements. Enhancements such as rolling up the data in the table to make it easier to browse were low-hanging fruit. Automation features such as auto-fill and auto-save had the greatest amount of return on investment.

Design

To change their perception, I had to understand and specify the context of use. By defining the task models, I could guide the stakeholders through the proposed solution and redefine the success criteria. If the goal were to improve the efficiency of data entry, then time-to-task would be a clear definition of success instead of how many clicks or steps.

The new design allowed the user to focus on completing one task at a time, which was grouped by type. In the case of the onboarding agreements, they were either sending or receiving data packages. We borrowed extensively from the concept of sending an email to draw from common patterns that any user would recognize. Following best practices would save time in learning new habits and make the experience intuitive for any new user.

The prototype test proved the redesign successful. The users completed their tasks promptly, and their feedback revealed themes for near-term and future enhancements. Enhancements such as rolling up the data in the table to make it easier to browse were low-hanging fruit. Automation features such as auto-fill and auto-save had the greatest amount of return on investment.

Design

To change their perception, I had to understand and specify the context of use. By defining the task models, I could guide the stakeholders through the proposed solution and redefine the success criteria. If the goal were to improve the efficiency of data entry, then time-to-task would be a clear definition of success instead of how many clicks or steps.

The new design allowed the user to focus on completing one task at a time, which was grouped by type. In the case of the onboarding agreements, they were either sending or receiving data packages. We borrowed extensively from the concept of sending an email to draw from common patterns that any user would recognize. Following best practices would save time in learning new habits and make the experience intuitive for any new user.

The prototype test proved the redesign successful. The users completed their tasks promptly, and their feedback revealed themes for near-term and future enhancements. Enhancements such as rolling up the data in the table to make it easier to browse were low-hanging fruit. Automation features such as auto-fill and auto-save had the greatest amount of return on investment.

“The UX design format showing the options and the workflow is extremely helpful for the business to see and conceptualize the expected tech delivery – a significant upgrade from mock-ups and walkthroughs that didn’t have the same layout previously...”

Executive Director, Prime Brokerage, J.P. Morgan

Figure 3.1. The framework used to define prototype testing included a clear problem statement, high-level journey maps, and task models.

Figure 3.2. The executive summary of prototype testing results

Figure 3.3. An example of qualitative measures of usability

Figure 3.4. An example of quantitative measures of usability

Collaboration Process

There were also many constraints with cross-department collaboration. Sourcing the user base was a big hurdle as the business often protected their team’s time. With such a large organization, I had to effectively communicate the value of UX with every new unit I encountered. It was essential to share work openly (successes and failures), initiate collaboration, and smooth interpersonal relationships.

Design solutions stem from business requirements which were often shifting targets. Anticipating delays was a trait of working in a large organization. I needed a process to assess the scope and manage the volume of work requests. The user-centered design process needed planning to achieve quality results. Although I aimed to recommend the entire design cycle, completing every deliverable was unnecessary. My minimum requirement for user-centered methodology within the organization is as follows:

  • Observation or one-on-one interviews for quick turnarounds

  • Recurring focus groups for larger projects

  • Vision planning for open-ended requests

  • User testing for all new features

Collaboration Process

There were also many constraints with cross-department collaboration. Sourcing the user base was a big hurdle as the business often protected their team’s time. With such a large organization, I had to effectively communicate the value of UX with every new unit I encountered. It was essential to share work openly (successes and failures), initiate collaboration, and smooth interpersonal relationships.

Design solutions stem from business requirements which were often shifting targets. Anticipating delays was a trait of working in a large organization. I needed a process to assess the scope and manage the volume of work requests. The user-centered design process needed planning to achieve quality results. Although I aimed to recommend the entire design cycle, completing every deliverable was unnecessary. My minimum requirement for user-centered methodology within the organization is as follows:

  • Observation or one-on-one interviews for quick turnarounds

  • Recurring focus groups for larger projects

  • Vision planning for open-ended requests

  • User testing for all new features

Collaboration Process

There were also many constraints with cross-department collaboration. Sourcing the user base was a big hurdle as the business often protected their team’s time. With such a large organization, I had to effectively communicate the value of UX with every new unit I encountered. It was essential to share work openly (successes and failures), initiate collaboration, and smooth interpersonal relationships.

Design solutions stem from business requirements which were often shifting targets. Anticipating delays was a trait of working in a large organization. I needed a process to assess the scope and manage the volume of work requests. The user-centered design process needed planning to achieve quality results. Although I aimed to recommend the entire design cycle, completing every deliverable was unnecessary. My minimum requirement for user-centered methodology within the organization is as follows:

  • Observation or one-on-one interviews for quick turnarounds

  • Recurring focus groups for larger projects

  • Vision planning for open-ended requests

  • User testing for all new features

Figure 4. Framework for selecting user-centered methodologies

I worked to consolidate interaction patterns because, more often than not, similar problems would occur across different lines of business. I aligned closely with the Center of Excellence to reduce duplicated efforts and refine the pattern library. My applications indexed high on the usage of tables. The information must be available at all times and at the user's fingertips. As a result, mountains of data filled the screen unintentionally, causing information overload.

Without design interpretation, diret user feedback lead to poor user experiences with crowded interfaces that cannot scale. Adopting new interaction patterns offered focus and satisfied productivity goals and was a change to the system for good. Defining the threshold for information density was guided by these principles:

I. ADA compliance

  • Disarm design push-back with accessibility standards. Does the proposed solution meet AA accessibility standards?

II. Reduce visual noise

  • Limit distraction by asking these questions: Is this design simple? Can we achieve the same goals with less? Example: Do we need all these colors to add value? Can a single-column form take out the guesswork of data entry?

III. Make signposts visible

  • Guide the users through emphasis. Does the navigation blend into the content?

  • Provide context for the information displayed. Can I understand the data trail?

  • Emphasize actionable items. What do I do next?

IV. Break up monotonous behavior

  • Orchestrate interactions for a delightful experience. Does opening up a table within a table within a table spark joy? Here are some alternatives that kept the focus on the task at hand.

  1. Progressive Disclosure (adds narrative)

  2. Wizards (adds clarity)

  3. Pagination over lazyload (choice adds a moment of pause)

Key Findings

What was most rewarding was watching innovation unfold through user-centered practices. These methodologies can transform, grease the wheels of collaboration, and breakthrough organizational silos. In a large, established organization, innovation is best managed through evolution vs revolution. What I found is that people are agents of change. What matters is the team that you take the journey with. Below are two valuable insights I picked up along the way.