Custom Grants Management System: Enhancements and Redevelopment

TL;DR: We had a bundle of leftover support hours for a custom grants management system, so I decided to conduct some informal usability testing. This revealed more opportunities for improvement than we could do at the time, so we focused on light enhancements (e.g. streamlining and rewording form questions). We were able to use this information later in redeveloping the system. During redevelopment, I used informal surveys and contextual inquiry both for requirements gathering (for 6 user types) and for testing of the prototypes from the developer. I left the organization prior to project completion, but by that point, user feedback had been elicited and distilled into requirements and prototype revisions for the developers.

Challenge | Approach | Results | Tools + Methods

Challenge

The Cal Ripken, Sr. Foundation used a custom grants management system to manage grants provided to community organizations to run out-of-school-time youth programming. Data collection and grants management procedures needed to meet federal grant requirements. However, the program team, which managed the grants and worked with community partners (or grantees), had received lots of anecdotal feedback about the system not being user-friendly, often handled many questions from grantees about how to submit their reports through the system, and generally spent a not insignificant amount of time helping them troubleshoot. In addition, staff also used the system to manage the grants within their portfolios and while it was custom-built to the business process, they found it clunky as well. There were a number of areas in which the system created sticking points for a variety of stakeholders. We did not initially have the funds to make any major changes, so we needed to identify and prioritize issues that we could remedy quickly and inexpensively until we had more of a budget available.

Approach

We had several types of users: external community partner staff, internal program staff, and internal finance and grant administration staff. As we were a small organization, it was easier for us to train and provide support to our internal staff, so we focused on improving usability for our community partners.

Usability Testing

As our community partners were often very busy, I decided to conduct a usability test with a new staff member who had no need to use the system, but had previously been a staff member from one of our community partners and had been in the same role as our external users.

I created a list of tasks that our community partners would need to complete in the system (e.g. login, submit an application, submit a report), paired with e-mail instructions they would have received. I scheduled time with our new colleague to sit down and watch him go through the tasks, taking notes and/or screenshots whenever he had a question or found something confusing or had a suggestion. Some of the issues that came out of this and talking with program staff about frequently asked questions included the following:

  • Wording on forms – It was unclear what was being asked for many of the questions—many of which had language taken directly from aggregate reporting deliverables that our organization was required to report to funders. In addition, some of the questions appeared to be redundant. In some cases, organization-specific terms were used, which would be especially confusing on applications that were potentially filled out by partners who were new to our organization. Some reports were named “June” or “September” based on a calendar that now had multiple report timelines, which also caused confusion for partners who had a different reporting schedule.
  • Form input types – Some fields that should have added up to a total did not auto-calculate, and some sections were missing validation (e.g. demographic data that should add up to 100%). Monetary fields required numeric input but did not show commas after entry, making it easy to mistype an amount and difficult to see that mistake at a glance.
  • Separate logins for partners with multiple grant types – The organization had added a new type of grant in the past year and created these with separate accounts so as to avoid confusion, but since they were usually managed by the same person at community sites, this turned out to cause more confusion.

Enhancements

I reviewed all of the forms and did the following:

  • Removed questions that were no longer needed
  • Consolidated redundant questions or reworded them so that users could distinguish between them
  • Reworded the questions into plain language and/or included definitions where warranted
  • Renamed any confusingly named forms, sections, etc.

I also talked with the finance and grants administration staff about consolidating the logins for partners with multiple types of grants, which they implemented for the next program year.

From an internal staff perspective, we needed to be able to extract data from the system as the canned reports were out of date with the form fields. The temporary solution until we had more of a budget was to request a way to export all of the data.

I worked with the developers to get the system changes made, including conveying our requirements and performing user acceptance testing together with a colleague whose work I directed on this project.

Redevelopment: Requirements Gathering

I was the project manager for the redevelopment of the grants management system as well.

There were multiple user types to be considered in redevelopment:

  • Finance/Grant Approvers
  • Finance/Grant Reviewers
  • Program Approvers
  • Program Reviewers
  • Program Data Managers
  • Community Partner Staff (external)

This time around, because this was a system redevelopment, I spent a lot of time making sure that we had a solid understanding of the business processes, since they were often closely tied to funding or compliance requirements, to ensure that the new system could meet organization needs while still improving usability for users all around. Some of the issues in the existing system had been due to adding new items to an old structure over time as well as the nature of custom systems.

In order to gather requirements from multiple stakeholders and synthesize these into a cohesive structure of actionable items for the developers, I informally conducted the following types of user research:

  • Surveys on workflow and need-to-know information – I followed up with individuals about further questions.
  • Contextual inquiries – In some cases, I also watched users complete parts of their workflow to see where the pain points were.

This helped me identify opportunities, such as the need for users to have their attention drawn to outstanding items or deliverables nearing a deadline, as well as which needs were across users, which were per user type, and which were individual preference as opposed to a business requirement.

Note: At the time, I was in the role of a Program Data Manager user. As such, I was responsible for and extremely familiar with what our organization was required to report to funders. In addition, I had previously been a Program Reviewer and was fairly familiar with those needs as well. The other internal member of the core project team was a Finance/Grant Reviewer and Program Data Manager as well.

Redevelopment: User Acceptance Testing

Our contract included working prototypes as part of the development process. As the project manager, I was responsible for the user acceptance testing, but due the wide range of users and the differences in how they used the system, I conducted the testing together with users from each group. Although, this was technically the acceptance testing, I conducted it in a very similar manner to usability testing in that I provided a list of tasks to run through and watch users complete.

I compiled all of the feedback and usability testing data and provided the developer with specific requests and changes based on that information.

Results

The initial enhancements made it easier for grant sites to interpret the questions correctly, so there was less time back-and-forth with program staff asking about a number that seemed like an extreme outlier only to find out the partner staff completing the report thought the question was asking something else. However, it was clear more improvements needed to be made.

Although I left for another job prior to completion of this project, the working prototypes were fairly well-developed and incorporated a number of very helpful features for increased usability. For example:

  • A homepage with a dashboard specific to the user type, which aligned with their account type since we required different permissions-levels. Previously, all users logged in and saw the same screen, which had been designed for the needs of internal staff. This dashboard allowed users to see at a glance what required their attention and an aggregate of items at each part of the process.
  • In addition, it was easier for internal staff to navigate their portfolio of grants, as there were visual status indicators for deliverables on the main grant record.
  • Internal users were also able to navigate to see all items (e.g. reports) at a particular point in the process (e.g. pending approval) rather than having to click into individual main grant records to get to each form. This saved a lot of time as internal users had anywhere between 30 to 200+ grants for which they were responsible for at a given time.
  • Differentiated grant types, so that based on the type of grant awarded, community partners (and internal staff!) would only see the relevant deliverables appear. Previously, all grants showed two semiannual reports due, even though a newer grant type only required a year-end report.

The prototypes were already a huge improvement over the existing system. Much of the work that remained to be done was either incorporating those changes at scale or related to building out the updated reporting functionality.

Tools + Methods

  • Project management
  • Requirements gathering
  • Contextual inquiry
  • Usability tests
  • Surveys