AI Smart Summary

Research Ideation

An internal messaging feature designed to enhance collaboration between team members working in a government records request software, improving communication and efficiency.

smart summary

Overview

This project explored the integration of AI technology to improve record request processing for government agencies. Through a mixed-methods research approach combining in-depth user interviews and large-scale feedback collection, we validated that AI-generated request summaries would significantly enhance workflow efficiency for agency staff. The research not only confirmed strong user interest with over 66% of surveyed users rating the AI summaries highly, but also provided critical insights into implementation considerations, allowing for informed decision-making about feature placement, permissions, and display options as we move toward implementation.

My Role

Lead UX Researcher

Timeline

3 months (March 2024 - May 2024)

Tools Used

Miro, Pendo

Deliverables

Research findings, AI prompts

The Problem

Government agencies process numerous record requests daily, requiring staff to manually review and extract key information from often lengthy and unstructured request descriptions. This manual process is time-consuming, prone to inconsistency, and creates bottlenecks in request processing workflows.

Key Issues

!

Information Overload

Staff must sift through verbose request descriptions to identify critical details, often spending excessive time on initial request assessment

!

Decision-Making Delays

The lack of quick access to essential request components slows down initial routing and assignment decisions, extending overall fulfillment timelines

!

Information Gaps

Requestors frequently omit crucial details (such as relevant date ranges or specific departments), forcing staff to rely on institutional memory or follow up for clarification, which an AI summary could help identify quickly

Research

To evaluate the potential value of AI-generated summaries for record request processing, I conducted a two-phase research study combining qualitative interviews with quantitative validation to understand user needs and reactions to the proposed feature.

Research Goals

Method 1: Personal Inventory Interviews

Method 2: AI Summary Evaluation Survey

Affinity diagram
Affinity diagram organizing key research findings

Key Insights

Critical Information Elements

Users consistently prioritize specific information types when processing requests, including date/time parameters, fulfillment possibility, appropriate department/assignee, "any and all" language, location details, and report numbers.

Strong User Interest

104/150 respondents indicated they would use the AI summary feature, with 100 rating it 4+ on a 5-point scale.

Workflow Enhancement

Users found the summary easy to read, appreciated its ability to reduce workload, valued it as a starting point, and noted it helped clarify next steps and internal processing.

Usage Concerns

Some users expressed hesitation about relying solely on AI summaries, with specific concerns about potential liability if important information were omitted or misrepresented.

Implementation Considerations

Users expressed interest in editing capabilities and indicated they would still review the original request text, suggesting the summary should complement rather than replace existing workflows.

Research Impact

This early-stage research provided crucial validation before significant development resources were committed. The strong positive response from users gave the product team confidence to move forward with implementation while highlighting specific design considerations around placement, permissions, and editing capabilities. By identifying both enthusiasm and concerns, the research enabled informed decision-making about feature scope and safeguards, ensuring the final implementation will maximize value while minimizing risk.

Reflection

This research project offered valuable insights not only about the specific AI feature but also about our product development process and user needs. The early-stage research approach allowed us to validate hypotheses, identify potential issues, and make informed decisions before significant resources were committed.

Technical Uncertainties

Balancing AI accuracy with user expectations required extensive experimentation with prompting approaches to generate consistently useful summaries

Research Scope

Determining the right combination of qualitative and quantitative methods to capture both depth of understanding and statistical confidence was a critical early decision

User Trust

Addressing concerns about AI reliability and potential liability required transparent communication about the system's capabilities and limitations

Feature Positioning

Finding the right balance between promoting the AI summary as a workflow enhancement while acknowledging it should complement rather than replace human judgment

Stakeholder Alignment

Ensuring product teams shared a common vision for how AI could responsibly enhance the user experience without introducing new risks

User Value Confirmation

The strong positive response validated our hypothesis that AI summaries could significantly enhance workflow efficiency while identifying specific value drivers

Implementation Insights

User feedback revealed important considerations about placement, permissions, and editing capabilities that will directly inform our design approach

Trust Requirements

Users' concerns about liability highlighted the importance of transparent AI implementation with appropriate safeguards and human oversight

Research Impact

Early-stage research proved invaluable in reducing development risk by identifying both opportunities and concerns before committing to implementation

Workflow Integration

Understanding that users would still review original requests helped clarify that the feature should enhance rather than replace existing processes

Design Exploration

Develop and test multiple approaches to summary placement and visualization within the request workflow

Permission Framework

Create a comprehensive model for viewing and editing permissions that addresses user concerns while maintaining workflow efficiency

Implementation Planning

Success Metrics

Establish clear measures to evaluate the feature's impact on request processing efficiency and user satisfaction

Continuous Improvement

Create a system for ongoing evaluation of summary quality and relevance based on user feedback and usage patterns

Continue exploring my work