Simplify complex writing
Overview:
Organization: QRA Corp
Product: QVscribe - Server Integration Software
Role: Lead Product Designer
Industry: B2B Aerospace and Defensive
Toolstack: Figma, FigJam, GitHub, Adobe Suite
Introduction
In this case study, we delve into QVscribe, a software solution designed to tackle challenges from poorly written engineering requirements. If you are not familiar with what exactly a requirement is let me set you a scene.
Imagine writing out every part of a spacecraft—its hardware, software, and calculations—with flawless precision to prevent critical failures. The clear, concise documentation you are authoring is known as requirements.
Here, QVscribe acts as your essential tool, reviewing and refining your precision requirements in real-time, highlighting ambiguities, and ensuring that every detail is communicated effectively to enhance project success and safety by lowering risk.
the starting point
I came on as the lead product designer when the first QVscribe concept was conceived by our Research and Development Department. The initial concept was of course rough but had value outcomes already outlined.
Our business objectives:
Increase Efficiency: Create an intuitive interface to simplify requirement scoring, boosting adoption and increasing sales potential.
Improve Decision-Making: Provide actionable insights that support better decision-making when authoring.
Enhance Analysis: Add additional features that could indicate risk, enhancing the product's appeal to new and current users.
QVscribe already had proven user needs through our existing R&D contracts and partnerships with organizations such as Lockheed Martin Skunk Works, United States Air Force, and NASA. With automatic access to users through these contracts, it gave us the perfect starting point to learn more about what was required from the product.
First Initial Concept of QVscribe from the R&D Department.
understanding our users
Having immediate access to organizations with large global workforces specializing in engineering requirements allowed us to run focus groups. From these groups, we created user segments, mapped out user interactions, and collected analysis requests/features.
In these discussions, we identified the following four key user segments that could benefit from QVscribe:
Requirement Consumer
They interpret and distribute requirements for execution within their company and external sources. They review requirements for risks that occur when executing.
Standards and Compliance Enforcer
They assign standards, review requirements for compliance and collaborate on developing new standards to maintain quality and regulatory compliance.
Requirement Author
They write detailed requirements aligned with the project scope and standards to ensure clear, actionable guidelines. They report directly to the project and review manager.
Project and Review Manager
They oversee the entire project from reviewing authors' requirement quality, working with enforcers to ensure high-quality standards and delivering requirements to consumers.
User Interactions Mapping
Each group has distinct tasks that they complete themselves but must collaborate effectively to ensure overall project success.
User interaction map that outlines the actions between each user group.
Finding the Problems
Using the interaction map, and the information learned from our focus groups we started evaluating what areas QVscribe could provide value and improve.
Daily, all these users were interacting over pre-existing Requirement Management (RM) Tools such as DOORs, Jama, and Polarion. To use QVscribe in its concept form they would have to copy, and paste between the RM Tool, and Microsoft Word itself.
Next was the insight that every user’s main job revolved around the idea of risk. Authors write low-risk requirements, enforcers outline what risk is, project managers make sure the overall project is low-risk, and consumers assess risks that could be found when executing. We observed the current star system wasn’t resonating with their risk mindset.
The last problem area was our project managers and enforcers were getting bogged down with the number of requirements they had to review for quality when they just wanted to know where the current overall quality of the document was.
Linking Business Objectives to User Needs:
Objective 1: Increase Efficiency
Missing Requirement Management Tool Integrations
Objective 2: Improve Decision-Making
Replace Non-Risk-Based Scoring System
Objective 3: Enhance Analysis
Lack of Full Documentation Analysis
New Analytical Features
User Requests for New Analytical Features:
Ability to View:
Found Units
Found Terms
EARS (Easy Guide to Requirement Syntax) compliance while Authoring a Requirement
Similar Requirements
Ideation
After the information-gathering stage, we focused on addressing our user's problems while also allowing room for new user requests the development team started working on. During the initial flow sketching, we collaborated closely between the teams to ensure that the capabilities of the new features were consistent and feasible.
Initial Flow Diagram
The launching pad for the design was to integrate seamlessly with our user's existing Requirement Management (RM) Tools. The current concept was inefficient, as users had to waste time copying and pasting into their RM Tools. I determined that not only did we need to integrate directly into their RM Tools, but we could also better accommodate a wider range of users who work across different RM tools by maintaining consistent design grammar and standards.
We concluded that it was important to provide our managers and enforcers with the capacity to view requirements at the document level rather than individually. This approach would significantly reduce their review time for delivery to consumers.
I replaced the existing star-based system with a more traditional "stop-light" colour system and expanded it from 3 to 5 levels to offer more incremental feedback when writing a requirement. The red, yellow, and green colours are well-established symbols of risk in everyday life, which align better with our user's mindset.
Flow diagram presenting initial user actions, score system, and screen breakdown.
Low-Fi Wireframes
While completing the previous flow diagram, our development team finalized their exploration of interactions across various RM tools. We decided to integrate scores and our icon within these tools but opted to use an iframe to display additional information due to developmental restrictions. This approach was incorporated into our wireframes to ensure a consistent user experience across our reviewing and authoring flows.
For the review screens, I aimed to deliver document-level quality stats that provide clarity on document health at a glance. To streamline the information presented, I decided to create a secondary tab on the score summary panel for the problem type breakdown. This change was made because most users focused primarily on the score and its impact on them, rather than the detailed breakdown of the problems themselves.
For the authoring screens, I focused on efficient requirement editing by highlighting the most critical errors at the top and listing them in descending order of perceived risk. Introducing this listing UI card system allowed us to better organize and present all types of output information.
QVscribe low-fi concept illustrating document-level review and individual requirement authoring, starting from existing requirement management tools.
In this prototype, we opted for a simple, minimalistic colour scheme to enhance user focus and readability. The restrained use of colour prevents visual clutter while strategically highlighting errors in requirements. Different alerts—such as quality, warnings, units, terms, similarity, and EARS compliance—are colour-coded to effectively draw attention. This approach ensures critical issues are immediately noticeable without overwhelming the user. By using colour sparingly, we create a clear, distraction-free authoring and reviewing experience, making it easier for users to concentrate on content and efficiently address problems.
Design
QVscribe high-fi concept illustrating document-level review and individual requirement authoring, starting from existing requirement management tools.
usability testing
Task-based Testing
When testing our prototype, the first key question was whether the updated design still allowed requirement authors to improve their writing using QVscribe. This was crucial because the success of other user actions depended on seeing results improve or decline. I conducted user sessions where participants edited a requirement from a score of 1 to 4 or higher using the new design. We included authors from junior to senior levels. While most succeeded, two very junior users struggled and would need additional guidance in the product’s future vision, often stopping at a score of 3.
The second key question was whether project managers, compliance enforcers, and requirement consumers could detect the quality of their documents, as this information is crucial for their work. In these user sessions, I presented three different documents on our review screen and asked users to rate the documents as poor, fair, good, very good, or excellent. Documents A and C, representing the extremes of the quality scale, clearly demonstrated that users could detect quality levels. However, Document B, which was in the middle of the score scale, caused some difficulty for users.
Can requirement authors improve their quality using QVscribe?
18/20 users could accurately edit a requirement from score 1 to score 4 or above.
Can project managers, compliance enforcers, and requirement consumers detect the quality of a document?
Users could detect document quality, with extreme examples (Documents A and C) showing clear results, while the middle example (Document B) posed challenges.
Implementation
Development Handoff
During the development handoff for QVscribe, I emphasized design precision and consistency using Figma’s Developer Mode. This included providing exact hex codes for color accuracy and overseeing asset optimization for clarity and performance. I also built a comprehensive design library in Figma, standardizing UX/UI elements and facilitating smooth integration with existing RM tools. By incorporating feedback from our development team directly in Figma, I minimized usability concerns related to integrating multiple RM tools. This approach ensured clear communication and an efficient handoff between design and development teams.
Example of assets provided through Figma Developer Mode.
Launch Metrics
Upon launch, QVscribe received positive initial feedback for its enhanced capability to identify and refine engineering requirements. In the following year, QVscribe achieved a sales growth rate of 57%, reflecting improvements in the interface.
User satisfaction with the document review screens was notably high, making it a key feature for adoption. Over the following months, our adoption rate increased by 22.5%. This enhancement enabled users to review documents consistently across platforms with ease and facilitated sharing findings with colleagues without requiring detailed analysis of the entire document.
Additionally, the new risk scoring helped enforcers, consumers and managers by allowing them to provide critiques diplomatically, using QVscribe as an impartial source: “I’m not saying it’s bad; QVscribe is saying it is.” This approach effectively eased their workload and softened the impact of their feedback.
Reflection
Problems & Solutions
Problem 1: Increase Efficiency
Solution: Implemented integrations with popular requirement management tools such as IBM DOORS, Jama, and Polarion. By automating data exchange, QVscribe minimizes the need for manual transfers, reducing errors and saving time.Problem 2: Improve Decision-Making
Solution: Replaced the current star-scoring system with a risk-based one. This enhanced decision-making by helping teams prioritize critical requirements that pose the greatest risks to the quality of their document, leading to more informed decisions.Problem 3: Enhance Analysis
Solution: Added functionality that allows QVscribe to analyze entire documents for terms, units, similar requirements, and quality. This enhancement would enable a holistic review of all relevant documentation, improving the quality and completeness of the analysis.
Lessons Learned
Allowing users to conduct both document and individual requirement analysis is critical for viewing a document in its entirety. This approach provides significant value by potentially reducing risk.
Providing clear error indications while seamlessly integrating into the user’s workflow has been essential for successfully meeting business objectives.
Future Improvements
Following the project launch, we identified areas for improvement to focus on in future releases and the development roadmap:
Suggestion 1: During task-based user sessions, our junior users struggled to raise a requirement’s score above 4. I recommend expanding the guidance provided for each problem type to better support learners, which should allow further growth of the user base.
Suggestion 2: Implement a glossary system for term consistency analysis. Our users need specific industry terms to have definitions available inline with their editing to ensure correct usage.
This case study highlights QVscribe, emphasizing the product’s design process and outcomes. Through detailed focus groups, task-based user sessions, and followthrough in execution, we successfully created a product that helps users write clear, and concise requirements and reduce document review overhead.