
Revolutionising Jira's usability and workflow with AI
Background
This is a project that I did for SUEDE x Atlassian 2023 Designathon. The design brief required us to come up with a solution involving AI integration into Jira. My teammates and I researched, designed and conducted usability test within a week. Given a short amount of time, our goal was to build a solution that can solve the problem without much interference of the interface.
Project type
Design challenge
Role
UX researcher
UI designer
Tools
Figma, Miro
Team
My Nguyen
Erien Simon
Taylor Lee
Ha Trang Trinh
Duration
1 week
Summary
Problem
How might we make managing work in Jira radically simpler for people with AI?
Outcome
We designed a prototype and wrote a case study that got us into top 5 of the competition
THE BRIEF
Struggles with learning curve on Jira
In the current landscape of project management software, Jira stands out as a leader with a resolute mission: to unlock the full potential of every team. However, as more and more teams turn to Jira to streamline their work processes, a significant challenge emerges. The platform necessitates users to acquaint themselves with a myriad of new concepts and technical terms, demanding substantial time and effort. This learning curve can often prove to be time-consuming and burdensome, hindering teams form achieving their objectives efficiently.
In response to this challenge, Jira is poised to leverage the power of artificial intelligence (AI) to simplify complex tasks, save users valuable time, and empower teams to advance their missions more efficiently. This strategic move holds the potential to make Jira even more dispensable tool for project management.
01 RESEARCH
Investigate the factors that exacerbate the challenges faced by users
To address the challenges that users are facing with learning new concepts and technical terms in Jira, a comprehensive understanding of these issues was imperative. To achieve this, we conducted Online Ethnography and Interviews. This research approach enabled us to collect valuable insights, which were subsequently organized into affinity diagram.
Within this extensive dataset, we identified three primary pain points
Inadequate guidance for users
Users are struggling due to the lack of clear and comprehensive guidance, making it challenging for them to effectively utilise the platform
Lack of clarity regarding functions and sections
The ambiguity surrounding functions and sections within the application has resulted in uncertainty about their intended purpose, compounding the learning curve
Lack of accommodation for different workflow
Jira predominantly caters to IT and software development, leaving other workflow management needs unaddressed. This limitation hinders its adaptability and utility for a broader range of users.
Transform user painpoints to user needs
Based on users’ pain points, there are two user needs that should focus on when designing the product.
Clear and concise explanations
Users require easily accessible, clear, and concise explanations of terms, functions and sections within the platform to enhance their understanding
Personalised guidance
Users seek personalised and user-friendly guidance to navigate and address challenges within the platform effectively
02 DEFINE
Categorise user groups on Jira
Consequently, based on insights from our research phase, we developed two personas: Jill and Ryan. The common goal for them is to enhance their efficiency in tasks or project management with a user-friendly interface, but their specific needs differ. Jill seeks simplified processes and contextual explanations, while Ryan looks for personalized guidance and recommendations to enhance his productivity.
Reframe perspectives to problem-solve
To inspire innovative solutions, we employed the ‘How Might We’ method to reframe user challenges and pain points as open-ended questions, laying the groundwork for our ideation process. This led to the generation of three distinct concepts:
How might we make the functions more clearly to the users?
AI assistant
Displaying pop-ups explanation of the function's purpose and how it relates to user current task
How might we guide users to effectively use the platform?
AI Checklist
Providing a checklist of recommended actions needed to complete the task
How might we be more accommodating to users with diverse workflow?
AI-based recommendation system
Offering personalized suggestions on actions and tools based on user's past interactions
To assess these concepts, we utilised the ‘Design Matrices’ method, which involved mapping specific criteria against each idea to evaluate its suitability. Initially, the AI-based recommendation system emerged as the preferred concept through this evaluation. However, following extensive discussions and a deeper analysis, we recognised that this concept might not comprehensively address all user pain points as effectively as desired. In response, we strategically decided to combine all three concepts into two primary features: ‘Explaning AI’ and ‘QuickStart AI’. This approach ensures a more comprehensive and user-centric solution to streamline the Jira user experience.
Chosen solutions
Explain AI
Explain AI was designed to bridge the knowledge gap, aiding users in comprehending technical terms across different projects and providing clarity on sections and functions within Jira when needed.
When users hover over or click on specific functions, fields, or buttons, it activates contextual pop-up explanations, delivering clear and concise information about the selected element’s purpose and its relevance to the user’s ongoing task.
QuickStart AI
The current 'Quickstart' button in Jira guides users in setting up projects, but it falls short of their comprehensive needs. To harness its full potential, we're introducing AI integration, optimizing the interface without introducing new tools or spaces. This streamlines the design process and maximizes the existing interface.
Quickstart AI is like a helpful assistant that personalises tasks based on the user's needs. It provides a checklist of steps that they can do automatically or manually, with guidance. Additionally, Quickstart AI also provides shortcuts to templates, features or workflow that users frequently follow within a board to increase their efficiency.
04 pROTOTYPE
We started designing low-fidelity wireframes to illustrate the locations of the features and its workflow.
Explain AI
QuickStart AI
05 TEST
Following the creation of our wireframes, a comprehensive evaluation was conducted using the Cognitive Walkthrough method, complemented by the System Usability Scale , involving three participants. The resulting key findings reveal the following:
Difficulty in Recognizing Available Features/Actions
Users encountered challenges in identifying available features and actions, underscoring the importance of providing clear guidance within the interface.
Unclear Representation of AI Capabilities through Icon Choice
The choice of an initial icon, represented by a “lightbulb”, proved inadequate in effectively communicating the automation capabilities of the AI to users. Clarity and transparency in iconography emerged as vital considerations for improved user understanding.
These findings will then be central to our focus as we work on enhancing the final design, ensuring a more user-friendly and intuitive experience.
FINAL DESIGN
Conclusion
Participating in my first design competition taught me crucial lessons. It enabled me to apply and refine skills from university, particularly in understanding and designing for user groups I'm not familiar with. Unlike university projects, which allow students to choose a problem of their interests, a design competition provides a restricted brief requiring us to investigate user groups with particular technology proficiency, age, and professions. While it may not always be viable due to resources and time constraints in a competition, I realized the importance of recruiting relevant participants for effective research and usability tests.
Another significant takeaway was the importance of success metrics. How do we know our design helps the users navigate on Jira? How do we measure it? As we entered the final round of the competition, this was the question that we were asked by the judges. Reflecting on this, I realized that conducting usability tests on our final design would have been a straightforward and effective way to validate our work.
Next Steps:
Conduct more comprehensive user testing and further refine our prototype.
QuickStart AI's potential could be expanded beyond task assistance, possibly evolving into a versatile tool that can respond to a range of user queries.
Enhance the user journey map to account for various potential user pain points will help inform the design.
Implement a feedback/rating feature for QuickStart AI would enable users to provide input on their experience, helping us continually enhance the tool.