
Revolutionising Jira's usability and workflow with AI
Overview
This is a project that I did for SUEDE x Atlassian 2023 Designathon. The design brief required us to come up with a solution involving AI integration into Jira. My teammates and I researched, designed and conducted usability test within a week. Given a short amount of time, our goal was to build a solution that can solve the problem without much interference of the interface.
Role
UX Researcher
UI Designer
Tools
Figma
Miro
Team
My Nguyen
+ 3 team members
Duration
1 week
Achievement
Our project got into top 5 of 23 teams in the competition.
The brief
Struggles with learning curve on Jira
In the current landscape of project management software, Jira stands out as a leader with a resolute mission: to unlock the full potential of every team. However, as more and more teams turn to Jira to streamline their work processes, a significant challenge emerges. The platform necessitates users to acquaint themselves with a myriad of new concepts and technical terms, demanding substantial time and effort. This learning curve can often prove to be time-consuming and burdensome, hindering teams form achieving their objectives efficiently.
In response to this challenge, Jira is poised to leverage the power of artificial intelligence (AI) to simplify complex tasks, save users valuable time, and empower teams to advance their missions more efficiently. This strategic move holds the potential to make Jira even more dispensable tool for project management.
process
primary research
Investigate the factors that exacerbate the challenges faced by users
Inadequate guidance for users
Users are struggling due to the lack of clear and comprehensive guidance, making it challenging for them to effectively utilise the platform
Lack of clarity regarding functions and sections
The ambiguity surrounding functions and sections within the application has resulted in uncertainty about their intended purpose, compounding the learning curve
Lack of accommodation for different workflow
Jira predominantly caters to IT and software development, leaving other workflow management needs unaddressed. This limitation hinders its adaptability and utility for a broader range of users.
Transform user painpoints to user needs
To address these challenges, we focused on two key areas: providing real-time and clear explanations and offering personalised guidance for new users.
Clear and concise explanations
Users require easily accessible, clear, and concise explanations of terms, functions and sections within the platform to enhance their understanding.
Personalised guidance
Users seek personalised and user-friendly guidance to navigate and address challenges within the platform effectively.
personas
Categorise user groups on Jira
Consequently, based on insights from our research phase, we developed two personas: Jill and Ryan. The common goal for them is to enhance their efficiency in tasks or project management with a user-friendly interface, but their specific needs differ. Jill seeks simplified processes and contextual explanations, while Ryan looks for personalised guidance and recommendations to enhance his productivity.
ideate
Reframe perspectives to problem-solve
To inspire innovative solutions, we employed the ‘How Might We’ method to reframe user challenges and pain points as open-ended questions, laying the groundwork for our ideation process. This led to the generation of three distinct concepts:
How might we make the functions more clearly to the users?
AI assistant
Displaying pop-ups explanation of the function's purpose and how it relates to user current task.
How might we guide users to effectively use the platform?
AI Checklist
Providing a checklist of recommended actions needed to complete the task.
How might we be more accommodating to users with diverse workflow?
AI-based recommendation system
Offering personalised suggestions on actions and tools based on user's past interactions.
To assess these concepts, we utilised the ‘Design Matrices’ method, which involved mapping specific criteria against each idea to evaluate its suitability. Initially, the AI-based recommendation system emerged as the preferred concept through this evaluation. However, following extensive discussions and a deeper analysis, we recognised that this concept might not comprehensively address all user pain points as effectively as desired. In response, we strategically decided to combine all three concepts into two primary features: ‘Explaning AI’ and ‘QuickStart AI’. This approach ensures a more comprehensive and user-centric solution to streamline the Jira user experience.
Chosen solutions
Explain AI
Explain AI was designed to bridge the knowledge gap, aiding users in comprehending technical terms across different projects and providing clarity on sections and functions within Jira when needed.
When users hover over or click on specific functions, fields, or buttons, it activates contextual pop-up explanations, delivering clear and concise information about the selected element’s purpose and its relevance to the user’s ongoing task.
QuickStart AI
The current 'Quickstart' button in Jira guides users in setting up projects, but it falls short of their comprehensive needs. To harness its full potential, we're introducing AI integration, optimising the interface without introducing new tools or spaces. This streamlines the design process and maximizes the existing interface.
Quickstart AI is like a helpful assistant that personalises tasks based on the user's needs. It provides a checklist of steps that they can do automatically or manually, with guidance. Additionally, Quickstart AI also provides shortcuts to templates, features or workflow that users frequently follow within a board to increase their efficiency.
Prototype
We started designing low-fidelity wireframes to illustrate the locations of the features and its workflow.
Explain AI
QuickStart AI
Usability testing
Following the creation of our wireframes, a comprehensive evaluation was conducted using the Cognitive Walkthrough method, complemented by the System Usability Scale, involving three participants. The resulting key findings reveal the following:
Difficulty in recognising available features/actions
Users felt lost and overwhelmed, struggling to identify key features. This reinforced the need for more intuitive navigation and clearer in-context guidance
Unclear representation of AI capabilities through icon choice
The choice of an initial icon, represented by a “lightbulb”, proved inadequate in effectively communicating the automation capabilities of the AI to users. Clarity and transparency in iconography emerged as vital considerations for improved user understanding.
final design
Next Steps
Conduct more comprehensive user testing and further refine our prototype.
Quickstart AI's potential could be expanded beyond task assistance, possibly evolving into a versatile tool that can respond to a range of user queries.
Enhance the user journey map account for various potential user pain points will help inform the design.
Implement a feedback/rating feature for Quickstart AI would enable users to provide input on their experience, helping us continually enhance the tool.
What I learned…
Designing for unfamiliar user groups: Competing in my first design competition pushed me beyond classroom learning and into real-world UX challenges. Unlike university projects, where I could choose problems based on personal interests, this competition required designing for a specific, unfamiliar user group with unique characteristics like technology proficiency, age, and profession. This experience reinforced the importance of user research and usability testing, especially when designing for a niche audience. I learned that recruiting relevant participants is essential for meaningful insights, even when working under tight deadlines. Balancing limited time, resources, and research depth has helped strengthen my ability to prioritise key testing and iterate efficiently.
The role of success metrics: One of my biggest takeaways was learning how to measure design impact. During the final round, the judges asked: “How do you know your design improves Jira’s usability? How do you measure success?” This question made me realise that usability metrics aren’t just optional but essential for proving a design’s effectiveness. Moving forward, I’ll incorporate clear benchmarks into my process, tracking task completion time, error rates, and user satisfaction. Defining success early on will help validate decisions and ensure my designs create measurable improvements.
Final thoughts: This project taught me how to design for diverse user needs, conduct research efficiently under constraints, and measure success through data-driven insights. I’ll carry these lessons forward to create more intuitive, impactful, and measurable UX solutions.