JTBD can be used in many UX research methods, from surveys to usability benchmarking. Once you validate your JTBD through interviews and create your JTBD Canvas, you can use pieces of the canvas to inform numerous GitLab processes. For example, you can use the needs and circumstances of your jobs to help you increase the realism and readability of your solution validation tests. Likewise, the main jobs can help you create recruitment screener options to find the right participants for your study.
An effective recruitment screener collects information about the respondent without revealing too much information about the study they may participate in. Typically, a screener would have a question asking the participant's job title and perhaps a question about common tasks they perform. JTBD can help by using the job performer and their main job to create valid screening options. When picturing an ideal candidate, try to identify the matching job performer. Use that performer's main job instead of a job title, and we can use fewer screening questions to reveal the same information as before.
For example: Say we have a study that needs security professionals using the vulnerability report page.
Using GitLab's list of user personas, we could create a list of job titles used by security personas to generate a list of more than 10 titles, such as Security Analyst, Security Operation Engineer, Security Consultant, Application Security Engineer, etc. This list would be very cumbersome for a respondent to answer and may not reflect the accurate day-to-day role of your ideal candidate.
Using the JTBD for the Secure and Govern stages, we could identify two or three ideal jobs/job performers for our study. Look at these job statements for example: "I identify risk in my org's assets" and "I address detected business-critical vulnerabilities." Additionally, we can include jobs from other workflows in the Secure section if we want to include invalid options to weed out unfit candidates. This screener question would be a handful of options that participants can select, and the result is a participant that accurately reflects your ideal candidate.
To make it easier, you can save all of the main jobs for your stage as a template and allow only selected ones for each study you have, like in this example. If participants do not match their main job, that can signal you to go back and accurately check how you discovered your JTBD.
If you conduct solution validation to test a design, you will likely make a script containing tasks for your users to attempt and provide feedback on. A significant factor in the quality of your feedback is how realistic the tasks are for the users without biasing them with the obvious solution. One way to do that is by utilizing the circumstances, job process, and needs of the job you are building for.
You can follow these steps to incorporate JTBD within your script:
Heuristic evaluations can help team members understand the user's current experience with the product. These evaluations uncover insights that may require direct changes to improve usability or further research to increase organizational understanding of a problem. Heuristic evaluations usually take 1 to 2 meetings between a Product or UX stakeholder and a UX Researcher.
You can follow these general steps to integrate the JTBD Framework into your heuristic evaluation: