Create a counselor-facing tool that allows college counselors to track their clients’ meditation progress on our industry partner, ZenVR’s meditation lessons.
Cole Anderson
Suyash Junnarkar
Neharika Khandavalli
Research: User Interviews, Survey Design, Affinity Mapping, Personas
Design: Wireframing, Prototyping
Evaluation: Heuristic Evaluation, Usability Testing
Lead UX Researcher
We were asked by our industry partner, ZenVR, to create an application that allows college counselors to track their clients' progress on ZenVR's meditation lessons. Our goal was to provide high-level, meaningful and actionable insight to counselors about their clients' progress.
We created a desktop application that allows counselors to track their clients' progress on ZenVR's lessons, administer surveys and journal prompts about their clients' experience and use system-suggested metrics to track their clients' meditation journey.
"ZenVR wants to define the relevant characteristics of client progress on their meditation modules. Identifying these characteristics will enable counselors to track clients’ meditation journey and incorporate key insights into future counseling plans."
First, we used literature reviews to understand the context and the problem space. Then, we conducted surveys with 25 counselors to understand their attitudes and preferences about tracking their clients' meditation progress. Finally, we conducted interviews with 8 counselors to understand how involved counselors are with their clients' meditation progress and what parts of their client's meditation progress would have clinical significance.
To better understand how counselors track their clients’ meditation, we turned to existing literature about the topic. For the literature review, we looked at research that studied the effects of employing mindfulness techniques for therapy and the benefits of tracking meditation progress to improve individual care. The exercise of reading and consolidation research allowed us to establish an empirical foundation for the project.
Through our literature review, we found that meditation has many positive physical, emotional, mental, spiritual, and interpersonal benefits. From stress alleviation to greater resilience to enriched relationships, meditation has a positive impact on the practitioner.
While many studies confirm the mental health benefits of meditation by utilizing longitudinal tracking, we initially could not find any research about how mental health counselors track their patients’ meditation progress through existing technology. Upon engaging further with the literature review process, we reviewed known strategies through which meditation progress is currently tracked. We studied the Five Facet Mindfulness Questionnaire, an instrument based on the five facets of mindfulness as it is currently conceptualized and understood today. Although widely accepted as a successful tracking instrument, the FFMQ is still limited in some capacity as many aspects of consciousness and experience are strongly individualized and cannot be measured with complete accuracy.
Studies conducted in 2017 found that users felt a reduction in perceived stress when they performed meditation in VR rather than when they did an in-person session (Anderson et al., 2017). Studies like this suggested that VR can be adopted as a useful tool for mindfulness and meditation practice.
I organized and conducted a literature review session for my fellow teammates. Two of my teammates had never done a literature review before so I explained how to do a literature review and assisted them in successfully completing their part of the review.
The fact that I immersed myself deeply in existing literature improved my ability to empathize with our target users' pain points and create technology that empowers their work. As with any literature review, ours answered many questions and raised many more. The most important insight from the literature review for me was the fact that tracking meditation progress was an underexplored and challenging area, one which can be uncovered by talking to counselors who actively use meditation in their therapy.
Through the literature review, we gained a broad understanding of the therapeutic benefits of meditation, how counselors use meditation in therapy, and existing literature about how meditation progress is tracked. After the literature review, we decided to do surveys with counselors because (i) we wanted to validate what we found in the literature review, (ii) we wanted answers to get a general understanding of counselors’ attitudes towards meditation and meditation tracking.
The survey was primarily designed to fulfill the information goals stated above. We created the survey and had a counselor review the survey to ensure that we were not missing crucial information. Our survey was created using Qualtrics.
We sent emails to 300 college counselors across the United States inviting them to participate in our survey. We received 25 responses in total.
I wrote and iterated on the online survey created in Qualtrics. I paid attention to the sequence of the questions, the phrasing of the options and the overall coherence of the survey. I sought feedback on the survey design from teammates and other peers and participated actively in recruiting counselors.
Although this was not my first time creating a survey, I paid attention to the mechanics of the question, the grammar of my sentences, and the breadth of the options in a way that I hadn't before. Through the creation of the survey, I learned that a good survey is made through many iterations and a healthy obsession with the finer details. The findings from the survey gave me the confidence that our research process was headed in the right direction. It also underscored the significance of building a solution that tracks something as personalized and unique as meditation once again.
After the survey, we decided that we need to employ a research method that would enable us to perform a more in-depth investigation into our problem space. Since we were interested in learning about how counselors integrate meditation and mindfulness techniques into their practice and how they track client progress over a period of time, we decided to conduct interviews.
1. What is your role at the counseling center?
2. Can you walk me through a typical session with a client?
3. How do you incorporate meditation into your counseling sessions?
4. Do you track your clients’ progress? If so, how?
5. Do you recommend any meditation or mindfulness applications to your clients?
Do they help with tracking the client’s progress?
6. What do you hope for your clients to get out of meditation?
7. What kind of information is useful to know about your client’s meditation journey?
How do you use that information to inform your therapy?
8. How do you know if meditation is positively or negatively affecting your client?
At the end of our survey, we had invited counselors to participate in an hour-long interview about meditation tracking. We received 8 responses from counselors and were able to interview all of them for the project.
We initially decided that we are most interested in gathering data from college counselors who specialize in mindfulness and meditation therapy. However, we quickly learned through our preliminary survey results that this persona of counselors was incredibly difficult to reach as they were already limited in number. At this point, we decided to broaden our desired scope to any counselors or therapists that integrate some level of meditation or mindfulness into their practice.
Upon completing our interviews, we came together to interpret the data as a team. At this stage, we aimed to organize and analyze our findings through a series of interpretation sessions which aimed to uncover the meaning and implications of our users’ actions and language. As we went through our interview notes, we looked for repeating themes, patterns, intents, issues, and pain points under the topics of meditation and tracking client progress. By recording these emerging themes and patterns, we wanted to create a shared understanding of our data and inquire further into the meaning of our users’ words and actions.
After gathering our notes from the interpretation session and updating them to the desired format, we imported them into Miro to start our affinity modeling. The affinity diagram was a bottom-up process that enabled us to consolidate all of our interview data in a meaningful way.
All eight counselors reported that they track their patients' progress in some way or the other. Two counselors mentioned that showing patients how they have progressed motivates them on their mental health journey when they feel stuck.
“Mindfulness helps students live their life according to their values and practices and I like to talk to them about how mindfulness can help them feel more meaningful about things they do.”
A total of six counselors stated the importance of gathering a mix of qualitative and quantitative meditation metrics. For instance, quantitative metrics include the number of times a client meditated and the duration of each meditation session. Qualitative metrics included mood tracking before and after meditation and written self-reports that identify the thoughts and feelings that emerged during meditation.
“I think I would be particularly interested in tracking the frequency with which they practice meditation and their mood before and after doing the exercise.”
Counselors tracked their patients’ progress formally, informally, or through a combination of both. Specifically, two counselors reported using formal methods of tracking, while the others mentioned tracking patient progress both formally and informally. Five counselors mentioned that they have had trouble tracking patient progress due to the pandemic situation as patients do not tend to take assessments seriously in a remote setting as compared to having the patient do the assessment during the counseling session.
“I verbally tell my clients what to do over the week, I don’t really record it anywhere.”
“How I conduct my future sessions depends on how well I can track their issues and find recurring problems in their day to day life.”
I created the interview questions and sought feedback from the team. I conducted one-hour interviews with college counselors. I also participated in the interview interpretation sessions and the ensuing affinity mapping. Finally, I compiled the insights above to push the project into the design stage.
I experienced the nuances of interviewing users first-hand as I navigated technological issues, found new ways to explain our project to users, and probe them appropriately to answer our research questions. Through the interviews, we were able to build a good rapport with the counselors in a way that they lent their support throughout the project. I am exceptionally proud of me and my team for building long-lasting relationships with our users even through a completely remote form of communication.
We used findings from literature reviews, surveys, and interviews to create our user personas and empathy maps. Once we described our users in detail, we formulated the key features of the application based on their requirements and preferences. Then, we created concept sketches and sought feedback on the sketches. We progressed to the wireframing stage and conducted two rounds of feedback. Finally, we arrived at our high-fidelity prototype.
Each persona differs primarily in their approach to utilizing meditation and mindfulness tracking. Other factors that distinguish them are their demographic information, job role, specialization, and individual frustrations. The personas and empathy maps were constructed to draw attention to the difference in tracking approaches along with differences of opinion in utilizing mediation and mindfulness techniques. Our empathy maps were constructed to help express our user's mental thoughts, feelings, and needs. They were divided into the same two categories of users as the user personas.
These personas were developed and created by my talented teammates, Cole Anderson and Suyash Junnarkar.
Since this was an exploratory research project, we created our application from scratch. We started with concept sketches. The inspiration for the sketched concepts came from our exercise of creating user personas. After creating the sketches, we scheduled 45-minute feedback sessions with 5 counselors.
Check client details and lesson progress, view progress trends in the form of trackers, assign a new lesson, and view the upcoming lesson.
Research Finding: Counselors reported wanting a central location where they can quickly gauge the progress of the client and accomplish key tasks.
Set meditation-related short-term and long-term goals for your client. Add, edit, or delete goals as required.
Research Finding: Counselors mentioned that having a place to set and add goals would assist in assessing the client's progress from the baseline.
Create or add previously written open-ended or survey-style questions for your client to reflect on after a meditation lesson.
Research Finding: The ability to assign questions so that clients would reflect on their meditation practice was reported being a reliable measure of progress by the counselors.
View previously completed lessons and journals.
Research Finding: A way to check previously completed lessons and journals would allow counselors to identify the client's struggles and resolve them during therapy.
Choose performance indicators to share with your client or simply send an encouraging note.
Research Finding: Some counselors mentioned that clients often feel demotivated to continue on their meditation journey as they cannot see how far they've come. Being able to share quantifiable metrics would help motivate and engage clients.
The dashboard allows counselors to view client details, data visualizations, recently completed modules, client goals and upcoming lessons.
On the homepage, counselors can view the profiles of all their clients, access the side navigation bar, and access their own profile settings.
The assessment tracker keeps a record of all of the standardized assessments administered by the counselor for that particular client.
The share progress feature allows counselors to share the client's journey with the client as an artifact of progress and encouragement.
To collect feedback on the first iteration of our wireframes, we conducted feedback sessions with five college counselors. We showed them the wireframes and asked questions about the comprehensibility of the terms used, the usefulness of individual features and the cohesiveness of the application.
Issue 1: The customizability of the tracking tool
The tracking tool was created as a way for counselors to track certain metrics about the patients. The wireframe that was presented to counselors showed a mood tracker, but when counselors were asked which metrics they would like to track about their client, the response differed greatly.
Design recommendation: Ask counselors about commonly tracked psychological metrics and include them in the application with the ability to create their own metric.
Issue 2: Standardized mental health assessments are not helpful in the context of a meditation tracking application.
Most counselors reported that they would not use the standardized assessment feature because
Design recommendation: Remove the standardized mental health assessments. Instead, allow counselors the freedom of creating and administering their own questions dependent on the clients' needs and journey.
Issue 3: The phrase "keyword summary" is confusing.
Counselors were confused about what the phrase means and gave us feedback based on what they thought the phrase meant.
Design recommendation: Rephrase the feature in a way that leverages counselors' existing knowledge and lexicon.
This is the first page that the counselors would see when they log in. This page contains a list of all of the counselor’s clients, with their student ID, their name, and the date of last visit. Counselors can access functions such as sorting their list of patients, adding a new patient on this page. The top bar, with the help and profile icon, as well as the side navigation bar are present on every page of the application.
Upon clicking on a client’s name on the dashboard, the counselor will be directed to their dashboard which contains all of the client’s information. This is where the counselor can view their demographic information, the goals that they set for the client, and the lessons completed by the client. The counselor can choose to assign a tracker to visualize the client's responses to the questionnaire they asked in each journal. Questions are assigned through the “assign new module” page when the counselor wants to assign a new meditation lesson to the client. "Up Next" is a static card at the bottom right of the screen that gives a brief description of the contents of the next module in ZenVR.
When counselors click on “create journal”, they will be prompted to either create a new journal or choose from saved templates and journals. Clicking on “create a new journal” will lead the counselor to the screen on the left. Here, the counselor can assign a new prompt, which is an open-ended question and/or assign a questionnaire, which is a survey-type question. In the event that the counselor adds a survey-type question, they can select a scale from the side navigation. The counselor also has the option of printing their template, previewing it before they send it, saving the template, and finally, sending it to the client.
Once the counselor has assigned a survey-type question a certain number of times, ZenTracker will identify it as a metric and ask the counselor if they would like to track it by placing it on the client's page. This would allow the counselor to monitor metrics about their client as they pertain to their meditation journey.
This feature of the application allows the counselors to share their clients’ progress with them. The share progress feature can be accessed from the left navigation bar. When the counselors click on “share progress”, a pop-up allows them to add the attribute that they would like to share with their client. Once they have made their selections, they can preview the report and share it with their client.
First, conducted 3 heuristic evaluations with usability experts. Then we conducted usability tests with 5 counselors to evaluate the usability, efficiency, and aesthetics of our application. Finally, we condensed the key issues and usability problems and shared design recommendations with our industry partner.
We took inspiration from Nielsen’s 10 usability heuristics and adapted them to fulfill our specific research goals. We formulated our heuristics based on key tasks and features within the application.
Initially, we had decided to use Nielsen's usability heuristics for the heuristic evaluation. However, we soon realized that some of the heuristics were not relevant to our research goals. Hence, we pivoted our strategy and used Nielsen's usability heuristics as a guideline for our own usability heuristics.
Research Goal 1: Discern usability problems that users may encounter while performing key tasks
Findings:
"Client Details, Journal History, and Share Progress shouldn't be part of global navigation menu if it only pertains to a single client. It should only be navigable from the client's dashboard."
"I see the share progress bar when I click on it but I do not know whose progress it is and who I'm sending it to."
"Everything else on the menu as its own page so the share progress bar opening as a pop-up is unexpected"
Research Goal 2: Ensure that the terms used in the application are self-explanatory
Findings: Some evaluators needed clarification on the difference between specific words such as prompt, journal, and questionnaire.
Research Goal 3: Ensure that the user flow is as the user would expect it to be.
Findings: Evaluators were able to move swiftly through the Dashboard, Data Visualization, Lesson History, and Share Progress features. Users needed help navigating through Client Goals and Create Journal features.
I single-handedly created the new heuristics for the application and sought feedback from my team. I also wrote the feedback session script and note-taking template to ensure that we could organize our feedback with ease. Since I had never conducted heuristic evaluations before, I was eager to do so and conducted two out of three heuristic evaluations for the project.
In the process of adapting Nielsen's 10 usability heuristics to our application and research questions, I was able to broaden my understanding of the usability requirements that an application must fulfill. Upon receiving feedback from the heuristic evaluators, I understood that providing design recommendations is about prioritizing the feedback received and transforming it into actionables for the design team.
After getting feedback from usability experts, we iterated on our prototype and then conducted usability tests with five counselors.
Given the limited scope of available research methods during the pandemic, we opted to conduct remote usability sessions to test our prototype. We conducted hour-long usability sessions in the form of a think-aloud paired with a semi-structured interview with five counselors. We also had the counselors rate our application on the System Usability Scale.
While the main functionality and tasks within our application were completely prototyped, there were parts that were not clickable. Thus, counselors could not get a complete picture of what the fully interactive application would look like. Additionally, it was challenging to navigate the technological issues of remote usability, due to which, we could not gather objective data such as accurate task completion time.
Users reported that access to a client's particular details being accessible from the client details page is confusing.
Recommendation: Remove the global navigation from the Client Details page and have it visible only when a counselor clicks on a specific client's profile.
"Clicking on client details shouldn’t have led me to this client’s page since I didn’t choose a client from the dashboard."
“Whose journal am I looking at here?”
Users commented that not knowing whose progress report they're sending or whose journal entry they are viewing is inconvenient and frustrating.
Recommendation: Specify the name of the client next to the heading of the page.
The use of different words for the same term introduced confusion in the application.
Recommendation: Identify and fix verbiage inconsistencies. Test with users to ensure that the issue has been resolved.
“Looks like you’re using a client for the header and patient for the demographics. Is that accurate?”
Every other item on the global navigation opens up to a new page but the "share progress" feature opens as a pop-up, which is confusing to users.
Recommendation: The feature should be redesigned in the form of a page instead of a pop-up to maintain consistency in terms of interaction design.
Users were confused because the tracker had both a "back" button and an option that read" do not assign a tracker". Some users thought that the two buttons would yield different results but could not predict what those results would be.
Recommendation: Remove the "do not assign a tracker" button and the "back" button. Add a cancel button to the bottom right.
“Will the ‘do not assign box’ delete all the trackers I’ve selected so far?”
“No, it’s not super obvious that short-term goals are nested inside long-term goals. Maybe the indent should be different to make it more clear.”
It was not apparent to counselors that short-term goals were supposed to be nested within long-term goals because the indent was not large enough.
Recommendation: Increase the indent between the goals but also allow the user to now assign a short-term goal to a long-term goal.
The average SUS score across all our participants was 87.5. Research and industry standards suggest that any SUS score about 68 is considered above average, thus indicating that our system fares decently well in usability.
This project was unique in many ways. It was my first brush with the world of user experience research. While I had worked for research in many years, both in an academic and professional setting, the user-centric approach to the research project was an incredible learning experience for me. Looking back at four months of hard-work, carried out under unusual conditions due to the pandemic, I can say that I am proud of our final deliverables.
Another reason why this project was a unique experience for me was because I had never worked so closely in a team before. I believe that a large part of the success of this project was because of the support that team members extended to one another. Through the course of the project, we learned how to leverage each other's strengths, learn technical and soft skills from one another, and account for individual differences in the project. We also learned how to fine-tune our work philosophies to each other's, due to which our project was carried out with minimal disruption and conflict.
While this project has introduced me to a plethora of methods, processes, and frameworks, it has also made me take notice of how I could improve my own work practices. This project taught me the importance of deliberate and conscious decisions, from the sequence of questions in a survey to the roadmap of the entire project. Justifying why I was including a question in a survey to myself was one of the hardest, but most important learning experiences of this project. As I get ready to undertake more UX projects in the future, I know I will carry the lessons of team communication, deliberate planning and user obsession with me.
© Aditi Bhatnagar 2020. Created with Webflow. Made with love and powered by coffee.