Generative research and UXR ops for Justfix

quantitative qualitative   form and data portal


Overview

Justfix builds digital tools and databases for equitable housing, in an agile environment with continuous collaboration. The three main products were:

  • Who Owns What building research and dashboard tool (built on JavaScript, React, Fullstory, Python datasets in Git)
  • Letter of Complaint (LOC):  web portal with login
  • Rent History Tool (chatbot and information retrieval)



Main Challenge

We didn’t know who our users were despite there being hundreds of users per day. And because there had never been a UX research practice before, we had no idea the best times to send out a survey or what to compensate participants.    

Summary

  • established a UX research practice from scratch that resulted in a 40% reduction in overall time to generate insights. Used automation using Zapier wherever possible
  • led survey research and multivariate analysis of approx. 1000 users + follow-up user interviews in order to inform discovery phase, increasing overall survey completion rate by 500% in 6 weeks and creating 5 new personas. This was repeatable across 3 products.
  • 5 new personas 



Role


Lead UX researcher.

Tools


Figma,  Zapier, Front, Javascript, Typeform, Tremendous, Dovetail

Methods


survey, usability testing, moderated interview, contextual inquiry







1. Pre-project research


To avoid redundancies and narrow down research questions, I looked at my previous work:

  1. Continuous research interviews broken down into atomic nuggets
  2. Diary studies with developer-type users 


Identifying Key Needs + Methods

Since this project was so vague, I had several conversations with our internal stakeholders about their key needs. I then chose research methods based on their needs.

Leadership needs: data/insights to share with investors.
meet funding goals for the year.Solution: quantitative survey with demographic questions.

Product team needs:
a way to compare users across products.
new product improvements to make.Solution: Most of the survey questions would be repeatable across all three products + have Qualitative interviews on the outliers (income/disability-wise) from  the survey (because we’re in the public sector, this allows for a more accessibility-focused viewpoint).



2. Research plan 

Problem: I had no idea what to compensate participants or what time to send out the surveys. I also did not know what participants might respond best to.

Solution: 
  • 7-8 stages of survey rollout: needed room to test out different UX operations schemes (compensation, times, etc)
  • quantitative part allowed for demographics + qualitative allowed for more free-flowing insights



3. Stakeholder Workshop

Problem: I got a lot of conflicting feedback from the team on my plan. People kept wanting to add more survey questions.

Solution: I conducted a co-creation workshop to get their buy-in. Everyone’s perspectives were visible to each other, meaning it was easier to compromise.





4. Phase 1: Recruitment and surveys


The logistics of survey recruitment included:

Deciding on a survey platform:
I did a competitive analysis of several different platforms including Qualtrics, Surveymonkey, and Typeform. Rating platforms by fit for survey questions, user experience, automotability, and website integration, Typeform came out on top.  

Deciding on and automating a rewards system:
I did not want to make the operations team have to manually send out 800+ rewards gift cards and thank-you emails. So, I used an API to integrate Tremendous with Typeform and Front (our customer messaging system), so that participants would automatically receive a thank-you email with their Tremendous gift card award and an option to opt-in to further research.
.


Here is what my initial journey map looked like for a WOW survey participant including both user and JustFix actions. (Zoom to look closer). I prototyped the front-end survey integration with the website using Figma and worked with developers to implement.




Phase 1 Results

  • Compensation system based on real data: Optimized reward amount and also know best days   and times to send surveys to different user groups
  • Increased survey completion rate by 500% by end stage
  • Surveyed 800+ users for one product + brushed up my multivariate statistics chops
  • Quantitative analysis report (bar graphs and summaries) + Presentations tailored to different stakeholders
  • Came up with 5 major user categories/ personas
  • Was able to repeat the survey across 3 major products with minimal tailoring for each product

For example, 5 key user groups of the first product, Who Owns What (WOW), were fleshed out:

Tenants searching for their current building

A large majority (about 73%) had previously taken actions to protect their tenants’ rights, which tells us that many users have some familiarity with the housing system (maybe there’s a case for individual “power users.”)


Individual housing 
seekers

WOW seemed to be the preliminary step in most housing seekers’ journey, with 63% not having seen, applied, or moved into the apartment they were searching for.
Government employees (non-lawyers)

A great one for fundraising: respondents rated WOW at an average of 3.8 (1-5) when asked how essential it was to their job performance supporting tenant outcomes.
Legal workers

 
75% of legal workers had used WOW to prepare for case work, with a staggering 85% of those cases being eviction cases
Community Organizers


There was about a 50/50 split between paid and unpaid organizers, but this may have been due to Justfix’s outreach model.



5. Phase 2: Qualitative Interviews

Qualitative research interviews with the people from the survey we defined as “outliers” or “special cases” when it came to disability, childcare, and finances to understand more about them. This would optimize for accessibility and inclusivity.

.

Interview Goals

  • To understand the user’s journey and any glaring pain points that could be addressed by the product team.
  • To understand how the user’s background affected their usage of the product
  • To understand the aftermath of using the product: in the Letter of Complaint’s case, whether they experienced retaliation and whether they felt that it was worth it
Interview Questions

For the Letter of Complaint (LOC), the discussion guide covered:
  1. Consent, confirmation of reward, and summary of what I’d be asking them
  2. User's background + housing details
  3. Circumstances leading to sending LOC
  4. UX of LOC submission  
  5. Actions taken before/after
  6. Landlord's response and any potential retaliation
  7. Challenges faced + suggestions for improving 
Duration/Compensation

45 minutes
$50 gift card (automated email)
Number of Participants

5-7 per product (in the Letter of Complaint’s case, we ended up having 9 total due to serendipity with capacity)



6. Synthesis 

I established a tagging taxonomy to highlight quotes from the interviews and crosslink them to the participant database. I then formed insights from these tags. I initially used the 7 parts of the discussion guide listed above to group and create my tags. As i went along, I further refined the tags and eliminated redundant ones. I also grouped the insights by 9 categories which was useful in structuring the research report.


Some of the tagging groups
Quotes from interviews.
9 insight groups


Phase 2 Results 

  • Documented a repeatable discussion guide / qualitative interview process using Dovetail on Figma, and developed a user research roadmap for the next quarter.  
  • Around 5  glaring issues per product were identified during the follow-up user interviews, and were fixed within one-two weeks.
  • Used a decision matrix workshop with developers/ product manager to identify ideas for new research and filed in an agile tracker.
  • Insights/quotes included in presentation to funders. Funding goal was met for the year!