scroll to see the process

Aggie Review - Designing a UC Davis Student-Validated Alternative to Rate My Professor to Improve Course Transparency

CLIENT

Codelab

ROLE

UX/UI Designer

TIMELINE

5 months

COLLABORATORS

3 designers, 5 developers, 2 pms

TOOLS

Figma, FigJam,

Why was it done?

As a UX/UI Designer on the Rate My Professor (RMP) redesign at CodeLab, our team was initially tasked with creating a UC Davis course review site for professors use. While RMP is widely used, it often created unproductive experiences for instructors. Our research revealed that the platform would be more effective if the primary users were students, shifting the focus from individual professors to the overall course experience.

Our solution was to refocus attention on course structure by designing a platform that encourages students to leave constructive, unbiased reviews, while accommodating diverse learning and viewing preferences.

Impact

Our research and hifi prototype revealed:

  • 6/6 usability test participants praised the platform’s ”user-friendly and digestible format.”

  • Delivered a fully annotated prototype to developers, enabling a successful launch.

  • Developed a metrics and roadmap plan to guide future designers at CodeLab for Aggie Review’s next stages.

Click here to jump to the prototype!

Overview

Short & Snappy: What’s This All About?

Context

Assumptions we Heard from Codelab

Codelab tasked our team of 3 designers to create a UC Davis-exclusive course review experience, prioritizing professors as the primary users and students as the secondary users. From our client, we heard:

Initial Thinking

Research-Led Pivot from Professors to Students

Our design team began by interviewing our primary users, professors. To better understand their needs, I took the lead in conducting and note-taking during five UC Davis professor interviews to:

Research

5 Interviews & 1 Survey to Understand Student Needs

I adopted a mixed-method approach to gain a deeper understanding of the student experience. This included designing and distributing a survey that received 55 responses, conducting one-on-one interviews, and taking detailed observational notes with 5 students. With the research goal of:

By triangulating these methods, I found:

Understand how professors perceive and manage their professional credibility and reputation on platforms like RateMyProfessors.

Understand how students engage with RMP, what information they value most in reviews, and how they assess review credibility.

Explore the thought process professors experience when reading, or being reviewed on RMP.

Early research and interviews reveled that professors had little motivation to engage with or use such a platform. This insight led us to pivot our focus toward students — recognizing that by improving the experience for students, we could indirectly create a more constructive and meaningful feedback environment for professors.

Designed to make student feedback clearer, more credible, and easier to act on.

“Students lack awareness on what makes a good classroom in college and education.”

UC Davis Professor

“I rely on the University’s end-of-course student evaluation to assess and refine my course structure, as this feedback is often more insightful and actionable than other forms of input like RMP.”

UC Davis Professor

Inaccurate. The flaw of the system is it favors one w/a disagreement and grudge.”

UC Davis Professor

“I use RMP but I keep in mind it’s not objective. I look at various reviews to compare accuracy. It’s very open and students or non-students can abuse.”

UC Davis Student

“RMP doesn’t ask specific questions, you click a number and maybe write a review. I don’t just want to see ‘3/5’ or ‘this is a good class.’ I want context that helps me understand the course.”

UC Davis Student

“When browsing reviews, I want to skim ratings to get a sense of the class, then dive deeper if I need more context like the workload, teaching style, and difficulty.”

UC Davis Student

🧑🏻‍🏫 

Make Peer Reviews Actionable for Professors

UC Davis professors want to enhance the value of reviews to accurately reflect the course and use them as a tool for evaluation.

📝 

Rethink the Current Reviewing Forum

Rate My Professor helps students choose instructors through peer reviews, but professors criticize its lack of moderation, citing biased & misleading feedback.

Where do you find information about a professor's course?

55 Responses

12%

9%

14%

65%

Rate My Professor

No research

Ask peers

Other

Have you written a review on RMP?

55 Responses

78%

22%

Yes

No

How accurate do you believe Rate My Professor reviews are?

55 Responses

Inaccurate

Somewhat Inaccurate

Neither Accurate nor Inaccurate

Accurate

Very accurate

6%

15%

63%

10%

6%

Validating RMP as the Primary Tool

Of the 55 responses, 36 students reported that their primary tool is Rate My Professor.

Students Question Review Accuracy

35 responses revealed students find RMP neither accurate nor inaccurate.

Students were Unmotivated

Students were unmotivated to write reviews. Reframing the process to be more rewarding and supportive could encourage more balanced participation.

Reviews don’t Reflect Diverse Experiences

Current review formats didn’t capture the variety of student experiences and needs. We identified a need for more flexible ways to express feedback—through quick ratings, prompts, or tags.

Existing Review Structures Reinforce Bias

Existing review structures reinforced personal bias. We saw an opportunity to shift the focus toward course experience to reduce bias and improve the overall quality of reviews.

Passive Use Dominates RMP

RMP drives course research, but student engagement is low—over three-quarters of students read reviews without writing any themselves.

Shifts from professor to course-centered search with professor filters

Encourages richer discussion by breaking down course aspects with a Likert-style rating system

Breaks down overall scores and keyword highlights from top-selected tags

Promotes balanced reflection by sharing what worked well and what could be improved

Builds student background to contextualize scores and experience

Identifies biased language in reviews and encourages students to revise feedback

Learnings

Growing Through Collaboration and Constraints

Collaboration and Constraint-Driven Problem Solving

Working closely with six developers taught me how to collaborate across skill levels and adapt designs to real implementation constraints. Time and technical limitations pushed our design team to be resourceful—using constraints as opportunities for creative problem-solving.


Design Validation through Measurable Impact

I learned the importance of defining measurable next steps to validate design success. Our team outlined metrics to track engagement and usability after launch—testing site flow clarity, measuring behavior changes, and exploring gamification strategies to motivate students to write reviews. This process helped me think beyond handoff and consider how design translates into sustained impact.

Create a student-based platform where students can accurately and objectively learn and discuss the course structure?

The Goal:

Opportunities we identified through affinity mapping:

Defining Success Metrics for Future Design Teams

How we drove impact for Codelab

Impact

Measuring Impact Through Testing and Delivery

I validated our impact through 5 usability sessions and a successful handoff to developers, resulting in a seamless launch. Defined success metrics now help track usability, retention, and student satisfaction—setting a foundation for sustainable growth.

User Growth & Retention

Sign Up: Percentage of students who successfully create and verify their accounts

User Retention: How many students continue using the platform over time, measured through Net Promoter Score (NPS) and account activity

User Loyalty: Percentage of users returning to leave additional reviews

Validated Through User Testing

6/6 usability test participants expressed adopting our platform for its “user-friendly and digestible format.”

Prototype Handoff & Launch

Delivered a fully annotated prototype to developers for handoff and successful launch.

Design Roadmap for Future Iterations

Created a metrics and roadmap plan for future designers at CodeLab to guide next stages of Aggie Review.

Experience & Engagement

Student Engagement: Number of sessions, clicks, and page visits per student

Reviews: Total number of reviews submitted (tracked via survey and analytics)

Task Completion: Task success rate and time-on-task during usability testing

Authenticating Student Feedback via Sign-Up Flow

Account creation requires students to use their school email and provide contextual information about themselves—all on a single page—while staying transparent about data use and avoiding overly personal details.

Designing Search to Center on Courses, Not Professors

Search bar interactions are designed to immediately allow students to search by courses, professors, or both, streamlining access to relevant information.

Streamlining Course Reviews for Quick Insights and Depth

Reviews focus on the course, mix quantitative and qualitative data, show reviewer context, include filters, and support reactions—improving clarity, consistency, and control.

Streamlining Course Reviews for Quick Insights and Depth

Students track their progress, reflect on course strengths and weaknesses, and assign scores that form the overall review rating.

Proposed Solution

Refining the Concepts through Testing

I collaborated with our lead designer to build high-fidelity components and actively contributed in design critiques. I translated user testing feedback into design improvements, ensuring visual consistency and clear, user-focused feedback throughout the interface.

Early Ideas

Exploring Ideas to Align with the Project Vision

I designed lofi solutions to create a productive forum. This involved shifting the focus from the professor to the course structure, breaking down questions to capture positives and negatives, and building context on students’ backgrounds. I then presented early design concepts with developers at Codelab Agency to gather feedback.

Testing

Validating Concepts with 6 Users at the Mid-Fidelity Stage

I led the writing review and search flows for the mid-fidelity prototype, while supporting the team on profile creation and course review access, allowing us to split work efficiently and accelerate the design process.

Lack of Structure Fuels Unreliable Impressions

Without prompts or standardized criteria, RMP reviews vary in depth and tone. Students struggle to find consistent, trustworthy information.

💡

Scoping early with a cross-functional team informed us:

that implementing a bias checker was out of scope due to time constraints and limited developer expertise.

incentivizing writing reviews required a clearer value exchange and a more intentional account creation flow.

What Worked

Users valued the combination of numerical and written responses, confirming that mixing quantitative and qualitative data improves clarity and trust.

Enable users to set up and manage personal account

6/6 users felt the progress bar didn’t accurately represent remaining steps, suggesting a need for clearer visual feedback.

The multi-screen flow increased perceived effort, leading 5/6 users to ignore the written text—suggesting that pacing and content delivery needed refinement.

4/5 users were uncertain about the value of creating an account, suggesting the need to better communicate its benefits.

Search by course, professor, or a combination of both

4/6 users expressed confusion when the search bar shifted to a two-step interaction. The lack of contextual cues or prompts led to hesitation, as users were unsure what type of input was required in each step.

Access existing course reviews for informed decision-making

4/6 users felt the UI screens seemed disconnected from the rest of the flow, suggesting a need for stronger visual consistency.

4/6 users wanted the ability to filter reviews, highlighting the importance of customization and control in browsing feedback.

6/6 users appreciated the mix of numerical and written responses, confirming quantitative and qualitative data improves clarity and trust.

Submitting structured, clear, and meaningful feedback

5/6 users found one question per screen inefficient and hard to review, suggesting a need for a smoother flow.

6/6 users found the phrasing unclear, suggesting simpler wording.

6/6 users said the progress bar misleading, suggesting clearer visual feedback needed.

Guiding student feedback with a bias checker

The bias checker flags inappropriate language and promotes constructive feedback, but was deferred to the backlog due to time constraints.

What Did Not

Confusing navigation and pacing revealed the need for clear hierarchy and better feedback.

Unclear content and interactions highlighted the importance of simplified language and contextual guidance.

Disconnected screens and limited controls emphasized the need for visual consistency and user control.

Synthesizing the insights from concept testing, I discovered:

jny.

sandbox

about

Last Updated: Nov, 2025

Scroll to Top

Thanks for tagging along. Let’s craft something meaningful together!

Get in Touch

Find me elsewhere:

Made w/a dash of figma magic 🪄

work

sandbox

about

resume