Westwood Plateau Golf and Country Club
This project evaluates the Westwood Plateau Golf and Country Club's application, which is underutilized compared to their website. The goal is to identify and address challenges within the app that hinder user interaction and create a more consistent and user-friendly experience between the app and website. By analyzing user data and the app's interface, the project aims to offer recommendations to enhance usability and improve the overall user experience for both guests and staff, particularly focusing on reducing delays caused by excessive videos.
Results
A controlled study compared the usability of a website and app for Fairways Grill and Patio with three tasks: ordering food, booking a golf course, and booking a golf simulator.
Key Findings:
Quantitative: Only the golf simulator booking showed a statistically significant difference in completion times (p=0.04), with the app being more difficult
Qualitative: Users found the website more intuitive overall, but cluttered. The app had a confusing unlabeled "plus" icon for booking
Reflection
Solution Mockups:
Issues and Solutions:
App:
Issue: Poor visibility of golf simulator booking; unclear "plus" icon.
Solution: Improve simulator booking visibility (place it with other golf bookings); label the "plus" icon.
Website:
Issue: Cluttered interface, redundant "tee-time" button.
Solution: Streamline the interface; remove or improve the "tee-time" button (fewer clicks to booking).
Both:
Issue: Lack of consistent features across platforms.
Solution: Add features like user profiles to both.
The proposed design changes aim to improve navigation, address user confusion, and create a more consistent and user-friendly experience across both platforms.
Process
The assignment evaluated the Westwood Plateau Golf and Country Club's application and website. The evaluation process included the following steps:
First, the team established a collaborative relationship with the company. They discussed the concerns of the company and what they wanted to be analyzed, which was the underutilization of the application compared to the website.
The next step was to identify the company’s pain points. The evaluation aimed to comprehend the challenges within the application that hinder user interaction.
After identifying the company's pain points, the team conducted heuristic evaluations. They assigned ratings based on recognized UX/UI principles.
The team then discussed the implications of the data and decided which issues were the most prevalent among the scoring.
Using this information, the team formulated a hypothesis. The hypothesis combined the insights from the heuristic evaluation and a focus question provided by the client.
The team then conducted a controlled study. This study aimed to compare the two interfaces and provide deeper insight and quantitative data.
After conducting the controlled study, the team conducted qualitative interviews with participants. The goal of the interviews was to gather qualitative data describing the participants' overall impression of the interfaces.
The team then analyzed the data. The data analysis was separated into three parts: heuristic evaluation, controlled experiment study, and qualitative interview.
Finally, the team proposed recommendations on how to enhance usability and craft a more cohesive and enjoyable interface for both guests and staff.
This project with Westwood Plateau Golf and Country Club was a defining moment in my journey as a UX/UI designer. It pushed me to bridge the gap between theoretical principles and real-world impact, all while navigating the complexities of client collaboration. The challenge—revitalizing an underused app—taught me how to balance business objectives with user needs, a skill I now consider foundational to effective design.
Working directly with the client was both exhilarating and humbling. Early on, I realized how crucial it is to translate their vision into actionable design goals while advocating for user-centricity. There were moments of tension—like when stakeholder priorities initially overshadowed user pain points—but these became opportunities to refine my communication skills. By framing design decisions through the lens of data, I learned to build trust and align the team around shared objectives.
Our mixed-methods approach (heuristic evaluations, controlled studies, and interviews) was a masterclass in the power of triangulation. The heuristic evaluations grounded us in best practices, but it was the controlled study that surprised me most. Seeing quantitative data validate our hypotheses—like how confusing navigation led to app abandonment—made the abstract tangible. However, the qualitative interviews were where the real magic happened. Hearing guests describe their frustrations in their own words, like struggling to book tee times or feeling overwhelmed by cluttered interfaces, transformed raw data into empathy. It reminded me that behind every statistic is a human experience waiting to be understood.
Synthesizing these insights was like solving a puzzle. One memorable breakthrough came when we mapped interview quotes to usability metrics, revealing a pattern: users valued simplicity but craved personalized features. This duality forced me to think creatively—for example, designing a minimalist interface with smart, context-aware recommendations. Collaborating with my team to turn these insights into actionable recommendations taught me the art of compromise and the importance of grounding every choice in evidence.
This project also highlighted the importance of storytelling in UX. Presenting our findings to the client wasn’t just about sharing data—it was about crafting a narrative that connected their business goals to user pain points. When we proposed simplifying the booking flow, we didn’t just cite drop-off rates; we shared a guest’s story about missing a tee time due to a confusing interface. This human-centric approach resonated deeply and ultimately drove stakeholder buy-in.
Looking back, I’m proud of the solutions we delivered, but I also recognize areas where I could have pushed further. For instance, I wish we’d advocated for more iterative testing with staff members earlier in the process to uncover workflow inefficiencies sooner. This experience solidified my belief that UX research is never “done”—it’s a cycle of learning, adapting, and refining.
Above all, this project taught me that great design isn’t just about solving problems—it’s about fostering connections. Seeing the client’s excitement as we unveiled the redesigned app, and later hearing feedback about improved engagement, was incredibly rewarding. It reinforced my passion for creating experiences that serve both users and businesses harmoniously. Moving forward, I’ll carry this lesson with me: that empathy, collaboration, and resilience are the cornerstones of meaningful design.
User Testing and Research
Controlled study, user research, testing and solutions
My role:
Design, development, user research, project management, usability tests
Team: Russell Yuen, Ethan Tang, Delai Gao, and Mark Duinkerke
Timeline:
May 2024