Organization: Student Outlets
Product: Koupon.ai website
Methods: Survey, Observation, Affinity Mapping
Tools: Google Analytics, Hotjar, Excel, Miro
The Context...
Koupon.ai is a website that uncovers deals and coupons for Amazon buyers. Our product team updated the website design and the product feature for the Halloween Champion. However, our Halloween Champion did not end up with a significantly high engagement or customer conversion but with an even slightly lower conversion rate.
We want to know why and how to improve
so that Koupon could maximize the commerce profit for the Christmas Season.
The Approach...
As specific data points are not implemented with every design update, and given the frequency of these updates, I opted for a bottom-up solution. This decision was driven by the recognition of a general problem, without a specific user flow taking precedence at the moment.
Step 1 General On-Site Survey
I conducted a thorough review of the website's general survey, which was implemented by the research team several months ago. This allowed us to track users' overall satisfaction and experiences over time. The goal was to swiftly identify any issues that might have eluded us. This process would allow us to catch any prominent crucial factors directly from the users' mouths.
On the quantitative side, here is what we got:
The finding...
Here we observed a decline in data across all three measurements starting in mid-October. However, tracing back the events of that week proved challenging due to a lack of updated documentation. To address this issue, I proposed the first actionable item – the implementation of a shared documentation system for product feature updates. This solution aims to be easily understandable and transparent for the entire team, fostering collaboration from research to engineering.
While our quantitative data covers broad traffic trends, delving into more qualitative insights requires understanding the context. To bridge this gap, I organized qualitative data gathered from the general on-site survey. I grouped users' suggestions and comments based on different themes, utilizing affinity mapping on Miro.
This process enabled us to pinpoint aspects that users appreciate and areas where we can make improvements.
More finding...
From Affinity Mapping, we found that the two things that most stepped out of the "complaints" are
Variety of the Deal
Ease of Navigation
Users showed a huge eagerness for expanding product variation, 6 out of 16 constructional comments mentioned being able to show more deals. The secondary top-listed comments are to improve the navigation experience, by having more sort options, related products, and more personalized recommendations.
What we found in the general survey was good enough to give us what to start with. However, we still feel a lack of context in understanding users' complaints and tied them to the general declining user conversion. The two findings are big, general, and shallow. They didn't offer a concrete and clear direction for improvement. To address this, we grappled with the question of how to enhance the user experience effectively and ensure a better user conversion in future iterations.
So.. we want to know more context on how our users use our product.
Step 2 Observation: Hotjar Recording
The two researchers, Lexi and I had a debate about whether we wanted to collect data via contextual inquiries, or in other words, interview and observation during the interviews. Eventually, we decided to move forward with naturalistic observation the opposite of intentional observation where the participants would be fully aware four their presence.
This is because the problem we are targeting is a lower conversion occurred in October, the past month. If we invite participants to perform how they browse and use our website,
They won't be the same people, nor in the same using scenario.
So, we decided to move forward with watching the user session recording.
We gauged the success of a user session based on their progression through the buy-in funnel, a metric established by our product success measure team. Our assumption was that users who derived more value, meaning they engaged more with the website and progressed further in the funnel, were indicative of successful user conversion.
Moving a bit upward on the bottom-up process, we have better-defined research questions to browse this time:
In total, we the research team of 2 have watched through a total of 84 recording sessions in a week. We documented each use case and coded accordingly to pre-setted measurements.
Coding book:
Did they start the session by searching a product or browsing the category/home page?
Were their main activity searching or browsing?
Have users utilized filter/sort options if they search for a product?
Have they clicked the product card to view the details? (Step P2)
Which product card have they clicked? Is it ranked high on the list or low?
Have they transmitted to Amazon? (Step P3)
The Actionable Item...
I found the two big insights we got from observing the recording user session overlapped and extended from more product variety and better navigation that we got from the survey.
We should work like a salesperson - “You don’t like these? I got others for you!”
When users complained about our website lacking sufficient deals, they were essentially expressing frustration with the challenge of finding the deals they were interested in, given the current user experience. Through observation, we noted that 21 users exclusively browsed the 'Recommends' section without clicking on anything. Among those who explored further, over half checked a product that ranked very low on either the recommended list or the search result list. This implies a risk of losing these users if they run out of patience with the current browsing experience.
Specifically, we want to make users stay longer so they are exposed to more deals and precisely show deals they are very likely to be interested in.
2. We should urge our users to Amazon after they find their interest deals
I surprisingly found that few participants spent time reading product details on the product detail page (P3). Instead, they stop and read the product card. Nearly half(n=9) of users spent less than 3 seconds on the product detail page before they clicked copy code to Amazon. Based on observation, only 2 users read product details: they spent 16 seconds, and 23 seconds on the product detail page. 89% users (n=17) spend less than 10 seconds on the detail page, mostly screening the pictures.
Hence, our solution is to streamline the decision-making process and offer swift access to Amazon. We propose adding a 'Go to Amazon' button directly on the product card. This enhancement eliminates the need for users to navigate into the product detail page, providing a quicker path to the purchasing platform.
The Impact...
While the design and engineering teams are actively implementing the recommended product feature updates, we've already witnessed a positive shift in user conversion by 24.2% as of December 2023, compared to the previous month.
The Next Step...
As a follow-up to our current findings from the survey and observations, our research team is embarking on an in-depth exploration of user psychology and behaviors using Koupon.ai. This long-term research initiative aims to establish a comprehensive understanding of users' job-to-be-done, contributing to the development of fundamental user profiles.