Enhancing The Information Lab’s Showcase Page
Enhancing The Information Lab’s Showcase Page
Project Overview
About The Client
The Information Lab is one of Europe’s leading data analytics consultancies, specializing in Tableau and Alteryx. Their Showcase page highlights real client case studies, designed to demonstrate their dashboard solutions across various industries.
The Problem
Despite its visual appeal, the Showcase page wasn’t serving its core purpose, helping prospective clients quickly find relevant case studies and understand the value of the dashboards. Users struggled with vague industry categorization, had difficulty to scan layouts and lack of summary or context for dashboards
Research Goals
Our team was tasked with evaluating the usability of the Showcase page and providing actionable design recommendations to improve content discoverability, enhance comprehension of dashboard value and support confident decision-making.
My Role
As the Strategy Lead and UX Researcher,
I designed the testing protocol and user tasks, led user testing and synthesis and co-developed design recommendations with the design and project leads
Team
Myra Chen (Strategy Lead), Rosaline Lee (Design Lead), Rutuja Nagulpelli (Project Manager)
Timeline
7 Weeks
Methodology
Moderated Remote Usability Testing
To uncover usability issues and inform design improvements for The Information Lab’s Showcase page, we conducted moderated remote user testing with participants in data-related roles.
We chose this method because it allows users to interact with the product in a natural, distraction-free environment, leading to more authentic behavior (Schade, 2013). Remote testing also offered logistical advantages like greater scheduling flexibility and reduced overhead, while maintaining high insight quality (Whitenton, 2019).
Testing Structure
Each session lasted approximately 45 minutes and was conducted on desktop devices. We focused specifically on the Showcase section of the website. Participants were guided by a moderator and accompanied by a silent observer who took detailed notes and managed screen recordings.
We followed a structured four-step process to ensure consistency and clarity throughout the testing sessions:
Pre-test Questionnaire
Collected participant demographics, professional experience, and familiarity with dashboards and data tools.
Task Execution
Participants completed two core tasks based on realistic scenarios while using the think-aloud protocol, allowing us to capture their immediate thoughts and behaviors.
Post-task Reflection
We asked follow-up questions to uncover users’ reasoning, clarify their decisions, and identify usability pain points.
Post-test Questionnaire
Included the System Usability Scale (SUS) to quantitatively assess users' perceptions of the Showcase page’s usability.
Additionally, we conducted a pilot session prior to formal testing to fine-tune the task structure, session timing, and moderator flow.
Participants
We recruited a total of 7 participants via Panelfox and LinkedIn, with a mix of internal and external users:
5 potential clients from various industries and geographies
2 internal sales team members from The Information Lab, included to provide insider perspective
All participants had experience in data-oriented roles and varied levels of exposure to dashboards ranging from rare to daily use.
The participant pool closely mirrored the client’s target audience, ensuring findings were relevant and actionable.
Scenario & Tasks
To simulate realistic user flows, we designed a scenario based on how a prospective client might explore the Showcase page:
📘 Scenario:
“You’re evaluating data consultancy firms to support your company’s internal reporting needs. You land on The Information Lab’s Showcase page and want to see if their work aligns with your industry and goals.”
🧭 Task 1: Find a Case Relevant to Your Industry
Participants were asked to browse the Showcase page and select a case that felt most applicable to their work. We observed how they navigated the site and what filters, labels, or page elements guided their decision.
📢 Task 2: Imagine a Team Pitch
Participants chose another case they might present to internal stakeholders as justification for partnering with The Information Lab. This task assessed clarity of content, value communication, and readiness for decision-making.
This mixed-method approach, combining structured tasks, observational data, and SUS scoring helped us uncover both surface-level and deep usability issues, and paved the way for focused, evidence-backed design recommendations.
Findings
Evaluation Results
To assess usability, we used a 1–5 Likert scale, with 1 indicating low usability and 5 indicating high usability. The results revealed a gap between visual appeal and informational clarity:
Overall Usability - 3.45 / 5
Task-Specific Usability - 2.93 / 5
While users generally found the interface visually appealing, performance on specific tasks revealed deeper usability concerns.
Key Findings from Testing
Real-time feedback and post-test responses revealed consistent patterns:
65% of participants found the tasks easy to complete
58% found the site easy to navigate, attributing this to its clean visual design
85% experienced confusion due to missing context or vague content
These results suggest a disconnect between design and usability—the site looks good but lacks the information users need to complete key tasks with confidence.
Takeaway from Testing
The Showcase page’s visual design supports basic navigation, but users struggled to understand the content and value of the dashboards. This highlights the need for clearer context, better guidance, and more intuitive categorization to bridge the usability gap.
Recommendation 1: Introduce search filters to enhance the "find your industry" functionality on the Showcase page.
Finding #1: Participants struggled to find relevant content in their field due to vague categorization on the showcase page.
Participants had difficulty finding relevant case studies due to vague, overly broad categories. The “Find your industry” button caused confusion, and the generic labels led to speculative browsing rather than clear, purposeful navigation. Many users were unsure whether the content would align with their specific industry or department needs, which disrupted their workflow and reduced confidence in the site’s organization. This lack of precise categorization made it harder for users to evaluate how The Information Lab’s solutions could apply to their business context.
Figure 1: Participants expressed confusion about the “Find your industry” button, and were unsure of its purpose.
Figure 2: Participants frequently searched for specific industries but found that the available categories did not align with their departmental or business needs.
Solution:
To improve content discoverability and reduce user frustration, we recommended implementing a robust search filter system. Filters based on industry, department, use case, and data tools would allow users to refine results more precisely than the current broad categories. This enhancement would streamline navigation, lower cognitive load, and make the Showcase page more effective as a sales and educational tool.
Figure 3: Introducing a search bar and add filter
Figure 4: An example of how the add filter would look like.
Recommendation 2: Introduce a new layout to display the cases in the topic page.
Finding #2: The topic page makes it difficult for participants to scan content quickly and view multiple cases at once.
Participants struggled to scan and compare content on the topic page due to a layout that required excessive scrolling and clicking. The lack of a high-level overview made it difficult to quickly identify relevant or standout case studies, ultimately hindering engagement and efficient decision-making.
Figure 5: Participants would like to see contents on the same page as it increases the scrolling.
Solution:
To enhance usability and visual appeal, the topic page should adopt a new grid-based layout for displaying cases, such as "ARR Overview", "Global Pipeline” and “Account Management Performance.” Each case can be presented in a uniform card format featuring a consistent image thumbnail, a short title, and a brief description. Clicking on a card could open a modal to show more details, including embedded dashboards, KPIs, or links to the full report.
Figure 6: Rearranging all the case studies on one page which reduces the scrolls and helps participants look at all the cases at once.
Recommendation 3: Add a short summary of the data displayed so that users don’t skip important information.
Finding #3: Participants had a hard time understanding the values of the case and how to interact with the dashboards due to lack of information and unclear instruction.
Participants had difficulty understanding the purpose and value of the case studies due to a lack of context and clear instructions. Many were unsure which metrics mattered or how to interpret the dashboards in real-world terms. This lack of guidance reduced engagement and weakened the overall usability of the Showcase page.
Figure 7: It was hard for some participants to understand the data displayed on the dashboard.
Solution:
To ensure users don’t overlook key insights, we recommended adding a concise summary at the top of the page that highlights the most important takeaways from the data visualizations. This brief overview should clearly state what the charts collectively reveal, such as overall performance trends, critical metrics like ARR value and overdue contracts, and notable patterns or outliers. By presenting this context up front, users can quickly grasp the significance of the dashboard content and are more likely to engage meaningfully with the visualized data. Providing a clear, upfront narrative would help ensure users don’t overlook important findings and better understand how the metrics and observations relate to overall business performance.
Figure 8: The Key Results adds narrative context to the data, helping highlight trends and key contributors that might be overlooked in raw visualizations.
Conclusion
Client Reaction
The Information Lab team was highly receptive to our findings. They valued the clarity of insights and the actionable nature of our recommendations. Specific ideas—such as a grid-based layout and the addition of contextual summaries—were identified as priorities for future implementation.
Next Steps
If we were to continue the project, our next phase would focus on implementation and further validation:
Prototype and test the revised Showcase page layout
Conduct A/B testing to evaluate the effectiveness of search filters
Explore mobile responsiveness, which was out of scope in the initial study
Key Takeaways
Contextual storytelling is crucial for communicating value in dashboard UX.
A visually polished interface can still fall short if the information architecture doesn’t align with user needs.
Remote moderated testing combined with think-aloud protocols is an effective, low-cost method for uncovering deep usability insights.