MCIT Website Usability Testing


The Canadian Tourism Commission's Meetings, Conventions and Incentive Travel (MCIT) team provide comprehensive services for their clients, helping them explore, discover, plan and book events and itineraries in Canada. The site provides a complete listing of hotels, convention facilities, and includes suggested itineraries, local area information and more.

The majority of their clients are in the US, but they also service the UK, Europe and beyond.

The CTC MCIT team came to FCV to usability test their redesigned website after the launch of several new features.


At FCV, we believe that delivering a great customer experience makes business sense. By testing with real people, early and iteratively, you focus on what matters to your customers, and ensure they come back for more.

From a list of 100 CTC’s MCIT real life contacts, ranging from previous customers to entirely new prospects, FCV screened, recruited and scheduled tests with qualified users. The times, technology and incentives were established for tests, as well as follow up interviews, data analysis and report writing. Using a methodical approach we applied a range of techniques that would uncover usability issues with the website. The components included:

  1. Expert review

    An FCV User Experience (UX) Analyst applied industry standard guidelines to look for usability issues and measure their severity and frequency. This has the added benefit of highlighting areas to explore in detail with real users.

  2. Competitor review

    We compared the CTC MCIT website with five industry competitors to establish benchmarks for navigation, language feature set, content focus and depth.

  3. Remote un-moderated test

    Using a set of screenshots from the new site being built and a heat mapping tool, we asked test users to respond to prompts which tested the effectiveness of the proposed high level navigation, page layouts and consistency. Users took this test in their own time and the results were amalgamated into website heat maps. Analysis of this activity uncovered some assumptions about the site and identified some areas for enhanced usability.

  4. Remote moderated test

    Based on the findings of the previous three components and discussion with the project team, FCV developed a set of test scenarios that would optimize on-site conversions. The test scenarios mimicked the tasks and functions of a typical user so we could identify if users encountered any obstacles in fulfilling on-site goals. For example: Was the site findable? Could the test meeting planners find the facilities to match their specific requirements? Did the site have enough useful information for users who have not been to Canada?

It would have been especially difficult to have performed a lab test in this case, because the highly specialized target audience are spread across different time zones and geographic locations. Coordinating these busy people to be tested in a room with an accredited UX moderator would have significantly delayed the project.

FCV are fervent advocates of remote, moderated usability testing for several reasons:

  • For our clients – the overall costs are significantly lower than a formal on-site lab test. Take a behind-the-scenes look at the test approach and setup.
  • Users are in a familiar environment- at their work or home – where they are relaxed and comfortable, using their favourite browser and keyboard. This makes the test results more authentic.
  • Testing is conducted with fewer resources than in a lab. We spend 5 minutes before and after installing and un-installing screen recording tools which are kept for posterity – not just notes on real-time observations. The process itself has an added benefit of highlighting to a list of potential customers that the client cares about optimizing their online experience to save them frustration in trying to achieve a task. It’s a good brand builder.
  • Tests can be viewed later / the sample size widened/ to continue to optimize for on-site conversions.


The review and testing methodology allowed us to uncover a wide range of usability issues. These issues were prioritized for the CTC MCIT team based on their severity and frequency. We delivered:

  • A detailed 68 page report of all findings from the tests including:

    a) 26 key usability issues, rated by severity and frequency levels.

    b) Click heatmaps, and full analysis.

    c) User comments.

  • A one-page summary of optimization recommendations for usability improvement.
  • The video recordings of the moderated tests and a summary of highlights.
  • A PowerPoint presentation including video highlights reel to communicate the major findings to key CTC MCIT stakeholders.

The remote testing worked flawlessly. The testing was simple for user participation and for FCV's UX team to record, annotate and share results. We even provided a toll-free number, to avoid some people from the US incurring costs while dialing in.

We look forward to seeing our recommended site modifications improve the experience of CTC's MCIT website customers, so they can discover, search and opt for facilities and itineraries in Canada.

Back to Work ×