OpenIDEO Usability Assessment | Research

Assignment for "Needs and Usability Assessment," Spring 2014.

Teammates: Ivan Buchanan, Pierce Gordon, Ronnie Jakobsen



OpenIDEO purports to have 60000+ accounts on the website from over 170 countries [1], but a recent study in November 2013 only shows 3959 users who have actually contributed. Many of the members participate only in a single challenge, with a small core of members who provide input across a collection of challenges [2]. How can we make OpenIDEO friendly to new users, and encourage ongoing participation?


  • Developing focus group guide and protocol
  • Moderating usability tests
  • Heuristic evaluation of the OpenIDEO website
  • Recruiting
  • Report: web version of written report, compiling and editing the video

Summary of Findings

We used three research methods for the experience: heuristic evaluation, usability tests with novice users, and expert user interviews. Through these methods, we found three main points of tension:

Issue 1: The main function of the website, the contribution process, was overly complex.

To remedy this, we suggest:

  • Reduce the number of clicks from first page to the contribution form.

  • From the “Challenges” page, clicking on a challenge should lead to the current stage of the challenge.

  • Offer suggestions about the recommended length of text for ALL boxes in the contribution form.

Issue 2: Language inconsistency on the OpenIDEO website confused users.

To remedy this, we suggest:

  • Make language around challenge involvement consistent.

  • Rename the “Forum” tab to “Help”.

Issue 3: Users have difficulty understanding important facts of the website, such as the roles of sponsors, staff, and other designers, and the role of the website altogether.

To remedy this, we suggest:

  • Explicitly state the different actors’ involvement at each stage. Who picks the “winners”? What’s the sponsor’s role?

  • Explain the site concept with concrete examples using the challenge timeline as the main thread.

Regardless of the usability issues of the website, our members did like the website. In the spirit of the website’s tenet that it is ‘always in beta,’ we offer this way to improve the website, to keep users involved in the process of designing better, together.

Research Questions and Goals

Our main research goal was to understand what contributes and detracts from designer engagement with the site. We wanted to tackle this from two angles. The first was, “What might be preventing people from contributing to site?” The second was, “What keeps the power users engaged?” Within these two larger questions, we had subsets of questions that determined our methods and focus.

What prevents people from contributing to the site?

  • How well does a first-time user understand what the site is about?

  • How well can a novice user navigate the site?

  • How easy is it for a novice user to add her first contribution?

  • How does the first time user feel about the site?

What keeps power users engaged?

  • What do people love about the site?

  • What motivates power users to keep coming back, and is the Design Quotient part of it?

  • How often do power users contribute?

  • Do power users feel a sense of community?

Methods, Sampling and Recruiting

Heuristic Evaluation

In reviewing the site, we had a sense that overall navigation might be one of the key issues preventing more people from contributing to the site after they sign up. For this reason, we decided to perform a heuristic evaluation. We also believed that the heuristic evaluation would guide our understanding of what issues we should look for in the usability tests with novice users.

Recruiting in this case was simple: each member of the team performed a heuristic evaluation (link), looking at Jakob Nielsen’s ten guidelines [3]. This included assigning a score for the severity of the issue and suggesting possible fixes. More details about how we analyzed the results are in the “Analyzing the data” section of the report.

Usability Tests with First Time Users

As mentioned in the previous section, one of our main research questions was what prevents more people from contributing to the site. To best tackle this, we decided to perform usability tests with first time users. This would also tell us more about the general navigability of the site, and what types of issues might be preventing users from signing up.

We aimed to recruit 3–5 first time OpenIDEO users, all of whom work in design or technology and are computer literate. We also wanted an even gender balance, and participants in at least a 20 year age range. OpenIDEO is ostensibly built to be usable by anyone who’s interested, so we wanted to capture potential age and gender differences.

To recruit the users, we used the “friends-of-friends” approach. Meredith and Ronnie emailed friends with a description of the project, a timeline in which the tests would happen, and a request for recipients to forward the message to “designers and design-minded” friends. We received six responses, and selected five of them to maximize the age range. We ended up with two men and three women between the ages of 22 and 33. Among them we professions such as software engineer, interaction designer, and app developer relations manager.

We scheduled hour-long usability tests at the five users’ houses. We felt that this location would be the most comfortable for them, as well as one of the likely places in which they would log onto the site. We recorded the sessions using TechSmith’s Camtasia, which captured audio, screen activity, and the tester’s facial expressions via the laptop webcam. The usability tests had 5 main tasks:

  1. Get familiar with the website and explain what you think it does.

  2. “Following” a challenge.

  3. Browsing through the challenges section and accessing a pre-defined challenge

  4. Finding other users

  5. Navigating to and in the site’s forum

At the end of the session, we asked users about their general impressions of the site. How well did they feel they understood it? Did they like it? (See the full usability test guide in the “Protocols and Supplementary Information” section.)

Interviews with Power Users

Given that our second major research question was what keeps power users engaged, we decided the most effective way to understand this would be to lead a focus group with about 5 experienced users. They could give us an idea of how they felt about the site and what they experience was, as well as generate ideas to improve the site experience. Since the power users are located all over the world, we initially planned to conduct a videoconference focus group via Google Hangouts. Meena had suggested that we leave the option to chat on during the Hangout, which would allow quieter users to add ideas while others were speaking.

 Pierce left comments on the profiles of 20 of OpenIDEO’s most active users. The email included an overview of the project and a request to participate in a virtual focus group. Three users volunteered and sent us their availability. Unfortunately, we were unable to find a time that worked for all of them.

So, we instead scheduled three videoconference interviews with powers users; one participant had technical issues, so her interview was conducted over the phone. As to information about the interviewees: one was a professor, one was a graduate student, and one was a visual designer.

Analyzing the data

As mentioned previously, we had agreed on 21 usability issues during our heuristic evaluation. We did individual evaluations, then met as a group to consolidate our observations into a single heuristic evaluation. If multiple people noticed the same issue and disagreed about the severity rating, we would discuss it and agree on one. We identified 21 total issues.

We then used the KJ-technique discussed by Jared Spool to prioritize our findings from the usability tests and interviews [3]. Our focus question was, “What prevents people from contributing to the site?” We all met in the co-lab of the I School with sticky notes; referencing our notes from the interviews and usability tests, we individually wrote down our observations and opinions. We then had one member of the group present his notes and explain them out loud as he put the stickies on the board. If anyone had the same note, we added it nearby. We went through each member of the group until all of the sticky notes were up. We then started discussing what themes we each noticed, and grouped the sticky notes accordingly. Next, we named each group, consolidating some groups in the process.  

In the end, we had identified nine distinct groups. We then voted on severity and “ease of fix” scores for each issue. Finally, we assigned priorities to the issues based on the combined scores. Our results are below in Table A.

Table A: Findings in prioritized order

Key Research Findings

Even though OpenIDEO has already been upgraded from its initial launching version to simplify its usability and layout, through heuristic evaluations, usability tests and one-on-one interviews we identified a series of key findings. We have chosen to focus on the three most urgent findings: the Contribution Process, Semantic Consistency and General Understanding of Website.

Contribution Process

Within the Contribution Process category, we identified two main opportunity areas: (1) the complexity behind the process of adding a contribution, and (2) a lack of guidance on what is expected from the user when he or she tries to contribute an idea or solution.

We found out that the complexity of adding a contribution partially lies behind the fact that the general layout of the website is overwhelmingly full of content, and lacks directions and explanations for new users. For example, upon accessing OpenIDEO’s home page, visitors are only presented with a short, one-sentence explanation of what OpenIDEO is, but are not offered any other explanation on how the site actually works. Once they reach the “Challenges” page, visitors are immediately presented with a long challenge description presented in a very small font (see figure C). Rather than creating excitement and positive expectations about the platform, the block of text deters visitors from further reading and engaging in the site.

Figure C: Example of a challenge description.

Through the usability tests that we performed, we learned:

  1. The amount of text throughout the site overwhelms the users. It gives them the impression that in order to catch up to the current status of the project and make valuable contributions, they will be forced to read and process pages and pages of content.

  2. Users had trouble figuring out how to contribute. It is hard to engage in projects because calls to action are effectively hidden, and require many clicks to find (see figure D).

  3. The contribution forms are complex and lack guidance. Even when users reached the desired contribution page, they found the forms were complex, lengthy, and lacking guidance. Four users complained or sighed after they got about 5 boxes in. One user also mentioned that based on the contribution snippets shown on the individual challenge page, he had thought he could make a “quick contribution.” All of this might actually discourage many visitors from contributing their ideas, thinking that a lot more is expected from them when contributing to a project.

Figure D: The steps towards making a contribution.

Figure D: The steps towards making a contribution.

We think that improving the user experience around the user flow to make contributions and collaborate within the platform should be something relatively easy for OpenIDEO to implement, and could have a very positive impact in the short run. In order to achieve this, we recommend:

  • Make sure that each challenge redirects to the current phase. In the current version of the platform, clicking on a determined challenge within the “Challenges” section takes the user to the project page, but does not redirect the user to the current phase.

  • Include “call to action” buttons on the “Challenges” page. For example, if a project is in the “Ideas” phase, add the “+ Add Idea” button within that project’s section in the “Challenges” list so that users who are interested in immediately contributing can do so. Power users could also benefit from having easier access to the sections they normally use.

  • Guide new users through the required steps and clicks to browse the main parts of the site and make their first contribution. To do this, OpenIDEO could use tutorials, hints and tooltips throughout the site. We also suggest moving the navigation not related to contributing to challenges, such as “Blog,” “Forums,” and “Impact” to the right side of the navigation bar. This would clarify what the most useful tabs are to learn about the site.

  • Set an idea of what is expected in each text box on the contribution page. In order to achieve this, we specifically recommend: Describing the recommended length for each text box (e.g. “approximately 400 words” or “max 1000 words”), adding examples or tips above each field that might inspire or clarify, and revising the forms to clarify what fields are required.

Semantic Consistency

Our second top finding was the inconsistent use of language and terms around the site. Through our usability tests and heuristic evaluations, we found out two main problem areas: the common presence of language inconsistencies throughout the site, and the lack of clear differentiation between sections of the website. The main issues in this area were the following:

  • “Follow” and “join” are used interchangeably. One of the main findings was that the phrases “Follow this challenge” and “Join this challenge” were being exposed to the user interchangeably within the same process flow, creating confusion and a sense of doubt about the site’s intention. Multiple users reacted strongly to this inconsistency: When they read the word “follow,” they immediately related the action to something similar as what social media does where “follow” means you will regularly be updated on that subject. However, OpenIDEO both lacked confirmation that the user was now following a challenge, and any explanation on what would happen if you followed that project.

  • Users didn’t understand the different between “Community” and “Forums.” Multiple users said that they had ignored the “Forums” section entirely before being asked about it, assuming that much of its discussion functionality was under “Community.” One asked, “Isn’t a forum a community and a community a forum? Then why do two sections exist? When should I visit each one?”

We strongly recommend:

  • Use a standard set of words or phrases throughout the website.

  • Rename the platform’s main sections so that they really describe their content. For example, “Forums” is a help-oriented section, so rename it “Help.” While these issues are not as critical as the ones detailed previously concerning the contribution process, they do present inconsistencies that confused users. A confused visitor is likely to give up and spend their precious time elsewhere.

General understanding of website

Our third key finding was that people generally find it difficult to determine exactly what the site is about. Although the front page clearly guides the user to learn “how it works” (through the top navigation item and the video) some of our participants were still confused after spending 30 – 40 minutes on the site. We attribute this problem to a couple of different usability issues.

Winning / impact

  • Participants who visited the site for the first time had a hard time figuring out what it means to win a challenge and what impact a winning idea would have. They found it hard to distinguish between “winning,” “realization” and “impact,” which are all words used to describe the final phase of a challenge.

  • Users couldn’t figure out if there were any prizes for the winning ideas. As far as they could tell, the incentives for participating are not explicitly described anywhere on the site, which made it hard for the participants to fully grasp the concept. One of our expert user interviewees later told us she had “won” a trip to Brussels where she got to pitch her idea at the European Commission. This had been a great experience for her and she especially enjoyed meeting other OpenIDEO contributors in person. Although this was a great prize which empowered her to get her idea realized she was not informed about it before her idea was chosen as one of the top ten ideas of the challenge.

We suggest that OpenIDEO advertise their prizes both to show potential contributors “what’s in it for them” and to make the idea of the open innovation challenge more tangible as a concept. We also suggest updating the documentation in “How It Works” to offer more explicit language about what “winning” means, and what the different types of contributions are.


Another source of confusion for our participants was the unclear roles of the different users on the site.

  • People listed in the “Community” section had roles users didn’t understand and didn’t know how to learn more about. A description of the role types (such as ‘contributors’, ‘OpenIDEO team members’, ‘challenge managers’, ‘community champions’ and ‘sponsors’) and their respective responsibilities and permissions is nowhere to be found. One participant assumed that people could only make one type of contribution.

  • The “we” in the challenges’ stage descriptions is ambiguous, and changes. One of our participants found it particularly frustrating not knowing who “we” refers to in the challenge descriptions. This could be either of the above-mentioned role types and he was unable to figure out who had the final vote in the elimination process. Another instance where this confusion became apparent was when we asked the participants to verbalize their expectations of what would happen if they clicked each of the top navigation items. When asking about the “Blog” section, we heard many different expectations about who would actually be writing the blog posts.

OpenIDEO has made an attempt to fix this problem by adding a label to all OpenIDEO team members’ profile pictures (see figure E). This handles the problem on a micro level so a user can identify the role of another user writing a specific contribution or comment, but does not address the macro level understanding problem which we have identified. We suggest that OpenIDEO explicitly state the different actors’ involvement at each phase. To avoid confusion for first time users they should always make it clear who does what.

Figure E: Profile picture for a member of the OpenIDEO team.

Figure E: Profile picture for a member of the OpenIDEO team.

Content abstraction

When tasked with learning more about how the site works, our participants were either drawn to the “How OpenIDEO Works” video on the front page or the “How It Works” top navigation item, which took them to a long infographic explanation. The explanatory documentation made users feel that OpenIDEO has an impact, but they didn’t understand how the site works. Each user spent a lot of time trying to understand the concept, but had to give up and move on although both the video and the infographic are quite lengthy. Only one user made it to the end of the video, and only one felt she understood how she could participate based on the “How It Works” page. Most of them got a much clearer idea about the OpenIDEO concept when exploring the individual challenge pages. We mainly attribute this problem to the level of abstraction and designer jargon used in the “How It Works” elements. An illustrative example of this can be found in the introductory paragraph on the infographic:

"OpenIDEO is an open innovation platform for social good. We’re a global community that draws upon the optimism, inspiration, ideas and opinions of everyone to solve problems together."

We would categorize this quote more like a vision statement than an explanation of what OpenIDEO actually does and how it works. One of our participants told us how he experienced the video more as a sales pitch than an introduction for new users.

To ensure that new visitors and potential contributors do not drop out of the site due to confusion about how it works and what they are supposed to do we suggest that OpenIDEO explain their concept with concrete examples, possibly using the challenge timeline (see figure F below) as the main thread. The challenge timeline seemed to be the trigger of some “aha moments” when some of our participants got a big step closer to “getting it”.

Figure F: The challenge timeline.

Additional findings

As stated earlier, we chose to focus on the three most urgent findings. However when analyzing our data using the KJ method we came to nine findings in total. Here are some brief descriptions of the less urgent findings which can be revised when the suggestions for the top three findings have been considered and hopefully implemented.

Content presentation

We found that the content of the front page was of little use to both the new and the experienced users. The content has the format of news but does not seem particularly relevant. Also a lot of content on the site can be characterized as a “wall of text,” which can be time consuming to get through, and may scare off new users who just want to get started as fast as possible.

We recommend personalizing the main page for users who are logged in, showing them recent activity and offering them links to the open challenges.


We found that the main navigation bar is split into two bars with no clear visual hierarchy guiding the user to the most important items. In the “Challenges” section, it was difficult for the participants to get an overview of which challenges were active, and when they were opened / closed due to the lack of contextual time information.

We recommend adding a timeline to each challenge in the “Challenges” list. We also recommend flipping the top two navigation bars so that the “OpenIDEO” home button is on the top. Several users mis-clicked on “Challenges” while attempting to return to the home page.


The OpenIDEO community is allegedly comprised tens of thousands of people from around the world. However this vast amount of people collaborating for the greater good is hardly to be seen on the site. The dedicated community page on the site is comprised of four top 8 lists with many repeat entries. Furthermore the forum tab takes you to a separate Zendesk domain.

We recommend including information about more than the top 8 contributors for the different contribution types, such as recent activity or users who are gaining in their Design Quotient.

Collaboration style

From the interviews with the expert users we found that collaboration mainly takes place in the comment section of each contribution. According to one interviewee, this format for collaboration is working out well between the engaged designers, but for a new visitor the collaboration is hidden some clicks into a deep information architecture. Thus the collaboration on the site becomes opaque and intangible to new potential contributors.

Design quotient

All of our participants seemed to have an opinion about the design quotient – essentially a pie chart displaying a user’s activity on the site as an attempt to gamify engagement on the platform. Some of them criticized it for measuring quantity instead of quality while others thought of it as a valid sign of design experience which could be used on their resume. In general the design quotient was well understood and accepted as a part of the experience.

Site performance

The site has some considerable waiting times when opening a challenge page or sorting through contributions. Although this aspect of a site’s usability is highly prioritized by usability expert Jakob Nielsen, we chose not to pursue it as a key finding due to the technical complexity of its solution and the vagueness of our resulting suggestion. Simply suggesting “make your website faster” is neither very inspiring nor actionable. Also we recognize that the site has a lot of content to handle and suspect that a fix would be both expensive and radical.


OpenIDEO is a platform whose purpose and objective were very well received by our testers. Even after struggling through tasks, the users were still excited about the site when we interviewed them at the end. The website already generates a lot of positive energy, and we believe we can capitalize on this by reducing the friction around: 1) contributing and 2) understanding how the challenge process works.

Our evaluations, tests and interviews allowed us not only to get a better insight from power users, but also gain important information on the impressions that new users gain upon visiting the site for the first time.

For new users, usability issues create high barriers of entry that end up discouraging user participation and engagement. First of all, while the user’s general idea of the site is well understood, the overwhelming amount of text and the small fonts makes it difficult to fully commit to the site. Second, the complexity in accessing some of the important parts of the site such as contribution page and compromises the initial user experience. Thirdly, once a user reaches the actual contribution form, the process still remains intimidating because of a lack of guidance and seemingly length requirements to add an contribution. We believe that by implementing the top three suggestions detailed in this report, OpenIDEO will lower the barriers without drastically changing the website design and layout.

For power users, the site provides a great platform to exchange ideas and engage in projects. In general, constant use of the site has created a strong sense of recallability. However, through our interviews we were able to also identify the main issues or nuances power users experience within the site. By streamlining the process that power users go through and making the contribution process more easily accessible, we think that engagement from power users could improve as well.

Finally, the overall project and experience have allowed us to go more in-depth in the techniques and processes that are involved in doing a complete usability assessment. Additionally, we were able to understand the difficulties that activities such as focus group recruiting and scheduling can have, and that adaptability to those situations is key in any usability assessment project.


[1] Lakhani, Karim R., Anne-Laure Fayard, Natalia Levina, and Stephanie Healy Pokrywa. "OpenIDEO." Harvard Business School Case 612-066, February 2012. (Revised October 2013.)

[2] "OpenIDEO - Home." OpenIDEO. N.p., n.d. Web. 08 May 2014.

[3] “The KJ-Technique: A Group Process for Establishing Priorities.” User Interface Engineering. N.p., May 11, 2004. Web. 7 May 2014. <>

Protocols and Supplementary Information

Heuristic Evaluation

The full heuristic evaluation spreadsheet is available here

Usability Test Guide

The protocol and script for the usability tests is available here.

Interview Guide

The interview guide is located here. The interview notes are located here.


Ranking spreadsheet here.