Ivan Buchanan, Pierce Gordon, Meredith Hitchcock, Ronnie Jakobsen

Individual Contributions

Ivan Buchanan: elaborated usability test guide and protocol, performed heuristic evaluation of the OpenIDEO website, observer and note-taker in a usability test, compiled and edited video examples for the project presentation.

Pierce Gordon: communicated with OpenIDEO, performed in-depth research upon OpenIDEO’s past iterations and future directions, identified and contacted potential one-on-one interviewees, performed one-on-one interviews.

Meredith Hitchcock: project management, developed focus group guide and protocol, wrote consent forms, moderated some usability tests, performed heuristic evaluation of the OpenIDEO website, identified and contacted potential usability testers, prepared the web version of the report, compiled and edited the video for the final written report.

Ronnie Jakobsen: moderated some usability tests, note-taker for some usability tests, performed heuristic evaluation of the OpenIDEO website, note-taker for one-on-one interviews, prepared software setup for usability tests.

Group Activities

  • Analyzing the data, notes and observations on heuristic evaluation, usability tests and one-on-one interviews

  • Highlighting and commenting each team member’s experience and insights

  • Identifying and voting on the main usability pain points

  • Writing the final report

  • Elaborating conclusions and recommendations

Executive Summary

OpenIDEO is an open collaboration platform focused on social good. By using the framework of Human-Centered Design and by harnessing the capabilities of 60,000+ members (at the time of this writing), they aim to address problems in many disparate fields by collecting rich and disparate research, brainstorming ideas, and inciting community feedback. However, the community that actually contributes to the challenges is drastically smaller – approximately 4,000 members. Moreover, research shows that a large percentage of that 4,000 only participate in one challenge, and then barely contribute after said challenge is complete. Therefore, for this usability study, we decided to focus upon understanding user retention and user collaboration issues in OpenIDEO.

We used three research methods for the experience: heuristic evaluation, usability tests with novice users, and expert user interviews. Through these methods, we found three main points of tension:

Issue 1: The main function of the website, the contribution process, was overly complex.

To remedy this, we suggest:

  • Reduce the number of clicks from first page to the contribution form.

  • From the “Challenges” page, clicking on a challenge should lead to the current stage of the challenge.

  • Offer suggestions about the recommended length of text for ALL boxes in the contribution form.

Issue 2: Language inconsistency on the OpenIDEO website confused users.

To remedy this, we suggest:

  • Make language around challenge involvement consistent.

  • Rename the “Forum” tab to “Help”.

Issue 3: Users have difficulty understanding important facts of the website, such as the roles of sponsors, staff, and other designers, and the role of the website altogether.

To remedy this, we suggest:

  • Explicitly state the different actors’ involvement at each stage. Who picks the “winners”? What’s the sponsor’s role?

  • Explain the site concept with concrete examples using the challenge timeline as the main thread.

Regardless of the usability issues of the website, our members did like the website. In the spirit of the website’s tenet that it is ‘always in beta,’ we offer this way to improve the website, to keep users involved in the process of designing better, together.

Background About the Project

OpenIDEO, a subsidiary program of the design firm IDEO, is an online collaborative design community that aims to address interdisciplinary challenges in varied fields. IDEO is well known as an innovative consulting company which uses user-centered design to address issues in multinational enterprise, international development, community engagement, and technological need through product, service, and experience-based interventions; and OpenIDEO aims to use the same framework with many more designers able to contribute. Organizations such as Amnesty International Steelcase Inc, Case Western Reserve University’s Power Center for Sustainable Value have all been sponsors of challenges with varied and ingenious solutions. As of this writing, the program has 22 completed challenges and two challenge in progress, ranging from such fields as affordable learning tools for the developing world, increasing bone marrow donors, and celebrating companies who innovate for world benefit.

OpenIDEO differs from a standard “Question and Answer” type forum in several key ways: the process of developing solutions is split into several key phases, and the process of submitting a solution encourages iteration and building off of prior submissions.

The platform goes through five main phases on the Website: “Research,” “Ideas,” “Evaluation,” “Applause,” and announcing the winner (see figure A). During the “Research” phase, the designers are encouraged to “share existing stories, tools, case studies, and examples.” During the “Ideas” phase, OpenIDEO asks designers questions such as “How would you solve this problem?” and suggests the designers consider “…areas of opportunity where new ideas might flourish – to spark our creative efforts.” After the new ideas are developed, the designers have a chance to evaluate the ideas on a collection of qualitative metrics, including their level of innovation sustainability, and its capacity to address the problem at hand. After this time, IDEO, the OpenIDEO staff, and the challenge sponsors come together and decide which of the designs are good enough to back; usually nine to ten are chosen [1].

Figure A: Phases of OpenIDEO challenges

The entire experience gives designers the opportunity to interact with each other in interesting ways. Examples include the Design Quotient, shown in Figure B, which appears on designer’s profiles and gives the designers a snapshot of their interaction on the website. The website turns the interaction on the website into four categories: research, ideas, evaluation, and collaboration; the more you interact, the higher your Design Quotient score. Another example is the “Applause” function where designers have the opportunity to give a certain community idea their recognition in a quantified manner, akin to a “like” on Facebook or a “favorite tweet” on Twitter. In many instances comments are used to suggest changes, applaud ideas, to present questions; they are used to interact in many ways with the product. The website also sends periodic emails if comments are left on your profile or a particular design, to give designers knowledge about activity on OpenIDEO, and another manner of deciding if interaction is necessary.

Figure B: The OpenIDEO design quotient (DQ)

Figure B: The OpenIDEO design quotient (DQ)

The OpenIDEO platform has five main design principles that they urge their designers to abide by, as expressed on the “About Us” section of the website: Being inclusive (recognizing and enabling all levels of participation from different disciplines), staying community-centered (remembering the core strengths of the community and play to them), being collaborative (promoting teamwork among individuals and teams by recognizing the many roles that are crucial to each step of the design process), staying optimistic (you never know when a wild idea might enable others to get closer to a viable solution) & the website is always in Beta (design for continuous improvement and iteration and scale deliberately).

We notice that the program purports to have 60000+ accounts on the website from over 170 countries [2], but much less than these users actively participate. A recent study shows a sweep in November 2013 only lists 3959 users which have actually contributed in any way to the OpenIDEO community. Many of the members participate only in a single challenge, with a small core of members who provide input across a collection of challenges [3]. One important proxy, for instance, is a relatively low number of ideas that have been contributed to the website for completed challenges. The total list of inspirations on the website is 8173, and the total list of concepts is 4219; drastically lower than the full list of OpenIDEO designers. This is where our study begins!

Research Questions and Goals

Our main research goal was to understand what contributes and detracts from designer engagement with the site. We wanted to tackle this from two angles. The first was, “What might be preventing people from contributing to site?” The second was, “What keeps the power users engaged?” Within these two larger questions, we had subsets of questions that determined our methods and focus.

What prevents people from contributing to the site?

  • How well does a first-time user understand what the site is about?

  • How well can a novice user navigate the site?

  • How easy is it for a novice user to add her first contribution?

  • How does the first time user feel about the site?

What keeps power users engaged?

  • What do people love about the site?

  • What motivates power users to keep coming back, and is the Design Quotient part of it?

  • How often do power users contribute?

  • Do power users feel a sense of community?

Methods, Sampling and Recruiting

Heuristic Evaluation

In reviewing the site, we had a sense that overall navigation might be one of the key issues preventing more people from contributing to the site after they sign up. For this reason, we decided to perform a heuristic evaluation. We also believed that the heuristic evaluation would guide our understanding of what issues we should look for in the usability tests with novice users.

Recruiting in this case was simple: each member of the team performed a heuristic evaluation (link), looking at Jakob Nielsen’s ten guidelines [4]. This included assigning a score for the severity of the issue and suggesting possible fixes. More details about how we analyzed the results are in the “Analyzing the data” section of the report.

Usability Tests with First Time Users

As mentioned in the previous section, one of our main research questions was what prevents more people from contributing to the site. To best tackle this, we decided to perform usability tests with first time users. This would also tell us more about the general navigability of the site, and what types of issues might be preventing users from signing up.

We aimed to recruit 3–5 first time OpenIDEO users, all of whom work in design or technology and are computer literate. We also wanted an even gender balance, and participants in at least a 20 year age range. OpenIDEO is ostensibly built to be usable by anyone who’s interested, so we wanted to capture potential age and gender differences.

To recruit the users, we used the “friends-of-friends” approach. Meredith and Ronnie emailed friends with a description of the project, a timeline in which the tests would happen, and a request for recipients to forward the message to “designers and design-minded” friends. We received six responses, and selected five of them to maximize the age range. We ended up with two men and three women between the ages of 22 and 33. Among them we professions such as software engineer, interaction designer, and app developer relations manager.

We scheduled hour-long usability tests at the five users’ houses. We felt that this location would be the most comfortable for them, as well as one of the likely places in which they would log onto the site. We recorded the sessions using TechSmith’s Camtasia, which captured audio, screen activity, and the tester’s facial expressions via the laptop webcam. The usability tests had 5 main tasks:

  1. Get familiar with the website and explain what you think it does.

  2. “Following” a challenge.

  3. Browsing through the challenges section and accessing a pre-defined challenge

  4. Finding other users

  5. Navigating to and in the site’s forum

At the end of the session, we asked users about their general impressions of the site. How well did they feel they understood it? Did they like it? (See the full usability test guide in the “Protocols and Supplementary Information” section.)

Interviews with Power Users

Given that our second major research question was what keeps power users engaged, we decided the most effective way to understand this would be to lead a focus group with about 5 experienced users. They could give us an idea of how they felt about the site and what they experience was, as well as generate ideas to improve the site experience. Since the power users are located all over the world, we initially planned to conduct a videoconference focus group via Google Hangouts. Meena had suggested that we leave the option to chat on during the Hangout, which would allow quieter users to add ideas while others were speaking.

 Pierce left comments on the profiles of 20 of OpenIDEO’s most active users. The email included an overview of the project and a request to participate in a virtual focus group. Three users volunteered and sent us their availability. Unfortunately, we were unable to find a time that worked for all of them.

So, we instead scheduled three videoconference interviews with powers users; one participant had technical issues, so her interview was conducted over the phone. As to information about the interviewees: one was a professor, one was a graduate student, and one was a visual designer.

Analyzing the data

As mentioned previously, we had agreed on 21 usability issues during our heuristic evaluation. We did individual evaluations, then met as a group to consolidate our observations into a single heuristic evaluation. If multiple people noticed the same issue and disagreed about the severity rating, we would discuss it and agree on one. We identified 21 total issues.

We then used the KJ-technique discussed by Jared Spool to prioritize our findings from the usability tests and interviews [5]. Our focus question was, “What prevents people from contributing to the site?” We all met in the co-lab of the I School with sticky notes; referencing our notes from the interviews and usability tests, we individually wrote down our observations and opinions. We then had one member of the group present his notes and explain them out loud as he put the stickies on the board. If anyone had the same note, we added it nearby. We went through each member of the group until all of the sticky notes were up. We then started discussing what themes we each noticed, and grouped the sticky notes accordingly. Next, we named each group, consolidating some groups in the process.  

In the end, we had identified nine distinct groups. We then voted on severity and “ease of fix” scores for each issue. Finally, we assigned priorities to the issues based on the combined scores. Our results are below in Table A.

Table A: Findings in prioritized order

Key Research Findings

Even though OpenIDEO has already been upgraded from its initial launching version to simplify its usability and layout, through heuristic evaluations, usability tests and one-on-one interviews we identified a series of key findings. We have chosen to focus on the three most urgent findings: the Contribution Process, Semantic Consistency and General Understanding of Website.

Contribution Process

Within the Contribution Process category, we identified two main opportunity areas: (1) the complexity behind the process of adding a contribution, and (2) a lack of guidance on what is expected from the user when he or she tries to contribute an idea or solution.

We found out that the complexity of adding a contribution partially lies behind the fact that the general layout of the website is overwhelmingly full of content, and lacks directions and explanations for new users. For example, upon accessing OpenIDEO’s home page, visitors are only presented with a short, one-sentence explanation of what OpenIDEO is, but are not offered any other explanation on how the site actually works. Once they reach the “Challenges” page, visitors are immediately presented with a long challenge description presented in a very small font (see figure C). Rather than creating excitement and positive expectations about the platform, the block of text deters visitors from further reading and engaging in the site.

Figure C: Example of a challenge description.

Through the usability tests that we performed, we learned:

  1. The amount of text throughout the site overwhelms the users. It gives them the impression that in order to catch up to the current status of the project and make valuable contributions, they will be forced to read and process pages and pages of content.

  2. Users had trouble figuring out how to contribute. It is hard to engage in projects because calls to action are effectively hidden, and require many clicks to find (see figure D).

  3. The contribution forms are complex and lack guidance. Even when users reached the desired contribution page, they found the forms were complex, lengthy, and lacking guidance. Four users complained or sighed after they got about 5 boxes in. One user also mentioned that based on the contribution snippets shown on the individual challenge page, he had thought he could make a “quick contribution.” All of this might actually discourage many visitors from contributing their ideas, thinking that a lot more is expected from them when contributing to a project.

Figure D: The steps towards making a contribution.

Figure D: The steps towards making a contribution.

We think that improving the user experience around the user flow to make contributions and collaborate within the platform should be something relatively easy for OpenIDEO to implement, and could have a very positive impact in the short run. In order to achieve this, we recommend:

  • Make sure that each challenge redirects to the current phase. In the current version of the platform, clicking on a determined challenge within the “Challenges” section takes the user to the project page, but does not redirect the user to the current phase.

  • Include “call to action” buttons on the “Challenges” page. For example, if a project is in the “Ideas” phase, add the “+ Add Idea” button within that project’s section in the “Challenges” list so that users who are interested in immediately contributing can do so. Power users could also benefit from having easier access to the sections they normally use.

  • Guide new users through the required steps and clicks to browse the main parts of the site and make their first contribution. To do this, OpenIDEO could use tutorials, hints and tooltips throughout the site. We also suggest moving the navigation not related to contributing to challenges, such as “Blog,” “Forums,” and “Impact” to the right side of the navigation bar. This would clarify what the most useful tabs are to learn about the site.

  • Set an idea of what is expected in each text box on the contribution page. In order to achieve this, we specifically recommend: Describing the recommended length for each text box (e.g. “approximately 400 words” or “max 1000 words”), adding examples or tips above each field that might inspire or clarify, and revising the forms to clarify what fields are required.

Semantic Consistency

Our second top finding was the inconsistent use of language and terms around the site. Through our usability tests and heuristic evaluations, we found out two main problem areas: the common presence of language inconsistencies throughout the site, and the lack of clear differentiation between sections of the website. The main issues in this area were the following:

  • “Follow” and “join” are used interchangeably. One of the main findings was that the phrases “Follow this challenge” and “Join this challenge” were being exposed to the user interchangeably within the same process flow, creating confusion and a sense of doubt about the site’s intention. Multiple users reacted strongly to this inconsistency: When they read the word “follow,” they immediately related the action to something similar as what social media does where “follow” means you will regularly be updated on that subject. However, OpenIDEO both lacked confirmation that the user was now following a challenge, and any explanation on what would happen if you followed that project.

  • Users didn’t understand the different between “Community” and “Forums.” Multiple users said that they had ignored the “Forums” section entirely before being asked about it, assuming that much of its discussion functionality was under “Community.” One asked, “Isn’t a forum a community and a community a forum? Then why do two sections exist? When should I visit each one?”

We strongly recommend:

  • Use a standard set of words or phrases throughout the website.

  • Rename the platform’s main sections so that they really describe their content. For example, “Forums” is a help-oriented section, so rename it “Help.” While these issues are not as critical as the ones detailed previously concerning the contribution process, they do present inconsistencies that confused users. A confused visitor is likely to give up and spend their precious time elsewhere.

General understanding of website

Our third key finding was that people generally find it difficult to determine exactly what the site is about. Although the front page clearly guides the user to learn “how it works” (through the top navigation item and the video) some of our participants were still confused after spending 30 – 40 minutes on the site. We attribute this problem to a couple of different usability issues.

Winning / impact

  • Participants who visited the site for the first time had a hard time figuring out what it means to win a challenge and what impact a winning idea would have. They found it hard to distinguish between “winning,” “realization” and “impact,” which are all words used to describe the final phase of a challenge.

  • Users couldn’t figure out if there were any prizes for the winning ideas. As far as they could tell, the incentives for participating are not explicitly described anywhere on the site, which made it hard for the participants to fully grasp the concept. One of our expert user interviewees later told us she had “won” a trip to Brussels where she got to pitch her idea at the European Commission. This had been a great experience for her and she especially enjoyed meeting other OpenIDEO contributors in person. Although this was a great prize which empowered her to get her idea realized she was not informed about it before her idea was chosen as one of the top ten ideas of the challenge.

We suggest that OpenIDEO advertise their prizes both to show potential contributors “what’s in it for them” and to make the idea of the open innovation challenge more tangible as a concept. We also suggest updating the documentation in “How It Works” to offer more explicit language about what “winning” means, and what the different types of contributions are.


Another source of confusion for our participants was the unclear roles of the different users on the site.

  • People listed in the “Community” section had roles users didn’t understand and didn’t know how to learn more about. A description of the role types (such as ‘contributors’, ‘OpenIDEO team members’, ‘challenge managers’, ‘community champions’ and ‘sponsors’) and their respective responsibilities and permissions is nowhere to be found. One participant assumed that people could only make one type of contribution.

  • The “we” in the challenges’ stage descriptions is ambiguous, and changes. One of our participants found it particularly frustrating not knowing who “we” refers to in the challenge descriptions. This could be either of the above-mentioned role types and he was unable to figure out who had the final vote in the elimination process. Another instance where this confusion became apparent was when we asked the participants to verbalize their expectations of what would happen if they clicked each of the top navigation items. When asking about the “Blog” section, we heard many different expectations about who would actually be writing the blog posts.

OpenIDEO has made an attempt to fix this problem by adding a label to all OpenIDEO team members’ profile pictures (see figure E). This handles the problem on a micro level so a user can identify the role of another user writing a specific contribution or comment, but does not address the macro level understanding problem which we have identified. We suggest that OpenIDEO explicitly state the different actors’ involvement at each phase. To avoid confusion for first time users they should always make it clear who does what.

Figure E: Profile picture for a member of the OpenIDEO team.

Figure E: Profile picture for a member of the OpenIDEO team.

Content abstraction

When tasked with learning more about how the site works, our participants were either drawn to the “How OpenIDEO Works” video on the front page or the “How It Works” top navigation item, which took them to a long infographic explanation. The explanatory documentation made users feel that OpenIDEO has an impact, but they didn’t understand how the site works. Each user spent a lot of time trying to understand the concept, but had to give up and move on although both the video and the infographic are quite lengthy. Only one user made it to the end of the video, and only one felt she understood how she could participate based on the “How It Works” page. Most of them got a much clearer idea about the OpenIDEO concept when exploring the individual challenge pages. We mainly attribute this problem to the level of abstraction and designer jargon used in the “How It Works” elements. An illustrative example of this can be found in the introductory paragraph on the infographic:

"OpenIDEO is an open innovation platform for social good. We’re a global community that draws upon the optimism, inspiration, ideas and opinions of everyone to solve problems together."

We would categorize this quote more like a vision statement than an explanation of what OpenIDEO actually does and how it works. One of our participants told us how he experienced the video more as a sales pitch than an introduction for new users.

To ensure that new visitors and potential contributors do not drop out of the site due to confusion about how it works and what they are supposed to do we suggest that OpenIDEO explain their concept with concrete examples, possibly using the challenge timeline (see figure F below) as the main thread. The challenge timeline seemed to be the trigger of some “aha moments” when some of our participants got a big step closer to “getting it”.

Figure F: The challenge timeline.

Additional findings

As stated earlier, we chose to focus on the three most urgent findings. However when analyzing our data using the KJ method we came to nine findings in total. Here are some brief descriptions of the less urgent findings which can be revised when the suggestions for the top three findings have been considered and hopefully implemented.

Content presentation

We found that the content of the front page was of little use to both the new and the experienced users. The content has the format of news but does not seem particularly relevant. Also a lot of content on the site can be characterized as a “wall of text,” which can be time consuming to get through, and may scare off new users who just want to get started as fast as possible.

We recommend personalizing the main page for users who are logged in, showing them recent activity and offering them links to the open challenges.


We found that the main navigation bar is split into two bars with no clear visual hierarchy guiding the user to the most important items. In the “Challenges” section, it was difficult for the participants to get an overview of which challenges were active, and when they were opened / closed due to the lack of contextual time information.

We recommend adding a timeline to each challenge in the “Challenges” list. We also recommend flipping the top two navigation bars so that the “OpenIDEO” home button is on the top. Several users mis-clicked on “Challenges” while attempting to return to the home page.


The OpenIDEO community is allegedly comprised tens of thousands of people from around the world. However this vast amount of people collaborating for the greater good is hardly to be seen on the site. The dedicated community page on the site is comprised of four top 8 lists with many repeat entries. Furthermore the forum tab takes you to a separate Zendesk domain.

We recommend including information about more than the top 8 contributors for the different contribution types, such as recent activity or users who are gaining in their Design Quotient.

Collaboration style

From the interviews with the expert users we found that collaboration mainly takes place in the comment section of each contribution. According to one interviewee, this format for collaboration is working out well between the engaged designers, but for a new visitor the collaboration is hidden some clicks into a deep information architecture. Thus the collaboration on the site becomes opaque and intangible to new potential contributors.

Design quotient

All of our participants seemed to have an opinion about the design quotient – essentially a pie chart displaying a user’s activity on the site as an attempt to gamify engagement on the platform. Some of them criticized it for measuring quantity instead of quality while others thought of it as a valid sign of design experience which could be used on their resume. In general the design quotient was well understood and accepted as a part of the experience.

Site performance

The site has some considerable waiting times when opening a challenge page or sorting through contributions. Although this aspect of a site’s usability is highly prioritized by usability expert Jakob Nielsen, we chose not to pursue it as a key finding due to the technical complexity of its solution and the vagueness of our resulting suggestion. Simply suggesting “make your website faster” is neither very inspiring nor actionable. Also we recognize that the site has a lot of content to handle and suspect that a fix would be both expensive and radical.

Reflective analysis

We all started out with a lay understanding of user research. Going through all the steps of an actual research process has given us a better understanding of research on at least these three different levels.

Firstly it has given us some hands-on experience with the tools needed to extract user attitudes and behavior. These tools include the purely technical means you have to master to record usability tests and interviews which is critical to documenting your findings. We certainly had some difficulties underway, but then we know a few pitfalls to avoid next time. Secondly we’ve been surprised with how differently our participants perceive and use a product like OpenIDEO and how many valuable insights that leads to. We really got to experience how you can become blind to certain things when you work on a project and how observing the user experience can reveal those blind spots. Watching new users struggle with performing simple tasks is a painful (it is so hard not to speak up) but enlightening exercise. Thirdly we have come to recognize and respect the amount of work it takes to get real, unbiased and useful user input. This fact seems to be difficult for outsiders to understand, and we didn’t understand it before diving in. From the outside user research can seem straight forward, and the general attitude towards it seems to be: “Just go talk to the users and see what they think about our product. How hard can that be?” But if you want to get rich and purposeful data many hours of planning, executing, analyzing and reporting lie ahead.

Setting our goals

Planning our research went fairly smoothly and we quickly agreed on what would be interesting to find out about OpenIDEO. We decided to take a holistic view of the website including both new and experienced users in our study. In hindsight we may have gotten a deeper understanding of a more specific problem if we had chosen to focus on only one of the two groups. A direct collaboration with OpenIDEO could have been decisive in this situation.

We deliberately chose three very different methods to attack OpenIDEO’s problem from different angles. The expert review was meant to create an outline for structuring the usability studies which turned out to work very well. By first assessing which parts of the site could potentially hinder a good user experience we were able to construct our tasks so they specifically targeted the functions in question. The goal of the usability tests was ultimately to observe the actual user experience and detect usability problems. Finally the aim of the focus group was to get a sense of the community spirit which is highlighted as one of the most important features of the site.


After setting up our goals we began recruiting for both the usability tests and the focus group. Although the focus group fell through due to transnational coordination problems we still had the contacts on eager OpenIDEO contributors who were willing to share some of their time with us. When recruiting for the usability tests we had a bigger pool of potential participants because we were looking for people who had no prior experience with the site. Testing on designers turned out to have both pros and cons. They were generally familiar with usability testing so they knew how to be useful participants. This especially became apparent in their ability to act slowly and speak their thoughts out loud.  A risk of having designers test the site is that they might not represent the average user, but in the case of OpenIDEO the target group of the site is actually designers (amateurs and professionals) so there was no way around it. The result was of course that the designers began to design and we got a lot of suggested solutions thrown in.


The method which caused us the most frustration was without a doubt the focus group / interviews. After failing at gathering our participants in a Google Hangout we agreed that the best alternative was individual interviews. We would still get the qualitative information about the incentives and the community but lack the group dynamic aspect which would have been optimal. Conducting the individual remote interviews turned out to be tricky as well: because of unforeseen technical issues, one of them had to be carried out over the phone. This made us unable to record the interview, so we had to rely on handwritten notes. Next time we plan a remote focus group we need to get in contact with our participants well in advance. We are very disappointed that it didn’t work out because it would have been exciting to experiment with the conducting a digital remote focus group.

Analyzing and reporting

Using the KJ method for grouping and prioritizing our findings proved very efficient and we almost kept within the 40 minute time frame suggested by Spool [6]. The themes that emerged were well in line with what we experienced in the field and got us on the same page when discussing which suggestions to focus on for the presentation. For the presentation we chose to focus on the top three findings and suggestions backed up by video evidence from our usability studies. This format seemed to work well as it demonstrated the points we were making and kept the attention of the audience.


OpenIDEO is a platform whose purpose and objective were very well received by our testers. Even after struggling through tasks, the users were still excited about the site when we interviewed them at the end. The website already generates a lot of positive energy, and we believe we can capitalize on this by reducing the friction around: 1) contributing and 2) understanding how the challenge process works.

Our evaluations, tests and interviews allowed us not only to get a better insight from power users, but also gain important information on the impressions that new users gain upon visiting the site for the first time.

For new users, usability issues create high barriers of entry that end up discouraging user participation and engagement. First of all, while the user’s general idea of the site is well understood, the overwhelming amount of text and the small fonts makes it difficult to fully commit to the site. Second, the complexity in accessing some of the important parts of the site such as contribution page and compromises the initial user experience. Thirdly, once a user reaches the actual contribution form, the process still remains intimidating because of a lack of guidance and seemingly length requirements to add an contribution. We believe that by implementing the top three suggestions detailed in this report, OpenIDEO will lower the barriers without drastically changing the website design and layout.

For power users, the site provides a great platform to exchange ideas and engage in projects. In general, constant use of the site has created a strong sense of recallability. However, through our interviews we were able to also identify the main issues or nuances power users experience within the site. By streamlining the process that power users go through and making the contribution process more easily accessible, we think that engagement from power users could improve as well.

Finally, the overall project and experience have allowed us to go more in-depth in the techniques and processes that are involved in doing a complete usability assessment. Additionally, we were able to understand the difficulties that activities such as focus group recruiting and scheduling can have, and that adaptability to those situations is key in any usability assessment project.


[1] Kuniavsky, Mike. Observing the User Experience: A Practitioner's Guide to User Research. San Francisco, CA: Morgan Kaufmann, 2003. Print.

[2] Lakhani, Karim R., Anne-Laure Fayard, Natalia Levina, and Stephanie Healy Pokrywa. "OpenIDEO." Harvard Business School Case 612-066, February 2012. (Revised October 2013.)

[3] "OpenIDEO - Home." OpenIDEO. N.p., n.d. Web. 08 May 2014.

[4] OpenIDEO University Toolkit. Rep. N.p.: OpenIDEO, n.d. Web. 7 May 2014. <http://documents.openideo.com/openideo_university_toolkit.pdf>.

[5] “The KJ-Technique: A Group Process for Establishing Priorities.” User Interface Engineering. N.p., May 11, 2004. Web. 7 May 2014. <https://www.uie.com/articles/kj_technique/>

[6] Mark Fuge, Alice Agogino. "How Online Design Communities Evolve Over Time: the Birth and Growth of OpenIDEO," Proceedings of ASME 2014 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference, August 17-20, 2014, Buffalo, USA.

Protocols and Supplementary Information

Heuristic Evaluation

The full heuristic evaluation spreadsheet is available here

Usability Test Guide

The protocol and script for the usability tests is available here.

Interview Guide

The interview guide is located here. The interview notes are located here.


Ranking spreadsheet here.

Statement of Informed Consent - Usability Test

The statement of informed consent that we had usability test participants sign is available here.

Statement of Informed Consent - Interview

The statement of informed consent that we had interview participants sign is available here.