Complex platforms - click to enlarge

Simple platforms - click to enlarge

Specialty platforms - click to enlarge

Additional participation platforms to consider - click to enlarge

Methodology

To evaluate the top digital participation platforms, we issued an open call to form an independent expert committee. Besides expertise in the use of digital platforms for public participation, we prioritized a diversity of gender and geography so committee members could bring different perspectives on what is most important and helpful for people in a range of contexts. At least half of the committee members are women and at least half are from Africa, Asia, Eastern Europe, or Latin America.

Committee members identified three main categories of platforms and six categories of evaluation criteria (see below). Each member then used these criteria to evaluate the 50 top platforms selected by Matt Stempeck, the principal author of our Guide to Digital Participation Platforms. Each committee member nominated up to 15 platforms that they considered best, and then the full committee rated all of the platforms that were nominated.

Each set of criteria was reviewed and rated across all platforms by two committee members, with further review by the committee when the two reviewers did not agree on a rating. Each committee member only rated platforms for at most two categories of the criteria, and the committee members did not know the final overall ratings until after the scores were compiled across criteria.

It is important to note:

  • The committee focused on platforms specifically designed to support public-participation programs. There are many other digital tools, like messaging apps, that are useful for community engagement, but that was beyond the scope of this research.

  • Platforms evaluated also were limited to those available in the most common global languages. There are many other platforms in languages that are not as widely used, and that could be excellent options in specific countries. 

  • Committee members based their evaluations mainly on publicly available information on the platforms’ websites and elsewhere, in addition to their own expertise and responses to a user survey.

The committee was not able to review all platforms, in part due to language limitations and the ever-changing nature of the field. This is just the first edition of the ratings; we hope to improve and expand the list in the future.

With enough resources, we hope to further develop the guide and ratings by:

  • Expanding assessment of the best platforms to users and experts.

  • Testing platforms to assess performance.

  • Expanding the ratings to other online participation tools, other than comprehensive platforms.

  • Developing supplemental online courses to help users get the most out of the guide.

  • Implementing other ideas from you!

Platform categories

  • Complex: Platforms that work well for complex participatory processes and/or use multiple features in one process. For example, a platform that includes features for idea collection, proposal development, and voting would be considered complex. Platforms in this category can be used for different kinds of processes in different contexts.

  • Simple: Platforms that work well for simple participatory processes, such as those that focus on one type of participation, e.g. proposal development. They can be used for different kinds of processes in different contexts.

  • Specialty: Platforms that work well for a particular process or linguistic or geographic context, such as platforms specifically for participatory budgeting or available in Czech.

Evaluation criteria

The ratings are based on six criteria: cost, capacity requirements, features, accessibility, ethics and transparency, and track record and reliability. The score for each criterion was standardized to a scale of 1-100, giving each equal weight. The overall score for each platform is the average of the six scores.

  • Platform cost assumes price for a local government that requires some customization of the tool. Ratings range from $ (free) to $$$$$ (more than $20,000/year).

  • Capacity requirements includes four variables:

    • Tech expertise required for configuration. Platforms that don’t need an expert to configure the platform received a higher score.

    • Tech expertise required for maintenance. Platforms that don’t need an expert to maintain received a higher score.

    • Hosting capacity required. Platforms that are fully hosted locally received a lower score, while platforms on the cloud received a higher score. 

    • Tech support available. Platforms offering full service support received a higher score.

  • Features includes 13 variables, which were scored according to whether it is offered by the platform:

    • Idea collection. 

    • Survey.

    • Proposal-drafting.

    • Voting.

    • Discussion forum.

    • Sentiment analysis (categorizing the emotional tone of discussions).

    • Commenting and sharing.

    • Mapping (allowing projects and user contributions to be connected visually to a particular location).

    • Process planning. 

    • Communication with participants. 

    • User verification and security.

    • Quantitative data analysis.

    • Heat mapping (using data visualization to indicate strength of support for an idea).

  • Accessibility includes six variables:

    • Number of countries where the platform has been used.

    • Functionality in multiple languages.

    • Hybrid integration with in-person activities (platforms that better integrate with in-person activities received a higher score).

    • Browser and technology requirements for users (platforms compatible with the most-used browsers or technology received a higher score). 

    • Connectivity requirements (platforms suitable for communities with connectivity challenges received a higher score).

    • Degree of mobile friendliness (platforms that are fully functional on mobile devices received a higher score).

  • Ethics and transparency includes three variables:

    • Open source: Is the platform’s code visible and customizable?

    • Data protection: Are collected data protected from leaks and outside use?

    • Content moderation: Is this service offered?

  • Track record and reliability includes two variables:

    • Time available on the market.

    • Number of institutional users.

Send us your questions and feedback

Do you believe we have not addressed an important issue or question related to online participation platforms? Did we miss a valuable platform in our ratings matrix? Is some of our information out of date? Contact our team with your feedback.

Technology Review Committee members

  • Emiliano Arena, Argentine Association for Participatory Democracy, Argentina. Emiliano is the coordinator of monitoring and evaluation at the Center of Public Policy Implementation for Equity and Growth. Emiliano’s research focuses on participatory budgeting, planning, monitoring and evaluation, urban social policies, governance, and open government.

  • Stéphane Dubé, Institut du Nouveau Monde, Canada. Stéphane has been active in civic participation since 1992. Currently civic-tech and special project director at Institut du Nouveau Monde, a non-partisan NGO based in Montréal aiming to foster citizens' participation in democratic processes, he advises public institutions in the design and organization of public participation processes. 

  • Kelly McBride, TPXImpact, United Kingdom. Kelly designs and facilitates community-based participatory and deliberative processes. She supports government and organizations as they work to deliver on their commitments to improve governance. Until recently, she led the Scotland office of Democratic Society and was director of policy and practice for a multinational team working across Europe.

  • Charlie Martial NGOUNOU, AfroLeadership, Cameroon. Charlie is an activist for democracy, human rights, digital rights, and open data with AfroLeadership, which he founded in 2007. He also works as a public finance expert for the Association Internationale des Maires Francophones.

  • Katya Petrikevich, Participatory Factory, Czech Republic. Katya is co-founder and international director of Participation Factory, a social enterprise that helps communities improve governance systems and quality of life through better participation and robust data. She has conducted research on current trends in tech and innovation for the EU Commission and delivered numerous training sessions on public participation and civic engagement.

  • Melissa Zisengwe,Civic Tech Innovation Network (CTIN), South Africa. Melissa is a journalist specializing in civic tech in Africa. She was appointed as a journalist at JamLab and CTIN in 2018, where she currently serves as program officer. As a former research fellow at the Collaboration on International ICT Policy for East and Southern Africa, she examined the use of civic tech during the COVID-19 pandemic in Africa.