How we developed the ratings

A vouth volunteer in China shows a senior citizen how to use WeChat, a popular social media platform, to vote on a participatory budgeting proposal.

We invited a diverse and independent group of digital participation experts to serve on the technology review committee that evaluated the leading digital participation platforms. Committee members did not directly interact with developers of the platforms as they evaluated the products. Besides their expertise, we prioritized a diversity of gender and geography so committee members would bring different perspectives on what is most important and helpful for people in various contexts. The six committee members are from Africa, Asia, Eastern Europe, Latin America, North America, and Western Europe.

This year, our ratings include 30 platforms, all of which were included in the 2022 edition. Some platforms have been removed from the list due to inactivity. The committee agreed upon the list of platforms. For the next edition, the committee is exploring new ways to include more platforms from underrepresented regions in the Global South.

Two of the committee members applied the evaluation criteria for each category to the platforms. When there was disagreement on the result, the committee members responsible for the rating discussed and reached a consensus on the final rating. Before the ratings were finalized, developers were provided with the draft version of the ratings of their platform and given the opportunity to complete any missing information or provide evidence on disputed ratings. None of the committee members knew the final overall rating until the total score was calculated.

User and developer survey

In addition to the expert review, the ratings were informed by surveys of both users and developers. The user survey was conducted online from November 9th to 23rd, 2023. It asked users of digital participation platforms to evaluate their experience based on defined criteria. The survey instrument was shared through our newsletter, social media, and our members and networks. We received 42 responses from 14 countries in South America, Africa, and Eastern and Western Europe. The respondents included staff members from local and federal governments, civil society organizations, universities, and businesses. Due to the limited response this year, we also used survey responses from 2022 to inform the ratings.

In addition, we surveyed platform developers seeking to collect useful missing information. The survey was open from November 9 to 23, 2023. We also extended our due diligence process to enable platform developers to view the draft ratings and add missing information. More than 60% of the platform developers engaged in the process.

It is important to note:

  • The ratings compare digital participation platforms across a range of criteria to help governments and organizations decide on a suitable platform for their needs 

  • Ratings are not based on testing of platforms, but on an assessment of their features, pricing, track record and reliability, etc.

  • Committee members based their evaluations mainly on publicly available information on the platforms’ websites and elsewhere, in addition to their own expertise and responses to the developer and user surveys.

What changed from last year’s ratings

Changes 2023 Ratings 2024 Ratings

Average score 

In 2022 and 2023 platforms received a standardized score of all the criteria. 

We have removed the overall score this year, only presenting the standardized number for each criterion.

Categories 

In 2023 platforms were divided into three categories:

Platform categories

  • Toolbox: Platforms offering tools for designing complete participatory processes. For example, a platform that includes features for idea collection, proposal development, and voting would be considered complex. Platforms in this category can be used for different kinds of processes in different contexts.

  • Specific tool: Platforms that work well for a specific approach of participatory processes, such as those that focus on one type of participation, e.g. proposal development. They can be used for different kinds of processes in different contexts.

  • Specialty: Platforms offering a unique or original approach for participatory and deliberative methodology, which work usually for a specific context, such as participatory budgeting or climate assemblies.

We addressed a mismatch between categories and rating criteria:

In 2024, the committee decided to change the number of categories as well as reduce their significance in the presentation of the ratings, as it caused confusion.

There are now only two categories, used for descriptive purposes only.

  • Multipurpose: Platforms offering tools for designing participatory processes for different kinds of processes, in different contexts, and/or for different stages of a process. For example, a platform that includes features for idea collection, proposal development, and voting would be considered multipurpose.

  • Specific: Platforms that work well for a specific approach of participatory processes, such as those that focus on one type of participation, e.g. proposal development, or those that work for a specific context.

Airtable presentation

In 2023, the presentation resulted in many issues, including some developers viewing and using the ratings as “rankings,” i.e., marketing their platform as the number 1/best platform on the ratings, though our tech expert committee recommends using them as noncomparative ratings since a platform’s usefulness is dependent on a number of factors. 

In 2024, the ratings presentation has been changed in the following ways:

  • The tools are presented in alphabetical order.

  • The criteria are also listed in alphabetical order.

  • As we removed the categories, the tools are presented in one Airtable instead of three.

Changes to the criteria

  • Some variables’ definitions were updated to improve clarity.

  • Removed “Additional costs” from the “Cost” category, as it is covered in “Capacity requirements.”

  • “Open core” has been removed as a platform feature.

  • Some platforms have been removed due to inactivity or lack of information.

    • Neighborland

    • Hromadskyi Project (Ukrainian)

    • Ideabox (Agorize)

    • Congreso Virtual

    • Liquitous

  • A new platform has been added: Citizen OS

  • “Process planning guidance” was moved from “Features” to “Capacity requirements”.

  • A new variable has been added in the ethics and transparency criterion: “Raw data export”

  • The variable “Number of institutional users” was updated to “Diversity of contributors”.

Due diligence process

In 2023, platform developers had one opportunity to share information with the committee. The information they shared via email and the developer survey was used to improve the ratings, especially in variables where the committee could not find information online. 

In the 2024 ratings, platform developers had two opportunities to share information in order to improve their scores.

  • First, we released and widely shared a developer survey in November 2023.

  • Second, we shared their draft ratings with them to allow them to provide information where we had information gaps. The platforms had to provide evidence for the score to improve.

  • This new due diligence process required the committee to revisit their initial scores at least three times to adjust or validate their scores to improve the accuracy of the ratings.