<aside> 🪧 This rubric shows publicly how all applications received from the SSF form are evaluated. The ADPC will use these criteria to provide a score to each application received or, in the event, discard it immediately if it violates one or more of the conditions mentioned.

As of July 22nd, our application process for new security service provider submissions is closed. We have whitelisted the Security Service Providers, whose details can be found here.

</aside>

Means Test: Criteria for Evaluation

[See full Means Test Rubric here]

The development of this Means Test aims to provide a structured approach for the ADPC to evaluate applications for financial assistance. This tool is designed to identify applicants who would benefit most from support, ensuring equitable access to subsidies within the Arbitrum Ecosystem, particularly for smaller entities with valuable contributions.

The intent is to allocate subsidies to those most in need, avoiding exploitation by larger players looking for a ‘free lunch/handout’. Such an event could give recipients an unfair advantage over their competitors or be an inefficient use of the DAO’s funds if they do not bring about a net positive change.

The reason for this approach over a purely quantitative approach is that most projects, especially the smaller ones being targeted within this subsidy program do not possess obvious immediately measurable metrics.

The means test will include a scoring system ranging from 1 to 5, reflecting the merit of each application.

Each of the sub-criteria in the means test have varying levels of importance, and they will each have a weighting attached. A weighting of 1 indicates low importance, 2 indicates neutral importance, and 3 indicates high importance.

Each application will be scored by ADPC members, followed by a collective decision on the most deserving grant recipients, taking into account the rating against the eligibility criteria, a value-for-money evaluation and the funds available. The ADPC may make other decisions in relation to the operation of the fund and selection of applicants as further detailed in the Application Process Terms.

In the event that an applicant receives a high score but is not chosen as a grant recipient, explanatory feedback will be provided either on an individual or collective basis to the cohort.

Criteria Sub-Criteria Description Weight
Arbitrum Ecosystem Contribution

How aligned is the project with the Arbitrum ecosystem and how easy will it be to track the applicant’s use of the subsidy funds? | Ecosystem Contribution | How does the applicant’s project contribute towards the growth of the Arbitrum ecosystem? | 3 | | | Transparency Practices | To what extent does the applicant demonstrate transparency in its operations? | 2 | | | Community Engagement | How does the applicant engage with the DAO community and solicit feedback/input on its project, incorporating this into its decision-making? | 1 | | | Accountability Measures | What mechanisms does the project have in place to ensure accountability and responsible stewardship of subsidy funds, including governance structures in place? | 3 | | Business Model & Need for the Subsidy

How effectively does the applicant’s business model align with their need for the subsidy? | Clarity of Business Model | How well-defined and understandable is the applicant’s business model? | 2 | | | Team Experience | What is the track record of the team on their ability to execute their plan? | 2 | | | Funding Gap Rationale | Is there a clear explanation of the funding gap the applicant is facing, along with the rationale for why additional subsidy funding is necessary to achieve its objectives? | 3 | | | Reasonableness of Subsidy Amount Requested | Does the requested subsidy amount make sense within the context of the project’s needs and potential impact? | 3 | | | Scalability Potential | What is the scalability potential of the applicant’s business model following the support of the subsidy? | 1 | | Financial Analysis

How realistic and stress tested is the applicant’s financial status and projections and is their plan for the use of the subsidy funds clearly outlined? | Accuracy of Projections | How realistic and well-supported are the financial projections provided by the applicant, inclusive of revenue forecasts and cost analysis? | 1 | | | Sensitivity to Scenarios | To what extent does the applicant’s financial analysis consider different scenarios, such as base, target and stress scenarios to assess the projects’ resilience and adaptability to changing market conditions? | 1 | | | KPIs | Are there clearly defined KPIs that will be used to track the project’s performance and measure progress towards achieving its goals? | 1 | | | Preferred Funding Distribution | Does the applicant have a preferred distribution plan for the subsidy funds, and is there a rationale provided for this distribution approach, such as front-loading funds for critical start-up costs or phased funding based on project milestones? | 2 | | Risk Analysis

Is the applicant aware of risks with their project and what is their plan for mitigating these risks? | Risk Identification | How effectively does the applicant identify and assess potential risks and vulnerabilities that the project may have? | 2 | | | Security Requirements | Does the applicant have a clear understanding of its security requirements and the measures needed to protect against security breaches, such as through the conducting of a security audit? | 3 | | | Mitigation Strategies | What strategies does the applicant have in place or intend to implement to safeguard against the aforementioned risks? | 2 |

Regarding the ‘Ecosystem Contribution’ metric above, we have conducted an initial assessment of the types of projects that are currently building in the Arbitrum ecosystem and identified a few verticals that the ecosystem would benefit from funding. These are set out below, along with the rationales for choosing them. We will provide more weight to these areas and welcome input from the community on our selection.

RWAs & Tokenization