Reporting functionality: Discussion and clarity

From the Maintainers call on the 14th of Sept we spoke briefly about Reporting - this then led into a larger discussion about what reporting means for each adopter of Hypha and a decision to hold a space (meeting and async space) to understand ‘Reporting’ better

The points from Maintainers call on the 14th read as follows:

  • Reporting functionality hoping to have updates and wireframes next week (this is the monitor and evaluation type of reporting)
  • Need to decide on terms for reporting on system analytics, manager kpi, and partner monitor and evaluation?
    • @eriol will take on working with folks to define this
  • Doing some research with @rosy and her team on workflow changes

Eriol’s proposal:

  1. We use a space in the Adoptors meeting on the 22nd (and any subsequent adoptors meetings) to begin to define ‘Reporting’
  2. We hold additional meeting time if we do not resolve clarity around reporting
  3. Eriol creates a Miro board to collect and collate all information that can also be added to asynchronously
  4. We stop when we feel we have a ‘good enough’ amount of info to move forward with reporting
  • Is the above proposal understandable or require any clarifications?
  • What are peoples quick reactions to the above proposal?
  • Does anyone have any critical concerns about the above proposal?
  • Is it good enough or safe enough to try for now?
  • Would accepting this proposal hinder us working together to meet our stated aims?
  • How can we ‘measure’ the concern?

Please note the links below to other reporting conversations (and add any additional links around reporting in Hypha in comments to this thread)

1 Like

In the Maintainers call on the 28th of September 2021 we briefly discussed one part of reporting from the perspective of applicants/grantees reporting to grant-giving organisations/funders. We briefly talked about who and how grant-giving organisations/funders report to (e.g. government, donors etc.)

We’d like to collect responses from adopters on the following questions before the next adopter call at the end of October 2021.

Can the design and product team get responses to these either written or recorded using a quick screen recording (https://www.loom.com/ is good!) by Thursday October 14th 2021 please.


  1. What kinds of reports do you currently ask for from applicants? (e.g. written long form, written short, automated surveys via SMS or email etc.)

  2. What kinds of questions do the applicants answer in these reports and to what extent do you value different kinds of questions/responses? (e.g. quantitative, qualitative and you value each equally)

  3. What kinds of methods or tools do you either ask applicants to use to send reports or do applicants request to use to send you reports? (e.g.attachment files on emails, SMS on Signal etc.)

  4. What kind of follow-up do you do as a funder to reports from applicants? (e.g. we offer support, sign post for support from other places etc.)


Tagging Adopters to respond to the questions when they can before the deadline. We briefly talked about to who and how do grant-giving organisations/funders report to but for now, we’ll focus on the applicants to grant-giving organisations/funders.

@dluong @blah @allan @rosy @vinkthom @CParraga

1 Like

For DFF:

  1. We currently have templates prepared in word (for narrative reporting) and excel (for financial reporting). There are several different types of narrative template depending on the type of project and the stage of the project, with the most detail being asked at the end of the project. The grantee fills out an answer to each question and returns the form. Alongside the narrative reports, we have what we call an “outcomes log”, where grantees can add different changes/impacts that have taken place throughout the project and list evidence for each impact.

  2. Heavily waited towards qualitative and focused on impact, rather than activities and outputs. We want to hear about progress towards objectives, what kind of outcomes/impacts were achieved, lessons learned, types of collaboration, suggestions for us and others, next steps after the project, etc. Quantitative information can sometimes help to support this, but is not requested specifically (like number of people who signed onto a campaign, etc).

  3. Up until now, we upload the templates into a password-protected Nextcloud folder (though we plan to start using the projects space on Hypha for this). The grantee then completes the reports and uploads them into the same folder for us to download and copy into our internal folders. We also use email and Signal sometimes for communications and follow up questions about reports, but prefer to try and keep most of that type of information out of our emails, and keep email attachments to a minimum.

  4. Provide direct feedback and ask questions over email or a call, and provide suggestions/support for next steps. Update the case study page on our website, and attach relevant documents and links. Post blogs by the grantee on our website. Feed any results and outcomes into a central outcomes log we keep of all outcomes achieved by our grantees. This is used to measure the overall impact of our funding, measure progress towards our own goals and commitments to our donors, and to feed into reporting to our own donors (who are currently all foundations).

Hope that helps and happy to answer any follow up questions.
Thomas

2 Likes

Can the design and product team get responses to these

I have realized in reading @eriol’s message that I cannot confidently say who comprises this team.

My uncertainty here makes me think we need to do a better job documenting who is responsible for different aspects of Hypha’s development. Some of the info exists on the Hypha main page and the main documentation page, but we can make more explicit the division of labor (in the sense of saying “here’s who to contact if you have questions about:”).

It might also be a good idea to list the means by which we can contact each of those folks (perhaps a link to WeChat and their handle on the platform).

Questions for folks:

  • Do you agree that we should make this more explicit to the community? (For ease of generating consensus, please like the post if you agree)
  • Where should it go (main page only, main page + main docs page)?

Once we have answers to those two questions, I can create GitHub tickets in the relevant repository/repositories to begin the process.

  1. What kinds of reports do you currently ask for from applicants? (e.g. written long form, written short, automated surveys via SMS or email etc.)

Every 6 months we do an informal check in via a Zoom, phone call, or email. In this case the report end product is my notes on the call or the text of the email update.

When the project is completed they fill out this form: ARDC Final Grant Report

Every project is different, and we ask for different things at the 1-year mark. For example, a large satellite project may require a written report on each deliverable, whereas a scholarship grant may (a) be done already and (b) just need a list of recipients, where they went to school, etc.

  1. What kinds of questions do the applicants answer in these reports and to what extent do you value different kinds of questions/responses? (e.g. quantitative, qualitative and you value each equally)

You can view the questions here: ARDC Final Grant Report

Question type value depends on the project, so for the purposes of building reporting in Hypha, both are equally valuable.

For example, for scholarships, the number and amount of scholarships distributed by our partners is the most important measure of success. For a project where a club is putting up a repeater antenna, describing that they successfully installed the antenna is the most important part.

  1. What kinds of methods or tools do you either ask applicants to use to send reports or do applicants request to use to send you reports? (e.g.attachment files on emails, SMS on Signal etc.)

Attachments on email is fine, but we prefer to use a form (which right now is in Airtable). The ability to upload attachments is helpful for photos and people who have long publications they want to upload.

  1. What kind of follow-up do you do as a funder to reports from applicants? (e.g. we offer support, sign post for support from other places etc.)

We use reports for:

  • Publicizing information in our annual report, in blog posts, and at public meetings where we share about the work we’ve been doing. Please note that we ask specifically for information that can be shared publicly. This may be a subset of the report, or the whole thing, or none of it.

  • We use internal report info about lessons learned and challenges to talk with our board and reviewers about which projects need additional support, or which types of projects we should think about funding in the future

  • We would like to use reports to advise new applicants about what makes a project successful and common pitfalls

2 Likes

@allan when you have a chance do you think you could answer these questions from a DDP pov?

  1. What kinds of reports do you currently ask for from applicants? (e.g. written long form, written short, automated surveys via SMS or email etc.)

  2. What kinds of questions do the applicants answer in these reports and to what extent do you value different kinds of questions/responses? (e.g. quantitative, qualitative and you value each equally)

  3. What kinds of methods or tools do you either ask applicants to use to send reports or do applicants request to use to send you reports? (e.g.attachment files on emails, SMS on Signal etc.)

  4. What kind of follow-up do you do as a funder to reports from applicants? (e.g. we offer support, sign post for support from other places etc.)