Day 2 - April 24 - Methods and Resources

17 posts / 0 new
Last post
Day 2 - April 24 - Methods and Resources

Hello, and welcome to day two!

Today I am interested in specific tools for measuring success of advocacy initiatives in the global south. Feel free to answer one or both of these questions, thank you!

 

  1. What have you asked activists to do to measure the success of their campaigns?

 

  1. What kind of resources for evaluation of advocacy campaigns in the global south have you used or pointed your grantees towards?

 

 

 

So I answered day 2 on day 1!

I have a related follow up to something David said yesterday: "despite the long delay in seeing the legislation actually approved, I already see the campaign as something of a success."  This is common- the long term nature of policy change. I am interested in resources and tools you have found useful in making the connection between the work you support and the shorter term situation and outcomes given ultimate goal not yet in play or in sight.

 

Yes you did provide some

Yes you did provide some great resources yesterday! And yes we have also had many discussions of how very long term the outcomes of advocacy campaigns can be.

Some thoughts

The short answer is that I don't ask partners a lot of questions about measuring their success! Mainly because achieving success is usually such a long-term endeavor as opposed to the timeframe of the proposal and the grant, which tends to be short-term even when it is a two or three-year grant ... Occasionnally, as mentioned before, there is an advocacy win in the form of a piece of legislation or a case won in court, but in the big picture much more is required to make a real difference.

Thinking about tools, opinion polls are one thing some partners are using to get a feel of the views of the general public on certain issues, and whether positions have evolved. In terms of campaigning and communications, I usually ask about numbers - how many views of a video clip, how many followers on social media. But that's not the same as measuring what change and impact will result from the campaign.

I'd be keen to hear what others' experiences are.

Thank you Soheila! Yes as

Thank you Soheila! Yes as others have stated the policy change goals are so long term that they are impossible to track during a regular grant cycle.  It sounds like you have used some measures for tracking public opinion and engagement with a campaign, which is an outcome as well, if not the ultimate change you are looking for. 

Have you seen any of your partners come up with their own measures of success during their campaigns?

opinion polls

Hi, opinion polls sound like a big undertaking. Can you give an example of who has used them, who did the actual polling and briefly what is involved?  We support work in areas where this could be useful component for a learning agenda.

I would also be interested in

I would also be interested in details of how this works, thanks!

I came across two examples of

I came across two examples of opinion polls over the past few months, and you're right that they were big undertakings. In both cases, the organization actually worked with a polling company (the polls were about the use of drones and how did the public opinion and political parties feel about it), and I think that came at a cost. I am not sure if these companies, like advertising companies, are sometimes willing to work pro bono. But resorting to their expertise helped ensure it was done correctly and efficiently in terms of the question grids, sampling, presentation of the findings.  

Follow up on Irit's comment from Day 1

I saw this comment from Irit from yesterday about her process for gathering data from grantees that is relevant to today's discussion - 

Thanks for the question, and the discussion. At AJWS we have in-country staff and consultants. They often come from the movements our grantees are engaged in. They provide grantees with we call 'accompaniment', which includes a broad range of tailored, needs-based support. They know our grantees work well. As part of their work, they also document and report on progress in a narrative form on an annual basis. These staff and consultants along with our Program Officers are providing the information, reaching out to grantees as needed. We developed this tool as a complement to our other reporting, that provides qualitative progress updates on how the movements and communities are coming together and advancing their issues.

Irit, to comment on your note

Irit, to comment on your note here - would you be willing to share with group the reporting tool (or parts of it) that your in-country staff and consultants use for gathering this data? It sounds like it has worked very well for you to gather data that you need! (in combination with the right people administering it)

Of course. The reporting tool

Of course. The reporting tool, which we call the Outcome Monitoring Form, is currently being revised. The form itself is being simplified, but the process will be more structured. I am happy to share the first iteration as well as the revised version. I am also happy to share the mapping tool, and would welcome feedback. We have rolled it out, but anticipate staff updates each year. This will provide us with an opportunity to make minor modifications.

I would love to see the first

I would love to see the first iteration, and the mapping tool, as I'm sure others would as well. It sounds like your tools have some concepts in common with outcome mapping, which is another method we have been looking into. Thank you!

Today's questions

In response to today’s questions – sorry for my delayed input today! We try to minimize our asks of grantee partners beyond their proposal and annual report on progress. We also recognize that the change they seek is something to which the constellation of our grantee partners and many others contribute. We really rely on our in country staff and consultants for measurement. These colleagues and our Program Officers provide grantee partners with accompaniment, including relevant resources. Resources are usually understood broadly, and include facilitating introductions to other organizations and/or stakeholders, peer exchanges / learning opportunities, capacity building grants, etc. I’m actually not sure which written or digital resources are shared, but I will see what I can find out…

Thank you that would be great

Thank you that would be great!

one more resource for global south

Advocacy Accelerator is worth a look (advocacyaccelerator.org). I was involved in some conversations during the planning phase and was not sure they'd gain traction. However, the website seems useful, they have implemented a series of webinars and there was a job description posted on Pelican yesterday that focuses on curriculum development and facilitation. So they seem to be emerging as a viable educator/trainer/coach around advocacy and advocacy evaluation.

Thank you Jackie, yes I have

Thank you Jackie, yes I have looked at the Advocacy Accelerator a bit as well. It looks like they are building some momentum in the field. They could be a potential partner in what we are building as well. 

Adding some info on our tools...

A few additional responses I wanted to add regarding how we measure our work:

  • For the Lifeline short-term advocacy grants I wrote about in the first days' response section, we use a mixed method monitoring and evaluation approach, Freedom House and our partners conduct ongoing data collection to assess the outcomes and impact of the Lifeline program. Narrative reports are completed by all partners and grantees immediately following the advocacy work and surveys are filled out by all key stakeholders 3-9 months after completing the grant.  These reports continue to provide insight into the types of short-term advocacy initiatives and strategies that are most effective in restrictive environments in which Lifeline offers assistance. The use of two follow-up reports has also been beneficial, as impact reports are now yielding information on outcomes that had not been achieved during the narrative reporting stage (upon completion of the grant.) Initiatives have resulted in a number of outcomes, including: improvements in organizational capacity, formation of partnerships or coalitions, media coverage, and policy advances.
  • We also use media analysis quite a bit to try and understand the effect our advocacy or awareness campaigns are having. For example, In Venezuela, FH supports organizations to document cases of human rights violations and more effectively report and disseminate the findings to better inform the Venezuelan public and international community about the deteriorating human rights situation and advocate for policy changes. In order to adequately assess impact of the project and to ensure it benefits from evidence based adaptive implementation, Freedom House undertook an effort to more consistently monitor media coverage of human rights organization’s focus issues and themes. The team developed a media monitoring tool to track coverage in specific outlets on a weekly basis. Outlets have been selected with the goal of representing a diverse range of editorial leanings, including outlets that typically present pro-government messages. Freedom House currently tracks 5 print and 5 digital media outlets. This unique style of monitoring allows us to understand when and how our supported HRO’s are being cited, the “reach” and accessibility of our information products, examine linkages with investigative and professional journalists, and determine the influence our work is having on decision makers and policies, both locally and abroad. We have found that over the last two years, supported human rights organizations (HROs) have documented more than 1,500 human rights violations, increased domestic media coverage of human rights issues in monitored outlets by more than 50 percent, and increased the number of times they’ve been cited in recommendations and reports issued by the UN, OAS, and international NGOs like Human Rights Watch and Amnesty International. For instance, as a reflection of improved quality of evidence-based reports presented by partners to the Inter-American Commission on Human Rights (IACHR) over the past year, supported HROs achieved a 90 percent increase from 2014 to 2015 in the number of citations of their products in the IACHR’s annual report on Venezuela (144 to 278). These are obviously not policy changes, but we see these as the incremental steps to achieving a policy change.