Digital Community Stewards Training

With growing access to reliable mobile and Internet connections, we've seen a related increase in the use of social media, including Facebook, Instagram, and Twitter. These platforms all offer a solution to the same need: the desire for human connection. Yet these platforms also come with responsibilities for users and administrators (admins), such as the duty to protect the human rights of those they interact with in virtual spaces.

That's why the nonprofit Search For Common Ground joined forces with Meta to host a workshop for social media administrators to better understand their rights and duties in moderating connections on these platforms and maintaining the safety and security of users who join their groups. (Note: Meta owns three of the most popular social media apps: Facebook, Instagram, and WhatsApp.) New Tactics' Senior Digital Media Coordinator, Ayman Malhis, attended the training in Dubai, along with 17 other admins from various countries and cultures, including Jordan, Lebanon, Egypt, Sri Lanka, Kyrgyzstan, Indonesia, Cameroon, Nepal, and Sierra Leone.

The workshop focused on the duties of administrators of digital communities (groups created through social networking sites) to protect their members. Facilitators also clarified the rights of members and suggested tools for maintaining the safety of these members. The training included nine units:

  1. Member Engagement: The aim of this unit was to hone technical skills to increase participation of members in their online groups. When members join an online community, they do so because they are interested in sharing their opinions on a specific topic or topics. Some want to listen and learn, and others want to actively share. The group administrator should be aware of and respect the diversity of interests and personalities in the group. On Facebook, one way to encourage member engagement is to use the badge system to distinguish members based on their participation in the group. For example, members can receive “New Member,” “Top Contributor,” or other badges that tend to increase activity in the group. 
     
  2. Trust And Connection: After identifying an online group’s audience, admins should work to build trust and communication within the digital community. Setting rules for communication in the group (e.g transparency, responsibility, privacy, protection) can encourage appropriate interactions amongst members. Explicitly defining these rules demonstrates the commitment of the group administrator to promote a safe space for exchange. Rules also increase the confidence of participants to communicate transparently and safely with other members. Some examples of group rules include:
    • Respect the opinions of others.
    • Do not spread hate speech.
    • Do not publish information without verifying it first.
    • Do not publish sensitive or private information.
    • Do not post information to harm or bully another person. 
       
  3. Understanding Disinformation: Another important aspect of group administration is understanding what constitutes false, misleading or harmful information that might be posted in the group. This information can be divided into three categories: 
  • Misinformation: This is false information that people share without realizing it is false or misleading. Members who share misinformation think they are sharing credible new information to the group. For example, during the COVID-19 pandemic, Facebook saw rampant misinformation shared about incorrect ways of spreading the virus.
     
  • Disinformation: This is false or misleading information that is intentionally shared for harm or gain. This may be for the user’s financial benefit, political influence, to cause trouble or to harm an individual person or entity. An example of disinformation that was shared in the training was the “Birds Are Not Real” campaign. This campaign started as a joke that the US government replaced real birds with surveillance drones in the form of birds to spy on citizens (false information). However, this satirical campaign morphed into an advocacy movement to demand governments stop spying on their citizens. Although in this case, the campaign ultimately resulted in a political impact to deter government surveillance and protect citizens’ privacy, the information itself is misleading and can therefore be categorized as disinformation.


  • Malinformation: This is genuine (accurate) information that is shared with the intent to cause harm. This could be personal details, sexual images published without consent, or leaked emails to damage someone’s reputation. An example of malinformation is the controversial anti-abortion website, the Nuremberg Files. The creator of the Files’ publicly posted the names, photos, and home addresses of over 200 staff members at abortion clinics in the Unites States providers (a practice known as doxxing), which led to the murder of eight doctors. 
  1. Navigating Disinformation: Once group admins have an understanding of false, misleading or harmful information that could be shared, they should develop a content strategy to address it. Information could be posted in various formats: images, videos, text, etc. This can make it difficult to verify the information because sometimes the text might be accurate, but the attached image or movie is not correct, or vice versa. During the workshop, tools were introduced to verify the accuracy of the information and to verify the authenticity of photos and videos, including:

The important takeaway about mis, dis, or malinformation: users and administrators should always verify the correctness of any information shared and cite its source when possible. In the case that incorrect or harmful information is shared in a group, the administrator may employ content moderation tools such as keyword alerts, removing the questionable content, removing the group member and/or attaching links to verified information. Ultimately, if group admins have established appropriate rules (see #2), then those will act as a content moderation strategy that allows for the removal of false, misleading and potentially harmful information.

  1. Non-Violent Communication: Facilitators also discussed the difference between hate speech and freedom of expression.
    • Free Speech: Free speech refers to the right to seek, receive and share information and ideas with others. But this freedom must be used responsibly and can be restricted when considered as threatening or encouraging hateful activity.
    • Hate Speech: Hate speech, particularly online hate speech targets particular groups of people – often minorities and dehumanizes them. Hate speech perpetrators often see “the other” as the enemy.

Training participants learned methods of nonviolent communication with group members to respect differences and diversity.

  • Observations: Nonviolent communication emphasizes what you have noticed without passing judgment on the member of your group. For example, instead of saying, “You have abandoned our group and never post anything anymore,” you could say, “I noticed that you don’t participate in the group as much as you used to.” This is a more friendly and acceptable way of communicating with other members.
     
  • Needs: As a group admin, you need to understand the needs of your group starting with the other admins. Are they feeling anger and frustration? For example, if one of the admins feels angry at some group members' interactions, you may need to dig deeper and address the causes of this feeling. Is that admin unsupported by the other admins? Is that admin overwhelmed by the responsibilities? Is that admin not seen and appreciated enough?
  1. Digital Safety: It is nice to live in a day and age where we can exchange ideas virtually without having to be in the same place. However, with the ease of digital community formation, there is added exposure to intrusions and attacks from unwelcome individuals. Therefore, online group administrators must hone their digital safety skills to mitigate risks to their community members. It is important that admins be aware of the types and methods of attacks and ways to protect group members from them. For example:
    • Lurkers: are the silent members in groups. These members are not active and they read or watch community content but don’t contribute. Lurkers can comprise as many as 80% of group members. Some lurkers may not be friends and could potentially be leaking information about group members. It is important that group members administrators be aware of this possible security risk.
       
    • Termination: While leaders have a great deal of control over their group’s activities,the social media platforms themselves ultimately decide what is allowed to happen on their platforms, including whether groups can exist. A common fear is that a social platform could unilaterally remove a community. Some common concerns shared by admins of social media groups include “They can just literally wake up and say, ‘OK, this isn't going to happen anymore”, “Two years of work and contacts would be lost” and ‘Alright, you can't have a group anymore, or you have to start paying (like Twitter)”.
       
  2. Digital Leadership: Anyone can create a digital group, but not everyone plans to or can manage these groups. Therefore, it is necessary to understand how to exercise leadership in the digital community and respect diversity without bias or discrimination. The training suggested that administrators should find suitable people within online groups to be co-administrators or moderators in the group, rather than just members. The larger the content management team, the better the experience of users will be.
     
  3. Growth & Inclusivity: The training also recommended tools to increase the number of group participants and engagement. Some examples of these include: 
    • Using distinctive designs, such as adopting a specific image or logo for the group, and common “branding” guidelines;
    • Finding ways to collaborate with other groups that are interested in similar topics to encourage cross-group exchange.
  4. Digital Rights: At the end of the training, we reviewed users’ digital rights, and standards set forth by companies like Meta to protect them. For example, Facebook has 22 standards for online communities that are divided into five categories: 
  • Violence and Criminal Behavior
  • Safety
  • Objectionable Content
  • Integrity and Authenticity
  • Respecting Intellectual Property

    These standards are continuously updated by the developers of Facebook to suit the users of digital communities, protect their rights and ensure their access to information.

During the 9 units described above, we built on our understanding of digital communities to improve our own roles as administrators, as well as to better understand the role of the companies who provide these platforms to protect digital rights. Ultimately, all parties involved have a duty to protect the human rights of those they interact with in virtual spaces. This can and should be done at all phases of the process – from online group creation to moderation. This means having a clear purpose for creating an online group, setting forth rules that members must abide by and continually cultivating a safe space for members to share within the group. Digital communities are important spaces for activists and human rights defenders; these spaces must be preserved.

 

This perspective was contributed by Ayman Malhis | MENA Digital Media Senior Coordinator