Artexte’s blog: a place for exchange, experimentation and sharing of ideas related to research in contemporary art.

This section gathers posts that showcase highlights from the collection, as well as links made between documents in the collection through serendipitous discovery.
This section highlights exchanges and conversations between the Artexte research community and the contemporary art community. It includes interviews, responses to articles or blog posts, or documentation of institutional partnerships.
This section features the work and observations of the individuals who frequent Artexte, including researchers, authors, students and artists.
This section is dedicated to employees, interns and volunteers who discuss their professional experience.
Behind the Scenes

Furthering Action For Community Safety: the Anti-Harassment Working Group at Art+Feminism

Mohammed Sadat Abdulai is a statistics tutor at the Madina Institute of Science and Technology. He is a longtime Wikipedian and Wikimedia community leader in Ghana. He has been involved in various structure groups of the Wikimedia Foundation including the Wikimedia Foundation Board Election Committee, WikidataCon 2017 Program committee, Wikimedia Volunteer Response team, and the Wikimedia Foundation Communication Committee. He is a co-lead of the Art+Feminism Wikipedia campaign, and co-editor of the Wikimedia Research Newsletter.

Amber Berson is a writer, curator, and PhD candidate conducting doctoral research at Queen’s University on artist-run culture and feminist, utopian thinking. She is the programming coordinator at articule, an artist-run centre in Montreal. She is a co-lead of Art+Feminism and is the current Wikipedian in Residence at Concordia University. 

Melissa Tamani is an art historian who graduated from the National University of San Marcos (Lima). She’s developed a career in the fields of cultural management and museum education at various cultural institutions in Peru. She’s been part of Art+Feminism since 2015, formerly as Regional Ambassador in Latin America and currently as Lead co-organizer. She is also a promoter of Wikimedia culture in Peru, organizing and taking part in campaigns focused on gender, human rights, and ecology.

Art+Feminism founded the Anti-Harassment Working Group (AHWG) in the Fall of 2019 as a response to the different forms of harassment experienced by female-identifying, trans and non-binary contributors on Wikipedia and members of our leadership collective, as well as to the reports that we have received from members of our community over the course of our organization’s history. We know from experience that Wikimedia’s online community is far from being a safe space for everyone, which is why we decided to create a group specifically charged with developing tools to prevent and document incidents of harassment or misbehavior, in order to guide users through the tools already in place via the Wikimedia Foundation. 

 

Grafted within the leadership and objectives of Art+Feminism, a non-profit organization that addresses the inequality of gender, feminism and the arts on Wikipedia founded in 2014, the AHWG empowers and protects contributors through strategies and tools that help our community deal with and prevent online harassment that may occur as a result of their involvement with Art+Feminism projects. This includes our Safe/Brave Space Policy that we demand to be honoured by all those who organize an editathon under the flag of Art+Feminism. 

Art+Feminism Wikipedia Edit-a-thon at The Museum of Modern Art, New York, 2019. CC BY-SA (https://commons.wikimedia.org/wiki/File:MoMA_Art_Feminism_2019_90.jpg)

As the work of Art+Feminism and the AHWG enters a new cycle, we have initiated a line of work to develop strategies and tools that help our community deal with and prevent online harassment that may occur as a result of their involvement with Art+Feminism projects. An important question for the Art+Feminism AHWG is: what are the meeting points between the gender gap in participation on Wikipedia and the conflictive aspect of the interactions between Wikipedians? A recent study conducted by researchers Amanda Menking, Ingrid Erickson and Wanda Pratt proposes why this question is central to our work[1]. These researchers conducted a series of interviews with female-identified Wikipedians, through which they found that the respondents’ perception of security directly impacted their editing practices. Episodes experienced by the interviewees, such as reversal of editing, conflict, criticism and harassment in Wikipedia, led them to use a range of tactics. These included avoiding editing on controversial topic pages to avoid conflict or “Wikidrama” and doing emotional work by adapting the external manifestation of their emotions to be able to navigate spaces in Wikipedia that they found unsafe. By making visible the difficulties women face within Wikipedia, the authors posit that the gender gap is also a consequence of a culture that does not prioritize the safety of all its members.

 

Discussion and respectful conflict are not just the reality within the editing culture on Wikipedia – they are encouraged behavior. That is why there are codes of conduct for all versions of Wikipedia, such as the English-language Wikietiquette and why the interface includes a discussion page for each article. One of the pillars of Wikipedia (there are no rules!) is to operate in good faith. This means that one should never make personal attacks, or specifically and disrespectfully target new users. Yet, in many of the etiquette guides we perceive that respect between editors is framed more as a way to generate greater effectiveness in editing, than as an objective in itself. For example, the English Wikipedia etiquette page reads,”treating others with respect is key to collaborating effectively in building an international online encyclopedia”[2].

 

The community of volunteer editors contributing to Wikipedia is a self-regulating body. Therefore, the various codes of conduct, reporting processes and sanctioning measures are defined by the community itself. In most cases, the sanctioning role is assumed by the administrators: users who are chosen by the community and who have special technical permissions that allow them, among other things, to block accounts of editors who repeatedly violate conduct policies.

 

It should be noted that not all versions of Wikipedia have a specific policy on harassment. In the English, French and Italian versions, for example, this policy exists and indicates its definition, types, how to proceed if one is being harassed and the sanctions that exist for harassers. In other cases, for example in the Spanish or Portuguese versions, users only have access to translations of the English policy in the form of “wikiessays” or policy proposals, though there have been attempts to generate discussion leading to the adoption and implementation of translated versions.

 

The first action that we embarked on as the AHWG was to think about how we will handle the harassment reports we receive. This starts by creating a clear channel of communication that is easily accessible to anyone who wants to contact us to describe an incident that they have experienced or witnessed, thereby creating an enabling environment that allows victims to voice their feelings. We created an email hotline (safety@artandfeminism.org) where people can report incidents of harassment for this purpose. The hotline information, and security toolkit can be found on our website, with descriptions about the types of help we are able to offer. The AHWG team collectively speaks English, Spanish, French and Hausa, and we will make the effort to find translators to help with reports that we receive in languages we do not know. We believe that providing clear communication paths for victims to reach out to somebody, irrespective of their geography, will make them feel less isolated. We will regularly share the weblink (https://www.artandfeminism.org/#safety) on our social media pages to increase the awareness about our work; and of any existing volunteer-led platforms about harassment on Wikipedia that editors may not know about. We know many security tools and guides exist but they are difficult to locate and written using the same interface as the platform new users are learning, making them inaccessible for many. Our aim is to make this information radically clear and accessible for users of any knowledge level.

 

When faced with reports of harassment, our previous advice has included reaching out to Trust & Safety, filling a harassment claim, using the available Wikimedia tools for harassment claims (such as the Editor Interaction Analyzer or the Interaction timeline) and documenting the experience. We believe that documentation, as well as transparency and clear and respectful communication – all pillars of Wikipedia and Wikimedia more largely – are important tools for moving forward with this and any future situations. We recognize that in some instances, harassment can persist, even when a user utilizes the tools at their disposal. It is not part of our mandate nor is within our capabilities to change the Wikipedia (or other Wikimedia project) guidelines and policies. Single users, chapters or groups such as ours cannot change the policy without community consultation. What we can do is advocate for changes to these guidelines and policies.

 

At the same time that we are developing a protocol to guide our steps in handling harassment reports, we are also putting together a toolkit about security on Wikipedia – which will hopefully help mitigate future incidents. The toolkit advises on the nature of personal information editors should share about themselves, beginning with their choice of a username and the amount of content they should place on their userpages that may be used to identify them. After defining what types of cases qualify as harassment; as a way to attach various degrees of urgency and response turnaround time to each case, the AHWG team deliberated when to ask for help or make an enquiry, providing a list of tools that are currently available to make a report to the community or to the Wikimedia Foundation. The document will further outline common triggers of disputes on Wikipedia that include Nomination for deletions and “edit warring” for example, and adding a section about how to avoid these situations, and what to do if you happen to be involved already. Finally, among our objectives is to provide a disclaimer that succinctly clarify our capacity and ability to take action on reports that we receive.

 

The internal protocol to receiving and responding to harassment reports is currently under review by the lead co-organizers, and the toolkit to security on Wikipedia documents was addressed to the entire Art+Feminism leadership collective made up of regional ambassadors, lead co-organizers and staff to comment on. We are looking forward to offering these documents to the Art+Feminism community for the 2020 editathon season.

Previous article

May 2018
Artexte

Next article

May 2021
Klara du Plessis and Kadie Salmon