Without Fear?

Without Fear ?

Exploring online civic space participation by marginalised women in India 

Women activists and political organisers who belong to marginalised groups and challenge oppressive social orders often face state scrutiny, identity-based delegitimisation, sexual harassment and abuse in India’s online civic space.

This online civic space also seems to be ‘shrinking’ due to the increased criminalisation of dissent, social media censorship, internet shutdowns, troll and bot manipulations, and widespread hate against religious minorities and oppressed caste groups.

While such ‘shrinking’ is assumed to repress all civic space actors equally, women organisers belonging to marginalised groups often bear disproportionate impacts and heightened abuse. This is likely due to the reproduction of social power structures within the civic space (including online), and the marginalised groups having limited access to legal, medical and financial aid, political power and social networks of influence.

Marginalised women have been historically excluded by the mainstream Indian feminist movement, which is framed for an archetypal Hindu, upper-caste, cis-gendered urban, middle-class woman. Since proportionally few marginalised women have access to participate in India’s online civic space, any shrinking disproportionately affects them as they are already underrepresented.

This qualitative, exploratory study examines marginalised women’s participation in the online civic space through in-depth interviews with 12 participants.

 

Findings

Censorship and self-censorship

One participant reported censorship attempts by state actors while another stated feeling direct and indirect state presence through the surveillance of her livelihood. Nearly all participants reported practising ‘self-censorship’ due to state surveillance, criminalisation and online speech repression. Such ‘self-censorship’ was not directed by their ‘free’ will but by the fear of possible state repression. Participants were habituated to being hypervigilant about the content they shared in the public domain and its tone. They constantly carried out risk assessments in their heads of the limits within which they could express their opinions without getting into trouble or facing further repression.

Delegitimisation and harassment

Two-thirds of the participants faced online sexual harassment from platform users. Participants reported attacks on their identity with casteist, Islamophobic, homophobic and transphobic remarks; misogyny and collective trolling; unauthorised access and use of personal information (e.g. morphed photos) and hateful messages in their inboxes. Participants reported increased harassment when the content they shared received more visibility or had higher reach.

Powerlessness and impact on personal life

Participants reported feeling various degrees of fear and powerlessness, inseparable from their marginalised identity and the lack of access to capital or influential networks. Several participants expressed the fear that they may be subject to legal proceedings or unjust incarceration. They raised concerns about the risks by association for their family and friends, doxing, account takedowns and the consequent loss of networks, and the wider implications of state persecution, such as impacts on livelihood, future employment and pursuit of higher education.

Impact on mental health

A majority of participants reported adverse impacts on their mental health due to online harassment by platform users and hostile interactions with state actors. They described feeling trauma, triggers, hurt, depression, anxiety and shock. Some participants had taken social media breaks for their mental health. Without support systems such as publicly funded mental health facilities, participants’ mental health risks remained largely unaddressed.

Inadequate support from reporting mechanisms

All participants reported receiving inadequate redressal from online reporting mechanisms. They highlighted that reporting mechanisms do not account for context, have limitations as they are designed to only censor specific words or phrases, and are content-agnostic, which enables censoring of human rights abuse documentation.

On approaching law enforcement

A majority of participants reported that they did not feel comfortable approaching the police for online harassment. This is unsurprising given the police’s historical and present role in enforcing social hierarchies.

Precautionary measures

In order to navigate the unsafe online civic space, participants reported making their accounts private and refrained from sharing their personal information, work or field information and physical location. Participants did not necessarily have greater awareness about, or access to, digital safety and privacy.

Steering online discourse

Participants reported that the mainstream Indian feminist movement was exclusionary. They shared that the online civic spaces were often captured by privileged persons who offered conditional allyship or spoke on behalf of marginalised women. Some participants shared that they were slotted into specific, narrow categories and work domains. Participants also reported the risks of having their labour appropriated by bigger accounts run by privileged persons. Here, they identified algorithmic features and technological tools as facilitators of erasure and appropriation. Lastly, participants reported how online discourse on specific movements have started being steered by communities themselves only recently.

 

Way forward

This exploratory study recommends:

    1. Systematic, comprehensive and disaggregated documentation of abuse which captures the particular experiences of organisers in their self-determined, intersectional identities;
    2. A disaggregated and longitudinal study of vulnerabilities and risks from online abuse to help determine appropriate support and redressal strategies;
    3. Further research about platform governance (including its purpose), platform architecture and the political economies of platform profits and state patronage; 
    4.  Building diverse and specialised networks that provide safety, legal, medical and  technological support to the different groups of marginalised women online;
    5. Studying access and power within the online civic space and the feminist movement to help dismantle power hierarchies; and
    6. Studying the exercise of police powers, including police discretion, online.  

The complete report can be freely accessed here under CC-BY-SA 4.0.

What we owe to each other: a user focused model of tool development – TBP at RightsCon

We presented our learnings on the usability of Tails in a country with heightened surveillance and reduced freedom of expression, where many users struggle with unreliable internet connectivity. We hope with this  and other efforts we can encourage tool builders to do similar work on their tools to see if they are being built to suit the needs of their users.

TBP at UX Forum 2022

Members of  our collective  participated in the UX Forum 2022 held from  25th April to April 30th 2022. The UXForum  is an effort to bring together human rights defenders, digital security  trainers, auditors, software developers, designers, and funders for exploring  human-rights centered design in the open source privacy and security  community.

The members of our collective hosted the following sessions on Day 1 of the UXForum

  1. Tailormade? – Chinmayi S K and Vasundhra Kaul delivered a lightning talk on learnings from the usability study of Tails in India. This was based on the study “Tailormade? A study of usability of tails in India ”.
  2. Low Connectivity UX – Chinmayi S K hosted this session along with Evie Winter. This workshop focused on experiences from  practitioners in low-connectivity regions. There were discussion around tools and platforms and their effectiveness in low-connectivity regions. There were also conversation on this could be improved further.

To  read more highlights from day 1 discussions at UXForum please check this Internews blogpost.

Tailormade? A study on usability of Tails in India

Ordinary citizens are more conscious now, of the need for additional security. They’ve also started thinking about privacy in more concrete terms. The need for accessible tools and training becomes more evident among those who use shared devices and want to keep their time on the device secure and private. There are also other more vulnerable groups of persons. Tools for anonymous and secure conversations, especially when used on shared devices, can be of benefit to these users.

Keeping in mind the varied needs that will arise from a multifaceted population, we decided to train a selected group of persons to use Tails for their work. In this process we also conducted a study to document the usability of Tails.

Our study participants came from different backgrounds, there were a few unifying factors; they were all non-technical or casual digital users. Over four months, we familiarised participants with the basics of Tails and recorded their usage of the platform for their work.

In our study we found that tails is a great platform for non-technical users in most instances and has significant potential for use in a country like India. However, adaptability will depend on how well Tails adjusts to the local circumstances. As a result of our study we made the following recommendations

  1. Chipset Support

Participants with the Apple M1 processor chips were unable to install or run Tails on their systems. We recommended that this be provided in the future

  1. WiFi

Multiple participants faced problems while connecting to the internet, two of whom ended up not being able to connect at all. We recommended that documentation support troubleshooting for wifi connects and that wifi symbol appears even when there is no connection

  1. Search feature

Participants wanted an easily visible search bar to filter through files and folders.

  1. Keyboard shortcuts

The participants requested more shortcuts that are commonly found in other operating systems. These included the option to cut, copy, paste, undo, redo, switch between applications, select all, etc.

  1. Glossary or basic technical information to the Tails documentation (preliminary sessions)

We got feedback requesting a separate session to go over basic concepts before the actual training began or for additional resources for reference.

  1. Verified Tor Bridges

There is a need for more verified bridges for this region. The bridges obtained from https://bridges.torproject.org/ failed to connect on Tails Platform. We also obtained bridges from the Tor Team for the study participants, which failed to connect too. We recommended that more verified bridges be available.

  1. Video conferencing or chat support

Across the participant group, several spoke about the need for having a chat platform (Signal as an example) and videoconferencing support.

  1. Graphics

There was a persistent concern that using Tails on a public computer would make the user stand out as it looks considerably different from current operating systems. We passed this along to the Tails team.

  1. Startup tutorial

While circulating training material is a possibility, some participants wondered if it would be possible to have a tutorial option on the screen. This tutorial could take the user through some of the features we covered in the training sessions and make the initial use of Tails less intimidating.

  1. Localization

Additional translation for at least five commonly spoken Indian languages would be a good start to introducing Tails to more regional users.

Once the draft report was ready, we shared it with the Tails team. We then had a conversation with Sajolida, a Tails team member, to discuss the recommendations. Some of our comments were related to topics that had already been flagged by the Tails team earlier, and were issues they were working on. These included making the documentation more relevant to non technical users and localizing the material to India. They were open to including a glossary to their documentation, and look into on boarding. We discussed the possibility of introducing custom themes and backgrounds, incorporating keyboard shortcuts across applications into the documentation, and working on other general usage issues. They expressed hope at being able to work on running Tails on M1 chips, once the Linux community takes the lead.

 

Tailormade?A study on the usability of Tails in India

The complete report can be freely accessed here under CC-BY-SA 4.0

Community fellowship program for grassroots trainers

The Bachchao Project conducted a training of trainers for women, trans* and queer individuals, and those belonging to the LGBTQIA+ spaces. The people for the training were chosen from a closed call spread in various underrepresented communities. Nine trainees were initially chosen, out of which six trainees completed their training. Five among them went on to do the Safe Sisters fellowship program with The Bachchao Project.

The Bachchao Project, in partnership with Safe Sisters, conducted the India Fellowship Program from August to November 2021. This cohort included 5 fellows, shortlisted for their diverse areas of work with at risk and/or underserved communities, who by the very virtue of their identities are exposed to many unique  risks.

An initial training period allowed the fellows to begin understanding and responding to security challenges they may face in their work and daily life. The aim of the program was to enable them to secure themselves, and pass on these learnings in the communities they work in. The focus was on holistic security practices rather than tool usage, and fellows underwent activites on needs assessment, risk assessment and threat modelling, before moving to possible interventions.Since the fellowship was based in India, the initial training was localised to the country, as far as possible. We also held a session with prior safe sisters fellows from other countries, to underline the feeling of community and support. As the fellows spoke to each other, common interests became apparent, and resources and advice was exchanged.

The common thread that bound everyone together was the sharing of stories of online harassment, lack of access to justice and the positive impact that fellows could see their work having, especially in under served communities.  Fellows also exchanged notes on challenges faced while training diverse stakeholders and underlined the importance of being mindful of our own biases and shortcomings as individuals that we may carry into the training space. This highlighted again that without being inter sectional in every aspect of our work -including how we speak, train and otherwise engage with different communities.

After the initial training period, we conducted additional trainings for fellows who wanted more information on certain topics. The fellows were encouraged to set up their trainings while keeping the fellowship trainers in the loop. We held follow up calls for all the fellows and discussed their plans for the communities they work with, and assisted them in the setting up of workshops, if they required help. All of the fellows successfully carried out needs assessments for their target groups, and were able to carry out trainings where required. They  can now take back this knowledge and skill to their communities.

AK:

“Given the current pandemic, when more and more people are online and internet is used in a variety of ways – I feel I have benefited immensely by the fellowship as it has not only made me understand how to be safe in the digital space but also empowered me to help others from more vulnerable communities and spaces. The fellowship helped me understand how to give support in a structured fashion and I am equipped to assist others in being equipped and safe in the digital world. While firefighting skills are necessary but its much better to take certain precautions from the beginning in-order to minimise risk.”

Arunima N:

“The module on online dating and gender-based violence was entirely new to me. I liked the   tools we were given to express ourselves in the context of dating while keeping parts of our digital identity safe from being mined by dating companies. I also like that conversations we had as a part of this module, particularly on communicating to a potential romantic partner why digital privacy is important to you, and to see if the other person respects this principle of ours. That was a personally illuminating conversation to witness, between the trainers […] and the participants”

Brindaalakshmi K:

“Most importantly, I’m grateful for the space that we had as fellows during the course of this fellowship to ask questions and ask for extra resources. It was a safe learning environment. Feeling safe in a learning environment is a high priority for me while learning anything. I appreciate the patience and the effort of the trainers in always holding space for the fellows. That made a huge difference to me especially while learning tools that are absolutely new. It made the process less intimidating”

“I have had many learnings from this fellowship, not one. But the most important lesson that I have learnt is that digital security and consciously practicing safer methods is a way of life and a lifestyle change. It is taking me time and I cannot expect the people that I do workshops for to change their habits overnight. Over time, I have known this even through my work. But this fellowship has made me realise that digital security doesn’t have to be a dark and bleak thing. We as individuals have more power than we realise even while using automated technology, which often feels larger than life to most people. Many generations of people are still adjusting to using technology. Safety rests in recognising this power that we hold and making conscious choices”

Ravalisri V:

“The 2 days program on using dating applications is one of the major learning. It provided a platform to share our experiences and all the safety measures to be taken while using them along with what is a necessary action to be taken when someone faces problems from others”.

Chinmayi Shrivastava :

“A fantastic experience!

In addition to all the insights I gained on digital security, I have also walked away with a newfound confidence for digital security challenges that I might face in the future in my work and daily life. Having practised the tools myself along with the training sessions conducted as part of the Safe Sisters fellowship, I definitely feel more secure online which is the first step for me to conduct my daily work related and personal activities online without being scared and anxious at the thought of losing my data or my data ending up in the wrong person’s hand.”

[Event Report] India, Lets build the list

The Bachchao Project in partnership with OONI hosted an online event on 9th and 10th October 2021 to update the Citizen Lab test list for India. The event, which was called “India, Lets build the list”, was organised to help strengthen community based monitoring of internet censorship in India. The event allowed experts from different fields to contribute to a curated list of websites that are relevant to India and which are regularly tested for censorship by volunteers in India.

Censorship in India, specifically online, has been evolving steadily since the notification of the Information Technology Act of of 2000 and its associated rules. Though the Act itself offers multiple ways in which the Government can remove content and/or block access to content (including shutting down internet services), very little data is available to confirm if due process is regularly followed in these matters. This  raises serious concerns about its impact on Indian citizens’ right to freedom of expression and access to information.

While many such blocked sites may fall in the expected categories of illegal streaming, adult content, file sharing etc., research has also shown that internet censorship in India also impacts a wide variety of other sites, such as news media and human rights sites.This list building and monitoring activity is therefore crucial for us as citizens and as a community of digtal rights practioners to safeguard the essence of a free internet and uphold the rule of law.

One open software project that aims to increase transparency of internet censorship (and other forms of network interference) around the world is Open Observatory of Network Interface (OONI). To this end, the project builds free and open source software – called OONI Probe – designed to measure various forms of network interference.

A recent study used the OONI Probe testing software to measure the blocking of websites in various states in India (such as Manipur and Bangalore) from January 2019 to January 2020. It found that while 136 sites from the Citizen Lab test list for India were confirmed to be blocked, the major decrepancies in access were between ISPs rather than between regions. A large number of media outlets seemed to be targeted for blocking as well.

As of now, a relatively small community in India reviews and contributes to the Citizen Lab test list for India, which means that it’s entirely possible that we are not looking at all the possible thematic areas in which website censorship may be happening.

It therefore becomes essential that more people from varied backgrounds and fields of interest support such open source testing for censorship. By reviewing and contributing to the the Citizen Lab test list for India, you can help ensure that a broad range of relevant websites are tested, and that the censorship measurement data collected from the testing of these websites is more comprehensive, robust, and timely. This will enable citizens to ask important questions to lawmakers and even mount legal challenges when necessary.

To this end, on Day 1 of our 2-day workshop, our OONI partners facilitated a session (“Introduction to Internet censorship”) which introduced participants to key concepts around internet censorship and how website censorship is implemented, with the goal of ultimately highlighting the importance of contributing to the Citizen Lab lists of websites that are measured for internet censorship. For the purposes of this workshop, the following forms of censorship were kept out of our scope:

  • Censorship on social media platforms
  • Internet outages/blackouts/shutdowns
  • Takedown requests
  • Online trolling
  • Self-censorship

We used these two days to specifically look at websites that may have been or could be at risk of being blocked by Internet Service Providers (ISPs) . The group discussed the recent history of internet censorship specifically related to blocking of sites under Section 69 (A) and Section 79 of the IT Act .  We also reviewed existing research and public advocacy efforts with regards to internet censorship in India.

The concept of the Citizen Lab Global Test List and India Test List, both hosted on Github, was introduced to the group. These lists are compiled and maintained as a voluntary global effort to monitor website censorship. The India test list has over 600 URLs  which fall under many of the Citizen Lab’s 30 standardized categories.

A review of this list showed that the list was not balanced in terms of URLs in each category. The list also needed an update based on recent events in the country. Our workshop was specifically aimed at rectifying this and making the list more comprehensive & inclusive of the myriad concerns of citizens of our country.

A few of the participants shared their own experience with state censorship and their work on these issues. One of them presented a list that they had compiled by testing for DNS hijacking of sites specifically on the ACT Fibernet. Another participant found that many official government websites are not accessible to people outside the nation and shared their own work on creating a proxy to allow researchers and others to access Indian government websites from other countries.Geo-blocking prevents archival by the Internet Archive, which many researchers depend on. Participants also shared their experience of studying the issue of internet access in conflict zones in India and that even though access to the internet is recognized as a human right, it is often on the very bottom of the priority list for communities who are facing very intense threats on the ground. They also shared that being able to help these communities understand that the role the internet can play in responding to some of the other threats they face (and the tools to enable this, while foregrounding their safety needs) had been a very positive, empowering experience for all involved.

To end Day 1, we dove into the methodology of list building and list pruning which was developed and presented to the group by our friends at Netallitica. This session was specifically aimed to prepare us for Day 2 during which we (the organizers and the attendees) split into groups and co-worked on updating the India test list.

We started Day 2 with practical inputs on how to make changes to this list, important points to remember so that anyone who looks at this list later to test or to clean it understands what changes have been made and why. Our partners from OONI also showcased their beta tool which will make updating the Citizen Lab test lists much easier (through a web platform, without requiring GitHub accounts), once it’s launched.

A total of 10 participants split into two online co-working groups and selected a single theme to work on for 30-minute hands-on sessions each. The participants selected themes based on their area of knowledge and interest and also on how much information the list for that theme already contained. The focus was to make each theme list cover a wider base making it representative of platforms/ sources of information/ interaction that are currently important in our country.

In each group there were discussions to decide which sites need to be added and/or removed, and how websites should be categorized . An important part of this exercise was to ensure that we are including sites that cater to various schools of thought so that the list is not skewed in its representation. This is important to do so that we can measure censorship across the board and not only of target sites that may be important to the world view of the people building and testing these lists.

Day 2 of the workshop resulted in the follow changes to the India test list :

Category Code (Name) New URLs added Updated to

HTTPS

Moved to Global list Recommended for deletion Domain Updated Category Updated
ECommerce 7 1 0 3 1 0
LGBT 15 0 0 1 0 0
Human Rights 8 0 0 0 0 0
Environment 31 1 0 0 0 1
Public Health 26 1 0 1 0 0
News Media 11 0 0 0 0 0
Terrorism & Militancy 0 0 0 1 0 0
Culture 19 1 0 0 1 1
Hate Speech 0 0 0 0 0 0
Political Criticism 4 0 0 1 0 1
Government 1 0 0 0 0 0
Pornography 5 0 0 0 0 0
Total 127
4
0
7
2
3

The participants were able to significantly add to the categories of LGBT, Environment , Culture and Public Health which were very sparsely polluted earlier.

Accomplishing this took time and effort to ensure no sites were repeated, URLs were added correctly, and that existing URLs in the list were still relevant. Our workshop focused specifically on contributing new URLs and we did not specifically set out to prune the existing list (though some of us took the initiative to look at this aspect too). Here is the pull request for this update: https://github.com/citizenlab/test-lists/pull/864

At the end of workshop, participants and us as organizers were enthused by the amount of understanding built about the importance of community based monitoring of internet censorship and a huge role that people from all walks of like can (and in our opinion, should) play to help technologists and digital rights advocates around the world to stand guard over a free Internet.

We hope that this effort will give impetus to more people to engage in these sort of open source list building and testing activities that will enable the generation of in-depth and representative data on the true nature of the Internet that citizens in India get to experience.

Report on Telecom Consumer Rights Education Program (2018-2019)

Authors: Chinmayi S K and Rohini Lakshané*
The “Report on Telecom Consumer Rights Education Program (2018-2019)” presents the highlights from a year-long education program for women telecom consumers conducted by
the The Bachchao Project in Manipur from December 2018 to August 2019. This program was made possible with support from Internews.
This program was conceived as a result of our experiences and observations from the study “Of Sieges and Shutdowns”. This report elucidates on the objectives of the program, the programmatic activities we conducted, the curriculum and design of the consumer education workshops, and our lessons and challenges. We hope that this report will benefit similar endeavours in Manipur and in the field of
consumer education.

 

Download the report here:

Report on Telecom Consumer Rights Education Program (2018-2019)

*in alphabetical order

[Event report] Participation in APrIGF 2021 and APrIGF Fellowship

Rohini Lakshané and Mythri Prabhakara participated in the hybrid edition of the Asia Pacific Regional Governance Internet Governance Forum (APrIGF 2021), that is, the online conference as well as Local Hub activities held in Hyderabad, from 27 to 30 September 2021. Ms Prabhakara also received the APrIGF 2021 Fellowship, which has been documented in this report. They also attended the launch event of the Internet Society (ISOC) Chapter Hyderabad at the APrIGF Local Hub. They made several contributions to the APrIGF 2021 Synthesis Document.

Fellowship

Mythri Prabharakara received the APrIGF Fellowship, which ran from 1 August 2021 to 15 October 2021. It was an intensive fellowship involving courses, peer interaction, delivering talks and presentations, and receiving several hours of mentorship. The mentor assigned to Ms Prabhakara was Mr Eun Chang Choi.

She was the rapporteur for a session entitled “Transnational conversations on reclaiming freedom of expression online”, where she also made an intervention about the conceptualisation of online consent and the legal framing and categorisation of cybercrime victims. https://www.youtube.com/watch?v=mTq1Mc3zDoQ

She made two presentations: one on her professional interests in law, gender and tech feminist spaces and the other for the fellows cohort. She represented the fellows cohort and presented a summary of the mentorship program that happened as part of the fellowship on the final day of the APrIGF. The presentation included a summary of all assignments completed and presentations made by the group, an e-course on Internet governance offered by the ISOC Foundation, and mock session proposals made to the UN Internet Governance Forum. https://www.youtube.com/watch?v=bpCJISLJvOQ

“The Gendered Impact of Intentional Internet Shutdowns”: Panel at the Global Digital Development Forum 2021

Rohini Lakshané‎ moderated a session entitled “The gendered impact of intentional Internet shutdowns” at the Global Digital Development Forum (GDDF) on 5 May 2021. The speakers were Felicia Anthonio (Access Now), Sandra Aceng (Women of Uganda Network), Deborah Brown (Human Rights Watch) and Zaituni Njovu (Zaina Foundation).

Description: Women, gender-diverse persons, and marginalised sections of society have been using the internet to overcome the obstacles posed by an imbalance of power and social restrictions. Internet shutdowns, a tool increasingly used by governments across the world, are depriving these populations of access to the ways in which the internet acts as a leveller. Our panelists represent researchers, advocacy, and policy groups exploring the impact of intentional internet shutdowns on women and gender-diverse persons in communities across Africa. They will speak from the perspectives of free and fair elections, cybersecurity, freedom of speech, and expression and digital rights, and discuss coping strategies these populations use when they are digitally disconnected.

Click here for the GDDF 2021 agenda.

A recording of the session is available at: https://digitaldevforum.course.tc/t/2021/events/the-gendered-impact-of-intentional-internet-shutdowns-edXz4h26pBmDUJokkFGRXN

Panel on Digital Speech, Disinformation, Censorship: Social Media and Democracy

Rohini Lakshané was a panelist at the event “Digital Speech, Disinformation, Censorship: Social Media and Democracy” organised by Tufts University and held on 23 February 2021. The panel was a part of the “Critical Times, Critical Thinkers” series of talks held by Tufts Global Education.

Details about the session: https://students.tufts.edu/digital-speech-disinformation-censorship-social-media-and-democracy

Description: “This panel will explore the role of digital technology as social experimentation, the controversial role of Big Tech in policing and censoring speech, and provide a dynamic discussion about EU regulation of Big Tech and the lack of it in the US. Special attention will be paid to exploring differences in the impact of these practices in the Global North and Global South, particularly the ways in which these practices may support rather than challenge authoritarian regimes in the Global South (e.g. India and Myanmar). Each speaker will offer 5 minute introductory remarks followed by a moderated discussion among all participants. Questions from the public are welcome!”

Speakers:
Rohini Lakshané, Director, Emerging Research, The Bachchao Project, Mysuru, India
Manuela Kasper-Claridge, Editor in Chief, Deutsche Welle, Berlin, Germany
Adam Moe Fejerskov, Senior Researcher, Danish Institute for International Studies

Moderator: Josephine Wolff, Assistant Professor of Cybersecurity Policy, The Fletcher School of Law and Diplomacy, Tufts University

Click here for a video recording of the session