Skip to main content

tv   Discussion on Protecting Human Rights Defenders Online  CSPAN  March 29, 2024 10:43pm-12:15am EDT

10:43 pm
announcer: get contact information for members of government right in the palm of your hand when you preorder your copy of the congressional directory. with bio and contact information for every house and senate member of the 118th congress, porter information on congressional committees, the president's cabinet, federal agencies and more. it costs $3295 less shipping and handling and every purchase helps support our nonprofit operations. scan the code on the right for greater c-spanshop.org to preorder your copy today for delivery this spring. announcer: coming up, officials from the national security council and the state department talk about the importance of protecting human rights defenders that operate online. topics include authoritarian repression in myanmar, and help online platforms are evaluating protecting the speech of human
10:44 pm
rights defenders. hosted by the center for strategic and international studies. [chatter] >> good morning, everyone. welcome to the center for strategic and international studies. i'm so delighted to have you here with us today and delighted to have those who are all tuning in online for this very important discussion of the launch of the united states guidance for online platforms on protecting human rights defenders online. i want to thank our partners in
10:45 pm
this effort, the department of state, as well as access now and the atlantic council's digital forensic lab for all of their support in putting this event together today. my name is michelle strucke. i'm the director of the human rights initiative and the humanitarian agenda here at csis , and today, and 2025, according to the 2025 global overview report, 5 billion active social media profiles exist in the world. as a percentage of global population, that is 62% of the world that is engaging in social media. mobile phone users are up to 70% of the global population, and more than 66% of all people on earth now use the internet. so this issue is affecting the daily fabric of all of our lives as they change and become increasingly reliant on social media platforms for our commerce, our entertainment, our utilities, even how we pay our
10:46 pm
bills, all the way to how we engage with networks. people who are experiencing some of the positives of this are human rights defenders. but also some of the negatives, of people who are the dark underside of the internet that is being used to track, threaten , target and harm people including human rights defenders, the very people that are upholding the values we all care about of human rights. democracy and other important issues. human rights defenders include everyone from members of ngo's and trade union, environmental advocates, land rates activists women's rights champions and , anti corruption activists and representatives of indigenous people. some of the most important people rely on to fight for human rights are being impacted as lovely and negatively -- positively and negatively by this issue. it couldn't be more important than it is today to have this discussion and i am really glad you decided to join us. further online in the room, we have the recommendations listed
10:47 pm
online on here on the screen. and you can also use your phone to scan the qr code if you want to look at the full report. further online, you can also look at that see. i will give a couple of announcements. if there is an emergency, the emergency exits are behind you. the bathrooms are down the hall. again, it is my pleasure to be with you here today. now i will introduce our special guests that are going to share with us some important thoughts. so first, we will have remarks by kelly razzouk, who is the national security council special assistant to the president, and senior director for democracy and human rights. kelly currently serves at the white house national security council as a special assistant to the president and senior director for democracy and human rights. prior to this, she was acting chief of staff and deputy chief of staff for policy for the u.s. ambassador to the united nations, linda thomas greenfield. from 2018 to 2020, she worked
10:48 pm
for the advocacy for the international rescue community, and she has held numerous government rules throughout her career advancing key u.s. priorities at the u.n. in new york. she/her served as human rights ambassador -- served under human rights ambassador susan rice. and then a senior policy advisor to ambassador samantha power, where she led key human rights initiatives for the obama administration including efforts to secure the release of political prisoners around the world. she has worked in a variety of important state department bureaus. she has a juris doctor from depaul university college of law where she was a sullivan human rights law fellow. i couldn't mark delighted to introduce kelly to kick us off with some keynote remarks. thank you. [applause] >> thank you so much, michelle, for that introduction, and
10:49 pm
thanks to all of you for being here today. it is such an honor to be at the center for strategic and international studies. it is my first time here and i am so thrilled to be in the room with experts from civil society, from technology companies, governments, and to host this launch in partnership with access now and the atlantic council's digital forensic lab. technology, as michelle said and as a lot of you know, has fundamentally transformed the fight for human rights around the world. online platforms enable activists to mobilize and share information more quickly and more widely as ever before. but at human rights defenders across the globe too often face technology-facilitated threats and attacks such as targeted surveillance, censorship, and harassment. these attacks can also move from the digital to the physical world. the misuse of commercial spyware, for example, has been
10:50 pm
linked to arbitrary detentions, disappearances, extrajudicial killings, and transnational repression. a woman was physically attacked and sexually assaulted for her advocacy highlighting the growing hate against lgbtqi+ people online. a reporter focused on exposing acts of corruption, was murdered at a carwash after being targeted by commercial spyware. and these are just two of the stories. last week, at the third summit for democracy in seoul, the united states convened a high-level side event focused on the misuse of commercial spyware. the reason we highlighted this at the summit for democracy because it is both a national security and counterintelligence threat. but it's also a very real threat to democracy. my colleague maher bitar, who serves as coordinator for intelligence and defense policy at the white house, and i co-moderated a panel at the , summit that brought together
10:51 pm
ministers from countries, journalists, civil society and private sector experts like representatives from the investor community, to discuss the importance of exposing the misuse of commercial spyware , and protecting human rights defenders. the generalists talked about the fear for their loved ones who faced profound risks to their safety and security. their comments reflected what i have heard over and over again as i have had the honor of meeting with heroic journalists and human rights defenders from around the world who have been victims of online attacks. they described the chilling effect that commercial spyware intrusions have had on their ability to continue their reporting and their activism. the isolation they faced from colleagues and counterparts now fear any contract with them. one prominent russian journalist has publicly described an intrusion by commercial spyware
10:52 pm
as feeling like she was stripped naked in the street. online attacks are all too often gendered as well. around the world, an estimated 85% of women and girls have experienced or witnessed some form of online harassment and abuse. indeed gendered information , manipulation and non -consensual synthetic content are frequently designed to silence and suppress women, political and public figures, by forcing women to self censor or limit their own lighter activity. the growing access to and use of ai further exacerbates these harvest by expanding the speed of intimidation, many position and synthetic content including nonconsensual intimate images, facilitating pervasive surveillance, and enabling enhanced and refined online censorship. the companies and civil society
10:53 pm
organizations here know these harms all too well. many of you have reported on the insidious tactics of nefarious actors that use online platforms are target members of civil society, journalists, and activists. addressing these threats is critical not just for the individual survivors of attacks, but it is also a global imperative for the defense of inclusive representative democracies. the united states government remains resolute in our commitment to address these harms. as president biden said at the second summit for democracy, we must ensure that technologies are used to advance democratic governance, not to undermine it. the united states helped develop this guidance we are launching here today for online platforms to support governments, civil society, and the private sector to come together to fight back against these abuses. this guidance is part of a whole of government effort to, as
10:54 pm
secretary blinken explained last week at the third summit, to build an inclusive, rights-respecting technological future that sustains democratic values and democratic institutions. last march to the president issued an executive order prohibiting u.s. government use of commercial spyware that poses risks to national security or has been misused by foreign actors to enable human rights abuses around the world. over the past year, the united states has leveraged sanctions, export controls, foreign assistance programs, and visa restrictions, to support victims and hold governments and firms accountable. we have built a global coalition of countries committed to this cause. in fact, last week at the summit, in addition to hosting the side event i just mentioned, we also announced that six new countries would join a joint statement on efforts to counter the proliferation and uses of
10:55 pm
spyware, adding to the inaugural 11 countries. additionally the cybersecurity and infrastructure security agency at the department of homeland security has been partnering with many other organizations and companies in this room to protect high-risk communities including civil society organizations and human rights defenders through their joint cyber defense collaborative. at the first summit for democracy in 2021, the biden administration launched the global partnership for action on gender-based online harassment and abuse, which rings together governments, international organizations civil society, and the private sector to accelerate progress towards safety and accountability for women & girls online. one thing we know to be true is that we can't do any of this alone. we need the expertise of civil society actors, the private sector, and online platforms. today's launch as a starting point for these important
10:56 pm
conversations that will take place going forward as we look to continue to strengthen these partnerships. thank you for your time today, and we look forward to the conversations and work ahead to address these critical issues. [applause] >> thank you so much. it is now my pleasure to introduce ambassador robert gilchrist. he is the senior bureau official in the bureau of democracy, human rights and labor at the u.s. apartment of state. previously, he served as principal deputy assistant secretary since december of 2003. he is a senior member of the foreign service class of minister, councillor. his last position was ambassador to the republic of lithuania from 2020 to 2023. prior to being ambassador, mr. gilchrist served as director of
10:57 pm
the department of state's operations center, deputy chief to the u.s. embassy in sweden, deputy chief of mission to the u.s. embassy in estonia, and director of nordic and baltic affairs at the state department. he was also deputy local counselor at the u.s. embassy in iraq, chief of the political section at the u.s. embassy in romania, and also a special assistant in the office of the deputy secretary of state. . join me in welcoming ambassador robert gilchrist. [applause] >> thank you to all of you for joining us this morning as we officially launch the u.s. guidance for online platforms on protecting human rights defenders online. i would like to extend my thanks to csis and to the atlantic council and to access now four hosting this event and many of you in this room for your
10:58 pm
insights, inputs and reflections during consultation over the last year as we developed this guidance. i'm robert gilchrist, a senior bureau official in the bureau of democracy, human rights and labor at the department of state. human rights defenders, or hrd's as we call them, play an integral role in promoting and protecting human rights. and governments and private sector companies should take steps to protect hrd's and respect for the rights and values for which they so fearlessly and tirelessly advocate. the united states developed the guidelines for online platforms we are launching today in response to the rapid growth of online threats against hrd's around the world. we remain resolute in our commitment to put rates at the center of our farm alessi and to condemn -- foreign policy to condemn attempt to silence human rights defenders' voices through threats, harassment, criminalization, or violence. we are grateful that the people
10:59 pm
shares this commitment and is working with us to elevate the voices of defenders and underscore their essential role as individuals on the front lines, defending human rights. on march 11, we released a joint u.s.-the recommendation outlining 10 steps companies can take to better identify, mitigate and provide access to remedy for digital attacks targeting hrd's. u.s. guidance builds of those recommendations by providing specific actions and best practices that companies can take to protect hrd's will may be targeted on or through platforms, products, or services. this guidance is addressed broadly to online platforms that host third party content. but we also hope it can support collaboration within the the broader ecosystem. including with civil society organizations will directly work with human rights defenders, and act as trusted-partner organizations platforms. i also want to clarify this
11:00 pm
guidance is designed to protect. the united states defines human rights defenders as individuals working alone or in groups, nonviolently advocate for the promotion and protection of universally recognized rates and fundamental freedoms. defenders can be of any gender, identity, age, ethnicity, sexual orientation, religious belief or nonbelief, disability, or status or some identify as lawyers, researchers, lawyers. hrd's continue to face threats and attacks including arbitrary or unlawful online surveillance, censorship, harassment, smear campaigns, this information to include gender disinformation. internet shutdowns and doxing. and online attacks pave the way for physical attacks, including
11:01 pm
beatings, killings, disappearances and arbitrary detention. strikes to defenders and democratic defenders escalate, preserving safe spaces online is more important than ever. as kelly mentioned, the u.s. government takes a broad approach to democratic values online. at the state department we are committed to supporting a protecting hrd's, so they can carry out their essential work without hindrance or undue restriction and free from fear of retribution against them or their families. in washington and throughout the world, department staff maintain regular contact with human rights defenders. our embassies of dedicated human rights officers who regularly monitor human rights developments, meet with defenders and their families, and advocate for our shared principles. over the past decade, the department has provided $60
11:02 pm
million to directly support almost 10,000 human rights defenders from more than 130 countries and territories from journalist to anticorruption activist and environmental defenders to labor activist, this often life-saving assistance has enabled over 90% of recipients to safely return to their work. while we are committed to enabling the work of cso's as a government, this is a collective responsibility. online platforms provide important tools to enable the work of hrd's, and they can do more to help them do that work safely. as companies, they have a responsibility to respect human rights in line with you and guided principles on business and human rights. we hope our guidance which was developed in consultation with hrd's, civil society organizations and platforms, many of you in this room, we
11:03 pm
hope our guidance will provide companies with the blueprint of actions they can take to better protect hrd's. we ask that you adopt and adapt the recommendations in this guidance to improve your own policies and processes. thank you to our esteemed panelists for being here today and for sharing your perspectives on opportunities for continued collaboration. collective problems demand collective solutions. and i am heartened to see so many stakeholders engaging with us today. we are all partners in this effort. thank you so much. [applause]
11:04 pm
>> thank you, everyone. i'm n veryo pleased to introducew a that i will moderate with each of them, and then we will have about 15 minutes at the end to have a question and answer session. you can raise your hand in the room if you would like to ask a question, and we have a microphone that will be coming around to you. if you're online watching, you can submit a question and i will be able to see the question and ask it in the room. we have full participation from both online and audience members here. without further ado, i will introduce the distant was panelists. so, starting to my left is jason peel mayor, the executive director of the global network initiative of gni. jason lee is a dynamic multi-stakeholder human rights collaboration building consensus for the advancement of freedom
11:05 pm
of expression and privacy among technology companies, academics, human rights, and press freedom groups and investors. prior to joining gni he was a special advisor at the department of state where he led the internet freedom and business and human rights section in the bureau of democracy and human rights and labor. in that role he worked with colleagues across the u.s. government, counterparts and other governments, and stakeholders around the world to promote and protect human rights online. next to him is the asia pacific policy and analysts at access now. she has been working in myanmar for about 10 years and has previous experience and political advisor, media and communications. she has served as one of the leading persons in myanmar organizing the myanmar digital rights forum since it was first held in 2016. and it was -- it is and was the cornerstone for the digital rights movement. then we have alex walton, who is
11:06 pm
the global head of human rights for google. alex is google's global head of human rights and free expression. she builds partnerships to promote free speech and expression and to combat internet censorship and filtering across the globe. she also coordinates policy and strategy around these issues for the company. prior to joining google, she worked at the raven group, center for american progress and now legal defense and education fund. she also served as a law clerk for the senate committee on the judiciary and for the house of representatives and the committee under dish or a subcommittee of constitution, civil rights and liberties, and worked for the u.s. department of labor, and bay area legal aid during law school. thank you for being here. and last but not least, i have cheryl mendez, who is the senior program manager of freedom house's emergency assistance program. she has worked for over a decade advocating for and providing emergency assistance and logistical support to human rights defenders and journalists at risk worldwide. in her role at freedom house,
11:07 pm
she provide supports to human rights defenders and civil society organizations who are at risk due to their human rights work. prior to joining freedom while she worked at the committee for debts to protect your nose, founded the crimes of war project and helped launch a culture of safety alliance, a collaboration between news organizations, press freedom ngo's and journalist to promote journalist safety. mendez is a photojournalist who 's trained documentary photographers in the middle east and north africa. thanks so much for this discussion today. i want to start with the comical questions -- a couple questions to set the stage for the pre-participants watching. i will start with alex and jason with a question. you worked closely with the private sector on addressing these human rights risks associated with technology for a number of years and therefore have this very unique perspective on how companies have worked to address threats to human and human rights defenders over years. so, what would be interesting i
11:08 pm
think for the audience is to know, how have you seen these risks change over time? are there private-sector actions that have worked really well in combating these issues, and are there any hard lessons you can share? if there is common standards today that maybe you would push for that one seemed unachievable. help us understand that journey of how the kind of situation has changed over time and how you think this guidance fits into the broader efforts. alex: thanks for including us in the conversation today. it's a great question because i think what has happened over, i have been a google for almost nine years and if i look at what is sort of, how the industry has changed, how technology has changed over that time, and sort of the way the ecosystem of civil society has coalesced around these issues, so sort of we are still dealing with permutations of the same thing, right? there are people who are trying
11:09 pm
to defend human rights exercise their human rights, and there are bad actors, governments or otherwise, who are targeting them and trying to stop them from doing it good work. the quintessential problem remains the same but the tools available to bad actors have evolved and attacked us the bad actors use has also evolved. so, teams within companies have over time, i think, become more resourced to look at these things and kind of become attuned to tracking and we have developed intel teams inside companies that are looking at how are bad actors trying to manipulate our technology on our platforms -- to target are vulnerable users. and i think companies have also sort of made more efforts to to engage with civil society to understand exactly how the problems are manifesting and how they look different in different places around the world. and different and the same.
11:10 pm
and ultimately like, yes, and also we have a.i. and generative a.i. and these things are morphing the ways in which people are -- using technology. all of those things are changing and still the fundamental problem is the same, how do we kind of collectively work together to understand how problems are evolving to fight back against them? but it is constant, right? the threats are evolving and so are our responses. >> jason, i would love to hear your thoughts. jason: thank you michelle and thank you to csis for hosting, the state department, access now. and to everyone who did the hard work to put this guidance together. i think it is really a wonderful set of principles, and night -- i'm looking forward to have a meaningful impact. and kudos to the question and
11:11 pm
building off of what alex said, i'm stepping -- it is worth taking a step back because i think these principles build nicely on a foundation that is in some ways 20 or more years in the making. i think for the purposes of this conversation, it's worth kind of starting with a u.n. guided principles that were developed in a really creative way by john , the special representative at human rights to the secretary-general kofi annan. and carried out multi- year, multi-stakeholder series of negotiations that produced the protect respect remedy that we now know as the u.n. guiding principles that were endorsed by the human rights council in 2011. and that framework really kind of was a pivot point in that it created a foundational shared understanding of the respective duties of states and
11:12 pm
responsibilities of companies, highlighting in addition in terms of remedy, the gni principles were negotiated in parallel to that process and in fact john had an advisor who was embedded ingni negotiation and there was a lot of crosstalk at the time. the gni principles came out of a series of incidents that were the result of persecution of human rights defenders using data that had been given to a number of authoritarian governments by u.s. tech companies. and so, the concerns about human rights defenders i do not think that was a time that was necessarily used at that point in time, but the underlying situation of journalists and activists who were trying to use these new technologies and tools to do their work being
11:13 pm
threatened and persecuted as a result of that, was very much what animated the gni principles. we have to u.n., the gni principles apply to the tech sector and since then, over the last 15 years, we have seen i think a tremendous growth in terms of the amount of attention, the amount of resources that are put into questions around technology and human rights. so, alex mentioned that a number of tech companies now have put more time and resources into this. at gni, we have a set of principles and application guidelines that i remember companies like google commit to implementing and then we have a unique assessment process that holds them to account to those commitments. at a very high level, those involved creating a human rights policy, employees across the company, in betting that policy-- embedding that policy
11:14 pm
through trainings across the company. having senior level oversight of that work, creating appropriate excavation channels, putting in place the human rights due diligence -- that are needed to become aware of and be able to respond to human rights risks, including respected defenders, and then having appropriate remedy and transparency throughout. so, that framework, if it sounds familiar, it's because it is very similar to the one that is in these guidelines, and i think that is a really sort of helpful way to structure this because it takes advantage of what companies have already been working on now for over a decade. it hits all of those sort of points, so that coherence will be really useful in terms of the ability of companies to take and implement this guidance. it will also help civil society organizations and advocates on the outside who are familiar with these frameworks be able to advocate and hold companies accountable in a more consistent way.
11:15 pm
michelle: thank you both for this great overview of how we ended up here and how these principles are such, these guiding principles are such an important practical step forward. i want to take it now to -- you have such a unique perspective on these issues having worked as a human rights defenders in myanmar, and also for an organization that is good at helping defenders. i want to hear about the context you are working in. and if you can take us on a hypothetical, if there were -- i human rights defenders today in myanmar who suddenly began doing what is happening -- receiving threats, being doxed on several platforms, what steps could they realistically take to protect themselves, and then when you move to the platforms, what are the most urgent types of threats that you see that platforms need to have processes to address when you come from that on the ground perspective? >> thank you. thank you so much.
11:16 pm
not just as a team member of the, i am also here as a human rights defenders from myanmar. just to give you a background for many of the people -- about what is going on in myanmar. basically like currently right now, we are over three years now, people in myanmar tirelessly resisting the military coup. in 2021. but on a daily basis what we see on the ground for over three years is just like pretty much like people in myanmar are -- the human rights abuse across states committed by the military. on a daily basis what we have seen is like air strikes. like dropping bombs on the civilians. in towns and many of the villages have been literally like a whole village has been
11:17 pm
wiped out after being -- even like this morning when i was on the way i received a message, a township has been basically on fire. and that has been shut down. so those are the kind of daily daily basis, we have seen about both online and off-line as well. and also like people have been killed, detentions. and those are on a daily basis. and when when you look online, so the same -- it is also the same, internet shut down. the military has basically shut down military since day one. currently right now, we are seeing over 80 townships -- almost more or less one part of the country does not have the internet at all. so, some of these townships around 30 townships, it has been over two years without internet. and also in some areas, not necessarily internet -- and
11:18 pm
mobile communication has been shut down as well. and then, like and also the other side, in terms of websites, including all of these like social media and messaging platforms, it is been almost three years now. even the use of the vpn has been banned in myanmar. the other side when you look at the military, has been trying to use different tools, trying to abuse all sort of like digital tools to even extend, there are civilians -- you see about it, they have been abusing all sorts of data collected from a study from footage. or even including a sim card registration data, biometric data, they collected i.d. project etc. so they have been abusing like -- all sort of this data to kind of like do, identify these
11:19 pm
individuals. where these people are exactly and what they are doing. and to monitor their online activities, to monitor the financial transactions. and also sort of -- violating all of their privacy and etc. that is what we have seen on a daily basis as well. and also come at the same time, the military has been actively monitoring all these social media, even though they banned social media and also messaging, people are trying to find a way -- and etc., but they are still monitoring all of these digital footprints happening across different platforms. and doxing people, abusing people. we have seen literally this was like, we have seen part of like kind of the campaign of terror. that is basically what we are
11:20 pm
seeing -- people had been arrested, and arbitrary detention has been happening. even a killing. two weeks ago one of our close friends basically got killed -- after she was arrested. and there was on that day, there was fires. then right after the fire we didn't get information of where she was taken and then a few days later on the way back from the -- she basically she just got killed. that those kinds of information we are -- and we basically kind of like are seeing on a daily basis. this is at the level, the -- of the people in myanmar are experiencing and here we use this human rights defenders. but thank you our keynote speaker, he defined it like human rights -- but in myanmar, the majority of the people in
11:21 pm
myanmar are currently right now, i mean, basically kind of like, they are standing up. they are taking risk to be even to defend their rights, to even defend the rights of -- they basically, so when we talk about sometime, we talk about the human rights defenders and etc. people in myanmar like every individual is a human rights defenders and the risks they are happening, the risks, the threats levels they are experience and is pretty much like at the same level on a daily basis, we are facing it. even a public school teacher and also the schoolchildren who are in the public school system which is under the control of the military, they refused to attend, they refused to go to the public school. they are arrested. many of them are even got killed as well. just for refusing the school system, controlled by the military.
11:22 pm
this is kind of like the threat level. to be kind of like a politician or high profile human rights -- even on these like students, like school can be basically kind of, you know -- tracked and monitored, their activities can be monitored and etc. that is the question level. i closely work with the local groups who have been monitoring all of the digital rights and abuses etc. so we closely, this is pretty much a small group and now will this group or so, we have kind of like a network with the extended, the civil societies and like labor unions and student unions, and etc. on a daily basis, the message we receive is if somebody got arrested from any of these -- they come to us.
11:23 pm
because they like, they want kind of, they have a footprint, to be taken down. if that person has a facebook account they want to secure the facebook page. telegram channel, even if they have -- we want to assist them. so basically this is their kind of, the messages on a daily basis we receive. whoever from this network got arrested. can you please help us secure this? that was like we basically -- the other digital rights group and also -- work with a different platforms ande etc. these are basically kind of, i will say about like, steps, but not necessarily kind of mitigate the risk. only we basically kind of took these steps only after somebody got arrested, --
11:24 pm
they are kind of like additional communications secured and also the network involved and etc. so, those are kind of like immediate steps we have to take it, right? the other ones are basically kind of, we call it our organization also like others digital rights organizations. we also work around on the mitigation -- that is also some of what we can with different platforms -- like we can either to kind of for see what are the risks and what are the mitigation plans. to this human rights defenders. and also media and journalists and you know, etc. basically, that is the day-to-day work. we have to take. and also this is also, we have a collaboration with these like writer, including the platforms. thank you.
11:25 pm
michelle: thank you so much for the work of the human rights defenders that you are in contact with that are facing these conditions every day. it was so powerful hearing what you just said because you had in it both the fact that authorities block the internet, which is a lifeline to people and the defenders cannot access tools that they need. at the same time, isn't that an indication of how powerful that is as a tool for human rights defenders? then the idea that it is used, the platforms are used to track and monitor people, i think it really raises a good question for you and for cheryl about this reporting, this secure and accessible reporting channels. i know it's extremely complicate it because even when someone takes a step to send that message in something that could be monitored, they are putting themselves at risk. so, how, are there lessons that freedom house and access now can share with platforms on steps that they can take to develop
11:26 pm
the secure channels that support human rights defenders when they are facing harms in the places they are defending human rights? so, maybe i will start with cheryl. cheryl: i think you know, one of the steps is to consult and c o-create with defenders and others that are -- that will also speak to accessibility issues. so, when a platform understands what the operating environment for a defender is. how they are able to access any type of channel but also, when they have access to channels in the past to thwart abuse or abusive language or threats, have they gotten a response? defenders and others are trying to understand and there should be greater education and training about what is the lifecycle of reporting? what will happen if a defender
11:27 pm
anybody does report something. what are the stages, what can they expect? because often what happens is the defenders and others, they experience what american bar association's center for human rights and many others have said is like flagging fatigue. so that is overwhelming. and then also when they are documenting these threats, if there was greater information on how to document threats, what to document, and also across social media platforms and companies, if there was a better kind of, you know, approach where if a defender is attacked including across multiple platforms, is there a way that the reporting process can be streamlined, so that they are not starting from scratch every single time that they may be reporting something to one or the other of a channel?
11:28 pm
this is also really important because as we all know from both defenders who risk and others, that when something happens off-line, and then if it goes, when it happens online, and if it goes off-line, and if it also leads to the criminalization of defenders, or different types of legal harassment because of these orchestrated campaigns to defame them, discredit them, both in the public eye, when that happens, whether governments or other actors can use that to then bring these frivolous suits against them, or to charge them or jail them. which also takes them out of their human rights work and it affects their families. you know, the costs are extreme depending on whether it is legal action. and, also, even with legal action, the life of the case, even if something gets dismissed, those costs and the
11:29 pm
psychosocial impact, you. know, is tremendous again, it has consequences for their family members and others. i think it is very important for , with having secure and accessible channels, that people know where the points of contact are. offline and online, because, for example, there may be defenders who do not have an online presence. who are still attacks online, whether they are indigenous defenders. if there isn't an understanding of like, what are the safe channels, who are the points of contact? you know, if a defender is going to be able to report, then are they able to have contact with human beings, you know, who are trauma informed, who may be able to give kind of real-time response or at least help them to understand what are the steps, you know? again, how they are documenting
11:30 pm
things like that. and also, the need for resourcing local and national networks, and civil society organizations. and training of trainers. again, because, with the way one reports and how they document when defenders and others who go through this entire process and effort and they have already been put under really difficult psychological strain and stress and, you know, and threats, every type of threat that has been described we've, defenders are reporting just when they're asking for emergency assistance or other networks. because we don't work in isolation. if a defender is a land environment, a journalist, etc. there are a variety of different networks. but one of the things that we and our partners do is that we are trying to bring those networks across, you know, in the same spaces. you know, it is both for the
11:31 pm
reason that it's, you know, it is very difficult when you are constantly contacting different types and different sources and all that for help and you kind of work together to provide support for defenders. but because each one of these areas in the broad diversity of human rights work, you know, it literally is cut and paste. i used to work at the committee to protect journalist. when i came to work in a more broad human rights defenders, it was the same thing. do you see it -- they are using the same tactics, the same approach. like you mentioned. it is the same scenario except the tools and other things are more insidious -- quicker, you name it. but i think it is also important for social media companies and others, all of us, to understand when defenders take kind of preventative measures and cautionary measures, which includes self censorship, which include leaving, including into
11:32 pm
exile, leaving the human rights field in its entirety sometime. and i bring this up because it is important, because that will inform how you, you know, think about safe and accessible secure channels, because it depends on, how do you respond. what is the manner of response? how quickly or efficiently? if there is no response, then this is what happens. defenders leave. that's a problem. and also with their families because often as was mentioned, the families are attacked, they are threatened with kidnapped. and it is not just threats. we have seen like, i have literally been working in this field between cpj and freedom house for 12 years or more, right and threats of abduction happen. everything that everybody said, including the keynote speakers and all of that, like i can name
11:33 pm
every single person that we have dealt with and what country, one context -- what context and how it has evolved and the way it's happening as well. as you had mentioned, the evolution. one of the other things about accessibility is, you know, including, you know, for example, persons with sorry disabilities and defenders who are working in this space, and other marginalized communities. when you are thinking about different reporting channels and how a person would access them, you know. and then i would also say that -- oh, one of the things that defenders have brought up, and i think it is really important, and i wrote it because i want to make sure i express this -- but they want to understand and
11:34 pm
they want guidance from social media companies and protection groups and all of that that, for example, if they experience online abuse, hateful speech, etc., what do they do if they block speech or they mute speech or they report threats. when they've seen, for example, sometimes when they blocked it, the threats have accelerated or escalated, right? when they have muted it, then they may not have access to something they need to monitor in order to document and report what's happening. and then, also, you know, when they report threats, if they do not receive a response, you know, then what do they do, if, in fact it takes time and they will not receive a response in a manner that they would need to, the time that they would need. but there could be advice. these are some of the things that you can do, right, along the process. it's also, again, about training
11:35 pm
and resources, because often defenders who are subjected to these type of threats and hate speech and things like that, you know, they may need colleagues and other trusted, you know, people in their networks to monitor these things for them, right? so they need this kind of education, you know, and these trainings of trainers. it's almost like, everything that, you know, access now and other groups in the protection landscape, whether they are frontline defenders or american bar association's justice defenders are many others in all of these different networks, the processes and things that have evolved in these networks, including working collectively, you know, that is something that social media companies are going down that road, but there is a lot to learn from what are the lessons from these networks, and particularly at the local and
11:36 pm
national levels and also consider when defenders may be not near capitals that are written remote situations, what are the challenges then if they're deep in a context where they may be at risk? i'll just -- michelle: thank you. you presented such a rich picture of the tensions and that people are facing when they are trying to report. in the networks that are supporting them, and the kind of common theme i heard is certainly the idea of having common processes from the platforms to be able to understand, who is the point of contact to go to? what did they do? online or in person. i want to ask if you want to add anything to this picture, especially when it comes to what platforms are able to actually do that human rights defenders on the ground need that would help them as they tried to do this.
11:37 pm
>> definitely i agree with, like, the channels and you know, like write a message -- especially like at the issues, you know, like human rights defenders on the grounds are kind of like, changing, constantly. and also from one country to another, the level, the situations are quite different from one country to another. so that is why, like, kind of like the platforms, they have like a google policy and one policy fits for all. it does not really work in reality. so, that is why we actively encourage all the platforms to engage with local, cso local activists. first, identify -- the risk level and to be part of the mitigation. so, currently right now, myanmar i always said about it, like,
11:38 pm
because we have not necessarily just -- kind of, you know, since last decade we have a number of issues, like hate speech and genocide and etc. that is why we kind of, i would say, got some sort of international tension and tension from the different platforms and etc. but still, like, i would say we have very few platforms who are regularly engaging with the cso's on the ground. that definitely needs to change. and also, the thing is, like, and also the other thing, like, we have been seeing about it, normally what the platform usually does more about kind of addressing the kind of like threats, you know, the human rights defenders on the ground are experiencing, etc., oftentimes we do not see it that
11:39 pm
much. like a proactive approach. like a mitigation plan for it so that is also definitely like -- we would like to see much more progress in. and then like the other one, like, even the, one of the recommendations about like a change of information. that is quite key as well. for somebody, like -- the chain of information, often time to time we try to engage with these different platforms, local organizations, they are monitoring an online platform, they flag these issues, these are the issues, these are the links and etc.. but they also have a limited capacity as well. but sometimes, the information flow is always kind of like we feed this information in. we get this information in, but often time we do not know where this information goes. how this information will be
11:40 pm
kind of like taken into the consideration by the platforms and how this information will be kind of reflected in the policy changes and etc. we do not have these kinds of information and also often times, time to time we do not really see that kind of like reflection as well. also, at the same time, there -- the different platforms have like, they are kind of like investigations and etc. and then these kinds of, it changes information with the cso's, what they see on the ground and what the platforms see. etc. these kinds of chain of information, i see about it, a regular visit the chain of information, especially for the cso's who are working on the protection, the security and physical security and etc. at least they can foresee what is going on on that platform, who are the bad actors?
11:41 pm
sometimes these bad actors are not necessarily within their own country territory? they have links to russia or china or links to other territories. or other countries. these platforms have this information, right? cso's, local organizations who are working on the ground, having this knowledge, having this information, is why kind of, i see it very helpful. in their consideration and they are like -- educating, you know, they are public, etc. i would say about it, we also would like to see more kind ofl like a change, like the information, in the future as well. definitely like, a regular engagement and that is kind of like pretty much like a soft -- can be able to solve most of these issues. michelle: thank you. i want to definitely get to some of the really important questions you have raised and give alex a chance.
11:42 pm
maybe you can help walk us through a little bit of, you know, from an example of a platform. how do you look at these reports when they come in? what it the processes look like, when you think of what messages you want to human rights defender to hear about? in that sense, too, what factors go into how -- to deciding how you structure protection mechanisms? that behind the scenes might be helpful for people to hear. alex: i know it is something we have long talked about. there is a lot more work to do between the platforms generally and what information sharing mechanisms might look like. there are obviously lots of challenges around privacy and security and our ability to validate information. that obviously we know it's useful to those working on the ground. and so, certainly lots more opportunity to think about what is possible in that space. but sort of just a back up and talk about sort of how we think
11:43 pm
about these issues and how it sorta fits into the structure of the company, to jason's point, it is a big global company. with a lot of people and a lot of teams and a lot of product services. so, how do we make sure we are paying attention to the issues that are coming up in myanmar, across latin america, across all of the regions of the world, right? so, i think first, we really do have to focus on the security by design, and that means that our teams that are focused on looking at threats and understanding the threats of that are coming at our platforms all day everywhere, are informing how we are building all of our products, because at base -- in a perfect world, our the way that you flag an issue in a product that any use or the world would flag an issue would also solve the problems of our most vulnerable communities and we design for the most vulnerable users and you're also solving problems for everyone. so, anyway, that is sort of
11:44 pm
why security by design is so important and when we do that it helps us get a little bit closer to helping these for -- solving these problems. but, obviously, like in addition to building security into the products at their foundation, we think about what are sort of specific features that some particular vulnerable users might need? and so, that looks for us like teachers. anybody that has gmail might have received a notice that it says it looks like you might have been targeted you will get a notice from us that says " you may have been targeted." we may not have a lot more information we can share with you but we've flagged someone that they might've been targeted, one, and we also provide them with links about ways to check security on their account and some other resources. making sure that people have resources to do some of those checks themselves. and then we also spent time thinking about specific products that might be useful to groups
11:45 pm
that are being targeted online. so we have things like the advanced protection program, which is a super secure version of gmail that lots of, that we have engaged with journalism rights defender organizations as well as candidates running for office because we know they are also attacked and making sure that those, those groups of folks have access to this tool. and it's not just sort of individual accounts, it is also sites. we have a program called project shows where we provide additional security to sites that might be subject to attacks. really investing in additional tools that we know people might need. and then, you know, obviously, engaging with stakeholders is something we have to continue to do to make sure that we are understanding how the threats are evolving and that needs to feed into the structure, the programs, the process, and the teams across the company that
11:46 pm
are working on these issues. and that really runs the gamut again from engineers and product developers to those writing and enforcing the policies, in our public policy teams that are doing advocacy around things like internet shutdowns -- fighting internet shutdowns and making sure that there adequate protections around censorship and privacy. that is a copper has -- a company has of approach and has to be built in the way that you build products. and having particular mechanisms for rights defender groups to come directly to your human rights team. when they don't, it doesn't make it perfect but i think if you do not have that, it means you do not have a way for the company to begin to understand how to have a responsibility to address the issues can kind of just start thinking about how to do that in concert with other stakeholders. michelle: thank you for that
11:47 pm
really fascinating answer, it is definitely a lot to think about in terms of how, how big of an issue this is a cross how many countries including here in the united states and how you have to respond. i wanted to ask sheryl about this question on kind of as we're thinking about what platforms do, need to do, teams and reach human resources they have in place, some of the most dangerous threats to human rights defenders often rely on coded leg which are images that basically might bypass detection from systems or even human reviewers who lack the knowledge of local languages, context, slang, symbols, things that people cannot find if they do not have that very localized information. sheryl, can you tell us more about how you've seen this issue arise in your work with human rights defenders around the world? then specifically how can platforms taking what alex said about needing to communicate more with stakeholders and engage different teams, how can they engage civil society and others with that kind of
11:48 pm
contextual knowledge to build that upand bring it into i think not just the human rights teams but also as alex mentioned, people designing products and features for companies? jason: sheryl: so, coded speech, whether hate speech or otherwise, is very contextual. in one of the things that is really important, as some said, is that companies have a physical presence, field presence or they also have teams on human rights but also linguistically, teams that understand historical narratives, historical violence. you know, teams who also understand and can monitor political language and political histories, cultural, etc. because one of the things is that, one of the difficulties is when speech doesn't seem direct, that it's a direct targeted attack, then often it not only
11:49 pm
does not get reported but it is not seen as something that can lead to offline violence or other types of violence. and, without understanding the local context, without understanding language and how it's used, and different expressions, or different allusions to, without that, then there isn't an understanding of light, this is a direct threat. and also, often, this use of indirect, you know, types of language like basically talking about the longevity of a defenders life. saying that something should be done. wishing that something would, you know, before them. talking, dehumanizing them. we have had many reports from defenders where they're basically not considered human, first of all, that they are a virus, etc.
11:50 pm
again, referring back to in some context, language that has been used historically during political violence, for example. but also, you know, on language where defenders, you know, it may call into question their gender, you know? language which also, you know, accuses them of corruption. and criminalization or belonging to different criminal groups or things like that because that also then leads to threats in the legal sphere, jailing, things like that. but also, the other part to your question about how platforms can inform and build their contextual knowledge, i think the thing is, as has been mentioned, across functional interdisciplinary, different types of teams, and also, you know, kind of focusing on an
11:51 pm
abuser approach, because defenders, you know, that will decrease the burden that is on defenders for them to flag content, etc., but it also, because of the phenomena we were talking about earlier about flag fatigue. so, but i think, as was mentioned, besides having human rights experts, having people experienced in journalism and political contexts, that also helps to unearth some of these type of nuance threats or coded threats, but also getting away from these kind of standard lists of hate speech, right, or certain types of vocabularies that are used, as was said, it is not a one-size-fits-all. there are different types of speech that may be considered
11:52 pm
hate speech or otherwise but if we just stick to those, then we are also dealing with speech which might be invisible and not measurable, you know because we do not even know it exists. but, within the local context, defenders and others do know it exists and they constantly flag it, and it is not seen again as, that doesn't really rise to the threat level of a direct threat. in that context it does, including, you know, what defenders are called. in certain context of a defender is called a criminal, you know, or an enemy of the state, or a variety of other things, you know, that is, you know, a means that often these orchestrated or kind of campaigns against them get activated. and then the catch-22 is that defenders are overwhelmed at the number of of the posts or
11:53 pm
whatever it is. that not one individual cannot reasonably handle that. nevertheless report on that. michelle: thank you. i think you have raised such important points for platforms considering i'm hearing a couple of themes and that, hearing the need for that very specific human rights defenders at a localized community level engagement and perhaps you are mentioning with people that are actually marginalized themselves. local people. -- foldable people. even in the u.s. context you may have a majority group that like you said, they can list names of common hate speech indicators, but they do not experience it themselves every single day in a way where they understand exactly what those insinuations mean or exactly what that symbol or kind of allegory or allusion means. i'm hearing that. an opportunity for platforms to look at those patterns and to
11:54 pm
figure out what is not rising to the level and that is maybe exposing an emerging trend, and that is something it sounds like they need local people to help navigate. go ahead, alex. alex: certainly at google, this is something we are deeply invested in. we have our trust and safety team is global, and they are, they do exactly this, they sort of for our hate speech policy in particular, they are looking at what are hate terms, thematic narratives coming up in the hate space, and how those evolve and try to make sure that we have taxonomies keeping up with the ways bad actors are dehumanizing and inciting violence, etc. so, like, but it is a huge investment for companies. i see my colleagues from meta, they do the same thing. i think really one of the things we need to think about our ways to resource the diversity of the industry, because it is something that certainly the big companies are invested in, but
11:55 pm
could improve and are deeply invested in, but how do we make sure that smaller companies that can have a big footprint are getting access to this type of sort of intel from experts about what narratives exists and how you should interpret them in a particular context. that takes a lot of people and resources that not all companies in particular small companies don't have yet. so, i think that is an important ongoing challenge for the ecosystem. michelle: that is a really important point, as we all definitely know that resourcing can be the key between being able to solve and address a challenge and having it be a great framework but not be able to be actualized. i want to turn to jason with the question. i think, hearing this conversation it is easy to feel that, you know, this could be an overwhelming problem, extremely complex. what can we really do? seeing these guidelines in these recommendations and these clear categories with the u.n. standards is a report reminder,
11:56 pm
that there are things that can be done that are and that are being done. but how do you really kind of monitor benchmarks and measure success? what does success look like? gni establish assessments for its members on the progress and implementing the principles. how do you think companies can start to benchmark their progress on improving human rights defenders protection, and how can multi-stakeholder initiatives like gni facilitate? jason: thanks. this is been a really rich conversation, and i think the question of how do we sort of take this forward and measure progress is a really important one. gni has a framework. for 15 years. we are multi-stakeholder organization. we have tech companies, not just internet companies, providers and infrastructure providers. all of whom sort of interface with free expression of privacy,
11:57 pm
slightly differently, depending on the products and services they provide. but all of whom have obviously by virtue being part of gni made a commitment and that commitment is broken down in the principles and our implementation guidelines into these specific measures. then we have a periodic assessment process that each company goes through. and in that process, the company will bring in an intent -- an independent third party to review the steps they have taken against each of the categories of obligations and document them. and they do that by looking at the sort of paper trail, what systems and policies are there, not just a public facing ones but the internal ones. they talk to key employees. from senior level management all the way down to the people in the field who are interacting with in dealing with threats on a daily basis. and then we look at case studies. so actual examples of how --
11:58 pm
privacy challenges manifest, we place particular focus on the interaction between companies and governments because that is where a lot of these threats manifest. so, all of that goes into a report, which then is shared with our multi-stakeholder membership for review. discussion, and recommendations. so we evaluate companies against a benchmark, which is good faith implementation with improvement over time. and this cycle repeats. recommendations that are provided to the company's then are reviewed at the next assessment. in order to make sure that there is progress happening. and so, that improvement over time concept which is also baked in to this guidance is really critical. but having a concrete framework and process to be able to measure improvement is really important. and improvement, of course, can mean lots of different things. i think what is interesting
11:59 pm
today is the fact that we are now seeing a number of emerging laws an regulations that are effectively takingd a lot of these recommendations and guidance from the u.n. guiding principles, the oecd guidelines and baking them into hard law. so, we have things like the corporate sustainability due diligence directive, which is a new process -- in the process of passing in the e.u., which applies to all companies of a certain size, with certain presence in europe regardless of sector. and that basically is a mandatory human rights due diligence law that requires covered companies to conduct human rights due diligence and demonstrate how they are doing that, how they are addressing the risks that they uncover through that. then we have tech specific regulations like the digital services act that apply to segments of the technology ecosystem and there are obligations for all kinds of
12:00 am
different companies with the very largest online service providers and search engines have to go through a pretty rigorous risk assessment process, and they have to have that audited by a third party auditor. so there are certain elements of the human rights framework that are being codified, which hopefully will mean that not just the very largest providers, many of which are members of gni, who have been doing this through a framework of co- governance, but also the smaller providers, the providers who have not committed as much to human rights as the policy or a policy, will now have to sort of raise their game. at least to a minimum level of -- and over time hopefully that floor will rise. a lot of questions about how this will work in practice. i do want to say i think there is just taking stock of a lot of the conversation and knowledge and experiences we have shared here,
12:01 am
i think when we think about human rights defenders specifically, it may be helpful to think that there are two kinds of defenses that companies can take to defend against attacks. we poignantly pointed out how in myanmar, basically the entire civilian population has become human rights defenders in one way, shape, or form. individual account actions are not the way you can do that for every single person in myanmar. that's where it's really important to have kind of general corporate practices that can protect against threats. i think there is none more important in that context than end-to-end encryption, which is a vital security and privacy measure that fortunately we have seen tech companies taking steps to sort of mainstream across products and services. you really cannot underscore
12:02 am
enough how important that can be for not just the sort of explicit human rights defenders but just everyday people who might find themselves from one-day to the next becoming a target. and you know, that is not, it's not sort of an obvious thing because there are a lot of governments and law enforcement agencies who see end-to-end encryption as an obstacle, not just the authoritarian ones, but democratic ones. cyber security and information security, security by design measures, these are things that can apply across the board addressing threat actors. a lot of companies have been putting time in to understand these threats because they are not just bad for individual users but they are really bad for the product and service if they become, you know, so per se vasively influenced by these threat actors. . that's at the general level.
12:03 am
and then we have these kind of specific projects or programs or mechanisms that can be made available to people who are non-activists, known targets. . that's like alex was talking about advanced detection and project shield and there are a bunch of other programs like that. i think those are really important for the specific actors but they are never going to be available or useful or really even relevant to the general population. i think as we think about measuring progress over time, it is worth looking at both the kind of systemic sort of efforts that companies can take and how those are progressing and how we are going across the industry and across stakeholder groups -- learning across the industry and acrostic tickle groups about their impact. and the specific measures and how some of those might become not just specific to a particular company in particular user, but can we think about ways that across the industry these protections can be provided more consistently? because it was pointed out i
12:04 am
think really appropriately that these threat actors don't just play in one space. the threats migrate, as do the users, across different platforms. >> thank you. i appreciate how you talked about both the systemic benchmarking level of companies and their realization of improvements over time, but also i think from the activist are on the ground perspective, people can feel impatient for that gradual realization to happen. i appreciate you talked about concrete things that are proactively happening to be between just these. in the interest of time, we will definitely turn to some audience q&a. i will start with the first one online and that way it also gives you time for more people to raise their hands. the first person effect if someone is being shy. i will start with a question from online that we received. you know, is there anything specifically around gendered online harm that's recommended? especially given the harm faced
12:05 am
by women human rights defenders. i heard people discussing the fact that gender has been specifically called out. obviously, there's many vulnerabilities that would cause a person to have this outside harm and gender is just one of them. i was curious if there's any, you know, anything specific on gender that anyone wanted to share about the guidance that could be applied to platforms? >> i think considering when someone is attacked because of their gender, also that they may be attacked because of their families or their families may be attacked. what are the protected mechanisms that already exist? for example, through social movements, you know, that work for the protection of women and other marginalized defenders who may be attacked because of their gender or because of hate speech
12:06 am
or otherwise. it's really important to look at like, who is working within this area? because they have also had a lot of experience on what, you know, gender specific attacks and threats and protection and mitigation, what that looks like. it's also not something new. women and women who identify, people who identify as women and others who are marginalized, and a lot of this is already down the road quite a way. you know. but again, the threat actors are using different means to go after them. and in many cases, they go after their families. and on support, let's say if somebody needed to relocate for their safety because of threats. often, some protection measures do not work because it does not consider their family or other
12:07 am
people around them who may be at risk. but also, protection also includes when someone may be at risk from their family because the impact of threats on them or from their community or otherwise. >> thank you. i think it's really important, as we are discussing this, for people to think about the fact that these threats can emanate, unlike in a physical threat where someone is running front of you, from really anywhere. and it can affect people that are in the person's family in addition to themselves. thank you. those are great points. >> i would just also add the psychosocial impact. because often, the threats often speak to denigrating the person, you know, and their reputation, and calling for them -- threats against women, calling into question her womanhood or being a mother or otherwise. again, i think psychosocial, both in remedy, in responding
12:08 am
and understanding the impact this has on a person, besides the fact that they may, you know, not be able to navigate online rhetoric because of threats. literally from like every sphere close to a person, themselves, their family, friends, like how it goes out, there society, etc., how this may impact those different relations and how they can navigate, including off-line in the physical world. >> i will turn to an audience question. take the microphone from elliott. inks. >> thank you so much for this panel. it's been really wonderful and instructive and a rich set of conversations and really excited about the guidance as well. i wanted to go back to something that cheryl, you said earlier,
12:09 am
which really struck me about the lack of coordination between platforms on the ground and how that affects human rights defenders. i have been traveling to different contexts over the course of the past year and it's struck me that in certain spaces, civil society organizations will say we have a great relationship with youtube, we have direct channels, but meta we cannot get on the phone for anything. in other places, it's like we can talk to tiktok, but you know, we cannot talk to google, or whoever this. so, the lack of sort of coordination in different spaces is just so apparent. and across different geographies. and to jason's point about, you know, when there isn't ordination, the threat actors will just migrate. the threats will migrate to the platforms where there is the least amount of protection. the question being really for alex mostly, to what extent are there conversations being had among different platforms
12:10 am
about generating some kind of more streamlined approach? and also, to the point about resourcing, right, it seems like that would be smarter from a resourcing perspective to cooperate with one another on the ground. and what are some of the barriers that you see to that happening? >> well, i think there's a lot of conversation that happens in the companies, particularly that gni companies, about how we a re engaging with intermediary organizations and organizations on the ground in different localities, and globally. and what programs we have to do that and are best practices around implementing those internal guidelines. some of the things i think that are ultimately, like, some of the barriers to more ordination are that the platforms are not nessel -- more coordination are that the platforms are not
12:11 am
necessarily more situated in any given country. the user base, the company's sort of market share, etc., all those things look really different and the way people are using their platforms can look different. we won't always have the same amount of resources or energy or approach to put into something at a given time. and obviously, as we've talked about, that can thoroughly evolve. and things that might be a priority at one time might become less of a priority as the situation changes or as our user base changes somewhere. but that is a reality of how and why it might look different for different companies in a particular country. because our services are not being used in the same way. and so, our approach to managing the issues might look a little bit different. >> well, thanks. we are, unfortunately, at time for this. and i want to let all of you get out of here. i felt a palpable sense of the room -- in the room of how many questions we could have asked,
12:12 am
here and online. i want to thank all of our panelists for all the work they are doing from each place they sit in advocating for greater protections for human rights defenders. i want to thank the state department for releasing these really incredible, concrete recommendations for online platforms for protecting human rights defenders. and of course, to thank the atlantic council for their and partnership in this. we will be continuing these conversations. these are important topics. you can contact us at the center for strategic and international studies or the human rights initiative. if you have more questions. but it is certainly an important issue that affects people from the bottom all the way to the top of our society, in terms of decision-makers and those that are on the front lines fighting for human rights. thank you again for coming today. please join me in thanking our panelists and our keynote speakers today. [applause]
12:13 am
[indiscernible chatter]
12:14 am
[indiscernible chatter] ♪ >> do you solemnly swear that the testimony you are about to give will be the truth, the whole truth, and nothing but the truth, so help you god? >> saturday, wash american history tv's new series "congress investigates," as we explore major investigations. authors and historians will tell these stories. we will see historic footage and examine the impact and legacy of key congressional hearings. the 1912 special senate committee investigating the sinking of the titanic, witnesses testified about ice warnings that were ignored, the inadequate number of lifeboats and the treatment of different
12:15 am
classes of passengers. we will find out what congress did and how those changes impact travel on the seas today. watch "congress investigates" saturdays c-span2. ♪ monday, u.s. ambassador t united nations, linda thomas-greenfield, discusses diplomacy in the pacific islands. hosted by the center for strategic and inrnional studies in washington, d.c. watch live coverage on c-span, c-span now, or online at c-span.org. next, a conversation on democracy and truth in journalism hosted by the american enterprise institute. journalists examined bias in the media, and declining public trust in institutions. this program is just over an hour. [indiscernible chatter]

9 Views

info Stream Only

Uploaded by TV Archive on