C > hs12 > 297.html
@C
RTop@RBtm@@O
ICwCg̗͊w@ołɗLȉ\̂4̐헪
http://www.asyura2.com/14/it12/msg/297.html
e 2019 N 11 29 20:09:25: CYdJ4nBd/ys76@6dw
 Tweet @

ICwCg̗͊w
Nature _CWFXg Vol. 16 No. 12 | doi : 10.1038/ndigest.2019.191233
FNature (2019-09-12) | doi: 10.1038/d41586-019-02447-1 | Strategies for combating online hate
Noemi Derzsy
\[VfBAvbgtH[ɂIC̃wCgO[v̗͊w͂̕AwCgXs[֎~錻s̃Av[ł͂܂ȂR炩ɂȂAICwCgołɗLȉ\̂4̐헪̊bꂽB

metamorworks/iStock / Getty Images Plus/Getty

IC̃wCgO[v̐ԌńA\[VfBAvbgtH[ɂԂƂ݂B̎dg݂͂ǂ̂悤Ȃ̂Ȃ̂AāȂ݂ʓIɌ邽߂ɂ͂ǂȑ΍邱Ƃł̂H@̋^ɑ΂AW[WEVgwičVgD.C.jNeil Johnson1́Ã\[VfBAvbgtH[ɑ݂ICwCgR~jeB̋ɊւāANature 2019N912261y[Wɋ[񍐂ĂBJohnsońAICwCgO[v̍\Ɨ͊w𖾂Ǎʂ𓥂܂āAIC\[VfBAɂ͂тwCgRec炷߂4̃|V[ĂĂB

• Letter
• Published: 21 August 2019
Hidden resilience and adaptive dynamics of the global online hate ecology
• N. F. Johnson,
• R. Leahy,
• N. Johnson Restrepo,
• N. Velasquez,
• M. Zheng,
• P. Manrique,
• P. Devkota &
• S. Wuchty
Article metrics
• 13k Accesses
• 2 Citations
• 622 Altmetric
• Metricsdetails
Abstract
Online hate and extremist narratives have been linked to abhorrent real-world events, including a current surge in hate crimes1,2,3,4,5,6 and an alarming increase in youth suicides that result from social media vitriol7; inciting mass shootings such as the 2019 attack in Christchurch, stabbings and bombings8,9,10,11; recruitment of extremists12,13,14,15,16, including entrapment and sex-trafficking of girls as fighter brides17; threats against public figures, including the 2019 verbal attack against an anti-Brexit politician, and hybrid (racist–anti-women–anti-immigrant) hate threats against a US member of the British royal family18; and renewed anti-western hate in the 2019 post-ISIS landscape associated with support for Osama Bin Ladenfs son and Al Qaeda. Social media platforms seem to be losing the battle against online hate19,20 and urgently need new insights. Here we show that the key to understanding the resilience of online hate lies in its global network-of-network dynamics. Interconnected hate clusters form global ehate highwaysf that—assisted by collective online adaptations—cross social media platforms, sometimes using eback doorsf even after being banned, as well as jumping between countries, continents and languages. Our mathematical model predicts that policing within a single platform (such as Facebook) can make matters worse, and will eventually generate global edark poolsf in which online hate will flourish. We observe the current hate network rapidly rewiring and self-repairing at the micro level when attacked, in a way that mimics the formation of covalent bonds in chemistry. This understanding enables us to propose a policy matrix that can help to defeat online hate, classified by the preferred (or legally allowed) granularity of the intervention and top-down versus bottom-up nature. We provide quantitative assessments for the effects of each intervention. This policy matrix also offers a tool for tackling a broader class of illicit online behaviours21,22 such as financial fraud.
IC̑Ɖߌh́̕Aƍ߂̋}1,2,3,4,5,6\[VfBAvitriol7ɂ҂̎Eׂ̋ȂǁA܂킵ȄoɃNĂ܂BNCXg[ł2019N̍UAhAȂǂ̑ʎˌ̐8,9,10,11;ߌh̕W12A13A14A15A16A퓬@̉ԉłƂĂ̏̕߂ƐIlg܂17BuOWbgƂɑ΂2019ŇU܂ތlɑ΂鋺ЁAуnCubhil퍷ʎҁAAږj́Ap̕čo[ɑ΂鋺Ђ𑞂18BāAIT}rfB̑qƃAJC_̎xɊ֘A2019NISIS̏󋵂Ŕ̑VɂB\[VfBAvbgtH[́AICł̑Ƃ̐킢ɕĂ悤ł19A20AVȓ@ً}ɕKvłBł́AICwCg̉񕜗͂𗝉邽߂̌́AO[olbg[N_Ci~NXɂ邱Ƃ܂BݐڑꂽwCgNX^[́AO[oȁuwCgnCEFCv܂B́AWIȃICKɂĎxA\[VfBAvbgtH[f܂B̐wf́AP̃vbgtH[iFacebookȂǁjł̃|VOԂ\AŏIIɂ̓ICł̌flourO[oȁu_[Nv[v𐶐Ɨ\Ă܂Bwɂ鋤Ľ͕킷@ŁAUɃ}CNxŋ}ɍĔzюȏC錻݂̃wCglbg[Nώ@܂B̗ɂA̍D݂́i܂͖@Iɋꂽjxƃgbv_E΃{gAbv̐ɂĕނꂽAIC̑Ŕĵɖ𗧂|V[}gbNXĂł܂Běʂ̒ʓI]񋟂܂B܂Ã|V[}gbNX́AZ\Ȃǂ̍L͂ȕss21A22ɑΏ邽߂̃c[񋟂܂B
Main
Current strategies to defeat online hate tend towards two ends of the scale: a microscopic approach that seeks to identify ebadf individual(s) in the sea of online users1,14,16, and a macroscopic approach that bans entire ideologies, which results in allegations of stifling free speech23. These two approaches are equivalent to attempts to try to understand how water boils by looking for a bad particle in a sea of billions (even though there is not one for phase transitions24), or the macroscopic viewpoint that the entire system is to blame (akin to thermodynamics24). Yet, the correct science behind extended physical phenomena24 lies at the mesoscale in the self-organized cluster dynamics of the developing correlations, with the same thought to be true for many social science settings25,26,27.
A better understanding of how the ecology of online hate evolves could create more effective intervention policies. Using entirely public data from different social media platforms, countries and languages, we find that online hate thrives globally through self-organized, mesoscale clusters that interconnect to form a resilient network-of-networks of hate highways across platforms, countries and languages (Fig. 1). Our mathematical theory shows why single-platform policing (for example, by Facebook) can be ineffective (Fig. 2) and may even make things worse. We find empirically that when attacked, the online hate ecology can quickly adapt and self-repair at the micro level, akin to the formation of covalent bonds in chemistry (Fig. 3). We leave a detailed study of the underlying social networks to future work because our focus here is on the general cross-platform behaviour. Knowledge of these features of online hate enables us to propose a set of interventions to thwart it (Fig. 4).
C
IC̑ɑł߂݂̌̐헪́AXP[̗[ɌX܂FIC[U[̊CŁuvl肵悤ƂIȃAv[1,14,16AуCfIM[Ŝ֎~鋐IȃAv[_̎R}Ƃ咣23B2̃Av[́A\̊CňqTĐlq𗝉悤Ƃ鎎݂Ɠłi]24ȂꍇłjA܂̓VXeŜӔC𕉂ƂIȎ_iljM͊w24jBAgꂽۂ̔wɂ鐳Ȋw24́AWĂ鑊֊֌W̎ȑgDNX^[_Ci~NX̒K͂ɂA̎ЉȊwݒ25A26A27ɂl܂B
ICwCg̐Ԃǂ̂悤ɐi邩悭邱ƂŁAʓIȉ|V[쐬ł܂B܂܂ȃ\[VfBAvbgtH[AAꂩ̊SɌJꂽf[^gpƁAvbgtH[AA܂wCgnCEFC̉񕜗͂̂lbg[Nlbg[N𑊌ݐڑ鎩ȑgDꂽ\XP[NX^[ʂāAICwCgO[oɔɉhĂ邱Ƃ킩܂i}B1jB̐wI_́APvbgtH[̃|VOiƂFacebookɂjʓIłȂi}2jAԂ\邱ƂĂ܂BU󂯂ꍇAIC̃wCgGRW[́Awɂ鋤ĽƓlɁA~NxŐvɓKюȏCł邱ƂoIɂ킩Ă܂i}3jBł̏œ_͈ʓIȃNXvbgtH[̓ɂ邽߁AbƂȂ\[Vlbg[N̏ڍׂȌ͍̍ƂɔC܂B̃IC̓m邱ƂŁAj~邽߂̈ẢĂ邱Ƃł܂i}4jB

Fig. 1: Global ecology of online hate clusters.

a, Schematic of resilient hate ecology that we find flourishing online, mixing hate narratives, languages and cultures across platforms. A1, A2 and A3 denote three types of self-organized adaptation that we observe that quickly build new bridges between otherwise independent platforms (see main text). We focus on Facebook (FB) and VKontakte (VK) clusters, shown as large blue and red symbols, respectively; different shapes represent different hate narratives. Undirected (that is, no arrowhead) coloured link between two hate clusters indicates a strong two-way connection. Small black circles indicate users, who may be members of 1, 2, 3chate clusters; directed (that is, with arrowhead) link indicates that the user is a member of that hate cluster. b, Placing hate clusters at the location of their activity (for example, eStop White Genocide in South Africaf (SA)) reveals a complex web of global hate highways built from these strong inter-cluster connections. Only the basic skeleton is shown. Bridges between Facebook and VKontakte (for example, A1, A2 and A3 in a) are shown in green. When the focus of a hate cluster is an entire country or continent, the geographical centre is chosen. Inset shows dense hate highway interlinkage across Europe. c, Microscale view of actual KKK hate-cluster ecosystem. The ForceAtlas2 algorithm used is such that the further two clusters are apart, the fewer users they have in common. Hate-cluster radii are determined by the number of members. d, Schematic showing synapse-like nature of individual hate clusters.
Source Data
Full size image
Fig. 2: Mathematical model showing resilience of hate-cluster ecology.

Full size image
Fig. 3: Adaptive dynamics of online hate at the microscale.

a, The KKK ecosystem on VKontakte before and after the school shooting in Parkland, Florida, on 14 February 2018. During subsequent weeks, rapid microscale rewiring due to individual cluster-joining initiated ebondsf between previously distinct KKK clusters. For clarity, only users (white circles) that change status in the next time step are shown, otherwise the image would be as dense as in b. Larger red nodes are clusters that are closed (that is, closed VKontakte groups), green nodes are open (that is, open VKontakte groups). Yellow links mean the user will leave cluster between day t and day t + 1, meaning that link will disappear. Blue links mean user joins cluster on day t. b, Full KKK ecology on VKontakte after the shooting in Parkland, showing a strong ebondf between the largest KKK clusters. c, Remarkably similar bonding suddenly emerges in anti-western (jihadi) hate-cluster ecology around 18 March 2015, a few days after a coalition strike appears to have wounded ISIS leader Abu Bakr al-Baghdadi. This coincides with rumours immediately circulating among these hate clusters that top ISIS leaders were meeting to discuss who would replace him if he died, suggesting that his injuries were serious. However, none of this become public knowledge in the media—that is, the observed rewiring and self-repair that fuses two clusters into one (that is, two disappeared, shown yellow, and one appeared, shown as blue) is a self-organized, adaptive response of the online hate system. Although b mimics electronic covalent bonding, c is a more extreme version of bonding more akin to nuclear fusion. The ForceAtlas2 algorithm used to plot b and c is such that the further two clusters are apart, the less users they have in common.
Full size image
Fig. 4: Policy matrix from our findings.

Descriptions of policies 1–4 are supplied in the main text, and each policy intervention is shown in green. The best policy for a given setting can be chosen according to the required balance between legally allowed (preferred) granularity and the legally allowed (preferred) nature of intervention.
Full size image
Our analysis of online clusters does not require any information about individuals, just as information about a specific molecule of water is not required to describe the bubbles (that is, clusters of correlated molecules) that form in boiling water. Online clusters such as groups, communities and pages are a popular feature of platforms such as Facebook and VKontakte, which is based in central Europe, has hundreds of millions of users worldwide, and had a crucial role in previous extremist activity27. Such online clusters allow several individual users to self-organize around a common interest27 and they collectively self-police to remove trolls, bots and adverse opinions. Some people find it attractive to join a cluster that promotes hate because its social structure reduces the risk of being trolled or confronted by opponents. Even on platforms that do not have formal groups, quasi-groups can be formed (for example, Telegram). Although Twitter has allowed some notable insights26, we do not consider it here as its open-follower structure does not fully capture the tendency of humans to form into tight-knit social clusters (such as VKontakte groups) in which they can develop thoughts without encountering opposition. Our online cluster search methodology generalizes that previously described27 to multiple social media platforms and can be repeated for any hate topic (see Methods for full details).
The global hate ecology that we find flourishing online is shown in Fig. 1a, b. The highly interconnected network-of-networks28,29,30 mixes hate narratives across themes (for example, anti-Semitic, anti-immigrant, anti-LGBT+), languages, cultures and platforms. This online mixing manifest itself in the 2019 attack in Christchurch: the presumed shooter was Australian, the attack was in New Zealand, and the guns carried messages in several European languages on historical topics that are mentioned in online hate clusters across continents. We uncover hate clusters of all sizes—for example, the hate-cluster distribution for the ideology of the Ku Klux Klan (KKK) on VKontakte has a high goodness-of-fit value for a power-law distribution (Extended Data Fig. 1). This suggests that the online hate ecology is self-organized, because it would be almost impossible to engineer this distribution using top-down control. The estimated power-law exponent is consistent with a sampling of anti-western hate clusters as well as the online ecology of financial fraud21, suggesting that our findings and policy suggestions can help to tackle a broader class of illicit online behaviours21,22.
We observe operationally independent platforms—that are also commercial competitors—becoming unwittingly coupled through dynamical, self-organized adaptations of the global hate-cluster networks. This resilience helps the hate ecology to recover quickly after the banning of single platforms. The three types of adaptation bridging VKontakte and Facebook that enabled hate to re-enter Facebook through the eback doorf are shown in Fig. 1a: (A1) hate-cluster mirroring; (A2) hate-cluster reincarnation; and (A3) direct inter-hate-cluster linkage (see Supplementary Information). We observed A2 after Facebook banned the KKK. An ecology of nearly 60 KKK clusters remained on VKontakte (Fig. 1c) that included posts in Ukrainian. When the Ukrainian government banned VKontakte, the VKontakte-based KKK ecosystem (Fig. 1c) reincarnated KKK cluster(s) back on Facebook, but with gKuKluxKlanh written in Cyrillic, making it harder to catch with English-language detection algorithms. Hence, adaptation A2 enabled the hate ideology to implant cluster(s) with thousands of supporters back into a platform in which it was still banned.
A sample of the hate-cluster network placed on a global map using self-reported location information of each cluster is shown in Fig. 1b. This shows how clusters connect across different continents creating one-step highways for hate content. The Facebook and VKontakte hate bridges occur in Europe, the United States and South Africa, even though VKontakte is often thought of as being local to central Europe. Europe (Fig. 1b, inset) shows a particularly complex hate ecology, which reflects intertwined narratives that cross languages and declared causes—for example, neo-Nazi clusters with membership drawn from the United Kingdom, Canada, United States, Australia and New Zealand feature material about English football, Brexit and skinhead imagery while also promoting black music genres. So although the hate may be pure, the rationale given is not, which suggests that this online ecology acts like a global fly-trap that can quickly capture new recruits from any platform, country and language, particularly if they do not yet have a clear focus for their hate.
Our mathematical model in Fig. 2 predicts additional resilience and its negative consequences for the current battle against online hate. It considers the fairly common observation in our data of a ring of c connected hate clusters within a given platform (for example, platform 1, see Extended Data Fig. 2). In our model, each hate cluster is attempting to spread its hate material to other clusters in the ring through links such as A1, A2 and/or A3 (Fig. 1a), but incurs a cost R when its material passes between platforms 1 and 2 because of the risk of sanctions on platform 2 (Facebook is better policed). We assume a probability q of a given hate cluster on platform 1 sending its hate material on a path via platform 2. The following formula, derived in the Supplementary Information, then gives the cluster-averaged value of the shortest path (that is, the average length of the hate highway)30 between the c hate clusters on platform 1:
ℓ¯=R(R|1)2(c|1)+(1|q)c|R[3+q(c|2|R)]q2(c|1)+q[2|2R+2c|q(R|1)(R|c)]|3q2(c|1)ℓ¯=R(R|1)2(c|1)+(1|q)c|R[3+q(c|2|R)]q2(c|1)+q[2|2R+2c|q(R|1)(R|c)]|3q2(c|1)
(1)
Figure 2b shows ℓ¯ℓ¯ as a function of the number of links between platforms 1 and 2 ( = cq with c fixed) when R increases linearly with , which is consistent with more links carrying more risk. The minimum in ℓ¯ℓ¯ has an important negative consequence. Suppose platform 2 finds a large number of hate links from 1, and manages to find some and shut them down, hence reducing . It can inadvertently decrease the average shortest path ℓ¯ℓ¯ between hate clusters on platform 1 (for example, VKontakte), hence accelerating how hate content gets shared within platform 1. The existence of several operationally independent platforms (Fig. 2a) with their own moderators and no coordinated cross-platform policing gives rise to a further resilience: our mathematical model (see Supplementary Information) shows that sections of the less policed platforms can then become isolated, creating spontaneous edark poolsf of hate highways (dark region in Fig. 2a).
Further resilience at the micro level occurs in the form of rapid rewiring and self-repair that mimics covalent bonding from chemistry, in apparent response to real-world events (Fig. 3). The ecology of the KKK on VKontakte (Fig. 3a) rewired around accusations just after the school shooting in Parkland, Florida. We do not know of any evidence that these clusters were involved, but news reports discussed the presumed shooterfs interest in the KKK, and its themes and symbols, hence these clusters probably worried about increased scrutiny. Links like chemical bonds quickly form between KKK hate clusters in a bottom-up, self-organized way. This adaptive evolutionary response helps the decentralized KKK ideological organism to protect itself by bringing together previously unconnected supporters. The network is presented on a larger scale in Fig. 3b, with the bonding density of common users clearly visible (white cloud between the green clusters). We also see this same bonding (Fig. 3c) emerge in the response of anti-western jihadist hate groups in 2015 when the leader of the Islamist terrorist group ISIS was reportedly injured in an air strike. We speculate that this covalent bonding is a general adaptive mechanism for online hate, and maybe for other illicit activities.
These insights suggest a matrix of interventions (Fig. 4 and Extended Data Fig. 3) according to the preferred top-down versus bottom-up approach on a given platform and the legal context in a given country. Each policy can be adopted on a global scale simultaneously by all platforms without them needing to share sensitive information. Policy 1 reduces the number of large hate clusters. One might assume that this can be achieved by banning the largest clusters, but the approximate power-law distribution of the size of the hate cluster means that others of similar size will quickly replace them. Instead, policy 1 exploits the underlying self-organizing mechanism by which large clusters form from smaller ones: large clusters can hence be reduced by first banning smaller clusters, with the advantage that smaller clusters are more numerous and easier to find. Policy 2 randomly bans a small fraction of individual users across the online hate population. Choosing a small fraction lowers the risk of multiple lawsuits, and choosing randomly serves the dual role of lowering the risk of banning many from the same cluster, and inciting a large crowd. Policy 3 exploits the self-organized nature of the system by setting clusters against each other in an organic, hands-off way— akin to a humanfs immune system. Our data show that there are a reasonable number of active anti-hate users online. Platform managers can encourage anti-hate users to form clusters, for example, through artificial anti-hate accounts as a nucleating mechanism, which then engage in narrative debate with online hate clusters. Online hate-cluster narratives can then be neutralized with the number of anti-hate users determined by the desired time to neutralization. Policy 4 can help platforms with multiple, competing hate narratives. In our data, some white supremacists call for a unified Europe under a Hitler-like regime, and others oppose a united Europe. Similar in-fighting exists between hate clusters of the KKK movement. Adding a third population in a pre-engineered format then allows the hate-cluster extinction time to be manipulated globally (Extended Data Fig. 3).
Limitations to our study include the fact that we cannot yet include all platforms because of a lack of public access. Also, our quantitative analysis is highly idealized in order to generate quantitative answers. Although not intended to capture the complications of any specific real-world setting, the benefit of the modelling approaches in Figs. 2a and 4 is that the output is precisely quantified, reproducible and generalizable, and can therefore help to frame policy discussions as well as probe what-if intervention scenarios. Our findings can also shed light on how illicit networks operate under similar pressures—that is, networks that similarly need to remain open enough to find new recruits yet hidden enough to avoid detection21,22.

̓@́ÃvbgtH[ł̃gbv_E΃{gAbṽAv[ƁA̖̍@I󋵂ɉ̃}gbNXi}4ъgf[^}3jĂ܂Be|V[́A@LKvȂAׂẴvbgtH[ŃO[oK͂œɍ̗pł܂B|V[1́AK͂ȃwCgNX^[̐炵܂Bő̃NX^[֎~邱ƂłBłƉ肷邱Ƃł܂AwCgNX^[̃TCY̋ߎׂ̂敪źAl̃TCY̑̃NX^[ɒu邱ƂӖ܂BɁA|V[1́AȃNX^[傫ȃNX^[bƂȂ鎩ȑgDJjYp܂BāAŏɏȃNX^[֎~邱ƂɂA傫ȃNX^[炷ƂłAȃNX^[葽A茩₷Ƃ_܂B|V[2́AIC̑lŜ̌X̃[U[̂ꕔ_ɋ֎~܂BȊIƁȂĩׂXNቺ܂B_ɑIƁANX^[瑽̐l֎~郊XNyA吨̐lƂd̖ʂ܂B|V[3́AlԂ̖ƉuVXeɎAL@IŐlɃNX^[𑊌݂ɐݒ肷邱ƂɂAVXe̎ȑgDꂽp܂B̃f[^́AICŃANeBuȃAwCg[U[Ȃ̐邱ƂĂ܂BvbgtH[Ǘ҂́AƂ΁AjJjYƂĂ̐lHIȑh~AJEgʂāAh~[U[NX^[邱ƂサAICNX^[Ƃ̋̕c_ɏ]邱Ƃł܂B̌AIC̃wCgNX^[̃ieBu𒆘a邱Ƃł܂Bɂ́Aa܂ł̎ԂɂČ܂AwCg[U[̐gp܂B|V[4́A̋鑞̕vbgtH[xł܂B̃f[^ł́Aꕔ̔l҂́Aqg[̂悤ȑ̐̉œꂳꂽ[bpvA͓ꂳꂽ[bpɔ΂Ă܂B KKK^̑Wc̊Ԃɂl̐퓬݂܂BOɐ݌vꂽ3Ԗڂ̏WcǉƁAwCgNX^[̏ŎԂO[oɑł悤ɂȂ܂igf[^}3jB
̌̐ɂ́ApubNANZXsĂ邽߂ɂׂẴvbgtH[܂߂邱ƂłȂƂ܂܂܂B܂Aʕ͂́AʓIȉ񓚂𐶐邽߂ɔɗzĂ܂B̌E̐ݒ̕GLv邱ƂӐ}̂ł͂܂񂪁A}1Ɛ}2̃fOAv[̗_́A 2a4́Ao͂mɒʉAČAʉ\ł邽߁A|V[̋c_gݗāAwhat-ifViI𒲍̂ɖ𗧂܂B̒ʂ́Al̃vbV[̉ŕslbg[Nǂ̂悤ɋ@\邩A܂AlɐVV̂ɏ\ɃI[vłȂAôɏ\ɉBĂKvlbg[NɌ𓖂Ă邱Ƃł܂21,22

Methods

Our online cluster search methodology is a direct generalization of that previously described27, but now looking at several social media platforms. It can be repeated for any hate topic, but we focus here on extreme right-wing hate because it is prevalent globally and has been linked to many recent violent real-world attacks. We observe many different forms of hate that adopt similar cross-platform tricks. Whether a particular cluster is strictly a hate philosophy, or simply shows material with tendencies towards hate, does not alter our main findings. Our research avoids the need for any information about individuals, just as information about a specific molecule of water is not needed to describe the bubbles (that is, clusters of correlated molecules) that form in boiling water. Our hate-cluster network analysis starts from a given hate cluster eAf and captures any hate cluster eBf to which hate cluster A has shared an explicit cluster-level link, and vice versa from B to A (see Supplementary Information).
We also developed software to perform this process automatically and, after cross-checking the findings with our manual list, were able to obtain approximately 90% consistency between the manual and automated versions. Each day, we iterated this process until the search led back to hate clusters that were already in the list. For the global hate network, we identified 768 nodes (that is, hate clusters) and 578 edges. This is larger than the number of clusters obtained in the previous study27 of anti-western hate (specifically, pro-ISIS aggregates, which numbered a few hundred on VKontakte). But the fact it is of similar magnitude suggests that the process by which billions of users cluster globally online into hate communities is such that it produces hundreds of clusters—not tens of thousands but also not ten or so.
Reporting summary
Further information on research design is available in the Nature Research Reporting Summary linked to this paper.
Data availability
The dataset is provided as Source Data. The open-source software packages Gephi and R were used to produce the networks in the figures. No custom software was used.
References
1. 1.
The UK Home Affairs Select Committee. Hate Crime: Abuse, Hate and Extremism Online. session 2016–17 HC 609 https://publications.parliament.uk/pa/cm201617/cmselect/cmhaff/609/609.pdf (2017).
2. 2.
Patrisse Cullors. Online Hate Is A Deadly Threat https://edition.cnn.com/2018/11/01/opinions/social-media-hate-speech-cullors/index.html (2017).
3. 3.
Beirich, H., Hankes, K., Piggott, S., Schlatter, E. & Viets, S. The Year in Hate and Extremism https://www.splcenter.org/fighting-hate/intelligence-report/2017/year-hate-and-extremism (2017).
4. 4.
Hohmann, J. Hate Crimes Are a Much Bigger Problem than Even the New FBI Statistics Show https://www.washingtonpost.com/news/powerpost/paloma/daily-202/2018/11/14/daily-202-hate-crimes-are-a-much-bigger-problem-than-even-the-new-fbi-statistics-show/5beba5bd1b326b39290547e2/?utm_term=.e203814306e8 (2018).
5. 5.
Reitman, J. U.S. Law Enforcement Failed to See the Threat of White Nationalism. Now they Donft Know How to Stop It https://www.nytimes.com/2018/11/03/magazine/FBI-charlottesville-white-nationalism-far-right.html (2018).
6. 6.
Southern Poverty Law Center (SPLC). Extremist Groups https://www.splcenter.org/fighting-hate/extremist-files/groups (2018).
7. 7.
John, A. et al. Self-harm, suicidal behaviours, and cyberbullying in children and younG people: systematic review. J. Med. Internet Res. 20, e129 (2018).
o Article
8. 8.
Berman, M. Prosecutors Say Accused Charleston Church Gunman Self-Radicalized Online https://www.washingtonpost.com/news/post-nation/wp/2016/08/22/prosecutors-say-accused-charleston-church-gunman-self-radicalized-online/?utm_term=.4f17303dffd4 (2016).
9. 9.
Pagliery, J. The Suspect in Congressional Shooting Was Bernie Sanders Supporter, Strongly Anti-Trump http://www.cnn.com/2017/06/14/homepage2/james-hodgkinson-profile/index.html (2017).
10. 10.
Yan, H., Simon, D. & Graef, A. Campus Killing: Suspect is a Member of eAlt-Reichf Facebook Group http://www.cnn.com/2017/05/22/us/university-of-maryland-stabbing/index.html (2017).
11. 11.
Amend, A. Analyzing a Terroristfs Social Media Manifesto: the Pittsburgh Synagogue Shooterfs Posts on Gab https://www.splcenter.org/hatewatch/2018/10/28/analyzing-terrorists-social-media-manifesto-pittsburgh-synagogue-shooters-posts-gab (2018).
12. 12.
Gill, P. & Corner, E. in Terrorism Online: Politics, Law, Technology (eds Jarvis, L. et al.) Ch. 1 (Routledge, 2015).
13. 13.
Gill, P. et al. Terrorist use of the internet by the numbers quantifying behaviors, patterns, and processes. Criminol. Public Pol. 16, 99–117 (2017).
14. 14.
Gill, P. Lone Actor Terrorists: A Behavioral Analysis (Routledge, 2015).
15. 15.
Gill, P., Horgan, J. & Deckert, P. Bombing alone: tracing the motivations and antecedent behaviors of lone-actor terrorists. J. Forensic Sci. 59, 425–435 (2014).
o Article
16. 16.
Schuurman, B. et al. End of the lone wolf: the typology that should not have been. Stud. Conflict Terrorism 42, 771–778 (2017).
o Article
17. 17.
Panin, A. & Smith, L. Russian Students Targeted as Recruits by Islamic State https://www.bbc.co.uk/news/world-europe-33634214 (2015).
18. 18.
Foster, M. The Racist Online Abuse of Meghan Markle Has Put Royal Staff on High Alert https://www.cnn.com/2019/03/07/uk/meghan-kate-social-media-gbr-intl/index.html (2019).
19. 19.
Wakefield, J. Christchurch Shootings: Social Media Races to Stop Attack Footage https://www.bbc.com/news/technology-47583393 (2019).
20. 20.
21. 21.
22. 22.
Wong, J. C. Anti-Vaxx Mobs: Doctors Face Harassment Campaigns on Facebook https://www.theguardian.com/technology/2019/feb/27/facebook-anti-vaxx-harassment-campaigns-doctors-fight-back (2019).
23. 23.
Martínez, A. G. Want Facebook to Censor Speech? Be Careful What You Wish For https://www.wired.com/story/want-facebook-to-censor-speech-be-careful-what-you-wish-for/ (2018).
24. 24.
Stanley, H. E. Introduction to Phase Transitions and Critical Phenomena (Oxford Univ. Press, 1988).
25. 25.
Hedström, P., Sandell, R. & Stern, C. Meso-level networks and the diffusion of social movements. Am. J. Sociol. 106, 145–172 (2000).
o Article
26. 26.
Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B. & Lazer, D. Fake news on Twitter during the 2016 U.S. presidential election. Science 363, 374–378 (2019).
o CAS
o Article
27. 27.
Johnson, N. F. et al. New online ecology of adversarial aggregates: ISIS and beyond. Science 352, 1459–1463 (2016).
o CAS
o Article
28. 28.
Havlin, S., Kenett, D. Y., Bashan, A., Gao, J. & Stanley, H. E. Vulnerability of network of networks. Eur. Phys. J. Spec. Top. 223, 2087 (2014).
o Article
29. 29.
Palla, G., Barabási, A. L. & Vicsek, T. Quantifying social group evolution. Nature 446, 664–667 (2007).
o CAS
o Article
30. 30.
Jarrett, T. C., Ashton, D. J., Fricker, M. & Johnson, N. F. Interplay between function and structure in complex networks. Phys. Rev. E 74, 026116 (2006).
o MathSciNet
o Article
Acknowledgements
N.F.J. is supported by US Air Force (AFOSR) grant FA9550-16-1-0247.
Author information
Affiliations
1. Physics Department, George Washington University, Washington, DC, USA
o N. F. Johnson
o , R. Leahy
o & N. Johnson Restrepo
2. Elliot School of International Affairs, George Washington University, Washington, DC, USA
o N. Velasquez
3. Physics Department, University of Miami, Coral Gables, FL, USA
o M. Zheng
o & P. Manrique
4. Computer Science Department, University of Miami, Coral Gables, FL, USA
o P. Devkota
o & S. Wuchty
Contributions
All authors contributed to the research design, the analysis and writing the paper.
Corresponding author
Correspondence to N. F. Johnson.
Ethics declarations
Competing interests
The authors declare no competing interests.
Publisherfs note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Peer review information Nature thanks Paul Gill, Nour Kteily and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Extended data figures and tables
Extended Data Fig. 1 Power laws.
a, b, Power laws for the KKK ecology (a) and the ecology of illicit financial activities (b). Their power-law exponents () are similar in a and b, and also consistent with c. c, The results from aggregating data from different thematic subsystems, each of which has a power-law distribution with an exponent (i) distributed around 2.5. d, Summary of the simulation procedure. N power-law distributions are created with a power-law exponent distributed around 2.5. Power-law exponents were then sampled, followed by a power-law test. e, Distribution of the resulting power-law exponents from this simulation procedure, for different values of the mean number of points in the underlying distributions (mu values). The resulting power law exponents are centred near 1.7, as observed in a and b.
Extended Data Fig. 2 Cluster loop.
a, Cluster loop from Fig. 2. b, Example of a loop of clusters.
Extended Data Fig. 3 Predicted policy effects.
a, The effects of policy 1, with on average more than 550 widely spaced time steps for = 10 and N = 104. If the size of an aggregate remains within the range smin to smax for a particular time period , that aggregate then fragments. b, The effects of policy 2. Colours represent different intervention starting times (tI) in units of days (vertical grey lines): green tI = 80, red tI = 120, blue tI = 200. Line types represent different percentages of individuals randomly removed (that is, banned) at time tI: dashed line 10%, dotted line 30%, solid line 50%. c, Results for policy 3 of the time to extinction (T) as a function of the initial population partition (N + P = 1,000 fixed, with N being the initial size of the hate population and P being the initial size of the anti-hate population) and the banning rate of the platform, from numerical simulations and also analytic theory. d, Policy 4 shows effect of different allocations of 100 peacekeepers in the hate-cluster versus anti-hate-cluster scenario. nc is the number of clusters of peacekeepers (that is, individuals of type C) that have size sc.
Supplementary information
Supplementary Information
Supplementary Methods, Supplementary Discussion, Supplementary Equations and Supplementary Notes.
Reporting Summary
Source data
Source Data Fig. 1

 @@͂Aꗗ

Rg
 1. 2020N128 12:09:53 : C2i4eDKWA6 : d3pUQ3kvbGNsV2s=[-2656]
2020/01/28() 10:24:13.01ID:O+h+ZLPG0

nONl̃\toグC

2020/01/28() 10:27:28.70ID:kleZLhez0
>703

BBCŕCŃf}TU炷CVXg̔
ĂČȂق

https://pbs.twimg.com/media/DyA4KMCV4AAyvyi.jpg

2020/01/28() 10:32:12.11ID:nmV9DRKM0
>711

Ȃł΂R񂾂B
ႠȂŒNōcɂȂRlɂȂ肵񂾁AƁB

[18\R]FSFtRgɂS

 2. 2020N529 03:31:05 : C2i4eDKWA6 : d3pUQ3kvbGNsV2s=[-2551]

@KatsudonOishI_E
·
524
20NȏO
ĂĂ
Gvē͐ȂĎvB
#排 #^MA

ߑO0:44 · 2020N524

 3. 2020N718 20:03:19 : iFn8MDLkzI : MDE3MTROVWk5anc=[494]
F20/06/20(y)23:12:29 ID:???
lbgfʉA
vz𖠉łđт񂾂Ă˂
łt^J^tɂȂ
 4. 2020N1218 11:17:52 : 6oB6TEmtoh : Lkg3b3RYS1RLaE0=[516]
2020/01/28() 10:24:13.01ID:O+h+ZLPG0

nONl̃\toグC

2020/01/28() 10:27:28.70ID:kleZLhez0
>703

BBCŕCŃf}TU炷CVXg̔
ĂČȂق

https://pbs.twimg.com/media/DyA4KMCV4AAyvyi.jpg

2020/01/28() 10:32:12.11ID:nmV9DRKM0
>711

Ȃł΂R񂾂B
ႠȂŒNōcɂȂRlɂȂ肵񂾁AƁB

@@@@@ C > hs12f@ @O

 @@͂Aꗗ

tH[Abv:

o^ŃRg\Bf@ʏ bEcCb^[ bhtmltaginҌjbiej

iyl[VKo^ȂFbNj
yl[iȂł\Bقǂj

pX[hiyl[ɕK{j

iyl[ƃpX[h͏gpŋL^Aȍ~Ƀ`FbNBpX[hׂ̓Bj

i 摜́j
[mFs΍

eRgSO @RgzM @XĈ˗ @폜RgmF@

@@@@@ C > hs12f@ @O

C@http://www.asyura2.com/ since 1995
Xp[̒猩o߂Ƀ[̃^Cgɂ͕KuCցvƋLqĂB
ׂẴy[WpA]ځAN܂BmF[͕svłBpN\ĂB
 @@@@@ C > hs12f@ @O
@
@@@@@ @
C @
hs12f @
@
@
@