Digital Feudalism: Information Warfare and the Rise of Political Kingmakers
Diagnosing Democracy's Digital Disease and Charting a Path Forward
I - The Crisis of Truth
“So you are saying that human agreement decides what is true and what is false?”—It is what human beings say that is true and false; and they agree in the language they use. That is not agreement in opinions, but in form of life.
- Wittgenstein, Philosophical Investigations §241
This piece is intended as a loose political and sociological rant on how Social Media and our Information landscape has led to our current (horrible) political landscape. The piece is loose and outlines some of my main motivations for a project Im working on Building Resilient Ecological Computational Infrastructure for Preserving Liberal Democracies.
In our digital age, something fundamental has shifted in how political power operates and how truth is constructed. This piece traces that transformation, beginning with how social media platforms have created an attention economy[2] that prioritises engagement over truth. From there, we'll see how this attention-capture system became the perfect vehicle for modern propaganda, turning social media into a sophisticated theatre of political manipulation. Understanding this requires examining how our minds process information - through fast, intuitive reactions and slower, deliberative thinking - and how digital influencers exploit these cognitive mechanisms.
This exploitation has given rise to a new political framework called Diagonalism, where traditional ideological divisions dissolve into networks of influence centred around powerful digital figures - our new feudal lords. What emerges is a picture of modern politics where truth matters less than narrative control, and where power flows not through traditional institutions but through the ability to shape perception itself. By understanding this transformation, we can better grasp both the crisis facing liberal democracy and potential paths toward building more resilient democratic systems.
II - The Attention Economy and Brainrot
The Oxford year of the word 2024 was “brain rot”[1], a term which has emerged out of online culture, from the algorithmic recommendation feeds of TikTok, Instagram and YouTube describing the selection pressures for completely nonsensical, attention grabbing gibberish. Often combining various themes touching on political and social trends as well as using cutting edge technology like AI video generation to produce absolutely cursed slop.
This meme seems to be somewhat organic, not being promoted by any particular political interest group or corporation to sell anything, but rather an expression of the sense of being openly manipulated, addicted to and dependent on modern social media devices, feeling their negative effects on our ability to focus, think and spend our time freely.
Using our phones and social media apps, we are not guests in a well intentioned guests house, or even customers being sold something, but rather we are the product. Engaging with sites whose sole purpose is to capture our attention, keep us glued and addicted to our screens and bring advertising and political content into our psyche’s to control and manipulate us.
This has been described as the attention economy[2], and this technological landscape combines with previous tools of manipulation and control used by corporate capitalism to manipulate human psychology to at least partially create … the complicated mess we are in.
Corporations first became interested in psychologically manipulating people en masse in order to leave the population with infinitely insatiable desires and inadequacy so we buy the next version of the thing we already have because we emotionally identify with it—forevermore eliminating the demand-side problem of satiated desire [3].
These techniques were adopted by politicians and state institutions with political and geopolitical objectives to control and direct the power of crowds both inside the voting booth and outside of it, towards the inconvenient apparatus of state institutions that might control and keep in check authoritarian power.
The aims of Silicon Valley entrepreneurs were somewhat utopian. The idea was that with access to information, the public would be less susceptible to being controlled by authoritarian governments and therefore these technological advancements would enable mankind's inevitable movement towards some kind of utopian society (often influenced by the ideology of Ayn Rand in their minds).
The problems became twofold: First, the unintended consequences of these new information technologies exposed the shortcomings of Randian individualist philosophy. The assumption that individuals, given access to information, would naturally arrive at truth failed to account for how information itself could be weaponized. Second, capitalist incentives to generate revenue and growth transformed these platforms into addictive engagement machines, hijacking human psychology to sell advertisements.
This convergence created a perfect storm: platforms optimized for emotional engagement met political actors ready to exploit these psychological vulnerabilities. While Silicon Valley dreamed of digital liberation, they had inadvertently built the most sophisticated propaganda delivery system in human history. The very features that made these platforms effective at capturing attention—algorithmic amplification, viral sharing, targeted content—became powerful tools for those seeking to shape public opinion and undermine democratic discourse. What emerged was not the promised marketplace of ideas, but rather a battlefield of competing narratives where truth became secondary to engagement, and where the lines between advertising, entertainment, and propaganda increasingly blurred.
III. The Evolution of Propaganda in the Digital Age
While propaganda has existed throughout history - from ancient military deception to wartime posters - the digital age has transformed both its reach and sophistication. Classical propaganda relied on centralised control of information channels: newspapers, radio, and television. These methods required significant resources and could be countered through competing information sources. Digital propaganda, however, operates through decentralised networks, using algorithmic amplification and psychological targeting that would have been impossible in earlier eras.
At its core, propaganda remains an attempt to "influence people's attitudes and behaviours, either by promoting a particular ideology or by persuading them to take a specific action"[4].
However, in the Information Age, its implementation has evolved dramatically. The first states to fully grasp and exploit these new capabilities were the Russians, while Western countries remained captivated by an optimistic narrative about digital media's democratising potential - a belief that universal access to information would inevitably lead to a more informed and discerning population.
This liberal, democratic assumption suffered from two critical flaws. First, as noted by scholars, "no historical law ensures that every such transformation will be more creative than destructive from the standpoint of liberal democratic values, especially where the market alone cannot be expected to produce a public good at anything like an optimal level"[5]. Second, it naively assumed a population of rational actors encountering neutral information in good faith, equipped with the tools to evaluate source credibility.
The Russians, unburdened by such idealistic pretenses, embraced new technologies as instruments of social control. Their approach was epitomised by Vladislav Surkov, a theatre director with military experience who reconceptualised politics as performance art. As researchers note,
"Surkov is an excellent dramaturg; he writes scripts, casts actors, analyzes their performance and narratives, runs promotions, and puts the repertoire into motion to achieve intended reactions of the target audience. Methods... [help] the Kremlin to manipulate public opinion as well as election systems using pseudo-experts, technical parties, fake civic organizations and youth movement such as Nashi, and covert media techniques"[6].
When combined with Russia's broader information warfare strategy, Surkov's theatrical approach revealed a sophisticated attack on democracy's key vulnerability: its commitment to open information exchange. As documented in military analyses, Russia's explicit goal became "damaging the influence of Western democratic values, institutions, and systems in order to create a polycentric world model." To achieve this, they deployed "hackers, its increasingly powerful intelligence community, the use of state-owned media (i.e. Russia Today, or RT, and Sputnik), troll farms, and bots"[7].
The ultimate objective was to create an information environment where truth becomes impossible to discern. In this constructed reality, even fact-checking efforts by democratic institutions become suspect - just another competing narrative, potentially orchestrated by "deep state" propagandists. Meanwhile, voices challenging established institutions position themselves as defenders of objective truth, creating a paradox where truth itself becomes a weapon of disinformation (as brilliantly illustrated in Adam Curtis's "Hypernormalisation" 2:21:37 ).
IV. The Mechanics of Modern Mind Control
A. Dual Process Theory
The theory behind how modern political rhetoric works in our current social media landscape depend on "dual process" models of rationality prevalent in the reasoning literature. Within this literature, what is sometimes called System 1 thinking is characterised as being fast, intuitive and energy efficient. While there is much debate about whether or not this is irrational (in fact there are many cases where this is optimal), it is certainly exploitable if people are furnished with bad heuristics for judging things so make these fast judgements in line with malevolent actors goals. In contrast to this, System 2 thinking is slower, reflective and serial— also, more energy intensive. As such, this type of thinking is less prone to the exploitation of the first system. There are still many problems, such as appropriate contextualisation, however System 2 is less prone to the kinds of manipulation as System 1.
B. Schema Formation and Reality Perception
The way in which System 1 gets “hijacked” is though being provided with cognitive schemas, gestalts for handling and interpreting information. These framing devices bring people to slowly view the world from one totalising perspective — eventually perceptions that would be viewed as politically neutral get assimilated by the way of seeing such that an individuals entire worldview becomes politicised.
“…it is not isolation from opposing views that drives polarization but precisely the fact that digital media bring us to interact outside our local bubble. When individuals interact locally, the outcome is a stable plural patchwork of cross-cutting conflicts. By encouraging nonlocal interaction, digital media drive an alignment of conflicts along partisan lines, thus effacing the counterbalancing effects of local heterogeneity. The result is polarization, even if individual interaction leads to convergence. The model thus suggests that digital media polarize through partisan sorting, creating a maelstrom in which more and more identities, beliefs, and cultural preferences become drawn into an all-encompassing societal division.” [9]
Effective politicians are able to imbue a constellation of facts with a new meaning, providing voters with a fresh cognitive shortcut to make sense of the political world…
…winning the argument in politics isn’t often about finding more or better facts. It’s about perception and the cognitive shortcuts we use to process information as we sort our world into neat categories that make sense…
…facts are most effective when they’re nestled within a ready-made intellectual framework for how to make sense of the world. Effective political movements use facts to reinforce schemas, but they understand that the schemas are what matter most. It’s a depressing truth, but getting the right taglines, slogans, and vivid ways of presenting political opponents is often far more important than being right. [10]
V. Diagonalism and the New Feudalism
The way that this works in contemporary politics, with our existing information infrastructure has been described as Memetic Feudalism, wherein Lords and Ladies, influencers shape and create our perceptual reality through "feudal hierarchies of influence."[11] Recognising the feudal nature of our social and political landscape leads to more easily understanding some of the radical changes in politics over the past twenty of so years. While traditional political theory thinks in terms of a taxonomy of differing theoretical commitment, modern politics no longer works like this. Political teams are simply collectives of allegiances with common goals using the apparatus of power, regardless of doctrinal commitment.
In The Information Age, feudalist Meme influencers play with symbolism to build influence and allegiances convenient to manipulating the information landscape in ways that further their goals, consolidate their power and influence the public; this type of politics is called Diagonalism.
'Diagonalism,' as Klein says, is the word William Callison and Quinn Slobodian have used to characterise these new alliances, 'born in part from transformations in technology and communication' and 'generally arcing towards far-right beliefs', while also contesting 'conventional monikers of left and right'. Diagonalists, in this typology, mostly self-identify as middle-class and are disproportionately self-employed.
This can be seen in the transition of Elon Musks political commitments, going from supporting the memes of modern moral progress and equality...
To using memes to undermine the legitimacy of the “other side’s” memes as a mark of identifying against that group…
To full blown memetic-fascism…
VI. Case Studies in Digital Feudalism
A. Platform Manipulation
Prior to Musks Takeover, Twitter was known to have a bias favouring sensationalist right wing content.
There has also been investigation into how the recommendation feeds on other platforms work to radicalise people towards extreme views.
"According to the aforementioned radicalization hypothesis, channels in the I.D.W. and the Alt-lite serve as gateways to fringe far-right ideology, here represented by Alt-right channels. Processing 72M+ comments, we show that the three channel types indeed increasingly share the same user base; that users consistently migrate from milder to more extreme content; and that a large percentage of users who consume Alt-right content now consumed Alt-lite and I.D.W. content in the past. We also probe YouTube's recommendation algorithm, looking at more than 2M video and channel recommendations between May/July 2019. We find that Alt-lite content is easily reachable from I.D.W. channels, while Alt-right videos are reachable only through channel recommendations. Overall, we paint a comprehensive picture of user radicalization on YouTube."
YouTube content aimed at teenagers talking about video games turned into discussions about the introduction of female characters and the overreach of feminism. Before you knew it, you had YouTube celebrities like Sargon of Akkad running with UKIP and appearing on TV talking about how he “wouldn’t even rape” Jess Phillips.
Now I understand how this looks to normal people, but to edgy teenagers radicalised into this way of viewing things this was a HUGE win. The sheer outrage, and inadequacy of “the establishment” to win these engagements merely bolstered the extreme political ideology. Getting upset, frustrated, showing emotion was just liberal tears and showed how The Left can’t meme and wanted to shut down jokes.
A sentiment reflected by those down the radicalisation pipeline in many ways, often under the guise of the debate around “Free Speech”.
There is a quote by Sartre which quite accurately captures this use of memetic humour by the right to normalise moral taboos. Sartre talks about antisemitism, but I believe these remarks can be extended to “the edgelord memer”.
“Never believe that anti-Semites are completely unaware of the absurdity of their replies. They know that their remarks are frivolous, open to challenge. But they are amusing themselves, for it is their adversary who is obliged to use words responsibly, since he believes in words. The anti-Semites have the right to play. They even like to play with discourse for, by giving ridiculous reasons, they discredit the seriousness of their interlocutors. They delight in acting in bad faith, since they seek not to persuade by sound argument but to intimidate and disconcert. If you press them too closely, they will abruptly fall silent, loftily indicating by some phrase that the time for argument is past.”
None of the politicians from The Old World were able to see this happening or combat it. In fact, their very engagement to combat this stuff using the tools of the Old World was seen as a pathetic attempt to shut down free speech where these feminists, gay rights supporters and progressives had views that couldn’t compete in “The Free Marketplace of ideas”. — I can tell you that’s exactly what the optics were because I was one of the teenagers who got radicalised by this stuff.
The problem with the marketplace analogy is that it by no means guarantees the victory of superior goods, but only goods the consumer wants to buy. The problem is multifaceted. The desires of consumers have been manipulated by marketing techniques, goods they are aware are on sale are influenced by things like popularity, or endorsement in alignment with a companies recommendation feed (which promotes engagement). These dynamics are not truth-conducive and exactly why we end up with the shoddiest Big Mac of political ideologies produced for mass appeal and consumption to generate a profit.
There were many investigations into the negative ways that social media was influencing our beliefs and had a bias for sensationalist, right wing propaganda.
Examples check out include:
There are many more examples of state actors explicitly exploiting this sort of thing, particularly in the cases of the Brexit Referendum political campaign and Trump’s 2015 political campaign in an expose written by a former employee of Cambridge Analytica, Chris Wylie.
B. Information Warfare in Action
Feudal Lords are overpowered nodes in the information network who can boost any kind of information expedient to them in order to make reality what they want and achieve their goals.
One such example of this was when Musk got involved in UK politics following a tragic event involving "The fatal stabbing of three young girls at a dance class in the seaside town of Southport, in the north of England, [leading to] the worst unrest the UK has seen in more than a decade."
The stabbing, as tragic and terrible as it was, became another example of the way that the Schemas that people have been cultivated with warp and distort their engagement with information. The public’s rightful outrage and anger at the crime was hijacked, and their reality was distorted by a collection of System 1 heuristics around illegal immigration into the UK and a clash of civilisations between “Islamic Culture” (ah yes that homogenous thing…) and “Western Culture” (that other homogenous thing).
The anti-immigration anti-Islam memes were promoted by all of the Feudal Lords of the right. Musk, Libs of TikTok ( a propaganda account frequently boosted by these guys ), Andrew Tate, Farage and others.
The claims contained in these memes were completely off base. The perpetrator of the crime, Axel Rudakubana, was born in Wales (UK) in 2007 and is a British Citizen, his parents are Rwandan Christians. Insofar as this evil guy was influenced by culture it was British Culture, and Christian values… This ought to completely undermine the “migrants coming off boats and doing these horrific crimes” narrative, but no. Despite this having nothing to do with Islam, or anything else, the fact that Rudakubana is black is enough to hijack other racist memes and lead an already emotionally incensed public into still believing this has something to do with Islam (where most of the adherents happen to be… dark skinned — “BuT iSlaM iSnT a RaCe” — nobody ever said racism had to be consistent).
This is just one example of how Musk used his place in the Feudal influencer network to shape public perception and encourage violence in the UK.
This is patterned behaviour has been observed and documented by
involving different incidents with the promotion of harmful politics such as racism and antisemitism. One example being the boosting of misrepresented crime statistics, toward the end of more colonising his audience with politically convenient schemas. The schema in this case being “The left are always lying to you misrepresenting statistics”, the irony, or even projection being that that is precisely what Musk is doing here…The real data, which Musk makes no attempt to engage with shows this by the way…
Jones’s analysis shows the way in which Musks engagement with tweets such as this. Often using “!!concerning” or something similar boosts the misinformation through the network.
These case studies demonstrate the precise mechanisms outlined in our theoretical framework. The YouTube radicalization pipeline perfectly illustrates how System 1 thinking gets exploited through recommendation algorithms, creating a path of least resistance toward increasingly extreme content. Each step provides new schemas that reshape how viewers interpret subsequent information, exactly as our dual process model predicted.
The Southport incident shows how modern feudal information control operates in practice. When Musk and other digital influencers amplified misleading narratives about the stabbing, they weren't just spreading misinformation - they were exploiting pre-existing schemas about immigration and race, triggering System 1 responses that bypassed rational evaluation. The fact that these narratives persisted even after being definitively disproven demonstrates how deeply embedded these cognitive frameworks become.
Most importantly, both cases exemplify the core principles of Diagonalism: the way traditional political categories dissolve into networks of influence, how truth becomes secondary to narrative control, and how digital platforms serve as force multipliers for these dynamics. The speed and scale of these information operations - from YouTube's algorithmic nudges to Musk's viral messaging - showcase why traditional institutional responses are increasingly ineffective against these new forms of information warfare.
VII. Co-Ordinated Attacks on Democratic Information
The Theatre of Information Warfare
The social media landscape has become a carefully orchestrated stage, where information warfare is conducted through theatrical performance. This approach, pioneered by Surkov in Russia, has been explicitly adopted by Western right-wing strategists. Steve Bannon's "Flooding the Zone" strategy exemplifies this evolution, transforming information warfare from targeted propaganda to overwhelming information chaos.
Flooding the Zone: A Strategic Framework
The strategy operates on multiple levels:
Network Exploitation: Using both legacy and social media to generate constant noise
Dual Amplification:
Positive: Through allied influencers promoting the message
Negative: Through opponents' attempts to fact-check and debunk
Velocity of Information: Overwhelming traditional media's capacity to respond
As Bannon himself noted:
"...they understand flood the zone, and how they're trying to combat it with 'focus focus focus'. They're still getting overwhelmed, and they will continue to be overwhelmed, because President Trump is taking actions in every different vertical..."[14]
The Asymmetric Battle for Truth
This creates a fundamental power asymmetry in democratic discourse:
Defensive Position: Democratic institutions must:
Verify claims
Provide evidence
Follow journalistic standards
Maintain credibility
Offensive Position: Bad actors can:
Generate constant controversies
Shift narratives rapidly
Ignore fact-checking
Exploit emotional responses
As the Brandolini’s Law states, "The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it."
The Collapse of Traditional Safeguards
The traditional mechanisms of democratic information processing - journalism, academic research, fact-checking - find themselves increasingly impotent. Their commitment to truth and verification becomes a vulnerability when faced with actors unconcerned with factual accuracy. By the time thorough analysis and debunking is complete:
The narrative has shifted
Public attention has moved on
New controversies have emerged
The damage is already done
The End Game
The ultimate consequence is the systematic undermining of democratic information systems. When truth-seeking institutions fail to keep pace with the flood of misinformation, power gradually shifts to those controlling the narrative. Once this power transfer is complete, the pretence of democratic discourse can be abandoned entirely - dissenting voices can be silenced, imprisoned, or erased, and truth becomes whatever those in power declare it to be.
This framework shows how "Flooding the Zone" isn't just a media strategy - it's a sophisticated attack on the fundamental information processing capabilities of democratic societies, exploiting their commitment to truth and verification as a structural weakness.
VIII. The Extent of Information Technology
This is all rather negative— and I believe we ought to be dismayed. There are many ways that good people can combat this shite though. One approach is through individual interactions with people that aim to deradicalise them through dialogical interventions ( I have talked about this, and try to model this a lot on my YouTube channel ). However, a lot of that only works within a liberal democratic society that protects open debate, journalism and academic freedoms. Under conditions of authoritarianism, these tools can only do so much, dissenting voices are shut down or imprisoned, and (as I outlined in this piece) due to the ownership, incentives and strategic use of information technology, Old World tools of “debunking” and “rational discourse” cannot out-compete malevolent actors.
Fundamentally, You Can’t Post your Way out of Fascism — and U.S. Citizens who are outraged and worried about the ways in which Constitutional Democracy is being dismantled should I believe be clearly consulting their conscience on how much they care about maintaining that, what they consider absolute red-lines for an authoritarian federal government and what rights and responsibilities their constitution has given them with respect to whether or not they will have to “fight like hell” or they might “[not] have a country anymore”. I promise that I am not saying these sorts of things incautiously. Americans who care about protecting democracy must also be smart and careful, because there are many MAGA supporters, domestic terrorist organisations such as The Oathkeepers chomping at the bit for political violence to begin, and similarly GOP administrators looking for an excuse to escalate, exert power and declare martial law. My only point is, if Americans truly believe that the purpose of the second amendment is to protect against a tyrannical federal government, then Americans need to take that responsibility seriously, organise, and clearly articulate what sort of thing would constitute an absolute red line from an incumbent President. Suggestively, I include the following:
“So there’s this guy Curtis Yarvin, who has written about some of these things,” [J.D.] Vance said. [Jack] Murphy chortled knowingly. “So one [option] is to basically accept that this entire thing is going to fall in on itself,” Vance went on. “And so the task of conservatives right now is to preserve as much as can be preserved,” waiting for the “inevitable collapse” of the current order.
He said he thought this was pessimistic. “I tend to think that we should seize the institutions of the left,” he said. “And turn them against the left. We need like a de-Baathification program, a de-woke-ification program.”
“I think Trump is going to run again in 2024,” he said. “I think that what Trump should do, if I was giving him one piece of advice: Fire every single midlevel bureaucrat, every civil servant in the administrative state, replace them with our people.”
“And when the courts stop you,” he went on, “stand before the country, and say—” he quoted Andrew Jackson, giving a challenge to the entire constitutional order—“the chief justice has made his ruling. Now let him enforce it.” . . .
“We are in a late republican period,” Vance said later, evoking the common New Right view of America as Rome awaiting its Caesar. “If we’re going to push back against it, we’re going to have to get pretty wild, and pretty far out there, and go in directions that a lot of conservatives right now are uncomfortable with.”
“Indeed,” Murphy said. “Among some of my circle, the phrase ‘extra-constitutional’ has come up quite a bit.” [21]
VIII. Conclusion: Building Resilient Democratic Systems - What Comes Next?
This piece is very loose political and sociological commentary for a project I am undertaking aimed more at governments, institutions and technologists who still believe in liberal democracies and see the value and importance of defending them using Information Technology.
This project is a revival of some older ideas from Systems Theory, Ecology, Sociology and Computer Science that I shall be articulating in forthcoming posts on Building Resilient Ecological Computational Infrastructure for Preserving Liberal Democracies.
Of course, much more analysis and many more solutions are required. There is not simply one single cause for our current political hellscape. This is one analysis of one of the causes.
I will aim to write on these topics and my more practical ideas in the next few months. If you are interested in these then please subscribe to my Substack.
You can also find links to join the discord server or to donate to me on Patron here: https://linktr.ee/digitalgnosis
References
[1] Oxford Languages. "Word of the Year 2024." Oxford University Press, 2024.
[2] Wikipedia. "Attention Economy." Last modified 2024.
[3] Suman, Swati. "How Edward Bernays' Manipulation Through Propaganda Became Marketing History." Medium, 2023.
[4] Wikipedia. "History of Propaganda." Last modified 2024.
[5] Starr, Paul. "The Flooded Zone: How We Became More Vulnerable to Disinformation in the Digital Era." In The Disinformation Age, edited by W. Lance Bennett and Steven Livingston, 67-92. Cambridge: Cambridge University Press, 2020.
[6] Hosaka, Sanshiro. "Welcome to Surkov's Theater: Russian Political Technology in the Donbas War." Nationalities Papers 47, no. 5 (2019): 750-773.
[7] Jackson School of International Studies. "A Russian Federation Information Warfare Primer." University of Washington, 2024.
[8] Huhn, Vilgot. "Decoupling Decoupling from Rationality." Unconfusion (blog), 2023.
[9] Törnberg, Petter. "How Digital Media Drive Affective Polarization through Partisan Sorting." Proceedings of the National Academy of Sciences 119, no. 42 (2022): e2207159119.
[10] Klaas, Brian. "Schemas and the Political Brain." The Garden of Forking Paths (newsletter), 2022.
[11] Jung, PF. "Applied Sociology 201: 'Memetic Feudalism.'" PF Jung's War Room (newsletter), 2024.
[12] Kabas, Marisa. "Elon Musk Crosses the Border to Nazi." The Handbasket (blog), 2023.
[13] Hudson, Marc. "What Are Words For: Diagonalism." Personal blog, 2024.
[14] "Episode 4251: The War on Christians is Over Under the Trump Administration." Rumble video, 2024.
[15] Wikipedia. "Brandolini's Law." Last modified 2024.
[16] Stewart, Rory. "The Long History of Argument." 2024.
[17] Wikipedia. "2024 Tenet Media Investigation." Last modified 2024.
[18] BBC News. "Southport Incident Coverage." 2024.
[19] Wikipedia. "2024 Southport Stabbings." Last modified 2024.
[20] "Read Trump's Jan. 6 Speech, A Key Part of Impeachment Trial." NPR, February 10, 2021.
[21] Nguyen, Tina. "Inside the New Right, Where Peter Thiel Is Placing His Biggest Bets." Vanity Fair, April 20, 2022.