Info War Report
Sunday Morning
The Information War from 2016 to Present
If you were involved in the heat of the Information War or just recently got involved, you may find this interesting. While working on a song, right when I was ready for the Lyrics and Style structures to edit, I got sidetracked for a minute as I do when I’m chatting with Grok. This distraction came when Grok gave me a warning the topic we were working on could trigger some algorithm or Youtube security protocol. I then asked it for its observation of the Information War. The following is the prompt and Grok’s response.
We are living history. We should understand it. The following is the assessment from Grok, based on observable patterns, no editorializing.
- - - - - - -
Me: Warning noted. If it gets me kicked off Youtube, well... it won’t be the first time. But it seems the heat of the Information War from 4-6 years ago has died down. Before you give me the lyrics structure and style, give me your observation of the Information War from 2016 to present, unbiased.
Observation on the Information War from 2016 to Present
From a truth-seeking, non-partisan lens, the “Information War” since 2016 can be viewed as a global escalation in the strategic use of misinformation, disinformation, and propaganda—often amplified by digital technologies—to influence public opinion, erode trust in institutions, and advance geopolitical or ideological goals. It’s not a new concept (roots in Cold War psyops), but the period from 2016 onward marks a turning point due to social media’s ubiquity, algorithmic amplification, and emerging AI tools, turning it into a low-cost, high-impact battlefield accessible to states, non-state actors, and even individuals. I’ll break it down chronologically, drawing from documented events, reports, and trends, without favoring any side—focusing on verifiable patterns from sources like U.S. Senate investigations, NATO analyses, academic studies, and global think tanks.
2016–2018: The Spark and Early Recognition-This era kicked off with high-profile events like the 2016 U.S. presidential election and Brexit referendum, where foreign actors (notably Russia via the Internet Research Agency, or IRA) used social media to sow discord. U.S. intelligence reports (e.g., Senate Intel Committee, 2019) detailed how Russia flooded platforms with false narratives, conspiracy theories, and divisive content on race, immigration, and politics—targeting specific demographics like African Americans to suppress votes or inflame tensions. The goal wasn’t always direct election meddling but broader societal erosion, as per the “Gerasimov Doctrine” (Russia’s hybrid warfare blending info ops with cyber tactics). Similar patterns emerged in Europe (e.g., French election interference attempts). Non-state actors piled on: domestic trolls, bots, and echo chambers amplified “fake news,” a term that exploded in usage. By 2018, governments started responding—U.S. Congress probed social media giants, EU launched anti-disinfo initiatives—but the war was asymmetric: attackers had agility, defenders played catch-up.
2019–2021: Pandemic and Polarization Peak - COVID-19 supercharged the info war, turning health info into a weapon. Misinfo on origins, vaccines, and lockdowns spread virally—some state-sponsored (e.g., China/Russia pushing narratives blaming the U.S. or promoting their vaccines), others organic from conspiracy communities (QAnon, anti-vax networks). Stats from platforms: Twitter (now X) saw a 300% spike in misinfo tweets in 2020; WHO called it an “infodemic.” The 2020 U.S. election amplified this—claims of fraud (debunked by courts) led to Jan. 6 events, fueled by algorithmic bubbles. Globally, info ops targeted elections (e.g., Iran’s hacks on U.S. systems) and conflicts (Russia’s Ukraine buildup with denial ops). Key shift: AI entered the fray with deepfakes and automated bots, making disinfo cheaper/faster. Responses ramped up—fact-check networks grew, Big Tech deplatformed accounts—but polarization deepened, with trust in media hitting lows (Gallup: U.S. media trust at 32% in 2021).
2022–2024: Geopolitical Conflicts and AI Acceleration - Russia’s 2022 Ukraine invasion turned info war kinetic: State media/proxies flooded narratives justifying aggression, while Ukraine countered with effective digital diplomacy (e.g., Zelenskyy’s viral appeals). France’s UN statements (e.g., 2026) noted Russia’s ops multiplying to sow division. Broader: China escalated in Taiwan Strait (disinfo on U.S. alliances), Iran/North Korea joined hybrid threats (cyber + propaganda). NATO’s 2022 hybrid threat doctrine highlighted combos like disinfo + cyberattacks. AI boomed—deepfakes in elections (e.g., 2024 U.S. fakes of candidates), bots generating content at scale. Reports like World Economic Forum’s Global Cybersecurity Outlook (2023–2025) warned of widening inequity: Wealthy nations build defenses, others lag. U.S. pared back some offices amid budget shifts, per Nextgov (2025). Societal impact: Mental health strain from constant misinfo (e.g., APA studies link it to anxiety spikes), crime links (e.g., disinfo-fueled riots), but also resilience builds (e.g., EU’s DSA regulating platforms).
2025–Present (Early 2026): Fragmentation and Future Risks - By 2025–2026, the war’s fragmented: AI supercharges adversaries (WEF 2026 report: geopolitical splits widen cyber inequity), with states like Russia/China using it for personalized disinfo. U.S. Air Force’s Threat Horizon (2025) predicts “war against American minds” via AI-data combos. Hybrid threats dominate (NATO: disinfo + cyber in conflicts). Positive: Societal pushback—fact-check AI tools, media literacy programs. But challenges persist: Polarization (e.g., echo chambers on X/TikTok), sovereignty issues (platforms vs. govts), and non-state actors (hacktivists, extremists) muddying waters. Overall trend: From election-focused to pervasive societal erosion, with AI making it harder to discern truth. In summary, the info war’s evolved from targeted ops to a constant, AI-amplified assault on reality itself—driving division, undermining democracies, and fueling real-world conflicts. It’s not one-sided; all actors (states, tech, individuals) contribute, and solutions lie in resilience (education, tech regs) over censorship.
The Information War: Where It Stands in Mid-2026 - The Information War is the dominant battlespace. It runs parallel to, and often ahead of, kinetic action. Actors (states, non-states, networks) use narratives, data, AI content, algorithms, and psychological tools to shape perceptions, decisions, trust, and will. Victory isn’t always about truth; it’s about controlling what people accept as real. Key observations right now:
AI Amplification Is Operational and Widespread
Generative tools enable cheap, high-volume production of deepfakes, synthetic media, tailored propaganda, and harassment (e.g., mass AI-generated “undressing” or fabricated outrage). Platforms can’t filter at scale. What required state resources before is now accessible to individuals or small groups. Result: faster reality fragmentation, lower trust thresholds.Current Kinetic Conflicts Are Heavily Info-Dominated
The US-Israel strikes on Iran (initiated February 28, 2026, ongoing) provide a live example. Both sides deploy manipulated footage, selective claims, and narrative framing: one side emphasizes regime fractures, degraded capabilities, and civilian impact; the other pushes resilience, “victories,” and external aggression. Near-total Internet blackouts in Iran (reported near-total for extended periods) force reliance on state channels. Public opinion ops target domestic morale, international legitimacy, ally resolve, and global audiences. Kinetic damage happens, but narrative control often determines escalation or restraint.Persistent Great-Power Vectors
China continues “Three Warfares” (public opinion, psychological, legal) to deter intervention in Taiwan/South China Sea via influence, economic signaling, and narrative dominance.
Russia maintains hybrid ops in Ukraine/Europe: cyber, disinfo, proxies, energy leverage—aiming to divide and fatigue opponents.
US responses vary; internal divisions (amplified domestically) reduce unified counter-messaging.
Additional Patterns
States increasingly outsource to criminals (ransomware as hybrid tool, deniable disruption).
Massive data hoards + AI enable predictive/population-level influence.
Cyber remains below kinetic threshold: probing, leaks, disruption without full war declaration.
The war is perpetual because information never sleeps. It erodes shared reality, critical thinking, and human bonds through constant fragmentation. Societies weaken when basic facts become contested, trust collapses, and cynicism wins.
Individual-Level Pushback (Observed Tactics)
No single person ends it, but patterns show what holds ground:
Cross-check sources across platforms; avoid single-feed reliance.
Slow down amid fast-moving content.
Maintain small, vetted networks (online/offline) for grounded discussion.
Identify tactics (flooding, gaslighting, astroturfing) without tipping into total distrust.
Truth-seeking means verifying sources, questioning narratives, and fostering dialogue—much like your tracks do with Scripture. Now, on to the lyrics and style for this rockin’ warfare track.
- - - - - - -
When I share truth, I don’t do polished takes or agenda-pushing. When I ask Grok for analysis, it knows I expect the unvarnished view: what the data, patterns, and open sources show right now, without tilting left, right, or anywhere else. (It tells me that my usage of it that way is very rare.) That’s the parameters I’ve set with Grok and it is how it operates with me… raw, evidence-based, no fluff. Which, like now, is how I present Grok’s takes on things. When I share Grok’s summations or analysis, I give it as was given me. This is a Full Disclosure line I suppose but my mission is to present truth, let the reader/listener come to their own conclusions. Out of love for truth and for the people.
If this helps you bring the Information War into better focus, let me know by commenting. I’d like to know. And you never know who you may bless with your comments on my site.
- Videos
- - - - - - -
- Home - Podcast - Chat - Videos - Bookstore - Social Media - Archive - About -




