Is Canada Prepared for Digital Interference in the 45th Election?
Despite overwhelming evidence that social media is being used to disrupt elections across the world, blindspots still leave Canadians vulnerable to misinformation and disinformation.
By Tess Corkery
Canada is weeks away from an election, in a global environment that is increasingly turbulent. The scale and reach of foreign and domestic actors seeking to disrupt democratic institutions in the west has surged on social media platforms. In a new America, tech companies are stepping back from their moderation responsibilities, leaving the Canadian democratic process vulnerable. In her recent report on foreign interference, Justice Marie-Josée Hogue reiterates this, stating: “It is no exaggeration to say that at this juncture, information manipulation (whether foreign or not) poses the single biggest risk to our democracy. It is an existential threat.” Canada is insufficiently prepared for the scale of digital interference that will define the 45th election.
In 2024, over half of the world's population went to the polls, with elections in 70 countries. All of these elections were marked by mis- and disinformation, meaning false information that is not intended to cause harm and false information deliberately intended to mislead, respectively. Some countries were able to protect their processes, while others did not fare as well. In December, Romania became the first EU member state in which elections were invalidated due to foreign interference and disinformation. Similarly, Moldova was investigated by Canada, the U.S. and Britain, finding that Russia used disinformation to sway elections in favour of pro-Russian candidates. In both cases, and in every election across the world, social media was the most powerful tool used by foreign and domestic actors to influence outcomes.
What happened in these elections is a warning for Canada and the rest of the world, about the extraordinary effectiveness of social media to spread mis and disinformation, and the real impacts it has on democratic processes and institutions. Platforms bear responsibility for enabling the spread of this content, yet little responsibility is taken. Instead, tech companies continue to shrink their trust and safety teams and slash content moderation. Meta and X, with a combined 4.6 billion users, removed fact checking from their platforms in 2025, meaning third party guard rails will no longer determine fact from fiction. What used to be a wicked problem has been largely removed from the agenda under the guise of free speech.
These fundamental changes are having an impact on Canada’s democratic process, with little ability for governments and civil society to be on the offensive. The Rapid Response Mechanism (RRM) run by Global Affairs Canada is working to monitor and respond to foreign state-sponsored disinformation. But, without real-time transparency from platforms or enforceable regulatory power, efforts like the RRM are largely reactive. Currently, the Media Ecosystem Observatory at McGill University is reporting on information incidents that could mislead the public and disrupt the democratic processes. In their recent report, AI-generated fake news content is being spread on Facebook and Instagram that mimics legitimate outlets like CBC and La Presse. There is evidence of widespread exposure, with 25% of Canadians indicating that they have encountered this type of content. These incidents leave the public with little support as real news has been barred on these platforms.
When such threats are escalated to the RRM, the platforms have the authority to remove and moderate as they see fit. Where platform owners wield extraordinary influence and act with impunity, the risks are undeniable. Elon Musk, for example, recently accused UK Prime Minister Keir Starmer of being “deeply complicit in mass rapes” and called for his imprisonment. The reality is someone with Musk’s level of algorithmic control and political reach can amplify such claims with no accountability. If similar actions were to take place during the current election, with accusations aimed at a Canadian political leader, the Government of Canada would have virtually no power to intervene.
To truly protect the integrity of Canada’s media ecosystem, researchers and policymakers need access to one critical thing: how platforms actually work. Currently, most researchers have no access to platform data, no insight into algorithmic decisions, and no reliable way to track the spread of harmful content in real time. It’s like trying to stop a virus without access to a microscope. You know it’s dangerous, but you can’t trace how it spreads or how to contain it. At the recent conference, Attention: Freedom, Interrupted hosted by the Media Ecosystem Observatory, policymakers, civil-society groups, and governments emphasised the need for researchers to be able to understand and analyse social media algorithms. Ethan Zuckerman, director of the Center for Civic Media at MIT, delivered a call to action for governments to regulate the data provided to researchers, stating that the public thinks people know much more than they really do about the platforms we use everyday. Without transparency from tech companies, even our most well-intentioned efforts remain incomplete and reactive.
Despite the overwhelming evidence that social media is an effective tool to disrupt and invalidate elections across the world, researchers and governments still don’t really know where the gaps are, leaving Canadians vulnerable. We know from history that bad actors don’t need to tamper with ballots or hack servers. All they need is to flood our feeds. In the absence of platform accountability, the Canadian government and public are left scrambling. There is no question that mis and disinformation will mark our 45th election and the media ecosystem in our country going forward. But this is not a call to censor. Free expression is foundational to democracy. The challenge is to be able to protect freedom of expression while ensuring information integrity.
Foreign interference is not a bug in the democratic system. It is a feature, one that’s been around as long as democracy itself. But the scale, speed, and sophistication of today’s disinformation campaigns require new tools, new laws, and new vigilance. Canada has not yet imagined what real digital democratic resilience looks like.
Tess Corkery is a Deputy Editor for The Bell and a dedicated public servant with a strong ambition to shape what our governments can achieve through public policy. Originally from Ontario, Tess studied Professional Communications at Toronto Metropolitan University and Nanyang Technological University in Singapore. During the pandemic, Tess began her career in the Federal Public Service, where she has since played a key role in enhancing digital service delivery through her background in UX design and policy implementation. By joining the Max Bell School, Tess aims to deepen her understanding of policy development and contribute to transformative changes in our public services.
Clear, informative and engaging! Thank you for this x