TikTok Algorithm May Be Favoring Mamdani Over Cuomo, New Research Suggests
The researchers fear that algorithmic influence could meaningfully affect voter behavior and public perception.

A new study suggests that TikTok’s algorithm may be giving preferential treatment to content supporting Zohran Mamdani in New York City’s mayoral race, while potentially suppressing videos that favor Andrew Cuomo.
The research, conducted by a team analyzing social media recommendation systems, suggests the platform’s “For You” feed isn’t as neutral as it appears. Instead of simply showing users what they’re likely to enjoy, the algorithm may be actively shaping public opinion ahead of the election.
“What if the algorithm isn’t just serving what you like, but shaping what you will like?” the Chief Technology Officer of Spring AI, Yehonatan Dodeles, wrote in a piece published on Medium. “These systems don’t merely reflect your tastes, they manufacture them, fine-tuning your emotions and beliefs with every swipe.”
The team at Spring AI — which says on its website that it is “Defending Democracies From Weaponized AI” — found that political content on TikTok receives far more artificial promotion than other topics. While most categories of videos follow a baseline pattern, with about 17 percent receiving what researchers call “excessive publicity,” political videos saw this rate jump to 55 percent — more than triple the normal amount.
Within political content, videos about the New York mayoral race stood out even more, the researchers say. Content supporting Mr. Mamdani and opposing Mr. Cuomo consistently received higher levels of algorithmic promotion compared to content favoring Mr. Cuomo, Spring AI found.
“Pro-Mamdani content substantially exceeded this already-elevated baseline, while pro-Cuomo content fell below it — suggesting not just relative disadvantage but active suppression,” Mr. Dodeles wrote.
Rather than simply collecting all videos posted to TikTok, the team focused on what users actually see in their feeds. They created both real and fake user profiles to track how the algorithm delivers different content to different people.
Using artificial intelligence, the Spring AI team built a system that can predict how many views a video should naturally receive based on factors like when it was posted and how people initially responded to it. Videos that performed much better or worse than predicted were flagged as potentially receiving artificial promotion or suppression.
The team says its intent was not to measure how much content exists for each candidate but rather to find out how much extra promotion certain videos receive beyond what their natural popularity would predict.
To test their method, the team looked at paid advertisements on TikTok — content they knew was being artificially promoted. Their system correctly identified more than 75 percent of these ads as receiving non-organic promotion.
“We’re measuring deviation from expected organic reach,” Mr. Dodeles wrote. “These videos received amplification beyond what their engagement patterns predicted, regardless of overall popularity or platform demographics.”
The team chose to release their preliminary findings now because of the upcoming election, despite the research still being in early stages. Mr. Dodeles warns that algorithmic influence could meaningfully affect voter behavior and public perception.
“The algorithm that selects which videos reach which users isn’t just determining what goes viral. It’s shaping what millions of people understand to be true about the world,” he wrote.

