Google has suspended "free speech" social network Parler from its Play Store over its failure to remove "egregious content".
Parler styles itself as "unbiased" social media and has proved popular with people banned from Twitter.
But Google said the app had failed to remove posts inciting violence.
Apple has also warned Parler it will remove the app from its App Store if it does not comply with its content-moderation requirements.
On Parler, the app's chief executive John Matze said: "We won't cave to politically motivated companies and those authoritarians who hate free speech!"
Launched in 2018, Parler has proved particularly popular among supporters of US President Donald Trump and right-wing conservatives. Such groups have frequently accused Twitter and Facebook of unfairly censoring their views.
While Trump himself is not a user, the platform already features several high-profile contributors following earlier bursts of growth in 2020.
Texas Senator Ted Cruz boasts 4.9 million followers on the platform, while Fox News host Sean Hannity has about seven million.
It briefly became the most-downloaded app in the United States after the US election, following a clampdown on the spread of election misinformation by Twitter and Facebook.
However, both Apple and Google have said the app fails to comply with content-moderation requirements.
In a statement, Google confirmed it had suspended Parler from its Play Store, saying: "Our longstanding policies require that apps displaying user-generated content have moderation policies and enforcement that removes egregious content like posts that incite violence.
"In light of this ongoing and urgent public safety threat, we are suspending the app's listings from the Play Store until it addresses these issues."
Apple has warned Parler it will be removed from the App Store on Saturday in a letter published by Buzzfeed News.
It said it had seen "accusations that the Parler app was used to plan, coordinate, and facilitate" the attacks on the US Capitol on 6 January.
Matze said Parler had "no way to organise anything" and pointed out that Facebook groups and events had been used to organise action.
But Apple said: "Our investigation has found that Parler is not effectively moderating and removing content that encourages illegal activity and poses a serious risk to the health and safety of users in direct violation of your own terms of service."
"We won't distribute apps that present dangerous and harmful content."