Apple has removed Parler from its App Store, citing the social media platform’s failure to moderate content that incited violence following the US Capitol insurrection. Here’s why.
Checkout this video:
Apple has removed Parler, a social networking app popular with conservatives, from its App Store.
The move comes after two days of violence at the US Capitol by pro-Trump supporters. Many people used Parler to plan the riots, and the app has been criticized for not doing enough to moderate dangerous content.
In a statement, Apple said that Parler had not taken adequate measures to address the proliferation of threatening and violent content on its platform.
“We have always supported diverse points of view being represented on the App Store, but there is no place for threats of violence and illegal activity,” an Apple spokesperson said. “Parler has not taken adequate measures to address the proliferation of these threats to people’s safety.”
Google also removed Parler from its app store on Friday night, citing a “repeated violations” of its policies.
What is Parler?
Parler is a social networking service that launched in August of 2018. The app advertises itself as being a “unbiased social media platform” that allows users to have open discussions about current events and news. The app was popular among conservatives and right-wing individuals, as it was seen as a haven for free speech.
In the weeks leading up to the U.S. Capitol insurrection on January 6th, 2021, Parler became increasingly popular as a platform for people to share their plans and coordinating the event. Following the insurrection, Apple removed Parler from its App Store, citing the platform’s failure to moderate content that encouraged violence.
Why Did Apple Ban Parler?
On January 10th, 2021, Apple removed Parler from the App Store due to its failure to moderate content and enforce its own rules. This resulted in widespread criticism of the platform, which had been previously touted as a safe haven for free speech.
Parler’s failure to moderate content
Apple removed the Parler app from its App Store on Saturday, January 9th, 2021. This came after it was revealed that Parler had failed to moderate content related to the Capitol Hill insurrection on January 6th. Apple said in a statement that Parler did not address “meaningful and constructive” moderation after being given 24 hours notice.
Parler is a social networking service that bills itself as “unbiased” and “free speech-friendly”. It has been popular with right-wing users, including those who have been banned from other platforms such as Twitter. Parler CEO John Matze has said that the platform would not censor user content.
The removal of the Parler app from the App Store comes as tech companies face growing pressure to crack down on hate speech and misinformation.
Apple has a set of policies that all apps must follow in order to be allowed on the App Store. One of these policies is that apps must not promote violence or incite illegal activity.
After the deadly riot at the US Capitol on January 6th, it was discovered that many people who participated had been using the Parler app to plan and coordinate the event. Based on this, Apple concluded that Parler was not in compliance with their policies and banned the app from the App Store.
What Does This Mean for the Future of Social Media?
Since the attack on the U.S. Capitol last week, there has been increased pressure on social media companies to do more to stop the spread of false information and conspiracy theories. On Friday, that pressure led to a major decision by Apple, which removed Parler—a social network popular with conservatives—from its App Store.
The move is a significant blow to Parler, which has seen its user base grow exponentially in recent months. But it also raises larger questions about the role of social media companies in policing content, and what this means for the future of the internet.
So why did Apple ban Parler? And what does this mean for the future of social media?
There are two main reasons why Apple decided to remove Parler from its App Store.
First, Apple said that Parler had failed to moderate “egregious content” on its platform, specifically citing posts that “encouraged violence.” In a statement, Apple said it had given Parler 24 hours to submit a plan for addressing the issue, but that the company had not “taken adequate measures to address the proliferation of these threats to people’s safety.”
Second, Apple said that Parler had not put in place “sufficient moderation controls” to prevent illegal content from being shared on its platform. This included images and videos from last week’s attack on the U.S. Capitol, which were still being circulated on Parler even after other platforms had removed them.
These are both serious charges, and they underscore the growing concerns about the role of social media companies in regulating content. While platforms like Facebook and Twitter have long faced criticism for their handling of sensitive topics, the events of last week have intensified calls for them to do more to stop the spread of misinformation and conspiracy theories.
The pressure has been especially intense on Parler, which has become a go-to platform for conservatives who feel unwelcome on other social networks. In recent months,Parler has been repeatedly accused of being a safe haven for hate speech and extremism, and calls for it to be banned have been growing louder.
So far, those calls have largely been ignored by tech companies—but Apple’s decision suggests that may be changing. It’s still not clear how long Parler will be off the App Store, or what steps it will need to take in order to be reinstated. But this is a major development, and it could have far-reaching implications for social media companies and users alike.
Apple has removed Parler from the App Store, citing the company’s failure to moderate content that incited violence during last week’s insurrection at the U.S. Capitol.
“We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity,” Apple said in a statement. “Parler has not taken adequate measures to protect user safety and to prevent the dissemination of harmful and illegal content, and we have suspended it from the App Store until they resolve these issues.”
This is not the first time Apple has removed an app for failing to moderate its content; in 2017, it removed the alt-right social network Gab after it was found to be hosting hateful speech. Parler has been hailed as a “free speech” alternative to Twitter and other social networks, but it has also been criticized for its lax moderation policies.