Cloud-based instant messaging app Telegram is removed from the App Store due to ‘inappropriate content ‘ distributed through the app.
An email shared by 9to5Mac reads a response from Phil Schiller “The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).”
In essence, the company has taken a dignified stand and is intolerant toward foul content circulated using the App Store. “I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.”
This is an important message to all tech companies to extend their efforts to prevent vile content from being distributed through their services.
Telegram is a secure feature that works on end-to-end encryption to protect messages shared between users. So, the content is not shared between users, but sourced from a third-party plugin used by the app.
However, the Telegram app was back on track with fixes on App Store within hours after its removal. “The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store,” the email said.