We now have the answer for why the popular messaging app Telegram was pulled from the App Store last week. Telegram for iOS notably disappeared from the App Store for several hours without an explanation before the service’s CEO blamed the problem on Apple pulling the app due to ‘inappropriate content’ appearing in the app.

In the email, Schiller takes an admirable and firm position on never allowing such vile content as child pornography to be distributed through the App Store.

The response also explains what Telegram CEO Pavel Durov referenced when responding to a user last week who asked why the app was pulled:

Similar to Apple’s iMessage, Telegram offers a secure messaging feature that relies on end-to-end encryption for protecting the privacy of messages sent between users. This means the illegal content was likely not simply media being shared between users but more likely content being served up from a third-party plug-in used by Telegram.

Within hours of Telegram being pulled, the secure messaging app returned to the App Store with fixes in place to prevent the illegal content from being served to users.

You can read the full email below.

The Telegram apps were taken down off the App Store because the App Store team was alerted to illegal content, specifically child pornography, in the apps. After verifying the existence of the illegal content the team took the apps down from the store, alerted the developer, and notified the proper authorities, including the NCMEC (National Center for Missing and Exploited Children).

The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.

We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.

I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.

While it’s terribly unfortunate that such evil exists in the world and managed to find its way into an iOS app, it is reassuring to know that Apple will not hesitate to use its resources to stop illegal content from being distributed when possible.