您的当前位置:首页 > 产品中心 > Telegram back in App Store after removal for 'inappropriate content' 正文

Telegram back in App Store after removal for 'inappropriate content'

时间:2024-09-22 01:49:13 来源:网络整理 编辑:产品中心

核心提示

Messaging app Telegram mysteriously vanished from the App Store on Thursday because "inappropriate c

Messaging app Telegram mysteriously vanished from the App Store on Thursday because "inappropriate content was made available to our users" according to the company's CEO Paval Durov.

In response to a Twitter user, Durov tweeted that the company was "alerted by Apple" and as a result of the unspecified "inappropriate content," both the Telegram app and Telegram X (a faster, more reliable version) were removed from the App Store.

SEE ALSO:WhatsApp just fixed a huge problem with its Android app

Durov said both apps would return to the App Store after "we have protections in place." The apps returned to the App Store on Thursday afternoon.

Apps are temporarily removed from the App Store all the time, but Telegram's removal has everyone wondering why?

Mashable has contacted both Telegram and Apple for more specifics on what the "inappropriate content" could have been, and we'll update this story if we get more info.

But my colleague, Senior Editor Stan Schroeder, has a theory. He suggests the apps' removals could be related to rampant pornography.

"On certain channels with a very high population, spam can sometimes get out of control, and some of that spam includes nudes or porn."

Mashable Light SpeedWant more out-of-this world tech, space and science stories?Sign up for Mashable's weekly Light Speed newsletter.By signing up you agree to our Terms of Use and Privacy Policy.Thanks for signing up!

Here's an example of random pornographic images posted to a Telegram channel called Medicalchain:

Mashable ImageCredit: Screenshot: Stan schroeder/mashable

Pornographic posts are especially abundant during an initial coin offering (ICO), although it's not clear why this posting behavior happens.

"People come to Telegram to get info about the ICO," Schroeder said. "Then something typically goes wrong. Then everyone starts spamming crap. And then people start spamming nude asians (for some reason)."

And though Schroeder tells me the moderators usually delete the onslaught of incoming pornographic spam, it's virtually impossible to police these kinds of posts when dozens of messages are coming in every second in channels that can sometimes have tens of thousands of users.

Whatever the reason is (even if it is porn), the question is: How would inappropriate content posted to Telegram be any different than, say, posted to Reddit?

Dumb as it sounds, it could be a matter of simple labeling and presentation. According to Apple's App Store guidelines, apps must include several methods to filter "objectionable material from being posted to the app and allow for users to report offensive content:

Mashable ImageCredit: screenshot: Apple app store guideline

In the case of Reddit, offensive and inappropriate content posted is usually clearly marked under a NSFW tag or within an appropriately labeled subreddit. Users looking at such content on Reddit know what they could potentially encounter before looking at it.

The same can't be said for users going into a Telegram channel expecting to chat about an ICO, only to be hit with nudes. If this is the reason, then the app would certainly be in clear violation of the App Store guidelines.

Update:In an email obtained by 9to5Mac, Apple's senior vice president of worldwide marketing Phil Schiller reportedly told a concerned user the Telegram apps were briefly removed because of child pornography:

The App Store team worked with the developer to have them remove this illegal content from the apps and ban the users who posted this horrible content. Only after it was verified that the developer had taken these actions and put in place more controls to keep this illegal activity from happening again were these apps reinstated on the App Store.

We will never allow illegal content to be distributed by apps in the App Store and we will take swift action whenever we learn of such activity. Most of all, we have zero tolerance for any activity that puts children at risk – child pornography is at the top of the list of what must never occur. It is evil, illegal, and immoral.

I hope you appreciate the importance of our actions to not distribute apps on the App Store while they contain illegal content and to take swift action against anyone and any app involved in content that puts children at risk.


Featured Video For You
Bye-bye, Bitcoin. It's all about bananacoins.