Apple updates App Store rules for communication app safety
Apple has updated App Store moderation rules, tightening requirements for communication apps, especially anonymous chats and user-generated content, to enhance safety and compliance.
Apple has updated App Store moderation rules, tightening requirements for communication apps, especially anonymous chats and user-generated content, to enhance safety and compliance.
© A. Krivonosov
Apple has updated its App Store moderation rules, significantly tightening requirements for communication apps. The changes primarily affect anonymous chats and user-generated content services that, in the company's view, don't provide sufficient safety levels.
Under the revised rules, Apple reserves the right to remove apps without prior notice if they facilitate the spread of illegal content, misinformation, bullying, or other harmful user interactions. Special attention will be given to chats without mandatory registration, moderation tools, and reporting mechanisms.
The company emphasizes that developers must actively monitor user content, employ both automated and manual moderation systems, and respond promptly to complaints. Apps that can't demonstrate compliance with these requirements risk removal from the store.
These new rules could seriously impact popular anonymous communication platforms, especially those focused on free message exchange without user identification. While Apple hasn't specified which apps might face initial removal, developers are advised to urgently align their services with the updated policy.