Child Safety
Zero tolerance. Cloud Portrait has an absolute zero-tolerance policy for child sexual abuse material (CSAM) and any content that sexually exploits or endangers minors. All confirmed violations are reported to the National Center for Missing & Exploited Children (NCMEC) and relevant law-enforcement authorities worldwide.
Contents
Cloud Portrait is a portrait-photography community built for adults and young people who love the art form. The safety of children is a non-negotiable priority. We are committed to:
These standards apply to all users, all content, and all features of Cloud Portrait across every country where the app is available.
The following content and behaviour are strictly prohibited and will result in immediate account termination and reporting to authorities:
No exceptions. These prohibitions apply regardless of claimed artistic, educational, or cultural context. There is no content or circumstance under which CSAM or child-exploitative content is permitted on Cloud Portrait.
All images uploaded to Cloud Portrait are scanned using industry-standard hash-matching technology (PhotoDNA or equivalent) that compares uploaded content against databases of known CSAM maintained by NCMEC and other authorised organisations. Matched content is immediately blocked and reported.
Reports submitted by users are reviewed by trained moderators. Content flagged as potentially involving child exploitation is escalated immediately, reviewed with priority, and actioned within 24 hours.
Upon detecting a violation, Cloud Portrait will:
If you encounter content on Cloud Portrait that you believe involves the sexual exploitation or abuse of a child, report it immediately.
Use the in-app report function on any post or profile, or contact us directly at the email below.
Report to Safety TeamOn any photo, album, or user profile, tap the ⋯ menu → Report and select "Child safety". Your report is sent directly to our safety team and is treated as highest priority.
For urgent child safety matters, email alan.wm@hotmail.com. Include as much detail as possible — screenshots, usernames, and links help us act faster.
You are also encouraged to report CSAM directly to the relevant authority in your country. You do not need to wait for Cloud Portrait to act:
Cloud Portrait cooperates fully and without delay with law enforcement agencies investigating child exploitation. We will:
We do not require payment to respond to legitimate law-enforcement requests related to child safety.
Cloud Portrait requires users to be at least 13 years old to create an account, in compliance with the US Children's Online Privacy Protection Act (COPPA). In jurisdictions where digital consent age is higher — including 16 in most EU/EEA member states under GDPR Article 8, and 18 in certain countries — the higher age requirement applies.
We do not knowingly collect personal data from children below the applicable minimum age. If we become aware that an account has been created by an underage user, we will:
If you believe a child below the minimum age has created an account, please contact us at alan.wm@hotmail.com.
Prevention is as important as enforcement. Cloud Portrait is committed to:
The following organisations provide help, support, and additional information on child safety online:
For child safety matters, contact our dedicated safety team:
We aim to acknowledge all child safety reports within 2 hours and take enforcement action within 24 hours of a confirmed violation.
These standards are published externally in compliance with app store requirements including the Google Play Developer Policy on Child Safety and the Apple App Store Review Guidelines.