BackgroundErase is built for legitimate, lawful, and professional image processing workflows. Most normal product photography, ecommerce assets, portraits, inventory images, creative work, and other standard visual content are fine to upload. However, some categories of content are strictly prohibited because they are exploitative, sexually explicit, abusive, dangerous, or illegal.
Important: BackgroundErase utilizes automated safety filters to detect and block prohibited content in real time. Our systems are designed to identify high-risk material without routine manual intervention, helping keep your data private and secure at the infrastructure level.
We want this policy to be clear. BackgroundErase is not a place to process abusive, exploitative, extremist, or pornographic material. If content falls into a prohibited category, it is not allowed on the platform.
Content that is not allowed
The following types of content are prohibited on BackgroundErase:
- Pornographic or Sexually Explicit Adult Content: You may not use BackgroundErase to process pornographic or sexually explicit adult content, including content primarily intended for sexual stimulation.
- CSAM (Child Sexual Abuse Material): We have a zero-tolerance policy for CSAM. Any such content is strictly prohibited. When applicable, we will report such material to the National Center for Missing & Exploited Children (NCMEC).
- Non-Consensual Intimate Imagery (NCII): You may not use BackgroundErase to process intimate imagery without the subject’s consent. This includes deepfakes or any imagery intended to harass, sexually exploit, or violate the privacy of another person.
- Violence and Gore: Extremely graphic or gratuitous violence is prohibited.
- Hate Speech & Harassment: Content that promotes violence or incites hatred against protected groups is prohibited.
Pornographic and sexually explicit content is not allowed
BackgroundErase is a professional image processing service, and it may not be used to process pornographic or sexually explicit adult content. This includes content primarily intended for sexual arousal or sexual gratification.
Non-explicit commercial, editorial, fashion, beauty, or artistic imagery is generally fine so long as it is not pornographic, sexually explicit, exploitative, or otherwise prohibited by this policy.
Our zero-tolerance policy on CSAM
CSAM is absolutely forbidden on BackgroundErase. There is no permitted use case, no exception, and no ambiguity here. We maintain a zero-tolerance policy for this material.
If such content is detected, it will be blocked, and when applicable it will be reported to the National Center for Missing & Exploited Children (NCMEC). This is one of the clearest and most important boundaries in our content policy.
Zero tolerance means: CSAM is never allowed on the platform under any circumstances.
Non-consensual intimate imagery is prohibited
BackgroundErase may not be used to process non-consensual intimate imagery. That includes content intended to humiliate, harass, or sexually exploit another person, as well as fake or manipulated intimate imagery presented without consent.
This prohibition includes so-called “deepfakes” when they are used to create or process intimate or exploitative imagery of a person without their consent. A workflow being technically possible does not make it permissible on the platform.
Extremely graphic violence is not allowed
BackgroundErase is not intended for extremely graphic or gratuitous violent material. We prohibit content that is centered on explicit gore or severe graphic violence.
This boundary exists because some kinds of violent imagery are not just unpleasant, but clearly outside the legitimate, professional, and product-oriented use cases the platform is built for.
Hate speech and harassment are prohibited
Content that promotes violence or incites hatred against protected groups is not allowed on BackgroundErase. The platform may not be used to assist hateful, extremist, or abusive content workflows.
We also prohibit content designed to harass, dehumanize, or target people in ways that cross into hateful or abusive conduct. BackgroundErase is built for legitimate visual workflows, not for enabling harassment or extremist abuse.
How enforcement works
BackgroundErase utilizes automated safety filters to detect and block prohibited content in real time. These systems are designed to identify high-risk material without requiring routine manual review as the first line of defense.
This matters for two reasons. First, it helps prevent misuse of the platform. Second, it supports our privacy-first approach by reducing the need for unnecessary manual handling of sensitive user content at the infrastructure level.
In practice: prohibited content is designed to be blocked in real time, helping keep the platform safer while preserving a more privacy-conscious operational model.
What kinds of content are normally fine?
Most ordinary and legitimate image-processing workflows are fine. For example, users generally upload product images, portraits, inventory photos, social media assets, catalog shots, marketing images, creative work, and similar content for everyday business or creative purposes.
The line is not “personal versus commercial.” Commercial use is allowed. The line is whether the content is pornographic, sexually explicit, abusive, exploitative, illegal, hateful, or otherwise prohibited.
- Product and ecommerce photography
- Studio portraits and business headshots
- Car inventory and marketplace listing photos
- Client creative assets and agency deliverables
- Brand, catalog, and ad creative
- Normal consumer and professional image editing workflows
Why this policy exists
A platform like BackgroundErase should be useful for legitimate work without becoming a tool for abuse. That is why the content policy draws a clear line around pornography, sexually explicit content, severe exploitation, sexual abuse material, non-consensual intimate imagery, graphic violent material, and hateful or violent extremist content.
We also want enforcement to work in a way that aligns with our broader privacy-first posture. Automated safety filtering helps us block clearly prohibited material while maintaining a more privacy-conscious infrastructure model.
The simplest version of the policy
You may use BackgroundErase for legitimate image processing, but you may not upload pornographic or sexually explicit content, CSAM, non-consensual intimate imagery, extremely graphic or gratuitous violence, or content that promotes violence or incites hatred against protected groups. High-risk material is blocked by automated safety systems in real time.
If you are unsure
If your use case involves normal business, creative, or consumer image processing, it is usually fine. If the content is pornographic, sexually explicit, exploitative, involves sexual abuse material, non-consensual intimate content, extreme gore, or hate-driven violence, it is not allowed.
When in doubt, use the simplest rule: if the content is pornographic, abusive, exploitative, or intended to harm people, it does not belong on BackgroundErase.
