By default, no. Your uploads are not used to train or improve our models unless you explicitly opt in.
Our principle: We believe your proprietary data is yours alone. While many AI services treat your uploads as their training library, BackgroundErase only uses your data to improve our models if you explicitly give us the green light.
This is one of the most important parts of how BackgroundErase handles customer data. We do not assume that uploading an image gives us permission to use it as future training data. Model improvement based on customer uploads is opt-in, not opt-out.
If you do not explicitly turn that setting on, we do not retain your data for over 24 hours. That point is central to our privacy-first approach, and we want to make it unmistakably clear.
The short answer
Your uploads are not used to train our models by default. We only use your data to improve our models if you explicitly give us permission by enabling the opt-in feature at the top of your account page.
No opt-in means no training use by default, and no retention over 24 hours.
Where this setting lives
This choice is controlled from backgrounderase.com/account . At the top of the account page, there is an opt-in feature that controls whether you allow your data to be used for model improvement.
If that setting is not turned on, your uploads are not part of our training workflow, and we do not retain the data for over 24 hours. We want the default behavior to be safe and privacy-conscious even for users who never touch the setting.
Why we made this opt-in instead of opt-out
Many AI services operate on the assumption that uploads can be turned into future training data unless the user finds a way to stop it. We disagree with that default. We believe proprietary and commercial image data should not quietly become part of a model-improvement pipeline without clear, affirmative permission from the customer.
That is why BackgroundErase uses an explicit opt-in model. If a customer wants to contribute data to help improve the models, they can choose to do that. If they do not make that choice, their uploads are not used that way by default.
Our view: silence should not be treated as permission to turn customer uploads into a training library.
What happens if I do not opt in?
If you do not enable the opt-in feature, your uploads are not used to improve our models by default. Just as importantly, we do not retain your data for over 24 hours.
This is a key part of our privacy-first position and one of the major reasons privacy-conscious users and companies choose BackgroundErase. It means the default workflow is designed around limited retention and explicit permission, not around silent reuse.
- No training use by default
- No opt-in means no retention over 24 hours
- The default state is designed to minimize data reuse
- The customer remains in control of whether broader model improvement use is allowed
What happens if I do opt in?
If you explicitly enable the opt-in control at the top of your account page, you are giving us permission to use your data to help improve our models. That is a user-directed choice, not a hidden default.
The important point is that this only happens after a clear affirmative action from you. We do not infer that permission from normal product usage alone.
Why this matters for businesses
Many customers use BackgroundErase in professional workflows: ecommerce, marketplaces, agencies, internal media pipelines, SaaS products, client work, and other commercial operations. In those environments, teams often care deeply about whether uploaded images are being retained or reused to train future models.
A privacy-first default matters because it reduces ambiguity. Teams do not have to guess whether every uploaded image is silently feeding a model-improvement system. Instead, the rule is simple and explainable: no opt-in means no training use by default, and no retention over 24 hours.
- Stronger fit for proprietary product imagery
- Clearer for agencies handling client-owned media
- Better for SaaS companies processing customer uploads
- More comfortable for privacy-sensitive production workflows
This is one of our biggest trust commitments
We want to be very direct about this because it is one of the most important trust questions people ask AI companies. Many customers are not just looking for a model that works well. They are looking for a vendor whose data posture is clear, limited, and compatible with business use.
That is why we emphasize this so strongly: if you do not explicitly opt in, we do not use your uploads to improve our models by default, and we do not retain your data for over 24 hours.
Why it matters: this is not a buried opt-out training policy. It is a deliberate privacy-first default.
Training policy and ownership are separate
The model-training policy is separate from ownership. Your uploaded images remain your exclusive property. We do not claim intellectual property rights over them. Separately, we also do not use them for model improvement by default unless you explicitly opt in.
Together, those two points create a much clearer customer relationship: your data remains yours, and broader training use requires your explicit permission.
The simplest version of the policy
BackgroundErase does not use your uploads to train or improve our models by default. We only do that if you explicitly opt in. If you do not enable that setting, we do not retain your data for over 24 hours.
How to review your setting
If you want to confirm how your account is configured, go to backgrounderase.com/account and check the opt-in control at the top of the page. That setting determines whether you have explicitly allowed your data to be used for model improvement.
If it is not turned on, your uploads are not used to train our models by default, and they are not retained for over 24 hours.
