Skip to main content

Credia AI Security and Privacy

How Credia AI handles your data in Re-Leased — covering data protection, privacy commitments, compliance with GDPR and privacy laws, and advanced user permissions. Your data is never used to train AI models.

Updated this week

Re-Leased is committed to being transparent about how Credia AI handles your data. We understand that your information is sensitive and confidential, and we have designed Credia AI to meet high standards of privacy and security from the ground up.


How We Protect Your Data

Credia AI is developed in-house by Re-Leased, which means we maintain full control over the entire product lifecycle. Credia runs on Microsoft Azure Cloud infrastructure, providing robust enterprise-grade security. Your data is never used to train AI models for other Re-Leased customers or for third-party applications.


Our Commitment to Privacy

We do not use your data to refine or improve any models. Credia AI runs general-purpose Large Language Models (LLMs) on our managed cloud infrastructure, ensuring complete separation between your data and any model training processes.


Compliance with Privacy Laws

Credia AI is built to the same privacy standards applied to all Re-Leased products. By using Re-Leased, you benefit from our compliance with applicable privacy regulations including GDPR.


Advanced User Permissions

For businesses that require an additional layer of access control, Re-Leased offers an advanced user permissions add-on. Learn more about configuring user permissions to control how Re-Leased works across your organisation.

Did this answer your question?