Kernl, AI and the Australian Privacy Principles
Utilising Kernl, which leverages OpenAI’s API, provides a secure and compliant environment for handling PII. OpenAI’s commitment to data privacy, robust security measures, and adherence to international standards make it a reliable choice for Australian organisations concerned about data security and privacy.
Kernl integrates OpenAI’s API platform, designed with robust security measures and privacy protocols to handle Personally Identifiable Information (PII) responsibly, aligning with Australian data protection standards.
For more information, see below.
1. Data Usage and Model Training
By default, OpenAI does not use data submitted through its API to train or improve its models. This means that any inputs or outputs processed via Kernl are not utilised for model training.
2. Data Encryption and Security Measures
All data transmitted to and from OpenAI’s services are encrypted using industry-standard protocols: AES-256 encryption for data at rest and TLS 1.2 or higher for data in transit. These measures ensure that your data remains secure against unauthorised access.
3. Data Redaction
We take data security seriously by ensuring that personally identifiable information (PII) is not exposed to external large language models (LLMs). Before any message reaches an external LLM, we use a best in class machine learning tool, to detect and redact PII. The AI then works with placeholder values, and as the response is streamed back to the user, we replace those placeholders with the original values in real time. This means the user experiences a seamless conversation, while the AI never sees any sensitive data. Redaction occurs both in the input (user message, chat history, and system context) and in the outputs of tool calls, all without compromising our real-time streaming capabilities.
4. Compliance with International Standards
OpenAI’s API services adhere to several international compliance standards, including SOC 2 Type 2, CCPA, and GDPR. These certifications confirm that OpenAI’s data handling practices meet rigorous security and privacy requirements.
5. Data Residency
We have chosen Singapore as our data residency region with OpenAI because it offers strong alignment with Australia’s privacy framework, particularly the Australian Privacy Principles (APPs). Singapore’s Personal Data Protection Act (PDPA) provides a comparable level of protection for personal information, with similar requirements around consent, access, and security. Additionally, Singapore’s close geographic proximity to Australia helps minimise latency and improves service performance, while also being a trusted and stable jurisdiction for data processing. This makes it a practical and legally sound choice for ensuring the security and compliance of client data
6. Data Retention Policies
OpenAI retains API data for a maximum of 30 days to monitor for abuse and misuse. This data is not used for model training and is only accessible by authorised personnel.
7. Audio retention
Kernl does not retain any audio data. Audio is processed solely for the purpose of generating transcripts, after which it is immediately discarded. No audio recordings are stored or accessible after transcription is complete.