Protecting sensitive text BEFORE external use is crucial for data privacy
AI services collect user inputs and data
Cloud-hosted privacy software is unacceptable, as it still exposes data
Protecting sensitive text BEFORE external use is crucial for data privacy
AI services collect user inputs and data
Cloud-hosted privacy software is unacceptable, as it still exposes data
CamoText is an app that detects and hides sensitive text with a click
CamoText cannot communicate nor transmit any data externally
No cloud storage, APIs, nor internet needed
CamoText runs locally and is laptop-friendly
No cloud servers, no API calls, no external access
Save the output or paste into your LLM interface
De-identifies data using NLP and custom algorithms
Highlight-to-anonymize more text, and review findings by category
Designed for compliance with GDPR, CCPA, HIPAA, FISMA, and more, using advanced encryption
Zero data retention unless user-saved
Most such services' host companies still retain data in these modes for "safety" purposes ranging from 30 days to two years, or require periodic manual deletion. Also, the level of privacy (by mode default or user-saved preferences) can change with a server-side update or company policy change, often without users' awareness. Finally, such companies can be compelled by legislation or judicial order to preserve data despite their policies.
One may use CamoText before exposing private data to these AI models for several reasons:
(1) to prevent organization-wide access due to security authorizations, conflicts, or ethical walls;
(2) organization-specific hosted AI logs and records are honeypots for malicious actors, exposing private data to considerably higher data breach risk; and
(3) private data deletion from the AI model, such as pursuant to a terminated client relationship or GDPR requirement, is extremely complex if not practically impossible.
Worth noting, "on-prem" (on premises) locally hosted AI models require specialized hardware and technical expertise for setup and maintenance (with accompanying orders-of-magnitude higher costs and subscription fees), and are generally less performant than the third party-hosted AI which leverage massive amounts of compute power.
Models than can run on individual computers are extremely limited in capability and performance (especially with CPU rather than GPU processing), and often still require specialized hardware add-ons. Open source models are also typically distributed without intuitive user interfaces. However, at CamoText we are constantly evaluating new LLM releases and whether local use of a general model on a consumer grade computer is feasible-- make sure to follow us for when that becomes reality!
Once installed, CamoText does not require an internet connection, use cloud servers, or make any API calls. As a licensed desktop app, it also does not require a subscription. Unlike other popular de-identification and anonymization software, all processing happens locally on your device using advanced NLP (natural language processing) and pattern matching, ensuring maximum privacy as no data is transmitted externally unless the user decides to do so.
Other offline tools require familiarity with the command line, do not have the same breadth of data type recognition, and do not have user-friendly interfaces.
CamoText was built by a law firm and is designed for flexible use in compliance with regulations like GDPR, CCPA, HIPAA, and FISMA. By processing data entirely offline and encrypting sensitive information locally before it interacts with external systems like AI, and not retaining nor transmitting user data, it helps users meet stringent privacy requirements, but the app can be used for different compliance levels.
The user retains control over the output, whether de-identified, pseudonymized, or fully anonymized-- including whether they decide to save anonymization key locally or let it be automatically deleted. User behavior and human-in-the-loop review, as well as proper legal counsel for your particular circumstances, determines the level of compliance.
CamoText's software is designed for a human-in-the-loop to inspect the output text for undetected items, false positives they desire to revert, or additional text they desire to obscure by simply highlighting the text and clicking "Highlight to Anonymize". Sensitivity and confidentiality can be subjective to the user, and even the best NLP recognizers are not perfect, so a human's review is always necessary. Our software is intended to be efficient, easy-to-use privacy tools for superpowered human productivity.
Try a limited in-browser demo
Download the latest release
Tailored options and features
See how CamoText can help you use the most powerful AI models while maintaining privacy
Contact Us