Faq

AI Models & Data Privacy

This page documents every way Char stores and processes your data. Nothing is hidden.

Local Data Storage

All core data is stored locally on your device by default. Nothing leaves your machine unless you explicitly enable a cloud feature.

Where Files Live

Char uses two base directories. See Data for the full directory layout.

Global base (shared across stable and nightly builds):

  • db.sqlite — SQLite database (sessions, contacts, calendar events, tags)
  • models/stt/ — downloaded speech-to-text model files (Whisper GGUF, Argmax tarballs)
  • store.json — app state (onboarding status, pinned tabs, recently opened sessions, dismissed toasts, analytics preference, auth tokens)
  • hyprnote.json — vault configuration (custom vault path if set)
  • search/ — Tantivy full-text search index

On macOS, this is typically ~/Library/Application Support/hyprnote/. On Linux, ~/.local/share/hyprnote/.

Vault base (customizable, defaults to the global base):

  • sessions/ — one subdirectory per session containing recorded audio, transcripts, notes, and attachments
  • humans/ — contact and participant data
  • organizations/ — organization data
  • chats/ — chat conversation data
  • prompts/ — custom prompt templates
  • settings.json — your app settings

Application logs are stored in the system app log directory as rotating files (app.log, app.log.1, etc.).

Is My Data Encrypted at Rest?

Char stores data as plain SQLite, JSON, and Markdown files on disk. Char does not add its own encryption layer. Your data is protected by your operating system's file permissions and any full-disk encryption you have enabled (such as FileVault on macOS or LUKS on Linux).

What Data Leaves Your Device

The following sections document every case where Char sends data to an external server.

Analytics (Opt-Out Available)

Char collects anonymous usage analytics by default to help improve the product. You can disable this entirely in Settings.

What is sent:

  • Event names (e.g. "session_started", "transcription_completed")
  • App version, build identifier, git hash
  • A device fingerprint (a hashed machine identifier, not your name, email, or IP)
  • When signed in: your user ID and email (for account-linked analytics)

Where it goes:

How to disable: Go to Settings and turn off analytics. When disabled, no analytics events are sent.

Error Reporting (Sentry)

Char uses Sentry for crash reporting and error tracking in release builds.

What is sent:

  • Error messages, stack traces, and crash dumps
  • A device fingerprint (hashed machine ID)
  • App version and platform information
  • ERROR and WARN level log messages as events; INFO level logs as breadcrumbs for context

Where it goes:

Cloud Transcription (When Enabled)

When you use cloud-based transcription, your audio is sent to the selected provider for processing. When using local transcription (Whisper or Argmax models), your audio never leaves your device.

What is sent:

  • Your recorded audio (streamed in real-time or sent as a file for batch transcription)
  • Configured language preferences and keywords

Where it goes depends on your setup:

  • Pro curated models: Audio is proxied through pro.hyprnote.com (our server) and forwarded to a curated STT provider. The proxy does not store your audio.
  • BYOK (Bring Your Own Key): Audio is sent directly from your device to the provider you selected.

Supported cloud STT providers:

ProviderPrivacy Policy
DeepgramPrivacy Policy
AssemblyAIPrivacy Policy
SonioxPrivacy Policy
GladiaPrivacy Policy
OpenAIPrivacy Policy
ElevenLabsPrivacy Policy
DashScopePrivacy Policy
MistralPrivacy Policy
Fireworks AIPrivacy Policy

Cloud LLM (When Enabled)

When you use cloud-based AI features (summaries, enhanced notes, chat), your session content is sent to the selected LLM provider. When using local LLMs (LM Studio or Ollama), everything stays on your device.

What is sent:

  • Transcript text, raw notes, and prompt templates
  • The content you are asking the AI to process

Where it goes depends on your setup:

  • Pro curated models: Requests are proxied through pro.hyprnote.com and forwarded to a curated LLM provider. Nothing is stored by our proxy.
  • BYOK providers: Requests are sent directly to the provider you selected (OpenAI, Anthropic, Google, or Mistral).
  • Local LLMs: Everything stays on your device. See Local LLM Setup.

MCP Tools (Pro Only)

Pro users have access to MCP tools for web search and URL reading during AI-assisted note generation.

What is sent:

  • Search queries (for web search)
  • URLs (for content extraction)

Where it goes:

  • Proxied through pro.hyprnote.com
  • Exa — web search
  • Jina AI — URL content reading

Network Connectivity Check

Char periodically checks if your device is online.

What happens:

  • A HEAD request to https://www.google.com/generate_204 every 2 seconds
  • No user data, cookies, or identifiers are included

Model Downloads

When you download a local STT model, Char fetches the model file from a hosting server.

What happens:

  • Standard HTTP download requests to S3 (Whisper models) or Argmax hosting servers
  • No user data is sent
  • Downloaded models are verified with checksums before use

App Updates

Char checks for updates using the Tauri updater system.

What is sent:

  • Your current app version and platform

Where it goes:

  • CrabNebula — release hosting and update distribution

Authentication (When Signed In)

When you sign in for Pro or cloud features, Char authenticates via Supabase.

What is stored locally:

  • Auth session tokens in the local Tauri store (store.json)
  • Account info: user ID, email, full name, avatar URL

What is sent:

  • Authentication requests to Supabase during sign-in
  • Auth tokens to pro.hyprnote.com when using Pro features

Cloud Database Sync (Optional)

Char supports optional cloud database sync.

When enabled: Your session data can be synced to a remote database. This is only active if you explicitly configure a cloud database connection.

When not configured: All data stays in the local SQLite database.

What AI Models Does Char Use?

Char uses two types of AI models: speech-to-text (STT) for transcription and large language models (LLMs) for generating summaries and notes.

Speech-to-Text Models

Local models run entirely on your device:

  • Whisper models (QuantizedTiny, Base, Small, LargeTurbo) — downloaded as GGUF files
  • Argmax models (ParakeetV2, ParakeetV3, WhisperLargeV3) — downloaded as tarballs

For local model details and download instructions, see Local Models.

Cloud models require sending audio to a provider. See the cloud transcription section above.

Large Language Models

Pro Curated Models — Subscribe to Pro for curated cloud AI models that work out of the box.

BYOK (Bring Your Own Key) — Enter your own API key for OpenAI, Anthropic, Google, or Mistral.

Local Models — Run models locally using LM Studio or Ollama. See Local LLM Setup.

Recommended local models: Gemma (Google, good balance of quality and performance) and Qwen (Alibaba, strong multilingual support).

Does Char Train AI Models on My Data?

No. Char does not use your recordings, transcripts, or notes to train AI models. When using cloud providers, your data is processed according to their respective privacy policies, but Char itself does not collect or use your data for training.

What Char Does NOT Do

  • Does not train AI models on your data
  • Does not sell your data to third parties
  • Does not collect your audio, transcripts, or notes for any purpose other than the features you explicitly use
  • Does not send your meeting content in analytics — analytics only includes event names and app metadata

Can I Use Char Completely Offline?

Yes, with local models. You can record audio, transcribe with a local Whisper or Argmax model, and generate summaries with a local LLM (LM Studio or Ollama) — all without an internet connection.

The only background network requests in a fully local setup are the connectivity check (HEAD to google.com) and update checks to CrabNebula.

How to Maximize Privacy

  1. Use a local STT model for transcription — see Local Models
  2. Use a local LLM (LM Studio or Ollama) for AI features — see Local LLM Setup
  3. Disable analytics in Settings
  4. Do not sign in or enable any cloud features
  5. Enable full-disk encryption on your OS (FileVault, LUKS, BitLocker)

Open Source

Char is open source. You can verify everything documented here by reading the code: