Product · Dev / Infra
Before any AI-generated SQL touches your database, it should pass through a Safe Execution Layer — a deterministic pre-DB boundary for risk detection, dry-run impact estimation, and replayable auditing.
Works with PostgreSQL / SQLite · Designed for LLM agents, coding tools, and internal automation.
LLM-assisted development tools generate production-grade SQL — but they also hallucinate schemas, produce destructive writes, and behave nondeterministically. Most teams respond with manual review, backups, and shadow databases.
That doesn't scale when you have coding agents, internal assistants, and automated workflows generating SQL every minute.
"Any SQL generated by AI tools must be manually checked. We back up tables before running anything."
— Senior Analytics Engineer, data consulting (reported workflow pattern)
Our view
AI can propose SQL. A deterministic Safe Execution Layer should decide what actually runs.
What it is
DB Safe Execution Layer sits between any SQL-generating client (LLM agents, coding tools, internal scripts) and your database. It parses, simulates, classifies, and logs every statement before deciding whether it is allowed to execute.
It doesn't change your database engine or ORM. It is a thin, explicit execution boundary that makes AI-driven SQL safe, auditable, and replayable.
Works out-of-the-box with PostgreSQL / SQLite
AST-based parsing detects statement type, accessed tables, predicates, and destructive patterns before execution.
Rewrite DELETE / UPDATE into SELECT COUNT(*) to estimate affected rows without touching data.
Rule-based LOW / MEDIUM / HIGH / CRITICAL levels, with approval required for high-risk operations.
Record a database snapshot reference (e.g. txid / point-in-time marker) before executing approved writes.
Every step — precheck, dry-run, classification, snapshot, execution result — is captured in a structured log.
Treats SQL as input. Works with any LLM, coding agent, or automation framework that emits SQL.
Demo flow
A developer or agent sends:
-- LLM-generated SQL
DELETE FROM visits WHERE visit_date < '2010-01-01';
Static pre-check
Parse SQL, detect DELETE, target table visits, and the filter predicate.
Dry-run rewrite
Generate SELECT COUNT(*) FROM visits WHERE visit_date < '2010-01-01' and execute safely.
Risk classification
If affected_rows > threshold → classify as HIGH risk.
Snapshot reference
Record a snapshot reference (e.g. postgres:txid:7428812) before any potential write.
Gated execution
Show: 'This operation will delete 3214 rows from visits. Approve? (yes/no)'. Block by default.
Audit + replay
Write a JSON log with all steps and decisions. Replay later without touching the LLM or re-executing writes.
Dev edition
1. Clone the repository
git clone https://github.com/interact-space/database-safe-layer.git
2. Create virtual environment & install dependencies
cd database-safe-layer python -m venv venv source venv/bin/activate # Windows: venv\Scripts\activate pip install -r requirements.txt
3. Configure database connection (.env)
DB_URL=postgresql://user:password@localhost:5432/db_name
4. Run the demo
python -m db_safe_layer.app
Roadmap
Phase 0 · Now
DB Safe Execution Layer (Dev Edition)
Phase 1 · Next
Enterprise features with design partners
Phase 2
Deterministic Execution DAG for agents
Design Partner Program
If your team is exploring AI → SQL automation, coding agents, or internal developer tools, and you need guaranteed safety before touching databases, we'd like to co-build with you.
You get early access, direct collaboration with our engineering team, and influence over the product. We gain real workload insights to shape the enterprise edition.
Apply as Design PartnerAbout
DB Safe Layer was not born as a generic infra product.
We originally built it as an internal tool while working on OMOP-based analytics and AI-assisted SQL in medical R&D. Every query had to satisfy two constraints:
The answer in most existing tools was “no”, so we built a pre-DB Safe Execution Layer: dry-run, risk analysis, snapshots, and replayable audit — before any SQL touched a real database.
Only later did we realize the same pattern is needed far beyond healthcare.