CSV to SQL Generator Free
Generate SQL INSERT statements from CSV data. Choose your database dialect, set the table name, configure batch inserts, handle nulls and type inference. Supports MySQL, PostgreSQL, and SQLite. Runs entirely in your browser.
CSV Input
SQL Output
Pro — UPSERT/MERGE, column type hints, UPDATE statements, API access
API access · Priority queue · Team workspace
How It Works
Paste Your CSV
Paste CSV with a header row — the column names become SQL column names in the INSERT statement. Click Sample CSV to load a product catalog with id, name, category, price, stock, and active columns — a good mix of integer, string, decimal, and boolean types that shows how type inference maps CSV values to SQL literals.
Configure SQL Options
Set the target table name, choose the SQL dialect (MySQL uses backtick quoting, PostgreSQL and SQLite use double-quote quoting for identifiers), pick the batch size for multi-row INSERT statements, and toggle whether to include a CREATE TABLE statement with inferred column types at the top of the output. Enable type inference to produce unquoted number and NULL literals.
Copy or Download SQL
Click Generate SQL to build the complete SQL script. Copy to clipboard to paste directly into a database client like DBeaver, TablePlus, pgAdmin, or mysql CLI. Download as a .sql file to include in a migration script or run with a database import tool. The output includes a transaction wrapper and row count comment for verification.
CSV to SQL Features
Multi-dialect SQL generation with batch inserts and type inference
3 SQL Dialects
MySQL uses backtick identifier quoting (`column`) and single-quoted string literals with backslash escaping for special characters. PostgreSQL uses double-quote identifier quoting ("column") and dollar-quoted or single-quoted string literals. SQLite uses the most permissive quoting. Pick the dialect that matches your database engine to avoid syntax errors in the generated SQL.
Batch INSERT Statements
Instead of one INSERT per row, batch mode groups multiple rows into a single INSERT statement: INSERT INTO t (a,b) VALUES (1,'x'),(2,'y'),(3,'z'). This is significantly faster for bulk loading — MySQL and PostgreSQL can insert 100-row batches an order of magnitude faster than individual inserts. Choose batch size based on your database's max packet size and available memory.
Type Inference
With type inference enabled, numeric CSV values are inserted as unquoted SQL numbers (42, 3.14), empty fields become NULL, and boolean strings (true/false) become SQL TRUE/FALSE. Without it, all values are quoted as strings. Type inference is essential for numeric and boolean columns to work correctly with database constraints and computations.
CREATE TABLE Statement
When enabled, a CREATE TABLE statement is prepended to the SQL output with column names and inferred types. Columns where all values are integers get INTEGER, decimal values get DECIMAL(10,4), and everything else gets TEXT or VARCHAR(255) depending on the dialect. Review and adjust the column types before running in production.
SQL Injection Safe
All string values are escaped before insertion into SQL literals. Single quotes in string values are doubled ('O''Brien'), preventing SQL injection. Backslashes in MySQL mode are escaped with double backslash. This is not a substitute for parameterized queries in application code, but it ensures the generated SQL file is safe to run as a one-time data load script.
100% Private
All SQL generation runs in your browser — your CSV data never leaves your device. No server receives or logs any data. Safe for CSV exports from production databases containing customer records, financial transactions, employee payroll data, or any sensitive tabular data that must not be transmitted to third-party services for processing.
Free vs Pro
| Feature | Free | Pro |
|---|---|---|
| CSV input size | Unlimited | Unlimited |
| Batch INSERT statements | ||
| UPSERT / INSERT OR REPLACE | — | |
| UPDATE statements | — | |
| Column type overrides | — | |
| REST API access | — |
Frequently Asked Questions
MySQL (uses backtick identifier quoting and backslash string escaping), PostgreSQL (uses double-quote identifier quoting and standard SQL string escaping), and SQLite (uses double-quote identifier quoting, compatible with most standard SQL). The generated CREATE TABLE statement also adapts column type names to each dialect — for example, MySQL uses INT and PostgreSQL uses INTEGER for whole number columns.
For MySQL, the max_allowed_packet setting limits INSERT size — the default is 64MB, so batches of 100-500 rows are safe for typical datasets. For PostgreSQL, there is no hard limit but batches of 100-1000 rows are optimal for performance. For SQLite, single-statement batches work well. If you get "packet too large" errors, reduce the batch size. Use 1 row per INSERT when debugging to isolate which row causes an error.
With type inference enabled, empty fields are converted to SQL NULL — the correct representation of a missing value in SQL. Without type inference, empty fields are inserted as empty strings ''. Use NULL for columns that allow null (default), and ensure your table schema defines columns that require values as NOT NULL to catch any unexpected empty rows before data loading.
Yes — the SQL output is wrapped in BEGIN; ... COMMIT; (or START TRANSACTION for MySQL). This ensures that if any INSERT fails due to a constraint violation or data error, the entire import can be rolled back cleanly without leaving partial data in the table. Always review the generated SQL and test on a non-production database before running on live data.
The free tool generates INSERT statements only. Pro adds support for UPSERT (INSERT ... ON DUPLICATE KEY UPDATE for MySQL, INSERT ... ON CONFLICT DO UPDATE for PostgreSQL), INSERT OR REPLACE for SQLite, and UPDATE statements with a configurable primary key column. These are useful for data synchronization workflows where rows may already exist in the target table.
No — all SQL generation runs in your browser. Your CSV data is never transmitted to any server. This is especially important for CSV exports that typically contain the most sensitive data in a business — customer PII, transaction histories, employee payroll details, inventory costs, and other information that database exports routinely include. Your data stays on your device throughout the entire conversion process.