CSV to JSON Converter Free
Convert CSV files to JSON arrays of objects instantly. Paste CSV, auto-detect delimiter, use the first row as keys, handle quoted fields, and download clean JSON. Handles RFC 4180 quoting, empty fields, and type inference. Runs entirely in your browser.
CSV Input
JSON Output
Pro — file upload, batch convert, custom key mapping, nested JSON, API access
API access · Priority queue · Team workspace
How It Works
Paste Your CSV
Paste any CSV text into the input panel. Click Sample CSV to load a realistic employee dataset with name, department, salary, start date, and active status columns — a good test of type inference, date fields, and boolean handling. The converter supports any number of rows and columns and handles quoted fields containing commas and newlines correctly.
Configure Options
Choose the delimiter (or use Auto-detect to let the tool pick the most likely delimiter by scanning the first line). Toggle whether the first row contains headers — if disabled, columns are numbered col_1, col_2, etc. Enable type inference to convert numeric strings to numbers and "true"/"false" to booleans. Skip empty rows to clean up sparse CSVs.
Copy or Download JSON
Click Convert to parse the CSV and produce a JSON array of objects in the right panel. Each row becomes one object, with headers as keys. Copy to clipboard to paste into code, or download as a .json file. The stats bar shows the number of rows and columns parsed, helping you verify the output matches your expectations before using it.
CSV to JSON Features
RFC 4180 compliant parsing with type inference and auto-detection
Auto-Detect Delimiter
The Auto-detect option scans the first line of your CSV and counts occurrences of comma, semicolon, tab, and pipe to choose the most likely delimiter. This works correctly for comma-separated files exported from Excel, semicolon-separated files from European Excel locales, tab-separated files from database exports, and pipe-separated files from legacy systems — no manual configuration needed.
RFC 4180 Quoting
Correctly handles double-quoted fields that contain the delimiter character, embedded double-quotes (escaped by doubling them to ""), and fields containing newline characters. This is essential for CSV exported from Excel and Google Sheets, which frequently quote fields that contain commas or line breaks, such as address fields or multi-line notes columns.
Type Inference
When enabled, numeric strings are converted to JSON numbers ("42" → 42, "3.14" → 3.14), and boolean strings are converted to JSON booleans ("true"/"false", "yes"/"no"). Empty fields become null. This produces correctly typed JSON that works directly with APIs and databases without post-processing.
Header Row Control
When "First row is header" is enabled, column names are taken from the first row and used as JSON object keys. When disabled, columns are automatically named col_1, col_2, etc. — useful for headerless data exports from databases or scripts. Headers are trimmed of leading and trailing whitespace to avoid keys with invisible characters.
Skip Empty Rows
Many CSV exports contain trailing empty lines or blank separator rows. The "Skip empty rows" option filters out any row where all fields are empty strings. This keeps the JSON output clean without manually removing blank lines before pasting. Disable this option only if your data intentionally contains all-empty rows that need to be represented as null-valued objects.
100% Private
All CSV parsing and JSON generation runs in your browser using JavaScript. Your data is never uploaded to a server, never logged, and never stored. This makes it safe for CSV exports containing customer records, financial transactions, employee data, medical records, or any other sensitive structured data that cannot be transmitted to external services.
Free vs Pro
| Feature | Free | Pro |
|---|---|---|
| CSV input size | Unlimited | Unlimited |
| Type inference | ||
| File upload (.csv) | — | |
| Custom key mapping | — | |
| Nested JSON output | — | |
| REST API access | — |
Frequently Asked Questions
The output is a JSON array of objects, where each row in the CSV becomes one object and each column header becomes a key. For example, a CSV with three rows (plus header) produces a JSON array of three objects. This format is immediately consumable by most APIs, databases, JavaScript applications, and data processing tools without any transformation.
Fields containing the delimiter character should be wrapped in double-quotes per RFC 4180, and this tool handles them correctly. For example, "Smith, John" as a quoted field is parsed as the single value Smith, John — the comma inside is not treated as a separator. Embedded double-quotes escaped as "" are also handled, producing a single " in the output value.
Empty fields (consecutive delimiters with nothing between them, or a trailing delimiter at the end of a row) are represented as null in the JSON when type inference is enabled, or as an empty string "" when disabled. Using null is more semantically accurate for missing values and works correctly with most JSON processing libraries that distinguish between a missing value and an empty string.
Yes — Excel CSV exports use comma as delimiter in English locales and semicolon in European locales. The Auto-detect option handles both cases automatically. Excel also quotes fields that contain commas, newlines, or double-quotes, which this parser handles correctly per RFC 4180. If your Excel file uses UTF-8 encoding with BOM, paste the text content into the input panel rather than uploading the raw file bytes.
Type inference automatically converts string values to their most appropriate JSON type. Numbers like 42 and 3.14 become JSON numbers, not strings. Boolean literals true, false, yes, and no (case-insensitive) become JSON booleans. Empty fields become null. All other values stay as strings. Disable this if you need all values to remain as strings (for example, if a field contains leading-zero zip codes like 01234 that would lose their zero as a number).
No — all parsing runs in your browser using JavaScript. Your CSV is never uploaded or transmitted anywhere. You can disconnect from the internet after loading the page and the conversion will still work, confirming no network requests are made. This makes the tool completely safe for CSV files containing customer data, financial records, health information, and any other sensitive tabular data.