dev101.io

CSV to JSON Converter

Convert CSV or TSV into JSON in your browser — RFC 4180 parser with auto-delimiter detection and type coercion.

Loading tool…

How to use CSV to JSON Converter

  1. Paste your CSV, TSV, or any delimited text into the input panel on the left.
  2. Leave "Delimiter" on Auto-detect for most inputs, or pick a specific character if you know the format.
  3. Toggle "Header row" off if the first row is data rather than column names — the output switches to an array of arrays.
  4. Flip on "Coerce numbers" or "Coerce booleans" to get typed JSON values; leave them off to keep every cell as a string.
  5. Use "Trim values" to strip whitespace on unquoted cells, and "Empty → null" to turn blank cells into JSON `null`.
  6. Press ⌘/Ctrl + Enter to copy the JSON output, or use the Copy JSON button.
  7. Click Share to generate a URL that restores your input and settings in any browser.

CSV to JSON Converter

A zero-dependency, browser-local CSV (and TSV) to JSON converter with a real RFC 4180 parser at its core. Paste a spreadsheet export, a database dump, an API CSV response, or anything with delimiters — get back clean, typed JSON without the round-trip to a server. Everything happens on your device.

Why another CSV to JSON converter?

The obvious implementation — input.split("\n").map(l => l.split(",")) — breaks on the first quoted field that contains a comma, the first embedded newline, the first escaped quote, the first CRLF line ending. Most online converters get one of those cases right and fail on the others, which is why pasting a real-world CSV into them so often produces silently-wrong output. Worse, almost all of them do the work on a server, so that invoice export or customer list you just uploaded now lives in somebody else's request log.

This tool runs a character-by-character state-machine parser locally. The tokeniser is around 200 lines of TypeScript, has zero runtime dependencies, and handles the full set of RFC 4180 edge cases — quoted fields containing delimiters, doubled-quote escapes, embedded CR / LF / CRLF inside quotes, mixed line endings across a document, leading UTF-8 BOM, and trailing blank rows. The type-coercion layer is separate from the tokeniser, so the parse itself never lies to you about what was in the file.

What it does

  • Parse CSV, TSV, or any delimited text — comma, semicolon, tab, pipe, or a custom single-character delimiter.
  • Auto-detect the delimiter by sampling the first ten non-blank logical lines and picking the character that yields the most consistent column layout.
  • Handle real-world edge cases: quoted fields containing commas, doubled-up escaped quotes (""), embedded newlines inside quoted fields, CR / LF / CRLF line endings, leading UTF-8 BOM, ragged rows where a trailing cell is missing, trailing blank lines.
  • Optional type coercion: convert numeric-looking cells to numbers (while preserving leading-zero identifiers), convert "true" / "false" (case-insensitive) to booleans, convert empty cells to JSON null.
  • Output control: pretty-print with 2-space indent or minify to a single line.
  • Header-row toggle: emit an array of objects keyed by the first row, or an array of arrays when the data has no column names.

Keyboard shortcuts

  • ⌘/Ctrl + Enter — Copy JSON output to the clipboard
  • ⌘/Ctrl + K — Open the global command palette (jump to another tool)

Privacy promise

There is no analytics, no telemetry, no third-party script that touches your input. The parser is a pure function of your CSV and your options — no network calls, no logging, no server round-trip. The Share button encodes a base64url snapshot of your input and settings into the URL fragment (#s=…), which HTTP spec guarantees is never transmitted to any server. Sharing a link is a peer-to-peer transfer of state, not an upload.

What's not supported (yet)

  • JSON to CSV (the reverse direction): planned as a separate tool.
  • Multi-character delimiters: the parser operates on single characters by design, which covers CSV, TSV, SSV, and pipe-delimited formats. If you have a pipe-pipe or |~| delimited export, pre-process with find-and-replace.
  • Streaming: documents bigger than browser memory should be processed with a CLI tool like csvkit or a streaming parser like papaparse in a Node script.
  • Custom column types: the coercion layer has three knobs (numbers, booleans, empty-as-null). Per-column schema control is on the roadmap for the v1.1 release.

Frequently asked questions

Does the CSV I paste ever leave my browser?

No. The parser runs entirely on your device — there is no CSV-handling endpoint on dev101.io. We wrote the RFC 4180 tokeniser from scratch in a few hundred lines of TypeScript so the tool has zero network dependencies. That matters for CSV specifically because spreadsheets are a favourite way to move customer lists, invoice rows, transaction logs, and user exports around; pasting one into a server-side converter effectively hands that data to a third party. The optional Share button encodes state into the URL hash (the part after `#`), which by HTTP design is never transmitted to any server.

How does auto-delimiter detection work?

When "Auto-detect" is selected, the parser samples the first ten non-blank logical lines of your input (honouring quoted embedded newlines) and scores each candidate delimiter — comma, semicolon, tab, pipe — by average column count and variance. The delimiter that produces the most columns with the most consistent row width wins. Ties break in favour of comma, then semicolon, tab, and pipe in that order. If your input has exactly one column, detection falls back to comma and you still get a valid result.

What does "RFC 4180 compatible" mean in practice?

RFC 4180 is the informal CSV spec most tools target. In concrete terms, this parser handles quoted fields that contain the delimiter (`"hello, world"`), doubled-up escaped quotes (`"she said ""hi"""`), embedded CR / LF / CRLF inside quoted fields, and mixed line endings across the document. It also strips a leading UTF-8 BOM — Excel exports on Windows often carry one — and drops trailing blank lines. Unclosed quotes return a typed error with the line and column of the starting quote, instead of silently producing garbage output.

Why does "007" stay a string even when "Coerce numbers" is on?

Leading-zero sequences are almost always identifiers — account numbers, phone numbers, ZIP codes, SKUs. `Number("007")` would silently turn them into `7`, losing the zeros forever, and that's exactly the kind of data corruption CSV-to-JSON tools are infamous for. Our number-coercion rule rejects any multi-digit integer that starts with zero; decimals, negatives, and scientific notation (`1e3` → `1000`) are all converted normally. If you genuinely want numeric conversion on zero-padded values, pre-process the column with your editor's find-and-replace before pasting.

What's the difference between header mode and no-header mode?

With "Header row" on (the default), the first row is treated as column names and every subsequent row becomes a JSON object keyed by those names — the most common shape when you're loading a CSV into a database or an API request body. With it off, each row becomes a JSON array of cell values; useful when the CSV has no meaningful column names or you're doing positional transforms. You can switch modes at any time; the output updates live.

Can this tool also convert TSV (tab-separated values)?

Yes. Pick "Tab (\t)" from the delimiter dropdown, or leave it on "Auto-detect" — the detector will spot the tab character without any configuration. TSV is the usual output format when you copy a range from Excel, Google Sheets, or Numbers, so "paste from spreadsheet → get JSON" works in two clicks.

Related tools