How to Convert Between JSON and CSV Online (Complete Guide)
This guide has a free tool → Open JSON to CSV Converter
How to Convert Between JSON and CSV Online (Complete Guide)
JSON and CSV are the two most common data interchange formats in software development, but they serve fundamentally different audiences. Developers work with JSON. Business teams and analysts work with spreadsheets. At some point in almost every project, you need to bridge that gap.
This guide covers every aspect of JSON-to-CSV and CSV-to-JSON conversion: how the formats differ, how the conversion logic works, common edge cases that trip up every developer, programmatic examples in multiple languages, and how to handle the conversion instantly online.
---
JSON to CSV Converter
Free online JSON to CSV converter - convert JSON arrays to CSV format for spreadsheets and databases
CSV to JSON Converter
Free online CSV to JSON converter - convert CSV data to JSON format with delimiter options and preview
JSON Formatter
JSON formatter and validator online - format, beautify, and validate JSON data instantly in your browser
Understanding the Two Formats
What JSON Is
JSON (JavaScript Object Notation) is a lightweight text-based data format designed for representing structured data. It supports nested objects, arrays, strings, numbers, booleans, and null values. JSON is the lingua franca of modern APIs and web applications.
{
"id": 1,
"name": "Alice",
"email": "alice@example.com",
"role": "admin",
"active": true,
"scores": [95, 87, 92],
"address": {
"city": "Portland",
"state": "OR",
"zip": "97201"
}
}JSON is hierarchical. Data can be nested to arbitrary depth. This makes it expressive but also means it does not map trivially to a flat tabular format.
What CSV Is
CSV (Comma-Separated Values) is a plain-text tabular format. The first row contains column headers. Each subsequent row contains values separated by a delimiter (most commonly a comma). CSV has no native concept of nesting, types, or relationships.
id,name,email,role,active
1,Alice,alice@example.com,admin,true
2,Bob,bob@example.com,editor,false
3,Carol,carol@example.com,viewer,trueCSV is universally supported. Every spreadsheet application, database tool, and data analysis platform can open a CSV file. It is the lowest common denominator of data exchange.
Why You Need to Convert Between Them
JSON to CSV is needed when:
- Exporting API response data for non-technical stakeholders who work in Excel or Google Sheets
- Generating reports from application data
- Preparing datasets for data analysis in Python, R, or Tableau
- Importing data into CRM systems, marketing tools, or database management software
- Sending bulk data to vendors or partners who require spreadsheet formats
CSV to JSON is needed when:
- Importing spreadsheet data into a web application
- Converting legacy CSV exports to work with modern REST APIs
- Seeding a database from a manually maintained spreadsheet
- Transforming user-uploaded files into API payloads
- Migrating data between systems with different format requirements
---
JSON to CSV: How the Conversion Works
The Simple Case
A JSON array of objects maps naturally to CSV rows. The object keys become column headers, and each object becomes a row.
[
{ "name": "Alice", "email": "alice@example.com", "role": "admin" },
{ "name": "Bob", "email": "bob@example.com", "role": "editor" },
{ "name": "Carol", "email": "carol@example.com", "role": "viewer" }
]Result:
name,email,role
Alice,alice@example.com,admin
Bob,bob@example.com,editor
Carol,carol@example.com,viewerThe conversion algorithm is straightforward:
- Collect all unique keys across all objects in the array
- Output the keys as the header row
- For each object, output its values in the order of the headers
Handling Nested Objects
Real-world JSON is rarely flat. Most API responses contain nested structures. When you convert nested JSON to CSV, the tool must flatten the hierarchy into columns.
The standard approach is dot notation - nested keys are joined with a period:
{
"name": "Alice",
"address": {
"city": "Portland",
"state": "OR",
"zip": "97201"
},
"scores": {
"math": 95,
"science": 87
}
}Flattened result:
name,address.city,address.state,address.zip,scores.math,scores.science
Alice,Portland,OR,97201,95,87Each nested key path becomes a column header using dot notation. This preserves all the data in a format that spreadsheet applications can work with.
Handling Arrays in JSON Values
Arrays inside JSON objects present a choice: join them into a single cell, or expand them into multiple columns.
Option 1: Join into a single delimited cell
{ "name": "Alice", "tags": ["admin", "verified", "beta"] }name,tags
Alice,"admin,verified,beta"The array values are joined with a separator (typically a comma or pipe) and wrapped in quotes. This is the most common approach.
Option 2: Expand into indexed columns
name,tags.0,tags.1,tags.2
Alice,admin,verified,betaThis creates a column per array index. It works well for small fixed-size arrays but creates sparse columns when array lengths vary between rows.
Handling Arrays of Nested Objects
When JSON contains arrays of objects (very common in API responses), expansion is required. Each nested object in the array can become multiple rows:
[
{
"order_id": 101,
"customer": "Alice",
"items": [
{ "product": "Widget", "qty": 2 },
{ "product": "Gadget", "qty": 1 }
]
}
]Expanded to rows:
order_id,customer,items.product,items.qty
101,Alice,Widget,2
101,Alice,Gadget,1The parent row data (order_id, customer) is repeated for each item. This "row explosion" is often what you want for downstream analysis.
---
CSV to JSON: How the Conversion Works
Basic Conversion
The reverse conversion turns rows into objects:
product,price,in_stock,category
Widget,9.99,true,hardware
Gadget,24.99,false,electronics
Doohickey,4.49,true,hardwareResult:
[
{ "product": "Widget", "price": "9.99", "in_stock": "true", "category": "hardware" },
{ "product": "Gadget", "price": "24.99", "in_stock": "false", "category": "electronics" },
{ "product": "Doohickey", "price": "4.49", "in_stock": "true", "category": "hardware" }
]The first row is treated as the key names. Each subsequent row becomes an object. The column order in the CSV maps to keys in the JSON objects.
The Type Problem
CSV has no type information. Every value is a string. A simple converter produces strings for numbers and booleans, which is often not what you want.
Without type detection:
{ "price": "9.99", "in_stock": "true", "quantity": "5" }With type detection enabled:
{ "price": 9.99, "in_stock": true, "quantity": 5 }A good converter applies these type inference rules:
- If a value matches a valid number format, parse it as a number
- If a value is exactly
"true"or"false", parse it as a boolean - If a value is empty, output
null - Otherwise, keep it as a string
This is important when the resulting JSON will be used in code or sent to an API that expects typed values.
Choosing the Right JSON Structure
Most converters produce an array of objects, which is the most useful format. But sometimes a different structure is needed:
Array of arrays (no headers as keys):
[
["Widget", "9.99", "true"],
["Gadget", "24.99", "false"]
]Keyed by a specific column:
{
"Widget": { "price": "9.99", "in_stock": "true" },
"Gadget": { "price": "24.99", "in_stock": "false" }
}This is useful when the JSON will be used as a lookup dictionary indexed by a unique identifier.
---
Common Edge Cases and How to Handle Them
Missing Values
When objects in a JSON array have different keys, the converter must handle the missing columns. Empty cells should be output for missing keys.
[
{ "name": "Alice", "email": "alice@example.com" },
{ "name": "Bob", "phone": "555-1234" },
{ "name": "Carol", "email": "carol@example.com", "phone": "555-5678" }
]All unique keys are collected first, then each row fills only the columns it has:
name,email,phone
Alice,alice@example.com,
Bob,,555-1234
Carol,carol@example.com,555-5678Values Containing the Delimiter
If a CSV value contains the delimiter character (a comma, for instance), it must be quoted. RFC 4180 specifies that values containing commas, newlines, or double quotes must be enclosed in double quotes.
name,description
"Smith, John","Senior developer, team lead"
"O'Brien","Uses single quotes"When quoting a value that contains a double quote, the double quote itself must be escaped by doubling it:
note
"He said ""hello"" to everyone"Newlines Inside Values
JSON string values can contain literal newlines (\n). In CSV, this must be handled carefully because newlines normally signal a new row. RFC 4180 allows newlines inside quoted fields:
name,bio
Alice,"Software engineer.
Works remotely.
Loves open source."The multi-line value is valid as long as the entire field is wrapped in quotes.
Special Characters and Non-ASCII Text
JSON fully supports Unicode. CSV is supposed to as well, but encoding matters. If a CSV file is saved without explicit UTF-8 encoding, opening it in Excel can corrupt characters like accented letters, Chinese characters, or emoji.
Best practice:
- Save CSV files with UTF-8 BOM encoding when targeting Excel on Windows (Excel reads the BOM to detect UTF-8)
- Save CSV files with plain UTF-8 when targeting other tools
- Specify the encoding explicitly in code
Large JSON Arrays
Converting a very large JSON file (tens of thousands of objects) in a browser can be slow and may run out of memory. For large datasets:
- Process the file in streaming chunks on the server
- Use a command-line tool like
jqor Python'scsvmodule - Split the input into smaller files first
---
Delimiter Options
CSV stands for "comma-separated values" but other delimiters are common. The right choice depends on your data and the tool that will consume it.
| Delimiter | Character | Common Use Case | |
|---|---|---|---|
| Comma | , | Default, universally supported | |
| Tab | \t | TSV format, works well when data contains commas | |
| Semicolon | ; | Default in European Excel (commas used as decimal separators) | |
| Pipe | `\ | | Log files, avoids conflicts with both commas and quotes |
When your data contains lots of commas (addresses, descriptions, lists), switching to a tab or pipe delimiter avoids excessive quoting and makes the file more readable.
---
How to Convert With ToolBox
JSON to CSV
- Open the JSON to CSV Converter
- Paste your JSON array directly or upload a
.jsonfile - The tool auto-detects the structure and generates a preview
- Choose your delimiter: comma, tab, semicolon, or pipe
- Toggle flattening options for nested objects
- Copy the result to clipboard or download as a
.csvfile
CSV to JSON
- Open the CSV to JSON Converter
- Paste your CSV data or upload a
.csvfile - The first row is used as keys by default
- Enable type inference if you want numbers and booleans parsed correctly
- Copy the JSON output or download as a
.jsonfile
Both tools run entirely in your browser using JavaScript. Your data is never sent to a server. This matters when the data contains personal information, credentials, or confidential business records.
---
Programmatic Conversion: Code Examples
JavaScript / Node.js: JSON to CSV
function jsonToCsv(items) {
if (!items || items.length === 0) return '';
// Collect all unique keys across all objects
const headers = [...new Set(items.flatMap(item => Object.keys(item)))];
// Escape a single CSV cell value
const escapeCell = (value) => {
if (value === null || value === undefined) return '';
const str = String(value);
if (str.includes(',') || str.includes('"') || str.includes('\n')) {
return '"' + str.replace(/"/g, '""') + '"';
}
return str;
};
const headerRow = headers.map(escapeCell).join(',');
const dataRows = items.map(item =>
headers.map(header => escapeCell(item[header])).join(',')
);
return [headerRow, ...dataRows].join('\n');
}
// Usage
const data = [
{ name: "Alice", email: "alice@example.com", age: 30 },
{ name: "Bob", email: "bob@example.com", age: 25 }
];
console.log(jsonToCsv(data));
// name,email,age
// Alice,alice@example.com,30
// Bob,bob@example.com,25JavaScript / Node.js: CSV to JSON
function csvToJson(csv, options = {}) {
const { delimiter = ',', inferTypes = true } = options;
const lines = csv.trim().split('\n');
if (lines.length < 2) return [];
const headers = parseCsvRow(lines[0], delimiter);
return lines.slice(1).map(line => {
const values = parseCsvRow(line, delimiter);
const obj = {};
headers.forEach((header, index) => {
const raw = values[index] ?? '';
obj[header] = inferTypes ? parseValue(raw) : raw;
});
return obj;
});
}
function parseCsvRow(row, delimiter) {
const values = [];
let current = '';
let inQuotes = false;
for (let i = 0; i < row.length; i++) {
const char = row[i];
if (char === '"') {
if (inQuotes && row[i + 1] === '"') {
current += '"';
i++;
} else {
inQuotes = !inQuotes;
}
} else if (char === delimiter && !inQuotes) {
values.push(current);
current = '';
} else {
current += char;
}
}
values.push(current);
return values;
}
function parseValue(str) {
if (str === '') return null;
if (str === 'true') return true;
if (str === 'false') return false;
const num = Number(str);
if (!isNaN(num) && str.trim() !== '') return num;
return str;
}Python: JSON to CSV
import csv
import json
import io
def json_to_csv(data, delimiter=','):
"""Convert a list of dicts to CSV string."""
if not data:
return ''
# Collect all unique keys
fieldnames = list(dict.fromkeys(
key for row in data for key in row.keys()
))
output = io.StringIO()
writer = csv.DictWriter(
output,
fieldnames=fieldnames,
delimiter=delimiter,
extrasaction='ignore',
restval=''
)
writer.writeheader()
writer.writerows(data)
return output.getvalue()
# Example
data = [
{"name": "Alice", "email": "alice@example.com", "age": 30},
{"name": "Bob", "email": "bob@example.com", "age": 25},
]
csv_output = json_to_csv(data)
print(csv_output)Python: CSV to JSON with Type Detection
import csv
import json
import io
def csv_to_json(csv_string, infer_types=True):
"""Convert CSV string to list of dicts with optional type inference."""
reader = csv.DictReader(io.StringIO(csv_string))
result = []
for row in reader:
if infer_types:
typed_row = {}
for key, value in row.items():
typed_row[key] = parse_value(value)
result.append(typed_row)
else:
result.append(dict(row))
return result
def parse_value(value):
if value == '':
return None
if value.lower() == 'true':
return True
if value.lower() == 'false':
return False
try:
return int(value)
except ValueError:
pass
try:
return float(value)
except ValueError:
pass
return value
# Example
csv_data = """name,age,active,score
Alice,30,true,9.5
Bob,25,false,7.2"""
result = csv_to_json(csv_data)
print(json.dumps(result, indent=2))
# [
# { "name": "Alice", "age": 30, "active": true, "score": 9.5 },
# { "name": "Bob", "age": 25, "active": false, "score": 7.2 }
# ]Python: Flattening Nested JSON Before Converting
def flatten_json(obj, prefix='', separator='.'):
"""Recursively flatten a nested dict using dot notation."""
items = {}
for key, value in obj.items():
new_key = f"{prefix}{separator}{key}" if prefix else key
if isinstance(value, dict):
items.update(flatten_json(value, new_key, separator))
elif isinstance(value, list):
# Join list items as a comma-separated string
items[new_key] = ','.join(str(v) for v in value)
else:
items[new_key] = value
return items
# Example
nested = {
"name": "Alice",
"address": {
"city": "Portland",
"state": "OR"
},
"tags": ["admin", "verified"]
}
flat = flatten_json(nested)
print(flat)
# {
# "name": "Alice",
# "address.city": "Portland",
# "address.state": "OR",
# "tags": "admin,verified"
# }Command Line with jq
For quick conversions without writing code, jq is the fastest option:
# JSON array to CSV (extract specific fields)
jq -r '["name","email","age"], (.[] | [.name, .email, .age]) | @csv' data.json
# JSON to TSV (tab-separated)
jq -r '.[] | [.name, .email, .age] | @tsv' data.json > data.tsv
# CSV to JSON (using csvkit)
csvjson data.csv > data.json
# JSON to CSV (using csvkit)
in2csv --format json data.json > data.csvThe @csv and @tsv built-in formats in jq handle quoting automatically.
---
Working With Real API Responses
Real API responses often include metadata wrappers, pagination, or deeply nested structures that need preprocessing before conversion.
Extracting the Data Array First
Most REST APIs return data wrapped in an envelope:
{
"status": "ok",
"total": 3,
"page": 1,
"data": [
{ "id": 1, "name": "Alice" },
{ "id": 2, "name": "Bob" }
]
}You need to extract the data array before converting:
const response = await fetch('/api/users');
const json = await response.json();
// Extract the nested array first
const users = json.data;
// Now convert to CSV
const csv = jsonToCsv(users);Handling Pagination
When working with paginated APIs, you may need to fetch all pages and combine them before converting:
async function fetchAllPages(baseUrl) {
let allData = [];
let page = 1;
let hasMore = true;
while (hasMore) {
const response = await fetch(`${baseUrl}?page=${page}&limit=100`);
const json = await response.json();
allData = allData.concat(json.data);
hasMore = json.data.length === 100;
page++;
}
return allData;
}
const users = await fetchAllPages('/api/users');
const csv = jsonToCsv(users);---
Format Comparison Reference
| Aspect | JSON | CSV |
|---|---|---|
| Human readability | Good for simple data, complex for deep nesting | Very readable for tabular data |
| Nesting support | Yes, arbitrary depth | No native nesting |
| Type information | Yes (string, number, boolean, null, array, object) | No, all values are strings |
| File size | Larger due to key repetition per row | Smaller for flat tabular data |
| Spreadsheet support | Requires conversion | Native, opens directly |
| API compatibility | Universal for REST/GraphQL | Supported by many APIs as import format |
| Database import | Supported by most modern databases | Universally supported |
| Streaming | Harder to stream large arrays | Easy to process line by line |
| Schema flexibility | Objects can have varying keys | All rows share the same columns |
---
When to Use Each Format
Choose JSON when:
- The data has nested or hierarchical structure
- The data will be consumed by code or an API directly
- Type information matters (numbers vs strings)
- The structure varies between records
Choose CSV when:
- The data is flat and tabular
- Non-technical users need to work with it
- You are importing into a database, BI tool, or spreadsheet
- File size is a concern and the data is simple
- The tool receiving the data only accepts CSV
---
Related Tools
Once you have converted your data, you often need to do more with it. Here are the tools that work well in combination with JSON-CSV conversion:
- JSON Formatter - Pretty-print and validate JSON before converting
- CSV to JSON - Convert CSV to a JSON array
- JSON to CSV - Convert a JSON array to CSV
- YAML to JSON Converter - Convert YAML config files to JSON first, then to CSV
- XML Formatter - Format XML data, which can sometimes be converted to JSON first
- Dummy Data Generator - Generate sample JSON datasets to test your conversion logic
- SQL Formatter - Format SQL that you might generate from CSV data
---
Try It Now
Convert between JSON and CSV instantly in your browser without sending your data anywhere:
- JSON to CSV Converter - Paste a JSON array, get a CSV file
- CSV to JSON Converter - Paste a CSV, get a JSON array
Both tools are free, private, and require no signup. Your data is processed entirely in your browser using JavaScript.
Related Tools
Free, private, no signup required
You might also like
Want higher limits, batch processing, and AI tools?