JSON Formatting Best Practices for APIs and Config Files
JSON (JavaScript Object Notation) has become the lingua franca of modern web development. From REST API responses to application configuration files, virtually every developer works with JSON daily. Yet many developers still make avoidable mistakes that lead to parsing errors, security vulnerabilities, and maintenance headaches. This guide covers everything you need to know about formatting JSON correctly.
RFC 8259: The JSON Specification You Should Actually Read
JSON is formally defined by RFC 8259, which superseded RFC 7159 in 2017. Understanding the specification prevents subtle bugs. Key rules include:
- Encoding: JSON MUST be encoded in UTF-8. RFC 8259 explicitly deprecated UTF-16 and UTF-32 for JSON transmitted over networks.
- Numbers: There is no distinction between integers and floats. Implementations should expect precision loss for very large numbers.
- Strings: Must use double quotes. Single-quoted strings are not valid JSON.
- Trailing commas: Strictly forbidden.
{"a": 1,}is invalid JSON. - Comments: Not allowed. This is a deliberate design choice by Douglas Crockford.
- Duplicate keys: Technically undefined behavior. Some parsers take the last value, others the first, others throw an error.
Use our JSON Formatter to instantly validate and pretty-print any JSON string against these rules.
Why Pretty-Print? The Case for Human-Readable JSON
Minified JSON saves bandwidth but destroys readability. For production APIs serving millions of requests, minification can meaningfully reduce payload sizes. However, there are compelling reasons to keep JSON human-readable in many contexts:
- Config files: Always pretty-print. Nobody wants to read
{"server":{"port":3000,"host":"localhost"}}on one line. - Version control: Pretty-printed JSON diffs cleanly. Minified JSON produces useless diffs.
- Development APIs: Add an
?pretty=truequery parameter or check theAcceptheader to serve formatted responses during development. - Log files: Structured logging with pretty-printed JSON is far easier to read during incident response.
The bandwidth argument against pretty-printing is largely resolved by HTTP compression. Gzip typically reduces JSON by 60β80%, making the extra whitespace nearly free.
2-Space vs 4-Space: The Great Indentation Debate
This is less of a technical question and more of a cultural one. Here is the breakdown:
2-space indentation is preferred by:
- The JSON specification examples themselves
- Node.js and npm ecosystem (
package.jsonuses 2 spaces) - Google's style guides
- Most modern JavaScript formatters (Prettier defaults to 2)
4-space indentation is preferred by:
- Python developers (PEP 8 influence)
- Deeply nested configs where 4 spaces aids visual separation
- Some enterprise Java/C# style guides
With JSON.stringify, you control this with the third argument:
// 2-space indentation
JSON.stringify(data, null, 2);
// 4-space indentation
JSON.stringify(data, null, 4);
// Tab indentation
JSON.stringify(data, null, '\t');
The pragmatic answer: use whatever your team or project already uses. Consistency matters more than the specific choice.
JSON5 and JSONC: Adding Comments to Configuration Files
Plain JSON's lack of comments is its biggest frustration for config files. Two popular supersets address this:
JSON5 (json5.org) adds:
- Single-line (
//) and multi-line (/* */) comments - Trailing commas in objects and arrays
- Single-quoted strings
- Unquoted object keys
- Hexadecimal numbers (
0xFF) - Multi-line strings
JSONC (JSON with Comments) is simplerβit adds only comments to standard JSON. VS Code uses it for settings.json and tsconfig.json. TypeScript's tsconfig.json is a JSONC file.
// tsconfig.json (JSONC format)
{
// Enable strict mode
"compilerOptions": {
"strict": true,
"target": "ES2022", // Modern target
"outDir": "./dist",
}
}
Neither JSON5 nor JSONC can be parsed by the native JSON.parse(). You need dedicated parsers like the json5 npm package or VS Code's built-in JSONC parser.
JSON Schema Validation
JSON Schema (draft 2020-12 is the latest) lets you define the structure your JSON must conform to. This is invaluable for API contracts and config file validation:
const schema = {
"$schema": "https://json-schema.org/draft/2020-12/schema",
"type": "object",
"required": ["name", "version"],
"properties": {
"name": {
"type": "string",
"minLength": 1,
"maxLength": 100
},
"version": {
"type": "string",
"pattern": "^\\d+\\.\\d+\\.\\d+$"
},
"port": {
"type": "integer",
"minimum": 1,
"maximum": 65535
}
},
"additionalProperties": false
};
Use the Ajv library in Node.js for high-performance JSON Schema validation. It compiles schemas to optimized JavaScript functions.
Handling Dates in JSON: Always Use ISO 8601
JSON has no native date type. The two common approaches are Unix timestamps (integers) and ISO 8601 strings. ISO 8601 strings are strongly preferred for human-readable APIs:
// Good: ISO 8601 with timezone (always include Z or offset)
{ "createdAt": "2026-04-10T14:30:00Z" }
{ "createdAt": "2026-04-10T16:30:00+02:00" }
// Acceptable: Unix timestamp in milliseconds
{ "createdAt": 1744291800000 }
// Bad: Human-readable but unparseable format
{ "createdAt": "April 10, 2026 2:30 PM" }
The critical rule: always include timezone information. Timezone-naive dates cause bugs that appear only in production when servers are in different timezones.
The BigInt Problem: JSON.stringify Loses Precision
JavaScript numbers are IEEE 754 double-precision floats. This means integers larger than Number.MAX_SAFE_INTEGER (2^53 - 1 = 9,007,199,254,740,991) cannot be represented exactly:
// A large ID from a database or distributed system
const response = '{"userId": 9007199254740993}';
const parsed = JSON.parse(response);
console.log(parsed.userId); // 9007199254740992 (wrong!)
// The safe solution: use strings for large IDs
const safeResponse = '{"userId": "9007199254740993"}';
const safeParsed = JSON.parse(safeResponse);
console.log(safeParsed.userId); // "9007199254740993" (correct string)
Twitter (now X) famously hit this bug when user IDs grew large enough to lose precision in JavaScript. Their API now returns both numeric and string versions of IDs. The lesson: use string type for any ID or value that might exceed 2^53.
For environments that need actual BigInt handling, the json-bigint npm package provides a drop-in replacement for JSON.parse that handles large integers correctly.
Circular Reference Errors
Attempting to JSON.stringify an object with circular references throws a TypeError:
const obj = { name: "parent" };
obj.self = obj; // circular reference
JSON.stringify(obj); // TypeError: Converting circular structure to JSON
// Solution 1: Use a replacer function to skip circular refs
function stringifyWithoutCircular(obj) {
const seen = new WeakSet();
return JSON.stringify(obj, (key, value) => {
if (typeof value === 'object' && value !== null) {
if (seen.has(value)) return '[Circular]';
seen.add(value);
}
return value;
}, 2);
}
// Solution 2: Use the 'flatted' npm package for full circular support
The JSON.stringify Replacer Function
The second argument to JSON.stringify is powerful and underused. It can be an array (whitelist of keys) or a function (transform values):
const user = {
id: 1,
name: "Alice",
password: "secret123", // Don't serialize this!
createdAt: new Date(),
balance: 1234567.89
};
// Array replacer: whitelist specific keys
JSON.stringify(user, ['id', 'name'], 2);
// Function replacer: transform values
JSON.stringify(user, (key, value) => {
if (key === 'password') return undefined; // exclude
if (value instanceof Date) return value.toISOString(); // serialize dates
if (typeof value === 'number' && key === 'balance') {
return parseFloat(value.toFixed(2)); // round currency
}
return value;
}, 2);
JSON vs YAML for Configuration Files
YAML is a superset of JSON and is widely used for configuration (Docker Compose, Kubernetes, GitHub Actions). When should you choose one over the other?
Choose JSON when:
- The config is consumed programmatically and rarely edited by humans
- You want strict parsing with no ambiguity
- The ecosystem expects JSON (
package.json, browser extension manifests)
Choose YAML when:
- Humans edit the config frequently and comments are valuable
- Multi-line strings appear often (YAML's block scalars are cleaner)
- The tooling ecosystem expects YAML (Kubernetes, Ansible, GitHub Actions)
Convert between formats instantly with our JSON to YAML Converter.
jq for Command-Line JSON Formatting
jq is the Swiss Army knife of command-line JSON processing. Install it once and you will wonder how you lived without it:
# Pretty-print a JSON file
jq '.' data.json
# Pretty-print an API response
curl -s https://api.example.com/users | jq '.'
# Extract a specific field
curl -s https://api.example.com/user/1 | jq '.name'
# Filter an array
jq '.users[] | select(.active == true)' users.json
# Transform structure
jq '{id: .id, fullName: (.firstName + " " + .lastName)}' user.json
# Count items in an array
jq '.items | length' cart.json
# Sort by a field
jq 'sort_by(.createdAt) | reverse' posts.json
You can also use our browser-based JSON Formatter for quick inspection without installing anything, and our JSON Diff tool to compare two JSON structures and spot the differences.
Practical Fetch + JSON Pattern
async function fetchUser(id) {
const response = await fetch(`/api/users/${id}`, {
headers: { 'Accept': 'application/json' }
});
if (!response.ok) {
throw new Error(`HTTP ${response.status}: ${response.statusText}`);
}
// response.json() handles parsing; check Content-Type first in strict code
const contentType = response.headers.get('Content-Type') ?? '';
if (!contentType.includes('application/json')) {
throw new Error(`Unexpected content type: ${contentType}`);
}
return response.json();
}
async function updateUser(id, data) {
const response = await fetch(`/api/users/${id}`, {
method: 'PUT',
headers: {
'Content-Type': 'application/json',
'Accept': 'application/json'
},
body: JSON.stringify(data, null, 0) // minified for network transfer
});
return response.json();
}
Summary
Solid JSON practices are foundational to professional web development. Use ISO 8601 for dates, string types for large integers, replacer functions to control serialization, and JSON Schema for validation. For config files, consider JSONC or JSON5 when comments are needed. Use jq for command-line work and our JSON Formatter for browser-based formatting and validation.