JSONify Tools

Privacy-first JSON tools for developers

tools
  • Formatter
  • Minifier
  • Tree Viewer
  • Text Diff
more
  • JSON to CSV
  • Schema Validator
  • Path Finder
  • Key Sorter
  • Field Renamer
  • JSON/YAML
  • Base64
  • URL Encode
  • JWT Decoder
  • JSON Diff
  • Regex Tester
  • Hash Gen
  • Color Convert
  • Timestamp
links
  • Blog
  • FAQ
© 2026 JSONify ToolsAll processing happens in your browserPrivacy · Terms
Skip to main content
JSONify ToolsJSONify Toolsv2.0
HomeJSON FormatterJSON BeautifierJSON MinifierText CompareBlog
February 8, 2025
7 min read
Best Practices

5 JSON API Pitfalls That Will Bite You in Production

These bugs do not show up in development. They show up at 2 AM when a customer reports that their payment was $0.01 off, or that their account ID changed.

JSONAPIRESTBackendProduction

1. Not Validating Response Shapes

Your code expects response.data.users to be an array. The API returns it as an array for months. Then one day, the backend team deploys a change and it comes back as an object with a results key. Your frontend does not crash immediately. It just silently renders zero users, and nobody notices for hours.

The fix is runtime validation at the boundary where external data enters your application. Not just type assertions; actual validation.

Validate at the boundary with Zod:

import { z } from "zod";

const UserSchema = z.object({
  id: z.number(),
  name: z.string(),
  email: z.string().email()
});

const ApiResponseSchema = z.object({
  data: z.object({
    users: z.array(UserSchema)
  })
});

async function fetchUsers() {
  const res = await fetch("/api/users");
  const json = await res.json();

  // This throws with a clear error if the shape is wrong
  const parsed = ApiResponseSchema.parse(json);

  // TypeScript now knows the exact shape
  return parsed.data.users;
}

// When the API changes shape, you get:
// ZodError: Expected array, received object at "data.users"
// instead of silent wrong behavior

The 10 minutes you spend writing a Zod schema saves you from the 3-hour debugging session when the API silently changes. Put validation at every external boundary: API responses, webhook payloads, config files, localStorage reads.

2. Floating Point Numbers for Money

JSON has one number type. JavaScript represents all numbers as IEEE 754 double-precision floats. This means your financial calculations will eventually be wrong.

The classic problem:

// A real scenario: calculating order totals
const items = [
  { name: "Widget", price: 0.1 },
  { name: "Gadget", price: 0.2 }
];

const total = items.reduce((sum, item) => sum + item.price, 0);
console.log(total);
// 0.30000000000000004

// This fails:
console.log(total === 0.3);
// false

// Worse: rounding errors accumulate across thousands of transactions
// You end up with ledgers that don't balance

The fix: send money as integers (cents) in your JSON APIs. Every payment processor does this. Stripe sends "amount": 1999 for $19.99. Format for display only at the last possible moment.

The correct approach:

// API sends cents as integers
const lineItems = [
  { name: "Widget", price_cents: 1000 },
  { name: "Gadget", price_cents: 2000 }
];

// Integer arithmetic is exact
const totalCents = lineItems.reduce(
  (sum, item) => sum + item.price_cents, 0
);
// 3000 (exactly correct)

// Format only for display
const displayTotal = (totalCents / 100).toFixed(2);
// "$30.00"

3. Date Serialization Chaos

JSON has no date type. So every API invents its own format. You will encounter ISO 8601 strings, Unix timestamps in seconds, Unix timestamps in milliseconds, and whatever format Java's SimpleDateFormat defaults to.

Date formats you will see in the wild:

// ISO 8601 (the good one)
{ "created": "2025-02-08T14:30:00Z" }

// ISO 8601 with offset (also fine)
{ "created": "2025-02-08T14:30:00+05:30" }

// Unix timestamp in SECONDS (common in APIs)
{ "created": 1738934400 }

// Unix timestamp in MILLISECONDS (what JS Date expects)
{ "created": 1738934400000 }

// The Java classic (ambiguous without timezone info)
{ "created": "Feb 8, 2025 2:30:00 PM" }

// Date only, no time (what timezone?)
{ "created": "2025-02-08" }

The dangerous one is the Unix timestamp ambiguity. If an API sends 1738934400 and you pass it directly to new Date(), JavaScript interprets it as milliseconds. You get a date in January 1970. No error, just silently wrong data.

Defensive date parsing:

function parseApiDate(value: string | number): Date {
  if (typeof value === "number") {
    // If the number is less than 1e12, it's probably seconds
    // (timestamps in seconds won't exceed 1e12 until year 33658)
    const ms = value < 1e12 ? value * 1000 : value;
    return new Date(ms);
  }
  // For strings, Date.parse handles ISO 8601
  const date = new Date(value);
  if (isNaN(date.getTime())) {
    throw new Error(`Unparseable date: ${value}`);
  }
  return date;
}

For your own APIs, pick ISO 8601 with a timezone offset and stick with it. Document the format. Future you will be grateful.

4. Large Number Precision Loss

JavaScript can safely represent integers up to 2^53 - 1 (that is 9,007,199,254,740,991). Any integer larger than that loses precision when parsed from JSON. This is not a theoretical problem. Twitter IDs, Snowflake IDs, and many database primary keys exceed this limit.

Precision loss in action:

// A real Twitter-style ID
const json = '{"id": 9007199254740993}';
const parsed = JSON.parse(json);

console.log(parsed.id);
// 9007199254740992  <-- WRONG! Last digit changed

console.log(parsed.id === 9007199254740993);
// false

// This means: if you use numeric IDs larger than 2^53,
// JSON.parse will silently corrupt them.
// Your app will fetch the wrong user, update the wrong record,
// or create duplicate entries.

The fix: APIs should send large IDs as strings. Twitter learned this the hard way and now includes both id (number) and id_str (string) in their responses. If you control the API, just use strings.

If you cannot change the API, use a reviver:

// Parse with BigInt support for known ID fields
function safeParse(jsonStr: string, bigIntKeys: string[]) {
  // Replace large numbers in known fields with strings before parsing
  let processed = jsonStr;
  for (const key of bigIntKeys) {
    const pattern = new RegExp(
      `("${key}"\\s*:\\s*)(\\d{16,})`,
      "g"
    );
    processed = processed.replace(pattern, '$1"$2"');
  }
  return JSON.parse(processed);
}

const result = safeParse(
  '{"id": 9007199254740993, "name": "Alice"}',
  ["id"]
);
console.log(result.id);
// "9007199254740993" (string, no precision loss)

5. null vs Missing Keys

In JavaScript, undefined and null are different values. But JSON only has null. When you serialize a JavaScript object to JSON, undefined values disappear entirely. This creates subtle bugs when round-tripping data through an API.

The disappearing field:

const user = {
  name: "Alice",
  nickname: undefined,  // user hasn't set one
  bio: null             // user explicitly cleared it
};

const json = JSON.stringify(user);
console.log(json);
// {"name":"Alice","bio":null}
// "nickname" is GONE. Not null. Gone.

const parsed = JSON.parse(json);
console.log("nickname" in parsed);
// false

console.log(parsed.nickname);
// undefined (but for a different reason now)

// This matters when you do PATCH updates:
// "field is null" = clear this field
// "field is missing" = don't touch this field
// After round-tripping, you can't tell the difference

The fix: never use undefined for values that need to survive JSON serialization. Use null explicitly when a field has no value. If you need to distinguish between "no value" and "not specified," use a sentinel or a separate set of field names.

A replacer function that preserves undefined as null:

function stringifyPreservingUndefined(obj: unknown): string {
  return JSON.stringify(obj, (_key, value) => {
    return value === undefined ? null : value;
  });
}

const user = {
  name: "Alice",
  nickname: undefined,
  bio: null
};

console.log(stringifyPreservingUndefined(user));
// {"name":"Alice","nickname":null,"bio":null}
// Both fields are now preserved

Validate Your JSON APIs

Catch these pitfalls before they reach production. Validate JSON structure and schemas with our free tools: