You’ve been parsing JSON wrong your whole life


You’ve been parsing JSON wrong
your whole life

This issue is brought to you by:

Secure Your AI Future at DevSecCon 2025

Software development is undergoing a seismic shift
as AI transforms how we build, deploy, and secure applications.
Register for the 1st-ever Global Community Summit on AI Security,
covering critical strategies to empower AI innovation without compromising security.

Ever opened a massive “.log” file and realized it’s just one long, angry paragraph of nested JSON?
Brackets everywhere, quotes all over, zero hopes of finding what you actually need.
I’ve been there.
One time I had to hunt down a single request ID inside thousands of lines of server logs.
Online parsers literally stopped working trying to process it.
My terminal felt like it was personally mocking me.
I spent hours grepping and scrolling like a caveman.
Here’s the kicker: your brain can only juggle about four chunks of info at once. Trying to manually track nested structures to be able to glue queries together and find information is like trying to do algebra while juggling balls (I can't do either but that's a different story).
It’s not your fault... it’s the wrong approach.

Brute force and scripts don’t cut it

Most people default to two bad options: manually skimming the mess or writing scripts that turns into a not-so-quick yet-feels-like-the-automation-is-productive mini project.
Manual scanning? Painful and error-prone.
Scripts? They break when the JSON shape shifts, and they don’t help you explore. Also, they're just time consuming and there's nothing worse then wasting time on automation that only runs once...
Neither scales when you’re dealing with real-world logs, API payloads, or CI output that’s constantly changing.

Becoming a JSON native speaker with JQ

Here’s the unlock: jq isn’t just a tool, it’s a tiny, powerful language for JSON. Think sed/grep/awk, but designed for structured data.
You start with the whole object (that’s . in jq), then drill down:

.team // gets the array
.team[] // iterates over members
.team[].name // pulls just the names.

// Need only the second person?
.team[1].name


// Want to reshape output? Map it into a new object:
{message: .commit.message, author: .commit.committer.name}

// Want raw output without quotes? Add -r.
// Need compact, single-line JSON? -c.
// Fetching from curl? Pipe it straight in—jq will pretty-print, colorize, and // let you explore instantly.

Make the machine do the thinking

This is where jq goes from “handy” to “ridiculous.” You can:

  • Filter: .team[] | select(.salary > 115000)
  • Count: wrap results in an array then length: [ ... ] | length
  • Slice arrays: .team[1:3]
  • Explore shapes: keys and .some.list | keys
  • Delete noise: .team[] | del(.skills, .whateverElse)
  • Test structure: has("startup")
  • Regex, math, conditionals: built right in


Real world example: logs

Say you’ve got a huge JSON log stream. Start with jq '.' just to see shape and get colors. If it’s an array of entries, check keys on the first one: .[0] | keys. Dive deeper: .[].details.latency | select(. > 4422). Now count the objects: [ .[] | select(.details.latency > 4422) ] | length. Filter errors/warnings: .[] | select(.level == "error"). This feels like superpowers because it is—no more squinting, no more fragile scripts. You’re querying data the way it’s structured.

Bonus round: cousins you should know

  • gojq: a Go implementation of jq with friendlier errors and native YAML support (add input/output flags for YAML). If jq’s error messages ever made you cry, gojq gives you tissues.
  • jaq: a Rust take on jq aiming for speed and correctness.
  • yq: jq’s spiritual cousin for YAML (and more). Same mental model, cross-format parsing.
  • jqp: a TUI that lets you iterate your jq query with live JSON on the left—zero re-running commands.
  • gron: flattens JSON into greppable dotted paths. Perfect for quick and dirty searches.
  • jqjq: yes, jq implemented in jq. Because of course 🤣

If you live in JSON, try Nushell

If you find yourself using jq daily, take a look at Nushell.
It treats data as tables by default: open a JSON file and you immediately get columns, nested paths suggested for you, and super simple filters: get team | where salary > 115000.
It converts between JSON, YAML, TOML, CSV, you name it.
It’s not a full programming language like jq, but for day to day data wrangling, it’s a joy.
I still use jq in scripts, CI, and one liners.
Nushell is my daily driver when I want to explore fast (which is every day 😉)


Do this today

  • Install jq (or gojq if you want YAML niceties and better parser errors).
  • Pick a real JSON you struggle with: API response, app logs, CI output.
  • Explore shape: jq 'keys', jq '.team | keys', jq '.[0] | keys'
  • Extract just what you need: jq '.team[].name' -r
  • Filter and count: jq '[.logs[] | select(.level == "error")] | length'
  • Reshape for clarity by defining the object you want to see:
    jq '.commits[0] | {msg: .commit.message, by: .commit.committer.name}'


Stop wrestling text.
Start speaking JSON.
jq turns “needle in a haystack” into “one liner and done.”
And if you want a smoother everyday experience, grab Nushell and let your shell speak data natively (with some caveats I mention here).
You’ll save hours, reduce errors, and honestly feel a little bit like you unlocked a cheat code.

Thank you for reading.

Feel free to reply directly with any question or feedback.

Have a great weekend!

ESPRESSO FRIDAYS

Every once in a while I send hand picked things I've learned. Kind of like your filter to the tech internet. No spam, I promise!

Read more from ESPRESSO FRIDAYS

Stop Renting SaaS. Build Your Own Cloud. This issue is brought to you by: Security, Performance, Simplicity. Pick Three. Twingate delivers an identity-based access for users, services, and AI agents that deploys in minutes, scales to every resource, and finally lets you retire your VPN. Try Twingate - it's FREE! -> Why pay cloud companies when you can just… not? I’ve recently started running my own services at home, because.. honestly? I’m tired of paying cloud providers for things I can run...

This Tool Replaced 7 CLIs (and killed my opensource) This issue is brought to you by: Depot: Build faster. Waste less time. Accelerate your Docker image builds and GitHub Actions workflows. Easily integrate with your existing CI provider and dev workflows to save hours of build time. Get started for free -> I’ve been in the terminal for 12 years. I don’t get surprised often. Then I found Television, and I was wrong about it before I even opened it. The friction of endless pipes ||| There’s a...

My Opencode Workflow As A Senior Engineer This issue is brought to you by: Descope: Drag & Drop Your Auth Your engineers are building at warp speed, so why should auth be left behind? Descope provides no / low code workflows that decouple auth, access control, and user management from your app’s codebase so your teams can focus on the core product. Signup and Get Started Now Everyone's trying to replace themselves. I'm just trying to ship faster. 11 months ago, Dario Amodei said "AI would be...