You’ve been parsing JSON wrong your whole life


You’ve been parsing JSON wrong
your whole life

This issue is brought to you by:

Secure Your AI Future at DevSecCon 2025

Software development is undergoing a seismic shift
as AI transforms how we build, deploy, and secure applications.
Register for the 1st-ever Global Community Summit on AI Security,
covering critical strategies to empower AI innovation without compromising security.

Ever opened a massive “.log” file and realized it’s just one long, angry paragraph of nested JSON?
Brackets everywhere, quotes all over, zero hopes of finding what you actually need.
I’ve been there.
One time I had to hunt down a single request ID inside thousands of lines of server logs.
Online parsers literally stopped working trying to process it.
My terminal felt like it was personally mocking me.
I spent hours grepping and scrolling like a caveman.
Here’s the kicker: your brain can only juggle about four chunks of info at once. Trying to manually track nested structures to be able to glue queries together and find information is like trying to do algebra while juggling balls (I can't do either but that's a different story).
It’s not your fault... it’s the wrong approach.

Brute force and scripts don’t cut it

Most people default to two bad options: manually skimming the mess or writing scripts that turns into a not-so-quick yet-feels-like-the-automation-is-productive mini project.
Manual scanning? Painful and error-prone.
Scripts? They break when the JSON shape shifts, and they don’t help you explore. Also, they're just time consuming and there's nothing worse then wasting time on automation that only runs once...
Neither scales when you’re dealing with real-world logs, API payloads, or CI output that’s constantly changing.

Becoming a JSON native speaker with JQ

Here’s the unlock: jq isn’t just a tool, it’s a tiny, powerful language for JSON. Think sed/grep/awk, but designed for structured data.
You start with the whole object (that’s . in jq), then drill down:

.team // gets the array
.team[] // iterates over members
.team[].name // pulls just the names.

// Need only the second person?
.team[1].name


// Want to reshape output? Map it into a new object:
{message: .commit.message, author: .commit.committer.name}

// Want raw output without quotes? Add -r.
// Need compact, single-line JSON? -c.
// Fetching from curl? Pipe it straight in—jq will pretty-print, colorize, and // let you explore instantly.

Make the machine do the thinking

This is where jq goes from “handy” to “ridiculous.” You can:

  • Filter: .team[] | select(.salary > 115000)
  • Count: wrap results in an array then length: [ ... ] | length
  • Slice arrays: .team[1:3]
  • Explore shapes: keys and .some.list | keys
  • Delete noise: .team[] | del(.skills, .whateverElse)
  • Test structure: has("startup")
  • Regex, math, conditionals: built right in


Real world example: logs

Say you’ve got a huge JSON log stream. Start with jq '.' just to see shape and get colors. If it’s an array of entries, check keys on the first one: .[0] | keys. Dive deeper: .[].details.latency | select(. > 4422). Now count the objects: [ .[] | select(.details.latency > 4422) ] | length. Filter errors/warnings: .[] | select(.level == "error"). This feels like superpowers because it is—no more squinting, no more fragile scripts. You’re querying data the way it’s structured.

Bonus round: cousins you should know

  • gojq: a Go implementation of jq with friendlier errors and native YAML support (add input/output flags for YAML). If jq’s error messages ever made you cry, gojq gives you tissues.
  • jaq: a Rust take on jq aiming for speed and correctness.
  • yq: jq’s spiritual cousin for YAML (and more). Same mental model, cross-format parsing.
  • jqp: a TUI that lets you iterate your jq query with live JSON on the left—zero re-running commands.
  • gron: flattens JSON into greppable dotted paths. Perfect for quick and dirty searches.
  • jqjq: yes, jq implemented in jq. Because of course 🤣

If you live in JSON, try Nushell

If you find yourself using jq daily, take a look at Nushell.
It treats data as tables by default: open a JSON file and you immediately get columns, nested paths suggested for you, and super simple filters: get team | where salary > 115000.
It converts between JSON, YAML, TOML, CSV, you name it.
It’s not a full programming language like jq, but for day to day data wrangling, it’s a joy.
I still use jq in scripts, CI, and one liners.
Nushell is my daily driver when I want to explore fast (which is every day 😉)


Do this today

  • Install jq (or gojq if you want YAML niceties and better parser errors).
  • Pick a real JSON you struggle with: API response, app logs, CI output.
  • Explore shape: jq 'keys', jq '.team | keys', jq '.[0] | keys'
  • Extract just what you need: jq '.team[].name' -r
  • Filter and count: jq '[.logs[] | select(.level == "error")] | length'
  • Reshape for clarity by defining the object you want to see:
    jq '.commits[0] | {msg: .commit.message, by: .commit.committer.name}'


Stop wrestling text.
Start speaking JSON.
jq turns “needle in a haystack” into “one liner and done.”
And if you want a smoother everyday experience, grab Nushell and let your shell speak data natively (with some caveats I mention here).
You’ll save hours, reduce errors, and honestly feel a little bit like you unlocked a cheat code.

Thank you for reading.

Feel free to reply directly with any question or feedback.

Have a great weekend!

ESPRESSO FRIDAYS

Every once in a while I send hand picked things I've learned. Kind of like your filter to the tech internet. No spam, I promise!

Read more from ESPRESSO FRIDAYS

Wait… cURL can do WHAT?! Brought to you in collaboration with 1Password and Browserbase: 🔐 Your AI Agents Can Finally Log In! 1Password and Browserbase just partnered to solve AI’s biggest security nightmare: authentication without exposing credentials. Introducing 1Password’s Secure Agentic Autofill, allowing you to connect your 1Password Enterprise Password Manager to your browser automation agent powered by Browserbase. Build AI agents that can actually work without compromising security....

Postgres is not a database. For years, we’ve been taught to see Postgres as the reliable, open source workhorse for storing our data. Everyone called it "The Toyota of databases", you know... it just works. But to leave it at that is to miss the whole story.Postgres isn’t just a place to put your data, it’s a powerful development platform that can become the core of your entire backend, an operating system (yea, bold) for your application’s stack. Diving deep into its capabilities I learned...

How I Setup Terminal On My Mac To Make It Amazing I often get asked about my "terminal setup", and I try to throw tips but it's never enough. This answer took a long time to compile but I'm glad to share it: A terminal setup from scratch (literal white screen terminal) to multiplexing, color output, auto-completion, history manager, fonts, nerd fonts etc etc etc... I learned that the journey to a “perfect” terminal setup is a personal one, built from years of small, incremental improvements....