You want to run a .context,.score filter on each element of v I think:

$ jq -r '.[] | [.c, .e, .score, (.v[] | .context,.score)] | @csv' file.json
"A","B",0.99,"asdf",0.98,"bcdfd",0.97

This is equivalent to using the builtin map function without assembling the results back into an array.

Answer from steeldriver on Stack Exchange
Top answer
1 of 2
4

You want to run a .context,.score filter on each element of v I think:

$ jq -r '.[] | [.c, .e, .score, (.v[] | .context,.score)] | @csv' file.json
"A","B",0.99,"asdf",0.98,"bcdfd",0.97

This is equivalent to using the builtin map function without assembling the results back into an array.

2 of 2
2

The following creates a JSON-encoded CSV record for each top-level array element, and then extracts and decodes them. For each of the top-level elements, the values of the sub-array is incorporated by "flattening" the array.

jq -r 'map([ .c,.e,.score, (.v|map([.context, .score])) ] | flatten | @csv)[]' file

Given a test document equivalent of the following:

[
   {
      "c": "A",
      "e": "B",
      "score": 0.99,
      "v": [
         { "context": "asdf", "score": 0.98, "url": "..." },
         { "context": "bcdfd", "score": 0.97, "url": "..." }
      ]
   },
   {
      "c": "A",
      "e": "B",
      "score": 0.99,
      "v": [
         { "context": "asdf", "score": 0.98, "url": "..." },
         { "context": "asdf", "score": 0.98, "url": "..." },
         { "context": "bcdfd", "score": 0.97, "url": "..." }
      ]
   },
   {
      "c": "A",
      "e": "B",
      "score": 0.99,
      "v": [
         { "context": "asdf", "score": 0.98, "url": "..." },
         { "context": "asdf", "score": 0.98, "url": "..." },
         { "context": "asdf", "score": 0.98, "url": "..." },
         { "context": "bcdfd", "score": 0.97, "url": "..." }
      ]
   }
]

... we get

"A","B",0.99,"asdf",0.98,"bcdfd",0.97
"A","B",0.99,"asdf",0.98,"asdf",0.98,"bcdfd",0.97
"A","B",0.99,"asdf",0.98,"asdf",0.98,"asdf",0.98,"bcdfd",0.97

One could also reorder the operations so that a single use of the @csv operator gets a set of arrays (rather than repeatedly using @csv on single arrays):

jq -r 'map([ .c,.e,.score, (.v|map([.context, .score])) ] | flatten)[]|@csv' file
Discussions

How can I use jq to filter only certain fields of the original JSON into an CSV?
In Ruby, "because it was there." cat foo.json | ruby -e 'require "json"; JSON.parse(STDIN.read).each do |line|; output=[]; line.each do |x,y|; output.push(y) if (x!="env"); end; puts output.join(","); end' foo,1.24 bar,1.21 boo,1.23 far,1.24 More on reddit.com
🌐 r/linuxquestions
2
5
February 17, 2023
Using jq to extract values from column-oriented JSON and format in CSV - Unix & Linux Stack Exchange
I assume you unpack this with a trivial command like cat input.json | jq '.data' > data.json, since it isn't really relevant to what's being asked (transforming to CSV). ... I traversed through to the third level of data objects using the empty [] index field form and .dot notation. More on unix.stackexchange.com
🌐 unix.stackexchange.com
October 23, 2014
json - How to concat multiple fields to same line with jq - Stack Overflow
You might wish to consider using ... using one of @csv, @tsv, or join/1; or using the -j command-line option. This is all pretty clearly explained in the standard jq documentation (see e.g. https://stackoverflow.com/tags/jq/info), as is the use of select for making a ... More on stackoverflow.com
🌐 stackoverflow.com
Using jq to parse and display multiple fields in a json serially - Stack Overflow
The next expression, [.first, .last], ... and @csv to print all input arrays as tab-separated and comma-seperated values, respectively. Similarly, it is possible to construct JSON values again, which is interesting if you just want to keep a subset of the fields: $ cat file.json | jq -c '.users[] ... More on stackoverflow.com
🌐 stackoverflow.com
🌐
Lekkimworld
lekkimworld.com › 2018 › 09 › 07 › jq-and-multi-field-output-to-csv
jq and multi-field output to CSV – lekkimworld.com
September 7, 2018 - $ sfdx force:schema:sobject:describe -s Account -u myorg --json | jq -r ".result.fields[] | .label, .name" Age Age__pc PO Box PO_Box__pc Postal Code Before City Postal_Code_Before_City__pc Street No Before Street Street_No_Before_Street__pc Archived State Archived_State__pc · The output is almost what I wanted but really wanted not to have to edit the file manually to build the output. Some quick googling and it appears that jq supports both CSV and tabular output from arrays. So fixing the issue was as simple as follows:
🌐
Programming Historian
programminghistorian.org › en › lessons › json-and-jq
Reshaping JSON with jq | Programming Historian
May 24, 2016 - Both of these commands are wrapped ... This filter created new JSON. To produce a CSV table from this, we just need to add an array construction and the @csv command at the end of this filter....
🌐
Mark Needham
markhneedham.com › blog › 2021 › 05 › 19 › jq-select-multiple-keys
jq: Select multiple keys | Mark Needham
May 19, 2021 - And then if we want to go one step further, we could even convert that all into a CSV file using the @csv operator: cat swagger-clean.json | jq -r '.paths | keys[] as $k | [ (.[$k] | keys[] as $k1 | [$k, $k1, .[$k1].operationId, .[$k1].summary] ) ] | .[] | @csv' Job done and jq to the rescue again!
🌐
Reddit
reddit.com › r/linuxquestions › how can i use jq to filter only certain fields of the original json into an csv?
r/linuxquestions on Reddit: How can I use jq to filter only certain fields of the original JSON into an CSV?
February 17, 2023 -

Hello, I say this stack overflow anser with a simple jq command to convert a JSON to a csv file, but I need to improve it further.

Say I have the following JSON:

[
    {
        "name": "foo",
        "env": "dev",
        "version": "1.24"
    },
    {
        "name": "bar",
        "env": "staging",
        "version": "1.21"
    },
    {
        "name": "boo",
        "env": "prod",
        "version": "1.23"
    },
    {
        "name": "far",
        "env": "prod",
        "version": "1.24"
    }
]

How does one create the CSV with only the "name" and "version" fields?

My current command is:

jq -r '(map(keys) | add | unique) as $cols | map(.[] | {name, version} as $row | $cols | map($row[.])) as $rows | $cols, $rows[] | @csv'

This is not working. Can anyone provide some help?

Thanks!

🌐
Qmacro
qmacro.org › blog › posts › 2022 › 05 › 19 › json-object-values-into-csv-with-jq
JSON object values into CSV with jq - DJ Adams
May 19, 2022 - I'm likely to want to use this approach again some time, so I'll store the core construct here as a function in my local ~/.jq file (see the modules section of the manual for more detail): ... Now I can use that function wherever I want; here's a great place, because it also simplifies the entire invocation: jq --raw-output ' .value[] | onlyvalues | @csv ' entities.json
Find elsewhere
🌐
Earthly
earthly.dev › blog › jq-select
JQ Select Explained: Selecting elements from JSON with Examples - Earthly Blog
July 24, 2023 - You convert from JSON to CSV. You can define you own functions and even find primes with jq: # Denoting the input by $n, which is assumed to be a positive integer, # eratosthenes/0 produces an array of primes less than or equal to $n: def eratosthenes: (. + 1) as $n | (($n|sqrt) / 2) as $s | [null, null, range(2; $n)] | reduce (2, 1 + (2 * range(1; $s))) as $i (.; erase($i)) | map(select(.));
🌐
Bashoneliners
bashoneliners.com › oneliners › 332
Printing with jq multiple values in CSV or TSV formats | bashoneliners.com
October 13, 2023 - curl -s 'https://api.github.com/orgs/github/repos' | jq -r '.[] | [.id, .name, .stargazers_count] | @csv' ... The curl calls a GitHub API endpoint to get repository infos of the github organization in JSON format. The -s (or --silent) flag makes curl silent, suppressing messages about progress or errors. The JSON content we want to filter using the subsequent jq looks something like this (showing here just the fields relevant for the example, but there are many many more):
🌐
Nhanvietluanvan
nhanvietluanvan.com › trang chủ › efficiently selecting multiple fields with jquery: a comprehensive guide
Efficiently Selecting Multiple Fields With Jquery: A Comprehensive Guide
July 7, 2023 - A: Yes, you can use logical operators like AND (&&) and OR (||) to perform complex field selections with multiple criteria. For example, `jq ‘select(.status == “active” && .age > 30)’` will select fields where the “status” is ...
🌐
CopyProgramming
copyprogramming.com › howto › jq-select-multiple-fields-from-array
Mastering jq: Selecting Multiple Fields from Arrays and 2026 Best Practices - Jq select multiple fields from array
November 21, 2025 - The -c flag produces compact JSONL (JSON Lines) output, ideal for streaming processing and log files. For CSV export, pipe selected fields into the @csv filter: .[] | [.id, .title, .user.login] | @csv converts array elements to comma-separated values. Raw output mode (-r) extracts values as ...
🌐
jq
jqlang.org › manual
jq 1.8 Manual
The input must be an array, and it is rendered as CSV with double quotes for strings, and quotes escaped by repetition. ... The input must be an array, and it is rendered as TSV (tab-separated values).
Top answer
1 of 1
1

Quick jq lesson
===========

jq filters are applied like this:
jq -r '.name_of_json_field_0 <optional filter>, .name_of_json_field_1 <optional filter>'
and so on and so forth. A single dot is the simplest filter; it leaves the data field untouched.

jq -r '.name_of_field .'

You may also leave the filter field untouched for the same effect. In your case: jq -r '.name, .description' will extract the values of both those fields.

.[] will unwrap an array to have the next piped filter applied to each unwrapped value. Example:
jq -r '.attributes | .[]
extracts all trait_types objects.

You may sometime want to repackage objects in an array by surrounding the filter in brackets:
jq -r '[.name, .description, .date]

You may sometime want to repackage data in an object by surrounding the filter in curly braces:
`jq -r '{new_field_name: .name, super_new_field_name: .description}'

playing around with these, I was able to get

jq -r '[.name, .description, .date, (.attributes | [.[] | .trait_type] | @csv | gsub(",";";") | gsub("\"";"")), (.attributes | [.[] | .value] | .[]] | @csv | gsub(",";";") | gsub("\"";""))] | @csv'

to give us:
"My Collection","This is a great collection.",1639717379161,"Background;Skin;Mouth;Eyes","Sand;Dark Brown;Smile Basic;Confused"

Name, description, and date were left as is, so let's break down the weird parts, one step at a time.

.attributes | [.[] | .trait_type]
.[] extracts each element of the attributes array and pipes the result of that into the next filter, which says to simply extract trait_type, where they are re-packaged in an array.

.attributes | [.[] | .trait_type] | @csv
turn the array into a csv-parsable format.

(.attributes | [.[] | .trait_type] | @csv | gsub(",";";") | gsub("\"";""))
Parens separate this from the rest of the evaluations, obviously. The first gsub here replaces commas with semicolons so they don't get interpreted as a separate field, the second removes all extra double quotes.