First, obtain an array containing all the different object property names in your object array input. Those will be the columns of your CSV:

(map(keys) | add | unique) as $cols

Then, for each object in the object array input, map the column names you obtained to the corresponding properties in the object. Those will be the rows of your CSV.

map(. as cols | map(rows

Finally, put the column names before the rows, as a header for the CSV, and pass the resulting row stream to the @csv filter.

$cols, $rows[] | @csv

All together now. Remember to use the -r flag to get the result as a raw string:

jq -r '(map(keys) | add | unique) as $cols | map(. as cols | map(rows | $cols, $rows[] | @csv'
Answer from user3899165 on Stack Overflow
Top answer
1 of 9
331

First, obtain an array containing all the different object property names in your object array input. Those will be the columns of your CSV:

(map(keys) | add | unique) as $cols

Then, for each object in the object array input, map the column names you obtained to the corresponding properties in the object. Those will be the rows of your CSV.

map(. as cols | map(rows

Finally, put the column names before the rows, as a header for the CSV, and pass the resulting row stream to the @csv filter.

$cols, $rows[] | @csv

All together now. Remember to use the -r flag to get the result as a raw string:

jq -r '(map(keys) | add | unique) as $cols | map(. as cols | map(rows | $cols, $rows[] | @csv'
2 of 9
196

The Skinny

jq -r '(.[0] | keys_unsorted) as $keys | $keys, map([.[ $keys[] ]])[] | @csv'

or:

jq -r '(.[0] | keys_unsorted) as $keys | ([$keys] + map([.[ $keys[] ]])) [] | @csv'

The Details

Aside

Describing the details is tricky because jq is stream-oriented, meaning it operates on a sequence of JSON data, rather than a single value. The input JSON stream gets converted to some internal type which is passed through the filters, then encoded in an output stream at program's end. The internal type isn't modeled by JSON, and doesn't exist as a named type. It's most easily demonstrated by examining the output of a bare index (.[]) or the comma operator (examining it directly could be done with a debugger, but that would be in terms of jq's internal data types, rather than the conceptual data types behind JSON).

 jq -cn '"a", "b"'
"a"
"b"

Note that the output isn't an array (which would be ["a", "b"]). Compact output (the -c option) shows that each array element (or argument to the , filter) becomes a separate object in the output (each is on a separate line).

A stream is like a JSON-seq, but uses newlines rather than RS as an output separator when encoded. Consequently, this internal type is referred to by the generic term "sequence" in this answer, with "stream" being reserved for the encoded input and output.

Constructing the Filter

The first object's keys can be extracted with:

.[0] | keys_unsorted

Keys will generally be kept in their original order, but preserving the exact order isn't guaranteed. Consequently, they will need to be used to index the objects to get the values in the same order. This will also prevent values being in the wrong columns if some objects have a different key order.

To both output the keys as the first row and make them available for indexing, they're stored in a variable. The next stage of the pipeline then references this variable and uses the comma operator to prepend the header to the output stream.

(.[0] | keys_unsorted) as $keys | $keys, ...

The expression after the comma is a little involved. The index operator on an object can take a sequence of strings (e.g. "name", "value"), returning a sequence of property values for those strings. $keys is an array, not a sequence, so [] is applied to convert it to a sequence,

$keys[]

which can then be passed to .[]

.[ $keys[] ]

This, too, produces a sequence, so the array constructor is used to convert it to an array.

[.[ $keys[] ]]

This expression is to be applied to a single object. map() is used to apply it to all objects in the outer array:

map([.[ $keys[] ]])

Lastly for this stage, this is converted to a sequence so each item becomes a separate row in the output.

map([.[ $keys[] ]])[]

Why bundle the sequence into an array within the map only to unbundle it outside? map produces an array; .[ $keys[] ] produces a sequence. Applying map to the sequence from .[ $keys[] ] would produce an array of sequences of values, but since sequences aren't a JSON type, so you instead get a flattened array containing all the values.

["NSW","AU","state","New South Wales","AB","CA","province","Alberta","ABD","GB","council area","Aberdeenshire","AK","US","state","Alaska"]

The values from each object need to be kept separate, so that they become separate rows in the final output.

Finally, the sequence is passed through @csv formatter.

Alternate

The items can be separated late, rather than early. Instead of using the comma operator to get a sequence (passing a sequence as the right operand), the header sequence ($keys) can be wrapped in an array, and + used to append the array of values. This still needs to be converted to a sequence before being passed to @csv.

🌐
Qmacro
qmacro.org › blog › posts › 2022 › 05 › 19 › json-object-values-into-csv-with-jq
JSON object values into CSV with jq - DJ Adams
May 19, 2022 - So that we better understand where we're heading, I want to introduce the @csv format string, which is described as follows: The input must be an array, and it is rendered as CSV with double quotes for strings, and quotes escaped by repetition.
Discussions

Using jq to extract values from column-oriented JSON and format in CSV - Unix & Linux Stack Exchange
jq has a filter, @csv, for converting an array to a CSV string. This filter takes into account most of the complexities associated with the CSV format, beginning with commas embedded in fields. More on unix.stackexchange.com
🌐 unix.stackexchange.com
October 23, 2014
text processing - Convert JSON of arrays to CSV with headers using JQ - Unix & Linux Stack Exchange
Question related to Export JSON to CSV with Headers using JQ . Looking for preferably a generic answer. ... The object name is extracted using to_entries|map(.key). The object content is put inside an array and transposed in order to get arrays with element of each object. More on unix.stackexchange.com
🌐 unix.stackexchange.com
January 30, 2020
Convert json to csv with jq tool
I need to convert a json document entirely into a csv format. It should be imported later in Excel for further filtering I've read, that the following should be possible: jq --raw-output 'to_entries | map([ .key, .val… More on community.unix.com
🌐 community.unix.com
8
0
March 26, 2022
Need help with using Array value as input for `jq`
echo '{json}' | jq Is that what you want to do ? Do you have to work with bash ? Python, node, ruby would handle structured data much better. More on reddit.com
🌐 r/bash
2
7
November 26, 2018
Top answer
1 of 2
15

You may pick the headers as an array of strings from the keys of the first element in the list, then extract all elements' values as separate arrays. Applying the @csv output operator to each element of the resulting list will CSV-quote the data (jq quotes all strings, but not booleans or numbers):

$ jq -r '[first|keys_unsorted] + map([.[]]) | .[] | @csv' file
"bytes","checked"
276697,false
276697,false

Or,

$ jq -r '(first|keys_unsorted), (.[]|[.[]]) | @csv' file
"bytes","checked"
276697,false
276697,false

Or,

$ jq -r '(first|keys_unsorted), map(map(.))[] | @csv' file
"bytes","checked"
276697,false
276697,false

Or any other way to extract the values into separate arrays.

Note that this relies on the keys to occur in the same order all throughout the input data.

However, it's even easier with Miller (mlr):

$ mlr --j2c cat file
bytes,checked
276697,false
276697,false

This simply passes the data through Miller's cat command (which, when used like this, does not modify the data), while converting from JSON to CSV using the --j2c option (short for --ijson --ocsv). Note that since Miller is properly CSV-aware, it only quotes the fields that actually need quoting.

You may also get a nicely formatted table by choosing the pretty-printed output format together with --barred:

$ mlr --j2p --barred cat file
+--------+---------+
| bytes  | checked |
+--------+---------+
| 276697 | false   |
| 276697 | false   |
+--------+---------+

(--j2p is short for --ijson --opprint.)

Or without --barred:

$ mlr --j2p cat file
bytes  checked
276697 false
276697 false
2 of 2
2

I got it:

cat file.json | jq '(.[0] | keys), .[] | join(",")'

looks like you can surround any part in parentheses to prevent it from "consuming" the stream (I don't know if that's what it's called or even if I got it right here, because I couldn't find anything on jq's documentation and had to piece everything together with bits and bobs scattered throughout various blogs and stackoverflows, so if there's a "proper" way to do it please let me know).

By the way if you have a shell for loop like I did use -s option to combine separate json objects:

$ for i in {3,4,5,8} 
do rclone rc core/stats --rc-user USER --rc-pass PASS --rc-addr :557$i
done | jq -rs '(.[0] | keys), .[] | join(",")' | column -ts,

bytes      checks  deletedDirs  deletes  elapsedTime       errors  eta  fatalError  renames  retryError  speed               totalBytes  totalChecks  totalTransfers  transferTime  transfers
1660182    0       0            0        258038.009782457  0       0    false       0        false       1664.9322505627426  1660182     0            6               0             6
407752609  0       0            0        258038.054874325  0       0    false       0        false       10615.04533495996   407752609   0            86              0             86
7403585    0       0            0        258038.103563555  0       0    false       0        false       20892.381593377457  7403585     0            2               0             2
0          0       0            0        258038.156466825  0            false       0        false       0                   0           0            0               0             0

🌐
Richrose
richrose.dev › posts › linux › jq › jq-json2csv
Convert JSON to CSV with JQ - richrose.dev
March 5, 2021 - Select the information that is to be displayed in the CSV. In our example, the fields id, item and description are used. ... NOTE: Here jq is being asked to read the top level element stock. We then use the pipe command (i.e. |) to pass the result as a parameter to the next command and perform another parse.
🌐
Vic Vijayakumar
vicvijayakumar.com › blog › convert-api-responses-from-json-to-csv-with-jq
Convert JSON API responses into CSV with jq • Vic Vijayakumar
November 28, 2020 - Unfortunately this has made this response hard to read, so we are going to ask jq to arrange the output as individual rows. jq has a built-in formatter called @csv which can operate on an input which is provided as an array of elements. Let’s convert our output to an array so we can pipe it to @csv.
🌐
Programming Historian
programminghistorian.org › en › lessons › json-and-jq
Reshaping JSON with jq | Programming Historian
May 24, 2016 - To format this as CSV, add the operator @csv on the end with another pipe and check the “Raw Output” box in the upper right. @csv properly joins the arrays with , and adds quotes where needed. “Raw Output” tells jq that we want to produce a text file, rather than a new JSON file.
🌐
JDriven
jdriven.com › blog › 2023 › 10 › jq-Joy-Converting-Json-To-CSV
jq Joy: Converting JSON to CSV - JDriven Blog
October 31, 2023 - We’d like to extract the title, ISBN, edition count, publish year and authors data and export it as CSV. To do this, we will flatten the structure of the JSON file by selecting the list of books and then specifying the fields from the selected items. cat java-books.json | jq '.works[] | [ .title, .availability.isbn, .edition_count, .first_publish_year, .authors[0].name, .authors[1].name, .authors[2].name]'
Find elsewhere
🌐
Sanity
sanity.io › blog › exporting-your-structured-content-as-csv-using-jq-in-the-command-line
Exporting your structured content as CSV using JQ in the command line | Sanity
$cols, $rows[] | @csv puts the column headers first in the array, and then each of the arrays that are transformed to lines by piping them to @csv , which formats the output as…
🌐
Phpfog
phpfog.com › home › blog › convert json to csv with jq
Convert JSON to CSV with jq - PHPFog.com
January 29, 2021 - echo '{"key1":"val1", "myarray":["abc", "def", "ghi"]}' | jq -r '[.key1, .myarray[0]] | @csv' "val1","abc" By default each CSV record as well as all its fields will be enclosed in quotes.
🌐
Hacker News
news.ycombinator.com › item
`jq` can transform CSV to JSON and vice-versa, especially for simple/naive data ... | Hacker News
June 3, 2021 - First attempt is to simply read each line in as raw and split on `,` - sort of does the job of, but it isn't the array of arrays that you might expect: · In the reverse direction there's a builtin `@csv` format string. This can be use with the second example above to say "turn each array into ...
🌐
xvdd
xvdd.github.io › posts › convert-a-json-array-to-csv-with-jq
Convert a JSON array to CSV with jq | xvdd
August 12, 2022 - jq can help automate the formatting, using the csv formatter (tsv is another option) that produce data easily imported in Excel. A first step may be required to produce a flat (without sublevel) array of JSON objects of the same structure.
🌐
Medium
medium.com › free-code-camp › how-to-transform-json-to-csv-using-jq-in-the-command-line-4fa7939558bf
How to transform JSON to CSV using jq in the command line | by Knut Melvær | We’ve moved to freeCodeCamp.org/news | Medium
September 14, 2018 - It appends the values to an array, which gives you an array of arrays with the values, which is what you want when you're transforming JSON into CSV.
🌐
Today I Learned
til.hashrocket.com › posts › fn98hbc5re-how-to-convert-json-to-csv-with-jq
How to convert JSON to CSV with jq - Today I Learned
July 3, 2021 - I had this json file that was an array of objects and some of those objects had different keys. I wanted to visualize that data in a spreadsheet (data science, AI, machine learning stuff) so I thought about having a CSV file where each JSON key would become a CSV column. // file.json [ { "type": "Event1", "time": 20 }, { "type": "Event2", "distance": 100 } ] jq to the rescue: cat file.json | jq -r '(map(keys) | add | unique) as $cols | map(.
🌐
Bomberbot
bomberbot.com › data-science › how-to-transform-json-to-csv-using-jq-in-the-command-line
How to Transform JSON to CSV Using jq in the Command Line: The Ultimate Guide - Bomberbot
November 3, 2024 - as $rows stores the array of CSV rows in a variable named $rows · Finally, we combine the $cols and $rows variables and pipe them to the @csv function to produce the final CSV output: cat data.json | jq -r ‘ (map(keys) | add | unique) as $cols | map(.
🌐
HugeDomains
padmanabha.hashnode.dev › converting-nested-json-array-to-csv-with-jq
Converting Nested JSON Array to CSV with JQ
January 7, 2023 - We use cookies to help you navigate efficiently and perform certain functions.
🌐
Nem
blog.nem.ec › code-snippets › jq-convert-to-csv
Convert JSON to CSV with jq - nem.ec
The deconstruction into a row of data happens at this section: [.url, .method] This should match the order and size of the header row to get a proper CSV file. The final selection, @csv is a special jq command that converts the list of arrays into a CSV format.
🌐
Medium
staffeng.medium.com › jq-json-extract-arrays-4560c2dcf051
jq & JSON : extract arrays. Given this json, extract cpu array… | by Natarajan Santhosh | Medium
August 1, 2023 - jq & JSON : extract arrays Given this json, extract cpu array, duration and functions array as csv { "__criblEventType": "stats", "__ctrlFields": [], "__final": false, "__cloneCount": 0, …