Resolved this on github:
.[0] as $keys |
.[1] as $values |
reduce range(0; $keys|length) as $i ( {}; . + { ($keys[
values[$i] })
Answer from Abdullah Jibaly on Stack OverflowCreate object from array of keys and values
json - How to use jq to create an object with an arbitrary key from a sub array? - Stack Overflow
Add new element to existing JSON array with jq - Stack Overflow
arrays - How to combine the sequence of objects in jq into one object? - Stack Overflow
Resolved this on github:
.[0] as $keys |
.[1] as $values |
reduce range(0; $keys|length) as $i ( {}; . + { ($keys[
values[$i] })
The current version of jq has a transpose filter that can be used to pair up the keys and values. You could use it to build out the result object rather easily.
transpose | reduce .[] as $pair ({}; .[$pair[0]] = $pair[1])
This was actually relatively straight forward:
.things | .[] | {name: .name, category: .params | .[] | select(.key=="category") | .value }
Your params almost looks like key/value entries, so you could create an object out of them by passing the array to from_entries. So to combine everything, you merely need to do this:
.things | map({name} + (.params | from_entries))
This yields:
[
{
"name": "foo",
"key1": "val1",
"category": "thefoocategory"
},
{
"name": "bar",
"key1": "val1",
"category": "thebarcategory"
}
]
The |= .+ part in the filter adds a new element to the existing array. You can use jq with filter like:
jq '.data.messages[3] |= . + {
"date": "2010-01-07T19:55:99.999Z",
"xml": "xml_samplesheet_2017_01_07_run_09.xml",
"status": "OKKK",
"message": "metadata loaded into iRODS successfullyyyyy"
}' inputJson
To avoid using the hardcoded length value 3 and dynamically add a new element, use . | length which returns the length, which can be used as the next array index, i.e.,
jq '.data.messages[.data.messages| length] |= . + {
"date": "2010-01-07T19:55:99.999Z",
"xml": "xml_samplesheet_2017_01_07_run_09.xml",
"status": "OKKK",
"message": "metadata loaded into iRODS successfullyyyyy"
}' inputJson
(or) as per peak's suggestion in the comments, using the += operator alone
jq '.data.messages += [{
"date": "2010-01-07T19:55:99.999Z",
"xml": "xml_samplesheet_2017_01_07_run_09.xml",
"status": "OKKK",
"message": "metadata loaded into iRODS successfullyyyyy"
}]'
which produces the output you need:
{
"report": "1.0",
"data": {
"date": "2010-01-07",
"messages": [
{
"date": "2010-01-07T19:58:42.949Z",
"xml": "xml_samplesheet_2017_01_07_run_09.xml",
"status": "OK",
"message": "metadata loaded into iRODS successfully"
},
{
"date": "2010-01-07T20:22:46.949Z",
"xml": "xml_samplesheet_2017_01_07_run_09.xml",
"status": "NOK",
"message": "metadata duplicated into iRODS"
},
{
"date": "2010-01-07T22:11:55.949Z",
"xml": "xml_samplesheet_2017_01_07_run_09.xml",
"status": "NOK",
"message": "metadata was not validated by XSD schema"
},
{
"date": "2010-01-07T19:55:99.999Z",
"xml": "xml_samplesheet_2017_01_07_run_09.xml",
"status": "OKKK",
"message": "metadata loaded into iRODS successfullyyyyy"
}
]
}
}
Use jq-play to dry-run your jq-filter and optimize any way you want.
Rather than using |=, consider using +=:
.data.messages += [{"date": "2010-01-07T19:55:99.999Z",
"xml": "xml_samplesheet_2017_01_07_run_09.xml",
"status": "OKKK", "message": "metadata loaded into iRODS successfullyyyyy"}]
Prepend
On the other hand, if (as @NicHuang asked) you want to add the JSON object to the beginning of the array, you could use the pattern:
.data.messages |= [ _ ] + .
If your input is a stream of objects, then unless your jq has inputs, the objects must be "slurped", e.g. using the -s command-line option, in order to combine them.
Thus one way to combine objects in the input stream is to use:
jq -s add
For the second problem, creating an array:
jq -s .
There are of course other alternatives, but these are simple and do not require the most recent version of jq. With jq 1.5 and later, you can use 'inputs', e.g. jq -n '[inputs]'
Efficient solution
For the first problem (reduction), rather than slurping (whether via the -s option, or using [inputs]), it would be more efficient to use reduce with inputs and the -n command-line option. For example, to combine the stream of objects into a single object:
jq -n 'reduce inputs as $in (null; . + $in)'
Equivalently, without --null-input:
jq 'reduce inputs as $in (.; . + $in)'
An alternative to slurping using the -s command-line option is to use the inputs filter. Like so:
jq -n '[inputs] | add'
This will produce an object with all the input objects combined.
map( { (.name|tostring): . } ) | add
(The tostring is for safety/robustness.)
INDEX/1
If your jq has INDEX/1 (introduced in version 1.6), you can simply write:
INDEX(.name)
Just build up a new object going through the items in the array. Add the items to the object with the name as the key.
reduce .[] as $i ({}; .[$i.name] = $i)
jq can deal with multiple input arrays. You can pipe the whole output of the loop to it:
for service in "$services" ; do
curl "$service/path"
done | jq -r '.[]|[.id,.startDate,.calls]|@csv'
Note that the csv transformation can be done by @csv
As @hek2mlg pointed out, it should be possible to invoke jq just once. If the input is sufficiently uniform (admittedly, maybe a big "if"), you could even avoid having to name the fields explicitly, e.g.:
$ for service in "$services" ; do
curl "$service/path"
done | jq -sr 'add[] | [.[]] | @csv'
Output:
"123","2016-12-09T00:00:00Z",4
"456","2016-12-09T00:00:00Z",22
"789","2016-12-09T00:00:00Z",8
"147","2016-12-09T00:00:00Z",10
Note that using -s allows you to perform arbitrary computations on all the inputs, e.g. counting them.
This trick with the jq 1.5 inputs streaming filter seems to do it
... | jq -n '.items |= [inputs]'
Ex.
$ find ~/ -maxdepth 1 -name "D*" |
while read line; do
jq -n --arg name "$(basename "$line")" \
--arg path "$line" \
'{name: $name, path: $path}'
done | jq -n '.items |= [inputs]'
{
"items": [
{
"name": "Downloads",
"path": "/home/steeldriver/Downloads"
},
{
"name": "Desktop",
"path": "/home/steeldriver/Desktop"
},
{
"name": "Documents",
"path": "/home/steeldriver/Documents"
}
]
}
Calling jq directly from find, and then collecting the resulting data with jq to construct the final output, without any shell loops:
find ~ -maxdepth 1 -name '[[:upper:]]*' \
-exec jq -n --arg path {} '{ name: ($path|sub(".*/"; "")), path: $path }' \; |
jq -n -s '{ items: inputs }'
The jq that is being executed via -exec creates a JSON object per found pathname. It strips off everything in the pathname up to the last slash for the name value, and uses the pathname as is for the path value.
The final jq reads the data from find into an array with -s, and simply inserts it as the items array in a new JSON object. The final jq invocation could also be written jq -n '{ items: [inputs] }.
Example result (note that I was using [[:upper:]* in place of D* for the -name pattern with find):
{
"items": [
{
"name": "Documents",
"path": "/home/myself/Documents"
},
{
"name": "Mail",
"path": "/home/myself/Mail"
},
{
"name": "Work",
"path": "/home/myself/Work"
}
]
}
I've got a jsonl
{"mainobject1": "maindata1", "array1": [{"object1": "data1"}, {"object2": "data2"}]
{"mainobject1": "maindata2", "array1": [{"object1": "data1"}, {"object3": "data3"}]
What I'm doing:
jq -r '[.mainobject1, .array1[].object1, .array1[].object2, .array1[].object3] | @ csv(nospace here)' file.json
What I want to get
maindata1,data1,data2,data3(null)
maindata2,data1,data2(null),data3
What I'm getting
maindata1,data1,null,null,data2,null,null
maindata2,data1,null,null,null,null,data3
Any ideas how to resolve this?