In jq 1.3 and up you can use the --arg VARIABLE VALUE command-line option:
jq -n --arg v "$VAR" '{"foo": $v}'
I.e., --arg sets a variable to the given value so you can then use $varname in your jq program, and now you don't have to use shell variable interpolation into your jq program.
EDIT: From jq 1.5 and up, you can use --argjson to pass in an array directly, e.g.
jq -n --argjson v '[1,2,3]' '{"foo": $v}'
Answer from user2259432 on Stack OverflowIn jq 1.3 and up you can use the --arg VARIABLE VALUE command-line option:
jq -n --arg v "$VAR" '{"foo": $v}'
I.e., --arg sets a variable to the given value so you can then use $varname in your jq program, and now you don't have to use shell variable interpolation into your jq program.
EDIT: From jq 1.5 and up, you can use --argjson to pass in an array directly, e.g.
jq -n --argjson v '[1,2,3]' '{"foo": $v}'
Once you have your variable loaded, you should use the split filter to split that string into an array.
$ jq -n --arg inarr "${ARR}" '{ arr: $inarr | split("\n") }'
The problem is that jq is still just outputting lines of text; you can't necessarily preserve each array element as a single unit. That said, as long as a newline is not a valid character in any object, you can still output each object on a separate line.
get_json_array | jq -c '.[]' | while read object; do
api_call "$object"
done
Under that assumption, you could use the readarray command in bash 4 to build an array:
readarray -t conversations < <(get_json_array | jq -c '.[]')
for conversation in "${conversations[@]}"; do
api_call "$conversation"
done
Here is a solution without loops:
json=$(jq -c ".my_key[]" ./my-file.json)
json_without_quotes=$(echo ${json//\"/""})
declare -a all_apps_array=($(echo $json_without_quotes | tr "\n" " "))
Creating Array of Objects from Bash Array using jq - Stack Overflow
More fun with jq, getting results into a usable array
Assigning an Array Parsed With jq to Bash Script Array - Stack Overflow
ubuntu - parse one field from an JSON array into bash array - Unix & Linux Stack Exchange
Split your bash array into NUL-delimited items using printf '%s\0', then read the raw stream using -R or --raw-input and within your jq filter split them into an array using split and the delimiter "\u0000":
printf '%s\0' "${IDS[@]}" | jq -Rs '
split("\u0000") | map({id:., names: ["bob", "sally"]})
'
for id in "${IDS[@]}" ; do
echo "$id"
done | jq -nR '[ {id: inputs, names: ["bob", "sally"]} ]'
or as a one-liner:
printf "%s\n" "${IDS[@]}" | jq -nR '[{id: inputs, names: ["bob", "sally"]}]'
I'm using this in a cURL to get the data from result[]:
foo=$(curl --request GET \ --silent \ --url https://example.com \ --header 'Content-Type: application/json' | jq -r '.result[]')
When I print $foo, this is what I have:
[key]
default
firewall_custom
zone
34
[
{
"id": "example",
"version": "6",
"action": "block",
"expression": "lorem",
"description": "ipsum",
"last_updated": "2024-08-15T19:10:24.913784Z",
"ref": "example",
"enabled": true
},
{
"id": "example2",
"version": "7",
"action": "block",
"expression": "this",
"description": "that",
"last_updated": "2024-08-15T19:10:24.913784Z",
"ref": "example2",
"enabled": true
}
]What I need from this is to create a loop where, in a series of addtional cURLs, I can insert action, expression, and description.
I'm imagining that I would push these to 3 separate arrays (action, expression, and description), so that ${action[0]} would coincide with ${expression[0]} and ${description[0]}, and so on.
Something along the lines of:
# assuming that I have somehow created the following arrays:
# action=("block" "block")
# expression=("lorem" "this")
# description=("ipsum" "that")
for x in ${action[@]}; do
bar=$(curl --request GET \
--silent \
--url https://example.com \
--data '{
"action": ${action[$x]},
"expression": ${expression[$x]},
"description": ${description[$x]}
}' | jq '.success')
if [[ $bar == true ]]
then
printf "$x succeeded\n"
else
printf "$x failed\n"
fi
# reset bar
bar=''
doneThe question is, how to create action, expression, and description arrays from the results of $foo (that original cURL)?
We can solve this problem by two ways. They are:
Input string:
// test.json
{
"keys": ["key1","key2","key3"]
}
Approach 1:
1) Use jq -r (output raw strings, not JSON texts) .
KEYS=$(jq -r '.keys' test.json)
echo $KEYS
# Output: [ "key1", "key2", "key3" ]
2) Use @sh (Converts input string to a series of space-separated strings). It removes square brackets[], comma(,) from the string.
KEYS=$(<test.json jq -r '.keys | @sh')
echo $KEYS
# Output: 'key1' 'key2' 'key3'
3) Using tr to remove single quotes from the string output. To delete specific characters use the -d option in tr.
KEYS=$((<test.json jq -r '.keys | @sh')| tr -d \')
echo $KEYS
# Output: key1 key2 key3
4) We can convert the comma-separated string to the array by placing our string output in a round bracket(). It also called compound Assignment, where we declare the array with a bunch of values.
ARRAYNAME=(value1 value2 .... valueN)
#!/bin/bash
KEYS=($((<test.json jq -r '.keys | @sh') | tr -d \'\"))
echo "Array size: " ${#KEYS[@]}
echo "Array elements: "${KEYS[@]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
Approach 2:
1) Use jq -r to get the string output, then use tr to delete characters like square brackets, double quotes and comma.
#!/bin/bash
KEYS=$(jq -r '.keys' test.json | tr -d '[],"')
echo $KEYS
# Output: key1 key2 key3
2) Then we can convert the comma-separated string to the array by placing our string output in a round bracket().
#!/bin/bash
KEYS=($(jq -r '.keys' test.json | tr -d '[]," '))
echo "Array size: " ${#KEYS[@]}
echo "Array elements: "${KEYS[@]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
To correctly parse values that may have newlines (and any other arbitrary (non-NUL) characters) use jq's @sh filter to generate space-separated quoted strings, and Bash's declare -a to parse the quoted strings as array elements. (No pre-processing required)
foo.json:
{"data": ["$0", " \t\n", "*", "\"", ""]}
str=$(jq -r '.data | @sh' foo.json)
declare -a arr="($str)" # must be quoted like this
declare -p arr
# declare -a arr=([0]="\$0" [1]=$' \t\n' [2]="*" [3]="\"" [4]="")
Update: jq 1.7 (2023-09)
As of version 1.7, jq has a --raw-output0 option, enabling it to output null-terminated strings which can be read into an array as usual:
mapfile -d '' arr < <(jq --raw-output0 '.data[]' foo.json)
wait "$!" # use in bash-4.4+ to get exit status of the process substitution
Note on NUL characters in JSON strings
JSON strings may contain NUL characters while shell variables cannot. If your JSON input may contain NUL's, you may need to add some special handling.
When using the
@shfilter, NUL characters from JSON strings will be silently replaced with the sequence\0. Note that this makes the JSON strings"\\0"and"\u0000"indistinguishable.When using the
--raw-output0option, NUL characters will trigger an error andjqwill terminate with an exit status of 5.
Reading multiple/nested arrays
The @sh filter can be combined with --raw-output0 to reliably read multiple arrays at once (or a single nested array) as it will produce a NUL-separated list of space-separated quoted strings.
json='[[1,2],[3,4]]' i=0
while read -r -d ''; do
declare -a "arr$((i++))=($REPLY)"
done < <(jq --raw-output0 '.[]|@sh' <<<$json)
for ((n=0; n<i; n++)); { declare -p "arr$n"; }
# declare -a arr0=([0]="1" [1]="2")
# declare -a arr1=([0]="3" [1]="4")
Using jq :
readarray arr < <(jq '.[].item2' json)
printf '%s\n' "${arr[@]}"
If you need a more hardened way:
readarray -td '' arr
for inputs with newlines or other special characters, avoiding word splitting.
Output:
value2
value2_2
Check:
Process Substitution >(command ...) or <(...) is replaced by a temporary filename. Writing or reading that file causes bytes to get piped to the command inside. Often used in combination with file redirection: cmd1 2> >(cmd2).
See http://mywiki.wooledge.org/ProcessSubstitution http://mywiki.wooledge.org/BashFAQ/024
The following is actually buggy:
# BAD: Output line of * is replaced with list of local files; can't deal with whitespace
arr=( $( curl -k "$url" | jq -r '.[].item2' ) )
If you have bash 4.4 or newer, a best-of-all-worlds option is available:
# BEST: Supports bash 4.4+, with failure detection and newlines in data
{ readarray -t -d '' arr && wait "$!"; } < <(
set -o pipefail
curl --fail -k "$url" | jq -j '.[].item2 | (., "\u0000")'
)
...whereas with bash 4.0, you can have terseness at the cost of failure detection and literal newline support:
# OK (with bash 4.0), but can't detect failure and doesn't support values with newlines
readarray -t arr < <(curl -k "$url" | jq -r '.[].item2' )
...or bash 3.x compatibility and failure detection, but without newline support:
# OK: Supports bash 3.x; no support for newlines in values, but can detect failures
IFS=$'\n' read -r -d '' -a arr < <(
set -o pipefail
curl --fail -k "$url" | jq -r '.[].item2' && printf '\0'
)
...or bash 3.x compatibility and newline support, but without failure detection:
# OK: Supports bash 3.x and supports newlines in values; does not detect failures
arr=( )
while IFS= read -r -d '' item; do
arr+=( "$item" )
done < <(curl --fail -k "$url" | jq -j '.[] | (.item2, "\u0000")')
I am writing a bash script for an Alfred Workflow. In this script I get a list (separated with newlines) from a command that I want to convert into a specific JSON format.
I tried storing the output into an array and parsing that in jq like that:
Command output:
$ piactl get regions auto france netherlands
Create array:
$ IFS=$'\n'
$ regions=($(piactl get regions))
$ echo "${regions[@]}"
auto france netherlandsParse to jq
$ jq -n --arg inarr "${regions}" '{ items: $inarr | split("\n") }'
{
"items": [
"auto"
]
}jq only outputs one item of the array and I don't know how to shape the JSON like shown in the wanted output below.
Wanted output:
{"items": [
{
"title": "auto",
"arg": "auto",
"icon": {
"path": "icon.png"
}
},
{
"title": "france",
"arg": "france",
"icon": {
"path": "icon.png"
}
},
{
"title": "netherlands",
"arg": "netherlands",
"icon": {
"path": "icon.png"
}
},
]}Can somebody help me craft the correct jq arguments for this task?
This trick with the jq 1.5 inputs streaming filter seems to do it
... | jq -n '.items |= [inputs]'
Ex.
$ find ~/ -maxdepth 1 -name "D*" |
while read line; do
jq -n --arg name "$(basename "$line")" \
--arg path "$line" \
'{name: $name, path: $path}'
done | jq -n '.items |= [inputs]'
{
"items": [
{
"name": "Downloads",
"path": "/home/steeldriver/Downloads"
},
{
"name": "Desktop",
"path": "/home/steeldriver/Desktop"
},
{
"name": "Documents",
"path": "/home/steeldriver/Documents"
}
]
}
Calling jq directly from find, and then collecting the resulting data with jq to construct the final output, without any shell loops:
find ~ -maxdepth 1 -name '[[:upper:]]*' \
-exec jq -n --arg path {} '{ name: ($path|sub(".*/"; "")), path: $path }' \; |
jq -n -s '{ items: inputs }'
The jq that is being executed via -exec creates a JSON object per found pathname. It strips off everything in the pathname up to the last slash for the name value, and uses the pathname as is for the path value.
The final jq reads the data from find into an array with -s, and simply inserts it as the items array in a new JSON object. The final jq invocation could also be written jq -n '{ items: [inputs] }.
Example result (note that I was using [[:upper:]* in place of D* for the -name pattern with find):
{
"items": [
{
"name": "Documents",
"path": "/home/myself/Documents"
},
{
"name": "Mail",
"path": "/home/myself/Mail"
},
{
"name": "Work",
"path": "/home/myself/Work"
}
]
}
Pass your strings using --arg, then you can create the JSON as expected:
#!/bin/bash
server1_name='server-1'
server1_url='http://server-1.net'
server2_name='server-2'
server2_url='http://server-2.net'
result=$(jq -n \
--arg name1 "$server1_name" \
--arg url1 "$server1_url" \
--arg name2 "$server2_name" \
--arg url2 "$server2_url" \
'[ { "name": $name1, "url": $url1 }, { "name": $name2, "url": $url2 } ]')
echo "$result"
Will produce:
[
{
"name": "server-1",
"url": "http://server-1.net"
},
{
"name": "server-2",
"url": "http://server-2.net"
}
]
You can construct two arrays of names and urls, then adapt this answer to "zip" the two arrays together into the desired array of objects.
jq -n \
--arg name1 "$server1_name" \
--arg url1 "$server1_url" \
--arg name2 "$server2_name" \
--arg url2 "$server2_url" \
'[$name1, $name2] as $names |
[$url1, $url2] as $urls |
[([$names, $urls] | transpose[]) as [$name, $url] |{$name, $url}]'
The benefit is that as the number of name/url pairs grows, you only need to modify the first two filters that define $names and $urls; the rest of the filter stays the same. You could even separate this into separate uses of jq, to facilitate the definition of a large list of servers.
names=$(jq -n --arg v1 "$server1_name" --arg v2 "$server2_name" '[$v1, $v2]')
urls=$(jq -n --arg v1 "$server1_url" --arg v2 "$server2_url" '[$v1, $v2]')
jq -n \
--argjson names "$names" \
--argjson urls "$urls" \
'[([$names, $urls] | transpose[]) as [$name, $url] | {$name, $url}]'
Reusing Glenn's test framework, but calling jq only once for the entire script:
list=( http://RESTURL1 http://RESTURL2 )
declare -A hypothetical_data=(
[http://RESTURL1]='{"foo":"Tiger Nixon","bar":"Edinburgh"}'
[http://RESTURL2]='{"foo":"Garrett Winters","bar":"Tokyo"}'
)
for url in "${list[@]}"; do
echo "${hypothetical_data[$url]}" # or curl "$url"
done | jq -n '{"data": [inputs | [.foo, .bar]]}'
#!/bin/bash
list=( http://RESTURL1 http://RESTURL2 )
declare -A hypothetical_data=(
[http://RESTURL1]='{"foo":"Tiger Nixon","bar":"Edinburgh"}'
[http://RESTURL2]='{"foo":"Garrett Winters","bar":"Tokyo"}'
)
# create the seed file
result="result.json"
echo '{"data":[]}' > "$result"
for url in "${list[@]}"; do
# fetch the data.
json=${hypothetical_data[$url]}
# would really do: json=$(curl "$url")
# extract the name ("foo") and location ("bar") values
name=$( jq -r '.foo' <<<"$json" )
location=$( jq -r '.bar' <<<"$json" )
jq --arg name "$name" \
--arg loc "$location" \
'.data += [[$name, $loc]]' "$result" | sponge "$result"
# "sponge" is in the "moreutils" package that you may have to install.
# You can also write that line as:
#
# tmp=$(mktemp)
# jq --arg name "$name" \
# --arg loc "$location" \
# '.data += [[$name, $loc]]' "$result" > "$tmp" && mv "$tmp" "$result"
# ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
done
End result:
$ cat result.json
{
"data": [
[
"Tiger Nixon",
"Edinburgh"
],
[
"Garrett Winters",
"Tokyo"
]
]
}