The problem is that jq is still just outputting lines of text; you can't necessarily preserve each array element as a single unit. That said, as long as a newline is not a valid character in any object, you can still output each object on a separate line.
get_json_array | jq -c '.[]' | while read object; do
api_call "$object"
done
Under that assumption, you could use the readarray command in bash 4 to build an array:
readarray -t conversations < <(get_json_array | jq -c '.[]')
for conversation in "${conversations[@]}"; do
api_call "$conversation"
done
Answer from chepner on Stack OverflowThe problem is that jq is still just outputting lines of text; you can't necessarily preserve each array element as a single unit. That said, as long as a newline is not a valid character in any object, you can still output each object on a separate line.
get_json_array | jq -c '.[]' | while read object; do
api_call "$object"
done
Under that assumption, you could use the readarray command in bash 4 to build an array:
readarray -t conversations < <(get_json_array | jq -c '.[]')
for conversation in "${conversations[@]}"; do
api_call "$conversation"
done
Here is a solution without loops:
json=$(jq -c ".my_key[]" ./my-file.json)
json_without_quotes=$(echo ${json//\"/""})
declare -a all_apps_array=($(echo $json_without_quotes | tr "\n" " "))
How to use jq to convert an bash array in command line to json array? - Unix & Linux Stack Exchange
Convert JSON array to bash array
Assigning an Array Parsed With jq to Bash Script Array - Stack Overflow
JSON array to bash variables using jq - Unix & Linux Stack Exchange
How to convert a JSON array into a Bash array?
What does the jq command do in converting JSON arrays to Bash arrays?
Can I access specific elements of JSON array in Bash?
Using jq :
readarray arr < <(jq '.[].item2' json)
printf '%s\n' "${arr[@]}"
If you need a more hardened way:
readarray -td '' arr
for inputs with newlines or other special characters, avoiding word splitting.
Output:
value2
value2_2
Check:
Process Substitution >(command ...) or <(...) is replaced by a temporary filename. Writing or reading that file causes bytes to get piped to the command inside. Often used in combination with file redirection: cmd1 2> >(cmd2).
See http://mywiki.wooledge.org/ProcessSubstitution http://mywiki.wooledge.org/BashFAQ/024
The following is actually buggy:
# BAD: Output line of * is replaced with list of local files; can't deal with whitespace
arr=( $( curl -k "$url" | jq -r '.[].item2' ) )
If you have bash 4.4 or newer, a best-of-all-worlds option is available:
# BEST: Supports bash 4.4+, with failure detection and newlines in data
{ readarray -t -d '' arr && wait "$!"; } < <(
set -o pipefail
curl --fail -k "$url" | jq -j '.[].item2 | (., "\u0000")'
)
...whereas with bash 4.0, you can have terseness at the cost of failure detection and literal newline support:
# OK (with bash 4.0), but can't detect failure and doesn't support values with newlines
readarray -t arr < <(curl -k "$url" | jq -r '.[].item2' )
...or bash 3.x compatibility and failure detection, but without newline support:
# OK: Supports bash 3.x; no support for newlines in values, but can detect failures
IFS=$'\n' read -r -d '' -a arr < <(
set -o pipefail
curl --fail -k "$url" | jq -r '.[].item2' && printf '\0'
)
...or bash 3.x compatibility and newline support, but without failure detection:
# OK: Supports bash 3.x and supports newlines in values; does not detect failures
arr=( )
while IFS= read -r -d '' item; do
arr+=( "$item" )
done < <(curl --fail -k "$url" | jq -j '.[] | (.item2, "\u0000")')
Hi guys,
I am a linux noob and am trying to write a script to extract info from a mkv file using mkvmerge but am not able to convert the target json script to a bash array. I have tried a number of solutions from stack overflow but with no success.
here are some of my attempts
dir="/mnt/Anime/Series/KonoSuba/Season 2/[Nep_Blanc] KonoSuba II 10 .mkv"
*********************************************************************************
ARRAY_SIZE=$(mkvmerge -J "$dir" | jq '.tracks | length')
count=0
arr=()
while [ $count -lt $ARRAY_SIZE ];
do
arr+=($(mkvmerge -J "$dir" | jq '.tracks'[$count]))
((count++))
done
*********************************************************************************
readarray -t test_array < <(mkvmerge -J "$dir" | jq '.tracks')
for element in "${test_array[@]}";
do
echo "$element"
done
*********************************************************************************
array=($(mkvmerge -J "$dir" | jq '.tracks' | sed -e 's/^\[/(/' -e 's/\]$/)/'))but the echo prints out lines instead of the specific objects.
Though now it is helpling me with my python, originally the project was to help me learn bash scripting. I would really like to have a bash implementation so any help overcoming this roadblock would be appreciated.
We can solve this problem by two ways. They are:
Input string:
// test.json
{
"keys": ["key1","key2","key3"]
}
Approach 1:
1) Use jq -r (output raw strings, not JSON texts) .
KEYS=$(jq -r '.keys' test.json)
echo $KEYS
# Output: [ "key1", "key2", "key3" ]
2) Use @sh (Converts input string to a series of space-separated strings). It removes square brackets[], comma(,) from the string.
KEYS=$(<test.json jq -r '.keys | @sh')
echo $KEYS
# Output: 'key1' 'key2' 'key3'
3) Using tr to remove single quotes from the string output. To delete specific characters use the -d option in tr.
KEYS=$((<test.json jq -r '.keys | @sh')| tr -d \')
echo $KEYS
# Output: key1 key2 key3
4) We can convert the comma-separated string to the array by placing our string output in a round bracket(). It also called compound Assignment, where we declare the array with a bunch of values.
ARRAYNAME=(value1 value2 .... valueN)
#!/bin/bash
KEYS=($((<test.json jq -r '.keys | @sh') | tr -d \'\"))
echo "Array size: " ${#KEYS[@]}
echo "Array elements: "${KEYS[@]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
Approach 2:
1) Use jq -r to get the string output, then use tr to delete characters like square brackets, double quotes and comma.
#!/bin/bash
KEYS=$(jq -r '.keys' test.json | tr -d '[],"')
echo $KEYS
# Output: key1 key2 key3
2) Then we can convert the comma-separated string to the array by placing our string output in a round bracket().
#!/bin/bash
KEYS=($(jq -r '.keys' test.json | tr -d '[]," '))
echo "Array size: " ${#KEYS[@]}
echo "Array elements: "${KEYS[@]}
# Output:
# Array size: 3
# Array elements: key1 key2 key3
To correctly parse values that may have newlines (and any other arbitrary (non-NUL) characters) use jq's @sh filter to generate space-separated quoted strings, and Bash's declare -a to parse the quoted strings as array elements. (No pre-processing required)
foo.json:
{"data": ["$0", " \t\n", "*", "\"", ""]}
str=$(jq -r '.data | @sh' foo.json)
declare -a arr="($str)" # must be quoted like this
declare -p arr
# declare -a arr=([0]="\$0" [1]=$' \t\n' [2]="*" [3]="\"" [4]="")
Update: jq 1.7 (2023-09)
As of version 1.7, jq has a --raw-output0 option, enabling it to output null-terminated strings which can be read into an array as usual:
mapfile -d '' arr < <(jq --raw-output0 '.data[]' foo.json)
wait "$!" # use in bash-4.4+ to get exit status of the process substitution
Note on NUL characters in JSON strings
JSON strings may contain NUL characters while shell variables cannot. If your JSON input may contain NUL's, you may need to add some special handling.
When using the
@shfilter, NUL characters from JSON strings will be silently replaced with the sequence\0. Note that this makes the JSON strings"\\0"and"\u0000"indistinguishable.When using the
--raw-output0option, NUL characters will trigger an error andjqwill terminate with an exit status of 5.
Reading multiple/nested arrays
The @sh filter can be combined with --raw-output0 to reliably read multiple arrays at once (or a single nested array) as it will produce a NUL-separated list of space-separated quoted strings.
json='[[1,2],[3,4]]' i=0
while read -r -d ''; do
declare -a "arr
REPLY)"
done < <(jq --raw-output0 '.[]|@sh' <<<$json)
for ((n=0; n<i; n++)); { declare -p "arr$n"; }
# declare -a arr0=([0]="1" [1]="2")
# declare -a arr1=([0]="3" [1]="4")
Your original version isn't going to be evalable because the author name has spaces in it - it would be interpreted as running a command Doe with the environment variable AUTHOR set to John. There's also virtually never a need to pipe jq to itself - the internal piping & dataflow can connect different filters together.
All of this is only sensible if you completely trust the input data (e.g. it's generated by a tool you control). There are several possible problems otherwise detailed below, but let's assume the data itself is certain to be in the format you expect for the moment.
You can make a much simpler version of your jq program:
jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + (.value | @sh)'
which outputs:
URL='example.com'
AUTHOR='John Doe'
CREATED='10/22/2017'
There's no need for a map: .[] deals with taking each object in the array through the rest of the pipeline as a separate item, so everything after the last | is applied to each one separately. At the end, we just assemble a valid shell assignment string with ordinary + concatenation, including appropriate quotes & escaping around the value with @sh.
All the pipes matter here - without them you get fairly unhelpful error messages, where parts of the program are evaluated in subtly different contexts.
This string is evalable if you completely trust the input data and has the effect you want:
eval "$(jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + (.value | @sh)' < data.json)"
echo "$AUTHOR"
As ever when using eval, be careful that you trust the data you're getting, since if it's malicious or just in an unexpected format things could go very wrong. In particular, if the key contains shell metacharacters like $ or whitespace, this could create a running command. It could also overwrite, for example, the PATH environment variable unexpectedly.
If you don't trust the data, either don't do this at all or filter the object to contain just the keys you want first:
jq '.SITE_DATA | { AUTHOR, URL, CREATED } | ...'
You could also have a problem in the case that the value is an array, so .value | tostring | @sh will be better - but this list of caveats may be a good reason not to do any of this in the first place.
It's also possible to build up an associative array instead where both keys and values are quoted:
eval "declare -A data=($(jq -r '.SITE_DATA | to_entries | .[] | @sh "[\(.key)]=\(.value)"' < test.json))"
After this, ${data[CREATED]} contains the creation date, and so on, regardless of what the content of the keys or values are. This is the safest option, but doesn't result in top-level variables that could be exported. It may still produce a Bash syntax error when a value is an array, or a jq error if it is an object, but won't execute code or overwrite anything.
Building on @Michael Homer's answer, you can avoid a potentially-unsafe eval entirely by reading the data into an associative array.
For example, if your JSON data is in a file called file.json:
#!/bin/bash
typeset -A myarray
while IFS== read -r key value; do
myarray["
value"
done < <(jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + .value ' file.json)
# show the array definition
typeset -p myarray
# make use of the array variables
echo "URL = '${myarray[URL]}'"
echo "CREATED = '${myarray[CREATED]}'"
echo "AUTHOR = '${myarray[URL]}'"
Output:
$ ./read-into-array.sh
declare -A myarray=([CREATED]="10/22/2017" [AUTHOR]="John Doe" [URL]="example.com" )
URL = 'example.com'
CREATED = '10/22/2017'
AUTHOR = 'example.com'
I am writing a bash script for an Alfred Workflow. In this script I get a list (separated with newlines) from a command that I want to convert into a specific JSON format.
I tried storing the output into an array and parsing that in jq like that:
Command output:
$ piactl get regions auto france netherlands
Create array:
$ IFS=$'\n'
$ regions=($(piactl get regions))
$ echo "${regions[@]}"
auto france netherlandsParse to jq
$ jq -n --arg inarr "${regions}" '{ items: $inarr | split("\n") }'
{
"items": [
"auto"
]
}jq only outputs one item of the array and I don't know how to shape the JSON like shown in the wanted output below.
Wanted output:
{"items": [
{
"title": "auto",
"arg": "auto",
"icon": {
"path": "icon.png"
}
},
{
"title": "france",
"arg": "france",
"icon": {
"path": "icon.png"
}
},
{
"title": "netherlands",
"arg": "netherlands",
"icon": {
"path": "icon.png"
}
},
]}Can somebody help me craft the correct jq arguments for this task?
I'm using this in a cURL to get the data from result[]:
foo=$(curl --request GET \ --silent \ --url https://example.com \ --header 'Content-Type: application/json' | jq -r '.result[]')
When I print $foo, this is what I have:
[key]
default
firewall_custom
zone
34
[
{
"id": "example",
"version": "6",
"action": "block",
"expression": "lorem",
"description": "ipsum",
"last_updated": "2024-08-15T19:10:24.913784Z",
"ref": "example",
"enabled": true
},
{
"id": "example2",
"version": "7",
"action": "block",
"expression": "this",
"description": "that",
"last_updated": "2024-08-15T19:10:24.913784Z",
"ref": "example2",
"enabled": true
}
]What I need from this is to create a loop where, in a series of addtional cURLs, I can insert action, expression, and description.
I'm imagining that I would push these to 3 separate arrays (action, expression, and description), so that ${action[0]} would coincide with ${expression[0]} and ${description[0]}, and so on.
Something along the lines of:
# assuming that I have somehow created the following arrays:
# action=("block" "block")
# expression=("lorem" "this")
# description=("ipsum" "that")
for x in ${action[@]}; do
bar=$(curl --request GET \
--silent \
--url https://example.com \
--data '{
"action": ${action[$x]},
"expression": ${expression[$x]},
"description": ${description[$x]}
}' | jq '.success')
if [[ $bar == true ]]
then
printf "$x succeeded\n"
else
printf "$x failed\n"
fi
# reset bar
bar=''
doneThe question is, how to create action, expression, and description arrays from the results of $foo (that original cURL)?
I have two bash arrays.
COLORS=(blue yellow) FOODS=(cheese ham)
I would like to use jq to combine the two arrays into json that looks something like this.
{
"blue": "cheese",
"yellow": "ham",
}Or even this
{
"blue": "cheese"
}
{
"yellow": "ham"
}If not possible I could make it work this way as well, although this would be less preferred.
{
"key1": "blue",
"key2": "cheese",
}
{
"key1": "yellow",
"key2": "ham",
}Any thoughts on how to do this? Thanks!
To convert your JSON to a bash array, with help of jq:
$ readarray -t arr < <(jq '.value' file)
$ printf '%s\n' "${arr[@]}"
"1"
"3"
"4"
To fix your expanded example (the exact command), just don't use object construction {value: .Value}, but instead only .Value:
$ readarray -t arr < <(aws ec2 describe-instances --region=us-east-1 --filters --filters "Name=tag:NodeType,Values=worker" --query "Reservations[].Instances[].Tags[]" | jq -r '.[] | select(.Key == "NodeNumber") | .Value')
$ printf '%s\n' "${arr[@]}"
1
3
4
Notice the lack of double quotes, since the -r option now prints only raw string values, not raw JSON Objects.
After you get arr populated with values like this, you can easily iterate over it and perform tests, just as you described in your question.
First, Store the Data
Given your raw data stored as a string in a json variable, perhaps with a here-document:
json=$(
cat <<- EOF
{
"value": "1"
}
{
"value": "3"
}
{
"value": "4"
}
EOF
)
Bash itself will do a reasonable job of prettifying it:
$ echo $json
{ "value": "1" } { "value": "3" } { "value": "4" }
Parsing the Data Into a Bash Array
There's more than one way to do this. Two of the more obvious ways are to use use jq or grep to extract the values into a Bash array with the shell's simple array notation:
values=( `echo $json | jq .value` )
echo "${values[@]}"
"1" "3" "4"
unset values
values=$(egrep -o '[[:digit:]]+' <<< "$json")
echo "${values[@]}"
1
3
4
There are certainly other ways to accomplish this task, but this seems like the low-hanging fruit. Your mileage may vary.
Caveat
The main thing to remember about Bash arrays is that they need to use expansions such as "${foo[@]}" when quoted, or ${bar[*]} when unquoted, if you want to use them for loops rather than indexing into them directly. Once they're in an array, though, you can access the entire array easily as long as you understand the different expansions and quoting rules.
It's generally a good idea to avoid looping over the result of a command substitution. It's inelegant as the command in the substitution must finish executing before the loop can even start running, it's inefficient as the full result of the command in the substitution must be stored in memory, and it's prone to errors since the shell must be allowed to split the output of the command on whitespace and subject the resulting words to filename globbing.
It's better to use read in a while loop:
#!/bin/sh
curl -s 'localhost:14002/api/sno' |
jq -r '.satellites[].id' |
while IFS= read -r id; do
curl -s 'localhost:14002/api/sno/satellite/'"$id"
done |
jq -r \
--argjson auditScore 1 \
--argjson suspensionScore 1 \
--argjson onlineScore 0.9 \
'.audits as $a | $a.satelliteName as $name |
reduce ($ARGS.named|keys[]) as $key (
[];
if $a[$key] < $ARGS.named[$key] then (
. + ["\($key) below threshold: \($a[$key]) for \($name)"]
) else . end
) | .[]'
This script assumes that you can contact your REST endpoints on localhost:14002 (the Docker container might be made to expose that port, for example). If you need to use the docker exec command to access the API, then replace the plain calls to curl with, e.g.
docker exec -i curl -s 'localhost:14002/api/sno'
For the updated question, using the api/sno/satellites endpoint:
#!/bin/sh
curl -s 'localhost:14002/api/sno/satellites' |
jq -r \
--argjson auditScore 1 \
--argjson suspensionScore 1 \
--argjson onlineScore 0.9 \
'.audits[] as $a | $a.satelliteName as $name |
reduce ($ARGS.named|keys[]) as $key (
[];
if $a[$key] < $ARGS.named[$key] then (
. + ["\($key) below threshold: \($a[$key]) for \($name)"]
) else . end
) | .[]'
Apart from a minor adjustment to the jq expression, this is essentially the same code as above, but bypassing the first curl call to fetch all the IDs, and the loop.
I‘ve found one possible answer on stackoverflow:
for sat in `docker exec -i storagenode wget -qO - localhost:14002/api/sno | jq .satellites[].id -r`
do
docker exec -i storagenode wget -qO - localhost:14002/api/sno/satellite/$sat \
| jq --raw-output \
--argjson auditThreshold 1 \
--argjson suspensionThreshold 1 \
--argjson onlineThreshold 1 \
'.audits
| .satelliteName as $name
| (
[{auditScore}, $auditThreshold],
[{suspensionScore}, $suspensionThreshold],
[{onlineScore}, $onlineThreshold]
)
| select(.[0][] < .[1])
| "\(.[0] | keys[]) (\(.[0][])) below threshold (\(.[1])) for \($name)"
'
done
I have a json file which i'm using bash to extract.
sample.json
{"extract": { "data": [ {"name": "John Smith", "id": 8752, "address": "1 Anywhere Street", "tel": 1234567890, "email": "john.smith@gmail.com" }, { "name": "Jane Smith", "id": 4568, "address": "719 Anywhere Street", "tel": 0987654321, "email": "janesmith@hotmail.com" } ] } }and store the value within an array
id=($(cat sample.json | jq -r '.extract.data[] .name'))
so in the case of ${id[0]} will output John Smith and ${id[1]} will output Jane Smith.
I am intending to store the values in a database (this will be my first attempt) which will be in a similar to that of the json, each object needs to be relative to how it is in the json so it might be better to go with:
data1=($(cat sample.json | jq -r '.extract.data[0] | .[]))
Lets say i have 1000 names to save to my database along with their id's. I'm some advice whether if there a more sensible (more effective) approach on how:
- Pull the data from Json? will I need to write this 1000 times?e.g
data1=($(cat sample.json | jq -r '.extract.data[0] | .[])) data2=($(cat sample.json | jq -r '.extract.data[1] | .[])) data3=($(cat sample.json | jq -r '.extract.data[2] | .[])) .. data1=($(cat sample.json | jq -r '.extract.data[1000] | .[]))
-Put the data into the DB from the first array? will the code need to reference the array as:
${data1[0]}
${data1[1]}
${data1[2]}Would be grateful for a steer in the right direction? - thanks.