The problem is that jq is still just outputting lines of text; you can't necessarily preserve each array element as a single unit. That said, as long as a newline is not a valid character in any object, you can still output each object on a separate line.

get_json_array | jq -c '.[]' | while read object; do
    api_call "$object"
done

Under that assumption, you could use the readarray command in bash 4 to build an array:

readarray -t conversations < <(get_json_array | jq -c '.[]')
for conversation in "${conversations[@]}"; do
    api_call "$conversation"
done
Answer from chepner on Stack Overflow
🌐
Ingernet
ingernet.github.io › bash › jq › json › 2020 › 04 › 16 › json-array-bash-array.html
Converting a JSON array to a bash array
April 16, 2020 - # using gcloud output as a source because why not use the hardest shit possible bork=$(gcloud --project=<project-id> container images list-tags us.gcr.io/<project-id>/<image-name> --filter='tags:DEPLOYED' --format=json | jq '.[0].tags') echo $bork [ "260", "61a1d7aef75421f5c209c42304716ba44e86ab7a", "DEPLOYED.2019-11-12T17.04.37.772145800Z", "DEPLOYED.2019-11-13T00.00.29.525908800Z" ] # ^ output is obviously not a bash array # strip out all the things you don't want - square brackets and commas borkstring=$(echo $bork | sed -e 's/\[ //g' -e 's/\ ]//g' -e 's/\,//g') arr=( $borkstring ) echo $arr ( "260" "61a1d7aef75421f5c209c42304716ba44e86ab7a" "DEPLOYED.2019-11-12T17.04.37.772145800Z" "DEPLOYED.2019-11-13T00.00.29.525908800Z" ) # ^ now THAT is a bash array
Discussions

How to use jq to convert an bash array in command line to json array? - Unix & Linux Stack Exchange
$ jq -c -n -e '[$x, $y]' --argjson x '"a"' --argjson y '"b"' ["a","b"] I know that I could do something like the above. If I want to generate a json array f... More on unix.stackexchange.com
🌐 unix.stackexchange.com
December 15, 2022
Convert JSON array to bash array
but the echo prints out lines instead of the specific objects. Instead of telling a vague story about what's happening, why don't you show us: the output of mkvmerge -J "$dir" | jq '.tracks' what you expected/need it to be More on reddit.com
🌐 r/bash
11
0
December 25, 2024
Assigning an Array Parsed With jq to Bash Script Array - Stack Overflow
I parsed a json file with jq like this : # cat test.json | jq '.logs' | jq '.[]' | jq '._id' | jq -s It returns an array like this : [34,235,436,546,.....] Using bash script i described an array... More on stackoverflow.com
🌐 stackoverflow.com
JSON array to bash variables using jq - Unix & Linux Stack Exchange
I'm looking to iterate over this array using jq so I can set the key of each item as the variable name and the value as it's value. More on unix.stackexchange.com
🌐 unix.stackexchange.com
December 30, 2017
People also ask

How to convert a JSON array into a Bash array?
To convert a JSON array into a Bash array, you can use the jq command line tool with process substitution with the syntax bash_array=($(echo "$json_array" | jq -r '.[]')). This parses the JSON array stored in the variable json_array and assigns each element to the bash_array.
🌐
linuxsimply.com
linuxsimply.com › home › bash scripting tutorial › a complete guide to bash array › array operations in bash › how to convert a json array into bash array [5 methods]
How to Convert a JSON Array into Bash Array [5 Methods] - LinuxSimply
What does the jq command do in converting JSON arrays to Bash arrays?
The jq command extracts each element of the JSON array in converting such an array into Bash arrays by using the .[] notation within the full syntax echo "$json_array" | jq -r '.[]' and outputs the elements as separate lines. The -r option outputs strings without quotes to make them suitable for Bash array assignment.
🌐
linuxsimply.com
linuxsimply.com › home › bash scripting tutorial › a complete guide to bash array › array operations in bash › how to convert a json array into bash array [5 methods]
How to Convert a JSON Array into Bash Array [5 Methods] - LinuxSimply
Can I access specific elements of JSON array in Bash?
To access any specific element of a JSON array in Bash, you can use the "jq" command to extract the element you want to access using indexing by employing the syntax echo "$json_array" | jq -r '.[index]'. For example, to access the first element from json_array='["item1", "item2", "item3"]', use the syntax echo "$json_array" | jq -r '.[0]'.
🌐
linuxsimply.com
linuxsimply.com › home › bash scripting tutorial › a complete guide to bash array › array operations in bash › how to convert a json array into bash array [5 methods]
How to Convert a JSON Array into Bash Array [5 Methods] - LinuxSimply
Top answer
1 of 6
36

Using jq :

readarray arr < <(jq '.[].item2' json)
printf '%s\n' "${arr[@]}"

If you need a more hardened way:

readarray -td '' arr

for inputs with newlines or other special characters, avoiding word splitting.

Output:

value2
value2_2

Check:

Process Substitution >(command ...) or <(...) is replaced by a temporary filename. Writing or reading that file causes bytes to get piped to the command inside. Often used in combination with file redirection: cmd1 2> >(cmd2). See http://mywiki.wooledge.org/ProcessSubstitution http://mywiki.wooledge.org/BashFAQ/024

2 of 6
9

The following is actually buggy:

# BAD: Output line of * is replaced with list of local files; can't deal with whitespace
arr=( $( curl -k "$url" | jq -r '.[].item2' ) )

If you have bash 4.4 or newer, a best-of-all-worlds option is available:

# BEST: Supports bash 4.4+, with failure detection and newlines in data
{ readarray -t -d '' arr && wait "$!"; } < <(
  set -o pipefail
  curl --fail -k "$url" | jq -j '.[].item2 | (., "\u0000")'
)

...whereas with bash 4.0, you can have terseness at the cost of failure detection and literal newline support:

# OK (with bash 4.0), but can't detect failure and doesn't support values with newlines
readarray -t arr < <(curl -k "$url" | jq -r '.[].item2' )

...or bash 3.x compatibility and failure detection, but without newline support:

# OK: Supports bash 3.x; no support for newlines in values, but can detect failures
IFS=$'\n' read -r -d '' -a arr < <(
  set -o pipefail
  curl --fail -k "$url" | jq -r '.[].item2' && printf '\0'
)

...or bash 3.x compatibility and newline support, but without failure detection:

# OK: Supports bash 3.x and supports newlines in values; does not detect failures
arr=( )
while IFS= read -r -d '' item; do
  arr+=( "$item" )
done < <(curl --fail -k "$url" | jq -j '.[] | (.item2, "\u0000")')
🌐
Reddit
reddit.com › r/bash › convert json array to bash array
r/bash on Reddit: Convert JSON array to bash array
December 25, 2024 -

Hi guys,

I am a linux noob and am trying to write a script to extract info from a mkv file using mkvmerge but am not able to convert the target json script to a bash array. I have tried a number of solutions from stack overflow but with no success.

here are some of my attempts

dir="/mnt/Anime/Series/KonoSuba/Season 2/[Nep_Blanc] KonoSuba II 10 .mkv"
*********************************************************************************
ARRAY_SIZE=$(mkvmerge -J  "$dir" | jq '.tracks | length')
count=0
arr=()

while [ $count -lt $ARRAY_SIZE ];
    do
        arr+=($(mkvmerge -J  "$dir" | jq '.tracks'[$count]))
        ((count++))
done
*********************************************************************************
readarray -t test_array < <(mkvmerge -J  "$dir" | jq '.tracks')
for element in "${test_array[@]}";
    do
        echo "$element"
done

*********************************************************************************
array=($(mkvmerge -J  "$dir" | jq '.tracks' | sed -e 's/^\[/(/' -e 's/\]$/)/'))

but the echo prints out lines instead of the specific objects.

Though now it is helpling me with my python, originally the project was to help me learn bash scripting. I would really like to have a bash implementation so any help overcoming this roadblock would be appreciated.

Top answer
1 of 5
33

We can solve this problem by two ways. They are:

Input string:

// test.json
{
    "keys": ["key1","key2","key3"]
}

Approach 1:

1) Use jq -r (output raw strings, not JSON texts) .

KEYS=$(jq -r '.keys' test.json)
echo $KEYS
# Output: [ "key1", "key2", "key3" ]

2) Use @sh (Converts input string to a series of space-separated strings). It removes square brackets[], comma(,) from the string.

KEYS=$(<test.json jq -r '.keys | @sh')
echo $KEYS
# Output: 'key1' 'key2' 'key3'

3) Using tr to remove single quotes from the string output. To delete specific characters use the -d option in tr.

KEYS=$((<test.json jq -r '.keys | @sh')| tr -d \') 
echo $KEYS
# Output: key1 key2 key3

4) We can convert the comma-separated string to the array by placing our string output in a round bracket(). It also called compound Assignment, where we declare the array with a bunch of values.

ARRAYNAME=(value1 value2  .... valueN)
#!/bin/bash
KEYS=($((<test.json jq -r '.keys | @sh') | tr -d \'\"))

echo "Array size: " ${#KEYS[@]}
echo "Array elements: "${KEYS[@]}

# Output: 
# Array size:  3
# Array elements: key1 key2 key3

Approach 2:

1) Use jq -r to get the string output, then use tr to delete characters like square brackets, double quotes and comma.

#!/bin/bash
KEYS=$(jq -r '.keys' test.json  | tr -d '[],"')
echo $KEYS

# Output: key1 key2 key3

2) Then we can convert the comma-separated string to the array by placing our string output in a round bracket().

#!/bin/bash
KEYS=($(jq -r '.keys' test.json  | tr -d '[]," '))

echo "Array size: " ${#KEYS[@]}
echo "Array elements: "${KEYS[@]}

# Output:
# Array size:  3
# Array elements: key1 key2 key3
2 of 5
16

To correctly parse values that may have newlines (and any other arbitrary (non-NUL) characters) use jq's @sh filter to generate space-separated quoted strings, and Bash's declare -a to parse the quoted strings as array elements. (No pre-processing required)

foo.json:

{"data": ["$0", " \t\n", "*", "\"", ""]}
str=$(jq -r '.data | @sh' foo.json)
declare -a arr="($str)"   # must be quoted like this
declare -p arr
# declare -a arr=([0]="\$0" [1]=$' \t\n' [2]="*" [3]="\"" [4]="")

Update: jq 1.7 (2023-09)

As of version 1.7, jq has a --raw-output0 option, enabling it to output null-terminated strings which can be read into an array as usual:

mapfile -d '' arr < <(jq --raw-output0 '.data[]' foo.json)
wait "$!"  # use in bash-4.4+ to get exit status of the process substitution

Note on NUL characters in JSON strings

JSON strings may contain NUL characters while shell variables cannot. If your JSON input may contain NUL's, you may need to add some special handling.

  • When using the @sh filter, NUL characters from JSON strings will be silently replaced with the sequence \0. Note that this makes the JSON strings "\\0" and "\u0000" indistinguishable.

  • When using the --raw-output0 option, NUL characters will trigger an error and jq will terminate with an exit status of 5.

Reading multiple/nested arrays

The @sh filter can be combined with --raw-output0 to reliably read multiple arrays at once (or a single nested array) as it will produce a NUL-separated list of space-separated quoted strings.

json='[[1,2],[3,4]]' i=0
while read -r -d ''; do
    declare -a "arrREPLY)"
done < <(jq --raw-output0 '.[]|@sh' <<<$json)
for ((n=0; n<i; n++)); { declare -p "arr$n"; }
# declare -a arr0=([0]="1" [1]="2")
# declare -a arr1=([0]="3" [1]="4")
🌐
Eliatra
eliatra.com › home › blog › transform json data on bash using jq
Transform JSON Data on Bash Using jq
November 23, 2022 - Since we grouped by Department, it does not matter from which object we take it for each array index. We just grab it from the first. For getting the number of employees for each Department, we just apply the length function. This example also shows how we can create an entire new JSON structure as output, and reference values from the step before: ... $ cat staff.json | jq 'group_by (.Department)[] | {Department: .[0].Department, length: length}' { "Department": "IT", "length": 2 } { "Department": "Management", "length": 1 }
Find elsewhere
🌐
LinuxSimply
linuxsimply.com › home › bash scripting tutorial › a complete guide to bash array › array operations in bash › how to convert a json array into bash array [5 methods]
How to Convert a JSON Array into Bash Array [5 Methods] - LinuxSimply
March 17, 2024 - In the above script, the code echo "$json_array" print the JSON array and jq -r '.[]' command extracts each element using the .[] filter. Then the mapfile command reads lines from its standard input (the output of the jq command in this case) to the array variable called bash_array.
Top answer
1 of 4
56

Your original version isn't going to be evalable because the author name has spaces in it - it would be interpreted as running a command Doe with the environment variable AUTHOR set to John. There's also virtually never a need to pipe jq to itself - the internal piping & dataflow can connect different filters together.

All of this is only sensible if you completely trust the input data (e.g. it's generated by a tool you control). There are several possible problems otherwise detailed below, but let's assume the data itself is certain to be in the format you expect for the moment.

You can make a much simpler version of your jq program:

jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + (.value | @sh)'

which outputs:

URL='example.com'
AUTHOR='John Doe'
CREATED='10/22/2017'

There's no need for a map: .[] deals with taking each object in the array through the rest of the pipeline as a separate item, so everything after the last | is applied to each one separately. At the end, we just assemble a valid shell assignment string with ordinary + concatenation, including appropriate quotes & escaping around the value with @sh.

All the pipes matter here - without them you get fairly unhelpful error messages, where parts of the program are evaluated in subtly different contexts.

This string is evalable if you completely trust the input data and has the effect you want:

eval "$(jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + (.value | @sh)' < data.json)"
echo "$AUTHOR"

As ever when using eval, be careful that you trust the data you're getting, since if it's malicious or just in an unexpected format things could go very wrong. In particular, if the key contains shell metacharacters like $ or whitespace, this could create a running command. It could also overwrite, for example, the PATH environment variable unexpectedly.

If you don't trust the data, either don't do this at all or filter the object to contain just the keys you want first:

jq '.SITE_DATA | { AUTHOR, URL, CREATED } | ...'

You could also have a problem in the case that the value is an array, so .value | tostring | @sh will be better - but this list of caveats may be a good reason not to do any of this in the first place.


It's also possible to build up an associative array instead where both keys and values are quoted:

eval "declare -A data=($(jq -r '.SITE_DATA | to_entries | .[] | @sh "[\(.key)]=\(.value)"' < test.json))"

After this, ${data[CREATED]} contains the creation date, and so on, regardless of what the content of the keys or values are. This is the safest option, but doesn't result in top-level variables that could be exported. It may still produce a Bash syntax error when a value is an array, or a jq error if it is an object, but won't execute code or overwrite anything.

2 of 4
20

Building on @Michael Homer's answer, you can avoid a potentially-unsafe eval entirely by reading the data into an associative array.

For example, if your JSON data is in a file called file.json:

#!/bin/bash

typeset -A myarray

while IFS== read -r key value; do
    myarray["value"
done < <(jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + .value ' file.json)

# show the array definition
typeset -p myarray

# make use of the array variables
echo "URL = '${myarray[URL]}'"
echo "CREATED = '${myarray[CREATED]}'"
echo "AUTHOR = '${myarray[URL]}'"

Output:

$ ./read-into-array.sh 
declare -A myarray=([CREATED]="10/22/2017" [AUTHOR]="John Doe" [URL]="example.com" )
URL = 'example.com'
CREATED = '10/22/2017'
AUTHOR = 'example.com'
🌐
Reddit
reddit.com › r/bash › convert bash array to json with jq
r/bash on Reddit: Convert bash array to JSON with jq
May 2, 2020 -

I am writing a bash script for an Alfred Workflow. In this script I get a list (separated with newlines) from a command that I want to convert into a specific JSON format.

I tried storing the output into an array and parsing that in jq like that:

Command output:

$ piactl get regions

auto
france
netherlands

Create array:

$ IFS=$'\n'
$ regions=($(piactl get regions)) 
$ echo "${regions[@]}"

auto france netherlands

Parse to jq

$ jq -n --arg inarr "${regions}" '{ items: $inarr | split("\n") }'

{
  "items": [
    "auto"
  ]
}

jq only outputs one item of the array and I don't know how to shape the JSON like shown in the wanted output below.

Wanted output:

{"items": [
    {
        "title": "auto",
        "arg": "auto",
        "icon": {
            "path": "icon.png"
        }
    },
    {
        "title": "france",
        "arg": "france",
        "icon": {
            "path": "icon.png"
        }
    },
    {
        "title": "netherlands",
        "arg": "netherlands",
        "icon": {
            "path": "icon.png"
        }
    },
]}

Can somebody help me craft the correct jq arguments for this task?

🌐
GitHub
gist.github.com › IAmStoxe › a36b6f043819fad1821e7cfd7e903a5b
This example shows you how to utilize jq to loop bash script through an array of JSON values. · GitHub
June 29, 2020 - echo '[{"name": "name #1", "value": "value #1"} , {"name": "name#2", "value": "value#2"}]' | jq --raw-output 'map("\(.name) = \(.value)")[]' ... Or if you really did want to drop down into the bash loop...
🌐
Alfred App Community
alfredforum.com › alfred workflows › workflow help & questions
Construct script filter JSON with jq from bash array - Workflow Help & Questions - Alfred App Community Forum
January 30, 2022 - Hi, I am creating a workflow and I want to create a script filter with strings from a bash array. Given I have an array called $regions that contains 3 strings. I want to create 3 list items from that using jq. echo $regions france netherlands denmark I tried this command but the output is obviou...
🌐
GitHub
gist.github.com › awesome › b3f65084c70264e87be3e72ee8abd0e5
jq parse JSON to bash4 associative array! "jq is like sed for JSON data" https://stedolan.github.io/jq/ · GitHub
jq parse JSON to bash4 associative array! "jq is like sed for JSON data" https://stedolan.github.io/jq/ - jq-json-to-associative-array-command-line-example.txt
🌐
how.wtf
how.wtf › how-to-iterate-through-json-arrays-in-bash-using-jq.html
How to iterate through JSON arrays in Bash using jq | how.wtf
June 14, 2023 - To achieve this, you can use the ... do 2 echo "$i" 3done · The command uses jq to extract each object in the “projects” array and then uses a while loop to iterate through the extracted objects....
🌐
Reddit
reddit.com › r/bash › more fun with jq, getting results into a usable array
r/bash on Reddit: More fun with jq, getting results into a usable array
August 17, 2024 -

I'm using this in a cURL to get the data from result[]:

foo=$(curl --request GET \
--silent \
--url https://example.com \
--header 'Content-Type: application/json' | jq -r '.result[]')

When I print $foo, this is what I have:

[key]
default

firewall_custom
zone
34
[
  {
    "id": "example",
    "version": "6",
    "action": "block",
    "expression": "lorem",
    "description": "ipsum",
    "last_updated": "2024-08-15T19:10:24.913784Z",
    "ref": "example",
    "enabled": true
  },
  {
    "id": "example2",
    "version": "7",
    "action": "block",
    "expression": "this",
    "description": "that",
    "last_updated": "2024-08-15T19:10:24.913784Z",
    "ref": "example2",
    "enabled": true
  }
]

What I need from this is to create a loop where, in a series of addtional cURLs, I can insert action, expression, and description.

I'm imagining that I would push these to 3 separate arrays (action, expression, and description), so that ${action[0]} would coincide with ${expression[0]} and ${description[0]}, and so on.

Something along the lines of:

# assuming that I have somehow created the following arrays:
# action=("block" "block")
# expression=("lorem" "this")
# description=("ipsum" "that")

for x in ${action[@]}; do
  bar=$(curl --request GET \
    --silent \
    --url https://example.com \
    --data '{
      "action": ${action[$x]},
      "expression": ${expression[$x]},
      "description": ${description[$x]}
    }' | jq '.success')

  if [[ $bar == true ]]
    then
      printf "$x succeeded\n"

    else
      printf "$x failed\n"
  fi

  # reset bar
  bar=''
done

The question is, how to create action, expression, and description arrays from the results of $foo (that original cURL)?

🌐
Reddit
reddit.com › r/bash › join 2 bash arrays in json using jq
r/bash on Reddit: Join 2 Bash Arrays in Json using jq
December 13, 2023 -

I have two bash arrays.

COLORS=(blue yellow)
FOODS=(cheese ham)

I would like to use jq to combine the two arrays into json that looks something like this.

{
    "blue": "cheese",
    "yellow": "ham",
}

Or even this

{
    "blue": "cheese"
}
{
    "yellow": "ham"
}

If not possible I could make it work this way as well, although this would be less preferred.

{
    "key1": "blue",
    "key2": "cheese",
}
{
    "key1": "yellow",
    "key2": "ham",
}

Any thoughts on how to do this? Thanks!

🌐
CodeGenes
codegenes.net › blog › assigning-an-array-parsed-with-jq-to-bash-script-array
How to Assign a jq-Parsed JSON Array to a Bash Script Array: Fixing Bracket and Format Issues — codegenes.net
Use jq to properly extract individual array elements (without brackets/quotes). Read these elements into a Bash array without splitting on spaces or special characters.
Top answer
1 of 4
5

To convert your JSON to a bash array, with help of jq:

$ readarray -t arr < <(jq '.value' file)
$ printf '%s\n' "${arr[@]}"
"1"
"3"
"4"

To fix your expanded example (the exact command), just don't use object construction {value: .Value}, but instead only .Value:

$ readarray -t arr < <(aws ec2 describe-instances --region=us-east-1 --filters --filters "Name=tag:NodeType,Values=worker" --query "Reservations[].Instances[].Tags[]" | jq -r '.[] | select(.Key == "NodeNumber") | .Value')
$ printf '%s\n' "${arr[@]}"
1
3
4

Notice the lack of double quotes, since the -r option now prints only raw string values, not raw JSON Objects.

After you get arr populated with values like this, you can easily iterate over it and perform tests, just as you described in your question.

2 of 4
2

First, Store the Data

Given your raw data stored as a string in a json variable, perhaps with a here-document:

json=$(
    cat <<- EOF
        {
          "value": "1"
        }
        {
          "value": "3"
        }
        {
          "value": "4"
        }
EOF
)

Bash itself will do a reasonable job of prettifying it:

$ echo $json
{ "value": "1" } { "value": "3" } { "value": "4" }

Parsing the Data Into a Bash Array

There's more than one way to do this. Two of the more obvious ways are to use use jq or grep to extract the values into a Bash array with the shell's simple array notation:

values=( `echo $json | jq .value` )
echo "${values[@]}"
"1" "3" "4"

unset values
values=$(egrep -o '[[:digit:]]+' <<< "$json")
echo "${values[@]}"
1
3
4

There are certainly other ways to accomplish this task, but this seems like the low-hanging fruit. Your mileage may vary.

Caveat

The main thing to remember about Bash arrays is that they need to use expansions such as "${foo[@]}" when quoted, or ${bar[*]} when unquoted, if you want to use them for loops rather than indexing into them directly. Once they're in an array, though, you can access the entire array easily as long as you understand the different expansions and quoting rules.

Top answer
1 of 2
4

It's generally a good idea to avoid looping over the result of a command substitution. It's inelegant as the command in the substitution must finish executing before the loop can even start running, it's inefficient as the full result of the command in the substitution must be stored in memory, and it's prone to errors since the shell must be allowed to split the output of the command on whitespace and subject the resulting words to filename globbing.

It's better to use read in a while loop:

#!/bin/sh

curl -s 'localhost:14002/api/sno' |
jq -r '.satellites[].id' |
while IFS= read -r id; do
        curl -s 'localhost:14002/api/sno/satellite/'"$id"
done |
jq -r \
        --argjson auditScore 1 \
        --argjson suspensionScore 1 \
        --argjson onlineScore 0.9 \
        '.audits as $a | $a.satelliteName as $name |
        reduce ($ARGS.named|keys[]) as $key (
                [];
                if $a[$key] < $ARGS.named[$key] then (
                        . + ["\($key) below threshold: \($a[$key]) for \($name)"]
                ) else . end
        ) | .[]'

This script assumes that you can contact your REST endpoints on localhost:14002 (the Docker container might be made to expose that port, for example). If you need to use the docker exec command to access the API, then replace the plain calls to curl with, e.g.

docker exec -i curl -s 'localhost:14002/api/sno'

For the updated question, using the api/sno/satellites endpoint:

#!/bin/sh

curl -s 'localhost:14002/api/sno/satellites' |
jq -r \
        --argjson auditScore 1 \
        --argjson suspensionScore 1 \
        --argjson onlineScore 0.9 \
        '.audits[] as $a | $a.satelliteName as $name |
        reduce ($ARGS.named|keys[]) as $key (
                [];
                if $a[$key] < $ARGS.named[$key] then (
                        . + ["\($key) below threshold: \($a[$key]) for \($name)"]
                ) else . end
        ) | .[]'

Apart from a minor adjustment to the jq expression, this is essentially the same code as above, but bypassing the first curl call to fetch all the IDs, and the loop.

2 of 2
0

I‘ve found one possible answer on stackoverflow:


for sat in `docker exec -i storagenode wget -qO - localhost:14002/api/sno | jq .satellites[].id -r`
do
  docker exec -i storagenode wget -qO - localhost:14002/api/sno/satellite/$sat \
  | jq --raw-output \
    --argjson auditThreshold 1 \
    --argjson suspensionThreshold 1 \
    --argjson onlineThreshold 1 \
    '.audits
      | .satelliteName as $name
      | (
          [{auditScore}, $auditThreshold],
          [{suspensionScore}, $suspensionThreshold],
          [{onlineScore}, $onlineThreshold]
        )
      | select(.[0][] < .[1])
      | "\(.[0] | keys[]) (\(.[0][])) below threshold (\(.[1])) for \($name)"
    '
done
🌐
Reddit
reddit.com › r/bash › writing bash code for array with multiple values
r/bash on Reddit: Writing bash code for array with multiple values
October 24, 2022 -

I have a json file which i'm using bash to extract.

sample.json

{"extract": { "data": [ {"name": "John Smith", "id": 8752, "address": "1 Anywhere Street", "tel": 1234567890, "email": "john.smith@gmail.com" }, { "name": "Jane Smith", "id": 4568, "address": "719 Anywhere Street", "tel": 0987654321, "email": "janesmith@hotmail.com" } ] } }

and store the value within an array

id=($(cat sample.json | jq -r '.extract.data[] .name'))

so in the case of ${id[0]} will output John Smith and ${id[1]} will output Jane Smith.

I am intending to store the values in a database (this will be my first attempt) which will be in a similar to that of the json, each object needs to be relative to how it is in the json so it might be better to go with:

data1=($(cat sample.json | jq -r '.extract.data[0] | .[]))

Lets say i have 1000 names to save to my database along with their id's. I'm some advice whether if there a more sensible (more effective) approach on how:

- Pull the data from Json? will I need to write this 1000 times?e.g

data1=($(cat sample.json | jq -r '.extract.data[0] | .[]))
data2=($(cat sample.json | jq -r '.extract.data[1] | .[]))
data3=($(cat sample.json | jq -r '.extract.data[2] | .[]))
..
data1=($(cat sample.json | jq -r '.extract.data[1000] | .[]))

-Put the data into the DB from the first array? will the code need to reference the array as:

${data1[0]}
${data1[1]} 
${data1[2]}

Would be grateful for a steer in the right direction? - thanks.