With jo, which makes it easy to generate JSON on the command line:

$ jo -p key1="$value1" key2="$value2"
{
   "key1": "foo",
   "key2": "bar"
}

or, depending on what you want the end result to be,

$ jo -a -p "$(jo key1="$value1")" "$(jo key2="$value2")"
[
   {
      "key1": "foo"
   },
   {
      "key2": "bar"
   }
]

Note that jo will also properly encode the values in the strings $value1 and $value2.

Answer from Kusalananda on Stack Exchange
Discussions

How to use jq to convert an bash array in command line to json array? - Unix & Linux Stack Exchange
$ jq -c -n -e '[$x, $y]' --argjson x '"a"' --argjson y '"b"' ["a","b"] I know that I could do something like the above. If I want to generate a json array f... More on unix.stackexchange.com
🌐 unix.stackexchange.com
December 15, 2022
Generating JSON in a shell script - Unix & Linux Stack Exchange
String concatenation is absolutely the wrong way to create JSON. But every time I have to use jq I end up spending 4 hours going through documentation to figure out how to use it, so +1 from me! ... # 2 shell arrays admins=("a" "b") views=("c" "d") accounts=$(jo "${admins[@]/#/APP_Service1... More on unix.stackexchange.com
🌐 unix.stackexchange.com
jq - How to create JSON Array in Bash - Stack Overflow
That is, in a shell command that contains an argument 'foo', the quote ' is an instruction to your shell, not part of the argument foo, and you want to leave it out of any JSON or other higher-level representation of the data. ... Consider converting your bash script into python - you will ... More on stackoverflow.com
🌐 stackoverflow.com
September 23, 2019
How to create array from json data in shell script - Stack Overflow
Communities for your favorite technologies. Explore all Collectives · Stack Overflow for Teams is now called Stack Internal. Bring the best of human thought and AI automation together at your work More on stackoverflow.com
🌐 stackoverflow.com
January 30, 2018
Top answer
1 of 4
6

One suggestion is to use --args with jq to create the two arrays and then collect these in the correct location in the main document. Note that --args is required to be the last option on the command line and that all the remaining command line arguments will become elements of the $ARGS.positional array.

{
    jq -n --arg key APP-Service1-Admin '{(ARGS.positional}' --args a b
    jq -n --arg key APP-Service1-View  '{(ARGS.positional}' --args c d
} |
jq -s --arg key 'AD Accounts' '{($key): add}' |
jq --arg Service service1-name --arg 'AWS account' service1-dev '$ARGS.named + .'

The first two jq invocations create a set of two JSON objects:

{
  "APP-Service1-Admin": [
    "a",
    "b"
  ]
}
{
  "APP-Service1-View": [
    "c",
    "d"
  ]
}

The third jq invocation uses -s to read that set into an array, which becomes a merged object when passed through add. The merged object is assigned to our top-level key:

{
  "AD Accounts": {
    "APP-Service1-Admin": [
      "a",
      "b"
    ],
    "APP-Service1-View": [
      "c",
      "d"
    ]
  }
}

The last jq adds the remaining top-level keys and their values to the object:

{
  "Service": "service1-name",
  "AWS account": "service1-dev",
  "AD Accounts": {
    "APP-Service1-Admin": [
      "a",
      "b"
    ],
    "APP-Service1-View": [
      "c",
      "d"
    ]
  }
}

With jo:

jo -d . \
    Service=service1-name \
    'AWS account'=service1-dev  \
    'AD Accounts.APP-Service1-Admin'="$(jo -a a b)" \
    'AD Accounts.APP-Service1-View'="$(jo -a c d)"

The "internal" object is created using .-notation (enabled with -d .), and a couple of command substitutions for creating the arrays.

Or you can drop the -d . and use a form of array notation:

jo  Service=service1-name \
    'AWS account'=service1-dev \
    'AD Account[APP-Service1-Admin]'="$(jo -a a b)" \
    'AD Account[APP-Service1-View]'="$(jo -a c d)"
2 of 4
4

I often use heredocs when creating complicated json objects in bash:

service=$(thing-what-gets-service)
account=$(thing-what-gets-account)
admin=(thing-what-gets-admin))
view=(thing-what-gets-view))

read -rd '' json <<EOF
[
    {
        "Service": "$service",
        "AWS Account": "$account",
        "AD Accounts": {
            "APP-Service1-Admin": $admin,
            "APP-Service1-View": $view
        }
    }
]
EOF

This uses jo to create the arrays as it's a pretty simple way to do it but it could be done differently if needed.

🌐
GitHub
github.com › h4l › json.bash
GitHub - h4l/json.bash: Command-line tool and bash library that creates JSON · GitHub
June 23, 2023 - From within a bash script you get better performance by running source json.bash and using the json bash function, which is a superset of stand-alone jb and much faster because it doesn't execute new child processes when called. See the Background & performance notes section for more. ... $ # The jb program creates JSON objects $ jb {} $ # The jb-array creates arrays, but otherwise works like jb. $ jb-array :number=4 [4] $ # From a bash shell or bash script, use the json and json.array functions $ source json.bash # no path is needed if json.bash is on $PATH $ json {} $ # json.array creates arrays, but otherwise works like json $ json.array []
Starred by 458 users
Forked by 8 users
Languages   Shell 94.9% | Dockerfile 3.6% | HCL 1.5%
Top answer
1 of 1
6

One way to do this is to provide a jq function that generates your repeated structure, given the specific inputs you want to modify. Consider the following:

# generate this however you want to -- hardcoded, built by a loop, whatever.
source_dest_pairs=(
  sourcebucket1:destinationbucket1
  sourcebucket2:destinationbucket2
  sourcebucket3:destinationbucket3
)

# -R accepts plain text, not JSON, as input; -n doesn't read any input automatically
# ...but instead lets "inputs" or "input" be used later in your jq code.
jq -Rn '
  def instructionsForPair($source; $dest): {
    "Name":"S3DistCp step",
    "HadoopJarStep": {
      "Args":[
        "s3-dist-cp",
        "--s3Endpoint=s3.amazonaws.com",
        "--src=\($source)",
        "--dest=\($dest)"
      ],
      "Jar":"command-runner.jar"
    }
  };

  [ inputs 
  | capture("^(?<source>[^:]+):(?<dest>.*)$"; "")
  | select(.)
  | instructionsForPair(.source; .dest) ]
' < <(printf '%s\n' "${source_dest_pairs[@]}")

...correctly emits as output:

[
  {
    "Name": "S3DistCp step",
    "HadoopJarStep": {
      "Args": [
        "s3-dist-cp",
        "--s3Endpoint=s3.amazonaws.com",
        "--src=sourcebucket1",
        "--dest=destinationbucket1"
      ],
      "Jar": "command-runner.jar"
    }
  },
  {
    "Name": "S3DistCp step",
    "HadoopJarStep": {
      "Args": [
        "s3-dist-cp",
        "--s3Endpoint=s3.amazonaws.com",
        "--src=sourcebucket2",
        "--dest=destinationbucket2"
      ],
      "Jar": "command-runner.jar"
    }
  },
  {
    "Name": "S3DistCp step",
    "HadoopJarStep": {
      "Args": [
        "s3-dist-cp",
        "--s3Endpoint=s3.amazonaws.com",
        "--src=sourcebucket3",
        "--dest=destinationbucket3"
      ],
      "Jar": "command-runner.jar"
    }
  }
]
Find elsewhere
Top answer
1 of 4
4

Simply use printf to format the output into JSON

First, you have a very blatant typo in this part of your code right here:

echo "${array[3]:$var-3:4}

Note there is no closing straight quote: ". Fixed it in the rewrite I did below:

But more to the point, doing something like this (using printf) as suggested in this StackOverflow answer. Tested and works in CentOS 7.

#!/bin/bash

readarray -t array <<< "$(df -h)";
var=$(echo "${array[1]}"| grep -aob '%' | grep -oE '[0-9]+');
df_output="${array[3]:$var-3:4}";

manufacturer=$(cat /sys/class/dmi/id/chassis_vendor);
product_name=$(cat /sys/class/dmi/id/product_name);
version=$(cat /sys/class/dmi/id/bios_version);
serial_number=$(cat /sys/class/dmi/id/product_serial);
hostname=$(hostname);
operating_system=$(hostnamectl | grep "Operating System" | cut -d ' ' -f5-);
architecture=$(arch);
processor_name=$(awk -F':' '/^model name/ {print $2}' /proc/cpuinfo | uniq | sed -e 's/^[ \t]*//');
memory$(dmidecode -t 17 | grep "Size.*MB" | awk '{s+=$2} END {print s / 1024"GB"}');
hdd_model=$(cat /sys/block/sda/device/model);
system_main_ip=$(hostname -I);

printf '{"manufacturer":"%s","product_name":"%s","version":"%s","serial_number":"%s","hostname":"%s","operating_system":"%s","architecture":"%s","processor_name":"%s","memory":"%s","hdd_model":"%s","system_main_ip":"%s"}' "$manufacturer" "$product_name" "$version" "$serial_number" "$hostname" "$operating_system" "$architecture" "$processor_name" "$memory" "$hdd_model" "$system_main_ip"

The output I get is this:

{"manufacturer":"Oracle Corporation","product_name":"VirtualBox","version":"VirtualBox","serial_number":"","hostname":"sandbox-centos-7","operating_system":"CentOS Linux 7 (Core)","architecture":"x86_64","processor_name":"Intel(R) Core(TM) i5-1030NG7 CPU @ 1.10GHz","memory":"","hdd_model":"VBOX HARDDISK   ","system_main_ip":"10.0.2.15 192.168.56.20 "}

And if you have jq installed, you can pipe the output of the shell script to jq to “pretty print” the output into some human readable format. Like let’s say your script is named my_script.sh, just pipe it to jq like this:

./my_script.sh | jq

And the output would look like this:

{
  "manufacturer": "Oracle Corporation",
  "product_name": "VirtualBox",
  "version": "VirtualBox",
  "serial_number": "",
  "hostname": "sandbox-centos-7",
  "operating_system": "CentOS Linux 7 (Core)",
  "architecture": "x86_64",
  "processor_name": "Intel(R) Core(TM) i5-1030NG7 CPU @ 1.10GHz",
  "memory": "",
  "hdd_model": "VBOX HARDDISK   ",
  "system_main_ip": "10.0.2.15 192.168.56.20 "
}
2 of 4
1

The following programs can output json:

lshw:

lshw -json

smartmontools v7+:

smartctl --json --all /dev/sda

lsblk:

lsblk --json

lsipc:

lsipc --json

sfdisk

sfdisk --json
Top answer
1 of 3
2

One possible solution to this:

declare -A aliases
aliases[Index]=components/Index/Exports
aliases[Shared]=components/Shared/Exports
aliases[Icons]=components/Icons/Exports

jq -n --argjson n "${#aliases[@]}" '
        { compileroption: {
                baseurl: ".",
                paths:
                (
                        reduce range($n) as $i ({};
                                .[$ARGS.positional[$i]] = [$ARGS.positional[$i+$n]]
                        )
                )
        } }' --args "${!aliases[@]}" "${aliases[@]}"

Does not use jo and instead pass the keys and values of the associative array aliases into jq as positional parameters with --args at the end of the command (--args must always be the last option, if it's used at all). The jq utility receives the keys and values as a single array, $ARGS.positional. This means the first half of the array contains the keys, and the second half of the array contains the corresponding values.

The body of the jq expression creates the output object and uses a reduce operation over a range of $n integers from zero up, where $n is the number of elements in the aliases array. The reduce operation builds the paths object by adding the positional arguments, one by one, using $i:th argument as the key and the $i+$n:th argument as the element in the corresponding array value.


A slightly different approach using jo to create leaf objects of each key-value pair of the associative array:

declare -A aliases
aliases[Index]=components/Index/Exports
aliases[Shared]=components/Shared/Exports
aliases[Icons]=components/Icons/Exports

for key in "${!aliases[@]}"; do
        jo "$key[]=${aliases[$key]}"
done

This would output the three objects

{"Icons":["components/Icons/Exports"]}
{"Index":["components/Index/Exports"]}
{"Shared":["components/Shared/Exports"]}

Since we're using jo like this, we impose some obvious restrictions on the keys of the array (may not contain =, [] etc.)

We could use jq in place of jo like so:

for key in "${!aliases[@]}"; do
        jq -n --arg key "$key" --arg value "${aliases[$key]}" '.[$key] = [$value]'
done

We may then read these and add them in the correct place in the object we're creating in jq:

declare -A aliases
aliases[Index]=components/Index/Exports
aliases[Shared]=components/Shared/Exports
aliases[Icons]=components/Icons/Exports

for key in "${!aliases[@]}"; do
        jo "$key[]=${aliases[$key]}"
done |
jq -n '{ compileroptions: {
        baseURL: ".",
        paths: (reduce inputs as $item ({}; . += $item)) } }'

The main difference here is that we don't pass stuff into jq as command line options, but rather as a stream of JSON objects.

2 of 3
1

Personally, I'd use perl or other proper programming language instead of a shell (especially bash!). Or at least switch to zsh with a better associative array support and use perl to do the JSONy stuff:

#! /usr/bin/perl
use JSON;

%my_aliases = (qw(
  Index   components/Index/Exports
  Shared  components/Shared/Exports
  Icons   components/Icons/Exports
));

$j->{compilerOptions}->{baseUrl} = "";
$j->{compilerOptions}->{paths}->{$_} = [$my_aliases{$_}] for keys%my_aliases;
print to_json($j, {"pretty" => 1});

Or:

#! /bin/zsh -

typeset -A my_aliases=(
  Index   components/Index/Exports
  Shared  components/Shared/Exports
  Icons   components/Icons/Exports
)

print -rNC1 -- "${(kv@)my_aliases}" |
  perl -MJSON -0e '
    chomp (@records = <>);
    %my_aliases = @records;
    $j->{compilerOptions}->{baseUrl} = "";
    $j->{compilerOptions}->{paths}->{$_} = [$my_aliases{$_}] for keys%my_aliases;
    print to_json($j, {"pretty" => 1})'
🌐
Baeldung
baeldung.com › home › scripting › build a json string with bash variables
Build a JSON String With Bash Variables | Baeldung on Linux
August 12, 2024 - In this script, we prepare the JSON array by formatting each element of the languages variable into a quoted string followed by a comma and a space and then removing the trailing comma and space to create a valid JSON array string.
Top answer
1 of 2
1

There are two main issues in your data and code:

  1. You have an input file in DOS or Windows text file format.
  2. Your code creates multiple single-element arrays rather than a single array with multiple elements.

Your input file, lol, appears to be a text file in DOS/Windows format. This means that when a utility that expects a Unix text file as input reads the file, each line will have an additional carriage-return character (\r) at the end.

You should convert the file to Unix text file format. This can be done with e.g. dos2unix.

As for your code, you can avoid the shell loop and let jq read the whole file in one go. This allows you to create a single result array rather than a set of arrays, each with a single object, which your code does.

The following assumes that the only thing that varies between the elements of the top-level array in the result is the source value (there is nothing in the question that explains how the max and min values of the source and destination ports should be picked):

jq -n -R '
        [inputs] |
        map( {
                source: .,
                protocol: "17",
                isStateless: true,
                udpOptions: {
                        sourcePortRange: { min: 521, max: 65535 },
                        destinationPortRange: { min: 1, max: 65535 }
                }
        } )' cidr.txt

or in the same compact one-line form as in your question:

jq -n -R '[inputs]|map({source:.,protocol:"17",isStateless:true,udpOptions:{sourcePortRange:{min:521,max:65535},destinationPortRange:{min:1,max:65535}}})' cidr.txt

Using inputs, jq reads the remaining inputs. Together with -R, it will read each line of cidr.txt as a single string. Putting this in an array with [inputs] we create an array of strings. The map() call takes each string from this array and transforms it into the source value for a larger, otherwise static object.

Add -c to the invocation of jq to get "compact" output.


If you don't want to, or are unable to, convert the input data from DOS to Unix text form, you can remove the carriage-return characters from within the jq expression instead.

To do this, replace the . after source: with (.|rtrimstr("\r")), including the outer parentheses. This trims the carriage-return from the end of each string read from the file.

2 of 2
0

Answer

This should get you the exact syntax you require:

In the example , the file containing your CIDR values is named cidr.txt and appears to contain only IP addresses along with subnets, i.e. other parameters remain constant. If you actually need to change these additional parameters (i.e. the port ranges you provided are not actually the same for all cidr then I will update my answer, and provide a fully fleshed out template)

Additionally, you will require 'jq' , which is the ubiquitous application for dealing with JSON through bash. It may likely already be installed these days, but if not then sudo apt install jq per usual.

while read cidr ; do 
   jq -n --arg CIDR "$cidr" '{"source":$CIDR,"protocol":"17","isStateless":true,"udpOptions": {"destinationPortRange":{"max": 65535,"min": 1},"sourcePortRange": {"min":521,"max": 65535}  }}'  
done < cidr.txt | jq --slurp

Using the four-line file sample you provided, the output of the above will give you the following in the terminal:

[
  {
    "source": "1.1.1.0/22",
    "protocol": "17",
    "isStateless": true,
    "udpOptions": {
      "destinationPortRange": {
        "max": 65535,
        "min": 1
      },
      "sourcePortRange": {
        "min": 521,
        "max": 65535
      }
    }
  },
  {
    "source": "2.2.2.0/24",
    "protocol": "17",
    "isStateless": true,
    "udpOptions": {
      "destinationPortRange": {
        "max": 65535,
        "min": 1
      },
      "sourcePortRange": {
        "min": 521,
        "max": 65535
      }
    }
  },
  {
    "source": "5.5.5.0/21",
    "protocol": "17",
    "isStateless": true,
    "udpOptions": {
      "destinationPortRange": {
        "max": 65535,
        "min": 1
      },
      "sourcePortRange": {
        "min": 521,
        "max": 65535
      }
    }
  },
  {
    "source": "6.6.0.0/16",
    "protocol": "17",
    "isStateless": true,
    "udpOptions": {
      "destinationPortRange": {
        "max": 65535,
        "min": 1
      },
      "sourcePortRange": {
        "min": 521,
        "max": 65535
      }
    }
  }
]


UPDATE

In order to fix the above output, you need to "repair" the line termination of your CIDR file. There are two ways of doing so:

Answer 1:

You can make the following changes to your script

#!/bin/bash

# There are four changes made to the script:
# 1. The addition of `tr` in order to eliminate '\r'.
# 2. Removal of '[' and ']' inside the `jq` command.
# 3. Addition of `jq --slurp` to enforce your specified JSON format. 
# 4. Addition of double-quotes around `$lel` to prevent splitting.

 
lel=$(while read cidr ; do 
       cidr=$(echo "$cidr" | tr -d '\r' );     
       jq -n --arg CIDR "$cidr" '{"source":$CIDR,"protocol":"17","isStateless":true,"udpOptions": {"destinationPortRange":{"max": 65535,"min": 1},"sourcePortRange": {"min":521,
"max": 65535}  }}' 
  done < lol | jq --slurp )
echo "$lel"

Alternative answer

You can "repair" the file containing our list of CIDRs:

cp lol lol_old
cat lol_old | tr -d '\r' > lol

Then, you can use the earlier version of your script, albeit with the corrections explained in #2-4 comments of the script included above.

Explanation

The reason for the \r found in your output is actually found in the formatting of your particular file containing your CIDRs, which happens to follow Windows - and not Unix - line termination standard.

The \r symbol you see in your output is actually present in your source file as well, where it is used along with \n to terminate each individual line. Both \r and \n are invisible characters. The combination of \r\n is known as CRLF - carriage return + line feed - which is a remnant from the age of typewriters, yet for some reason is still used by Windows systems. On the other hand, Unix uses only LF to terminate lines, where it is represented by \n in its escaped form.

To confirm this peciular behavior, you can try executing the following:

head -n 1 lol | xxd -ps 
312e312e312e302f32320d0a

In the above output - the first line of your file converted to its hex form - ends with 0d0a. This HEX combination represent CR+LF. On the other hand, if you execute the following directly inside of your Bash terminal:

echo "abcd"  | xxd -ps
616263640a

you will find that the output follows Unix standard, where the line termination uses simple 0a, i.e. the hex representation of LF.

Note: This line-termination issue is incredibly common, widespread and something one always needs to be on the lookout for operating from inside Unix on files that may have been generated under Windows.



Info regarding jq

The above example (the while read loop) sends its output to the terminal, but you can of course use redirection if you need to store it in a file, using the standard syntax:

while read cidr; do [...] ; done < cidr.txt > outcidr.json

This file will contain the pretty-printed JSON output, but if you need/prefer your output to be contstrained to a single line, you can do:

cat outcidr.json | tr -d '\n' | tr -s ' '

More importantly, if you ever in the future end up with a single-line, complex JSON output that looks impossible to decipher, jq can be used to reformat and pretty-print it`:

echo '[{"source":"1.1.1.0/24","protocol":"17","isStateless":true,"udpOptions":{"destinationPortRange":{"max":55555,"min":10001},"sourcePortRange":{"min":521,"max":65535}}},{"source":"2.2.2.0/24","protocol":"17","isStateless":true,"udpOptions":{"destinationPortRange":{"max":55555,"min":10001},"sourcePortRange":{"min":521,"max":65535}}},{"source":"3.3.3.0/24","protocol":"17","isStateless":true,"udpOptions":{"destinationPortRange":{"max":55555,"min":10001},"sourcePortRange":{"min":521,"max":65535}}}]' > bad_output.json

cat bad_output.json | tr -d '\r' | jq '' 

[
  {
    "source": "1.1.1.0/24",
    "protocol": "17",
    "isStateless": true,
    "udpOptions": {
      "destinationPortRange": {
        "max": 55555,
        "min": 10001
      },
      "sourcePortRange": {
        "min": 521,
        "max": 65535
      }
    }
  },
  {
    "source": "2.2.2.0/24",
    "protocol": "17",
    "isStateless": true,
    "udpOptions": {
      "destinationPortRange": {
        "max": 55555,
        "min": 10001
      },
      "sourcePortRange": {
        "min": 521,
        "max": 65535
      }
    }
  },
  {
    "source": "3.3.3.0/24",
    "protocol": "17",
    "isStateless": true,
    "udpOptions": {
      "destinationPortRange": {
        "max": 55555,
        "min": 10001
      },
      "sourcePortRange": {
        "min": 521,
        "max": 65535
      }
    }
  }
]
# Getting first-order keys for each of the 3 objects 
 
jq '.[] | keys' bad_output.json 

[
  "isStateless",
  "protocol",
  "source",
  "udpOptions"
]
[
  "isStateless",
  "protocol",
  "source",
  "udpOptions"
]
[
  "isStateless",
  "protocol",
  "source",
  "udpOptions"
]

# Getting values corresponding to the selected key"  
jq '.[] | .source ' outcidr.txt 
"1.1.1.0/22"
"2.2.2.0/24"
"5.5.5.0/21"
"6.6.0.0/16"
🌐
Cameronnokes
cameronnokes.com › blog › working-with-json-in-bash-using-jq
Working with JSON in bash using jq - Cameron Nokes
August 1, 2020 - You can get even fancier and create intermediary arrays and objects on the fly. Here, I’m combining the keys of the dependencies and devDependencies objects (from a package.json file) into an array, flattening it, and then getting the length.
🌐
GitHub
gist.github.com › ORESoftware › a4e3948b0ce9c22752c759d7e694c9ab
Convert array to JSON in Bash shell · GitHub
April 22, 2018 - Convert array to JSON in Bash shell. GitHub Gist: instantly share code, notes, and snippets.
🌐
Unix.com
unix.com › unix for beginners q & a
Convert String to an Array using shell scripting in JSON file. - UNIX for Beginners Q & A - Unix Linux Community
November 27, 2019 - This is the sample json I have pasted here. I want all the IP address strings to be converted into an array. For example “10.38.32.202” has to be converted to “10.38.32.202”] everywhere in the JSON. There are multiple IP…
🌐
Jan-Piet Mens
jpmens.net › 2016 › 03 › 05 › a-shell-command-to-create-json-jo
Jan-Piet Mens :: A shell command to create JSON: jo
March 5, 2016 - A here! script will do it as will a a printf(1), but neither much improve legibility, and if strings contain quotes, it becomes almost impossible to make a script produce JSON. ... Bam! Jo tries to be clever about types and knows null, booleans, strings and numbers. It does arrays, and it pretty-prints on demand: $ jo -p -a spring summer winter [ "spring", "summer", "winter" ] Inspired by a comment on HN, I added another hack: if a key’s value begins with an opening brace ({) or a bracket ([]) we attempt to decode JSON from it; this allows jo to add objects or arrays (use -a!) to itself.
🌐
Brazil's Blog
blog.kellybrazil.com › home › blog feed › practical json at the command line
Practical JSON at the Command Line - Brazil's Blog
June 25, 2021 - There are a few interesting things going on in this script: A Bash array variable named packages is created with packages=() A while loop reads in all of the JSON objects created by jq into the packages Bash array.
Top answer
1 of 4
22

If you really cannot use a proper JSON parser such as jq[1] , try an awk-based solution:

Bash 4.x:

Copyreadarray -t values < <(awk -F\" 'NF>=3 {print $4}' myfile.json)

Bash 3.x:

CopyIFS=$'\n' read -d '' -ra values < <(awk -F\" 'NF>=3 {print $4}' myfile.json)

This stores all property values in Bash array ${values[@]}, which you can inspect with
declare -p values.

These solutions have limitations:

  • each property must be on its own line,
  • all values must be double-quoted,
  • embedded escaped double quotes are not supported.

All these limitations reinforce the recommendation to use a proper JSON parser.


Note: The following alternative solutions use the Bash 4.x+ readarray -t values command, but they also work with the Bash 3.x alternative, IFS=$'\n' read -d '' -ra values.

grep + cut combination: A single grep command won't do (unless you use GNU grep - see below), but adding cut helps:

Copyreadarray -t values < <(grep '"' myfile.json | cut -d '"' -f4)

GNU grep: Using -P to support PCREs, which support \K to drop everything matched so far (a more flexible alternative to a look-behind assertion) as well as look-ahead assertions ((?=...)):

Copyreadarray -t values < <(grep -Po ':\s*"\K.+(?="\s*,?\s*$)' myfile.json)

Finally, here's a pure Bash (3.x+) solution:

What makes this a viable alternative in terms of performance is that no external utilities are called in each loop iteration; however, for larger input files, a solution based on external utilities will be much faster.

Copy#!/usr/bin/env bash

declare -a values # declare the array                                                                                                                                                                  

# Read each line and use regex parsing (with Bash's `=~` operator)
# to extract the value.
while read -r line; do
  # Extract the value from between the double quotes
  # and add it to the array.
  [[ $line =~ :[[:blank:]]+\"(.*)\" ]] && values+=( "${BASH_REMATCH[1]}" )
done < myfile.json                                                                                                                                          

declare -p values # print the array

[1] Here's what a robust jq-based solution would look like (Bash 4.x):
readarray -t values < <(jq -r '.[]' myfile.json)

2 of 4
4

jq is good enough to solve this problem

Copypaste -s <(jq '.files[].name' YourJsonString) <(jq '.files[].age' YourJsonString) <( jq '.files[].websiteurl' YourJsonString) 

So that you get a table and you can grep any rows or awk print any columns you want