You can do this:
X=("hello world" "goodnight moon")
printf '%s\n' "${X[@]}" | jq -R . | jq -s .
output
[
"hello world",
"goodnight moon"
]
Answer from kev on Stack OverflowI have the following string in bash
"3.8,3.9,3.10"
Is there a way using shell to convert it into a json array, i.e.
["3.8", "3.9", "3.10"\]
You can do this:
X=("hello world" "goodnight moon")
printf '%s\n' "${X[@]}" | jq -R . | jq -s .
output
[
"hello world",
"goodnight moon"
]
Since jq 1.6 you can do this:
jq --compact-output --null-input '$ARGS.positional' --args -- "${X[@]}"
giving:
["hello world","goodnight moon"]
This has the benefit that no escaping is required at all. It handles strings containing newlines, tabs, double quotes, backslashes and other control characters. (Well, it doesn't handle NUL characters but you can't have them in a bash array in the first place.)
Simply use printf to format the output into JSON
First, you have a very blatant typo in this part of your code right here:
echo "${array[3]:$var-3:4}
Note there is no closing straight quote: ". Fixed it in the rewrite I did below:
But more to the point, doing something like this (using printf) as suggested in this StackOverflow answer. Tested and works in CentOS 7.
#!/bin/bash
readarray -t array <<< "$(df -h)";
var=$(echo "${array[1]}"| grep -aob '%' | grep -oE '[0-9]+');
df_output="${array[3]:$var-3:4}";
manufacturer=$(cat /sys/class/dmi/id/chassis_vendor);
product_name=$(cat /sys/class/dmi/id/product_name);
version=$(cat /sys/class/dmi/id/bios_version);
serial_number=$(cat /sys/class/dmi/id/product_serial);
hostname=$(hostname);
operating_system=$(hostnamectl | grep "Operating System" | cut -d ' ' -f5-);
architecture=$(arch);
processor_name=$(awk -F':' '/^model name/ {print $2}' /proc/cpuinfo | uniq | sed -e 's/^[ \t]*//');
memory$(dmidecode -t 17 | grep "Size.*MB" | awk '{s+=$2} END {print s / 1024"GB"}');
hdd_model=$(cat /sys/block/sda/device/model);
system_main_ip=$(hostname -I);
printf '{"manufacturer":"%s","product_name":"%s","version":"%s","serial_number":"%s","hostname":"%s","operating_system":"%s","architecture":"%s","processor_name":"%s","memory":"%s","hdd_model":"%s","system_main_ip":"%s"}' "$manufacturer" "$product_name" "$version" "$serial_number" "$hostname" "$operating_system" "$architecture" "$processor_name" "$memory" "$hdd_model" "$system_main_ip"
The output I get is this:
{"manufacturer":"Oracle Corporation","product_name":"VirtualBox","version":"VirtualBox","serial_number":"","hostname":"sandbox-centos-7","operating_system":"CentOS Linux 7 (Core)","architecture":"x86_64","processor_name":"Intel(R) Core(TM) i5-1030NG7 CPU @ 1.10GHz","memory":"","hdd_model":"VBOX HARDDISK ","system_main_ip":"10.0.2.15 192.168.56.20 "}
And if you have jq installed, you can pipe the output of the shell script to jq to “pretty print” the output into some human readable format. Like let’s say your script is named my_script.sh, just pipe it to jq like this:
./my_script.sh | jq
And the output would look like this:
{
"manufacturer": "Oracle Corporation",
"product_name": "VirtualBox",
"version": "VirtualBox",
"serial_number": "",
"hostname": "sandbox-centos-7",
"operating_system": "CentOS Linux 7 (Core)",
"architecture": "x86_64",
"processor_name": "Intel(R) Core(TM) i5-1030NG7 CPU @ 1.10GHz",
"memory": "",
"hdd_model": "VBOX HARDDISK ",
"system_main_ip": "10.0.2.15 192.168.56.20 "
}
The following programs can output json:
lshw:
lshw -json
smartmontools v7+:
smartctl --json --all /dev/sda
lsblk:
lsblk --json
lsipc:
lsipc --json
sfdisk
sfdisk --json
I am writing a bash script for an Alfred Workflow. In this script I get a list (separated with newlines) from a command that I want to convert into a specific JSON format.
I tried storing the output into an array and parsing that in jq like that:
Command output:
$ piactl get regions auto france netherlands
Create array:
$ IFS=$'\n'
$ regions=($(piactl get regions))
$ echo "${regions[@]}"
auto france netherlandsParse to jq
$ jq -n --arg inarr "${regions}" '{ items: $inarr | split("\n") }'
{
"items": [
"auto"
]
}jq only outputs one item of the array and I don't know how to shape the JSON like shown in the wanted output below.
Wanted output:
{"items": [
{
"title": "auto",
"arg": "auto",
"icon": {
"path": "icon.png"
}
},
{
"title": "france",
"arg": "france",
"icon": {
"path": "icon.png"
}
},
{
"title": "netherlands",
"arg": "netherlands",
"icon": {
"path": "icon.png"
}
},
]}Can somebody help me craft the correct jq arguments for this task?
So I need to make a pure JSON string using a variable, and I'm bashing (no pun intended) my head against the wall trying to get the result I want:
The desired result looks like this:
'{"app":"app-version.tar","db":"db-version.tar"}'The variable I have would contain the version info (so like, $VERSION), and I CANNOT get a proper JSON out of something like
'{"app":"app-'$VERSION'.tar","db":"db-'$VERSION'.tar"}'Anyone able to help?
Edit: I also need to echo the EXACT string being generated at some earlier point as well, for making sure everything is correct.
I was also trying to convert a bunch of lines into a JSON array, and was at a standstill until I realized that -s was the only way I could handle more than one line at a time in the jq expression, even if that meant I'd have to parse the newlines manually.
jq -R -s -c 'split("\n")' < just_lines.txt
-Rto read raw input-sto read all input as a single string-cto not pretty print the output
Easy peasy.
Edit: I'm on jq ≥ 1.4, which is apparently when the split built-in was introduced.
--raw-input, then --slurp
Just summarizing what the others have said in a hopefully quicker to understand form:
cat /etc/hosts | jq --raw-input . | jq --slurp .
will return you:
[
"fe00::0 ip6-localnet",
"ff00::0 ip6-mcastprefix",
"ff02::1 ip6-allnodes",
"ff02::2 ip6-allrouters"
]
Explanation
--raw-input/-R:
Don´t parse the input as JSON. Instead, each line of text is passed
to the filter as a string. If combined with --slurp, then the
entire input is passed to the filter as a single long string.
--slurp/-s:
Instead of running the filter for each JSON object in the input,
read the entire input stream into a large array and run the filter
just once.
The problem is that jq is still just outputting lines of text; you can't necessarily preserve each array element as a single unit. That said, as long as a newline is not a valid character in any object, you can still output each object on a separate line.
get_json_array | jq -c '.[]' | while read object; do
api_call "$object"
done
Under that assumption, you could use the readarray command in bash 4 to build an array:
readarray -t conversations < <(get_json_array | jq -c '.[]')
for conversation in "${conversations[@]}"; do
api_call "$conversation"
done
Here is a solution without loops:
json=$(jq -c ".my_key[]" ./my-file.json)
json_without_quotes=$(echo ${json//\"/""})
declare -a all_apps_array=($(echo $json_without_quotes | tr "\n" " "))
I'd get jq to output the results line-wise. Then use the bash mapfile command to read the lines into an array
mapfile -t correct < <(jq -r '.results[] | .correct_answer' <<<"$quiz")
declare -p correct
declare -a correct=([0]="Amazon" [1]="60 lbs" [2]="True")
For the incorrect answers, since bash does not have multi-dimensional arrays, I'd get jq to output the array of incorrect answers as CSV:
mapfile -t incorrect < <(jq -r '.results[] | .incorrect_answers | @csv' <<<"$quiz")
declare -p incorrect
declare -a incorrect=([0]="\"Netflix\",\"BBC\",\"CCTV\"" [1]="\"30 lbs\",\"50 lbs\",\"70 lbs\"" [2]="\"False\"")
Then:
$ echo "q $i: ans=${correct[i]}; wrong=${incorrect[i]}"
q 1: ans=60 lbs; wrong="30 lbs","50 lbs","70 lbs"
Documentation:
- Process Substitution
- Here Strings
Assuming you're interacting with the user:
i=0
read -p "Answer to question $i: " answer
if [[ $answer == "${correct[i]}" ]]; then
echo Correct
elif [[ ${incorrect[i]} == *"\"$answer\""* ]]; then
echo Incorrect
else
echo Invalid answer
fi
Keep in mind that within [[...]] the == operator is not a string equality operator, it is a pattern matching operator.
- the first test is a straightforward string comparison
- the second test sees if the CVS string holding the incorrect answers contains the answer in double quotes
Extending @RomanPerekhrest's answer (go upvote it now):
mapfile -t answers < <(jq -r '.results[] | [.correct_answer] + .incorrect_answers | @sh' <<<"$quiz")
declare -p answers
declare -a answers=([0]="'Amazon' 'Netflix' 'BBC' 'CCTV'" [1]="'60 lbs' '30 lbs' '50 lbs' '70 lbs'" [2]="'True' 'False'")
Then, you can use something like this
for i in "${!answers[@]}"; do
declare -a "this_answers=( ${answers[i]} )"
echo $i
printf " > %s\n" "${this_answers[@]}"
done
0
> Amazon
> Netflix
> BBC
> CCTV
1
> 60 lbs
> 30 lbs
> 50 lbs
> 70 lbs
2
> True
> False
Hi guys,
I am a linux noob and am trying to write a script to extract info from a mkv file using mkvmerge but am not able to convert the target json script to a bash array. I have tried a number of solutions from stack overflow but with no success.
here are some of my attempts
dir="/mnt/Anime/Series/KonoSuba/Season 2/[Nep_Blanc] KonoSuba II 10 .mkv"
*********************************************************************************
ARRAY_SIZE=$(mkvmerge -J "$dir" | jq '.tracks | length')
count=0
arr=()
while [ $count -lt $ARRAY_SIZE ];
do
arr+=($(mkvmerge -J "$dir" | jq '.tracks'[$count]))
((count++))
done
*********************************************************************************
readarray -t test_array < <(mkvmerge -J "$dir" | jq '.tracks')
for element in "${test_array[@]}";
do
echo "$element"
done
*********************************************************************************
array=($(mkvmerge -J "$dir" | jq '.tracks' | sed -e 's/^\[/(/' -e 's/\]$/)/'))but the echo prints out lines instead of the specific objects.
Though now it is helpling me with my python, originally the project was to help me learn bash scripting. I would really like to have a bash implementation so any help overcoming this roadblock would be appreciated.
If you really cannot use a proper JSON parser such as jq[1]
, try an awk-based solution:
Bash 4.x:
readarray -t values < <(awk -F\" 'NF>=3 {print $4}' myfile.json)
Bash 3.x:
IFS=$'\n' read -d '' -ra values < <(awk -F\" 'NF>=3 {print $4}' myfile.json)
This stores all property values in Bash array ${values[@]}, which you can inspect with
declare -p values.
These solutions have limitations:
- each property must be on its own line,
- all values must be double-quoted,
- embedded escaped double quotes are not supported.
All these limitations reinforce the recommendation to use a proper JSON parser.
Note: The following alternative solutions use the Bash 4.x+ readarray -t values command, but they also work with the Bash 3.x alternative, IFS=$'\n' read -d '' -ra values.
grep + cut combination: A single grep command won't do (unless you use GNU grep - see below), but adding cut helps:
readarray -t values < <(grep '"' myfile.json | cut -d '"' -f4)
GNU grep: Using -P to support PCREs, which support \K to drop everything matched so far (a more flexible alternative to a look-behind assertion) as well as look-ahead assertions ((?=...)):
readarray -t values < <(grep -Po ':\s*"\K.+(?="\s*,?\s*$)' myfile.json)
Finally, here's a pure Bash (3.x+) solution:
What makes this a viable alternative in terms of performance is that no external utilities are called in each loop iteration; however, for larger input files, a solution based on external utilities will be much faster.
#!/usr/bin/env bash
declare -a values # declare the array
# Read each line and use regex parsing (with Bash's `=~` operator)
# to extract the value.
while read -r line; do
# Extract the value from between the double quotes
# and add it to the array.
[[ $line =~ :[[:blank:]]+\"(.*)\" ]] && values+=( "${BASH_REMATCH[1]}" )
done < myfile.json
declare -p values # print the array
[1] Here's what a robust jq-based solution would look like (Bash 4.x):
readarray -t values < <(jq -r '.[]' myfile.json)
jq is good enough to solve this problem
paste -s <(jq '.files[].name' YourJsonString) <(jq '.files[].age' YourJsonString) <( jq '.files[].websiteurl' YourJsonString)
So that you get a table and you can grep any rows or awk print any columns you want