Use parentheses to evaluate $key early as in:
jq --null-input --arg key foobar '{($key): "value"}'
See also: Parentheses in JQ for .key
Answer from Hans Z. on Stack OverflowUse parentheses to evaluate $key early as in:
jq --null-input --arg key foobar '{($key): "value"}'
See also: Parentheses in JQ for .key
You can also use String interpolation in jq which is of the form "\(..)". Inside the string, you can put an expression inside parens after a backslash. Whatever the expression returns will be interpolated into the string.
You can do below. The contents of the variable key is expanded and returned as a string by the interpolation sequence.
jq --null-input --arg key foobar '{ "\($key)": "value"}'
How to access shell variable value as key name
How to dynamically find a value of a key-value in json in bash script - Unix & Linux Stack Exchange
jq how to pass json keys from a shell variable - Stack Overflow
bash - Passing variable to jq to edit a json file - Unix & Linux Stack Exchange
Is it possible to do something like this:
jq -r --argjson i 0 --arg key "name" '.projects[$i].$key' input-file
I've tried enclosing $key in both parenthesis and square brackets unsuccessfully.
Fetching the value of a jq variable doesn't cause it to be executed as jq code.
Furthermore, jq lacks the facility to take a string, compile it as jq code, and evaluate the result. (This is commonly known as eval.)
So, short of a writing a jq parser and evaluator in jq, you will need to impose limits and/or accept a different format.
For example,
keys='[ [ "key1", "childkey" ], [ "key2", "childkey2" ] ]' # JSON
jq --argjson keys "$keys" '.[] | [ getpath( $keys[] ) ]' file.json
or
keys='key1.childkey,key2.childkey2'
jq --arg keys "$keys" '
( ( $keys / "," ) | map( . / "." ) ) as $keys |
.[] | [ getpath( $keys[] ) ]
' file.json
Suppose you have:
cat file
[{
"key1":1,
"key2":2
}]
[{
"key1":1,
"key2":2
}]
You can use a jq command like so:
jq '.[] | [.key1,.key2]' file
[
1,
2
]
[
1,
2
]
You can use -f to execute a filter from a file and nothing keeps you from creating the file separately from the shell variables.
Example:
keys=".key1"
echo ".[] | [${keys}]" >jqf
jq -f jqf file
[
1
]
[
1
]
Or just build the string directly into jq:
# note double " causing string interpolation
jq ".[] | [${keys}]" file
You can use square bracket indexing on all objects in jq, so [$name] works for what you're trying:
jq --arg key1 true --arg name "$name" '.Linux.script_executed[$name] = $key1 ...'
This use of square brackets is not very well documented in the manual, which makes it look like you can only use .[xyz], but ["x"] works anywhere that .x would have as long as it's not right at the start of an expression (that is, .a.x and .a["x"] are the same, but ["x"] is an array construction).
Note the use of single quotes above - that is so Bash won't try to interpret $name and $key1 as shell variables. You should keep the double quotes for --arg name "$name", because that really is a shell variable, and it should be quoted to make it safe to use.
Michael Homer has the correct response. I'll post what I was aiming for in relation to op's question.
I am trying to modify a permissions graph obtained through a REST API curl call using jq to output a json file which I will PUT to the server with updates. The following is the output of the curl API query:
{
"revision": 2,
"groups": {
"1": {
"1": {
"native": "none",
"schemas": "none"
},
"2": {
"native": "write",
"schemas": "all"
}
},
"2": {
"1": {
"native": "write",
"schemas": "all"
},
"2": {
"native": "write",
"schemas": "all"
}
}
}
}
As you can see it is nested. I am trying to modify this using bash variables every time a new permission set is created and when databases are created. For exmaple I'm trying to modify groups."1"."2"."native" and "schemas" also "groups."2"."2".native and "schemas" values. It's rough, but the keys are as follows groups.groupPermissionID.DatabaseID.*
In order to modify this nested graph on the fly via a bash shell script, I used Michael Homer's solution of applying [$name]. In my case I had to do this twice in a row. i.e., [$gID][$dID]. In the following setup the variables are constant but in my model they are command arguments passed to the bash shell script.
dbID="2"
pgroupID="2"
curl -X GET -H "Content-Type: application/json" -H "X-Metabase-Session: XXXXX-XXXXX-XXXXX-XXXXXXXXX" "http://localhost:9000/api/permissions/graph" | jq --arg dID "$dbID" --arg gID "$pgroupID" -r '.groups."1"[$dID]."native" = "none" | .groups."1" [$dID]."schemas" = "none" | .groups[$gID][$dID]."native" ="none" .groups[$gID][$dID]."schemas" ="none"' > permissiongraph.json
Which produced the following updated JSON graph for me to PUT to my server:
{
"revision": 2,
"groups": {
"1": {
"1": {
"native": "none",
"schemas": "none"
},
"2": {
"native": "none",
"schemas": "none"
}
},
"2": {
"1": {
"native": "write",
"schemas": "all"
},
"2": {
"native": "none",
"schemas": "none"
}
}
}
}
Michael is correct when he said this is sparsely documented. I couldn't find it anywhere in the manuals. I hope this helps someone.
Consider also passing in the shell variable (EMAILID) as a jq variable (here also EMAILID, for the sake of illustration):
projectID=$(jq -r --arg EMAILID "$EMAILID" '
.resource[]
| select(.username==$EMAILID)
| .id' file.json)
Postscript
For the record, another possibility would be to use jq's env function for accessing environment variables. For example, consider this sequence of bash commands:
[email protected] # not exported
EMAILID="$EMAILID" jq -n 'env.EMAILID'
The output is a JSON string:
"[email protected]"
shell arrays
Unfortunately, shell arrays are a different kettle of fish. Here are two SO resources regarding the ingestion of such arrays:
JQ - create JSON array using bash array with space
Convert bash array to json array and insert to file using jq
I resolved this issue by escaping the inner double quotes
projectID=$(cat file.json | jq -r ".resource[] | select(.username==\"$EMAILID\") | .id")
Congratulations on figuring out how to use to_entries.
One small suggestion is to avoid using shell interpolation to "construct" the jq program. A much better way to achieve the desired goal is to pass in the relevant values on the command-line. In your case, the following would be appropriate:
$ jq --arg username "$defaultusername" '
.orgs | to_entries[] | select(.value == $username ).key'
Another small point is to avoid using echo to send JSON to STDIN. There are several possibilities, including these patterns:
- if you are using bash:
jq .... <<< "$JSON" - use
printf "%s" "$JSON" | jq ... jq -n --argjson JSON "$JSON" '$JSON | ...'
In your case, the last of these alternatives would look like this:
$ jq --arg username "$defaultusername" --argjson JSON "$aliasConfig" '
$JSON
| .orgs | to_entries[] | select(.value == $username ).key'
I think I got it figured out here:
get_sfdx_defaultusername() {
config="$(cat .sfdx/sfdx-config.json 2> /dev/null)";
globalConfig="$(cat ~/.sfdx/sfdx-config.json)";
aliasConfig="$(cat ~/.sfdx/alias.json)";
defaultusername="$(echo ${config} | jq -r .defaultusername)"
defaultusernamealias="$(echo ${aliasConfig} | jq -r '.orgs | to_entries[] | select(.value =="'$defaultusername'").key' )"
globaldefaultusername="$(echo ${globalConfig} | jq -r .defaultusername)"
if [ ! $defaultusernamealias = "null" ]
then
echoString=$echoString$defaultusernamealias"$txtylw (alias)"
elif [ ! $defaultusername = "null" ]
then
echoString=$echoString$defaultusername"$txtylw (local)"
else
echoString=$echoString$globaldefaultusername"$txtylw (global)"
fi
echo $echoString"\n"
}
This allows me to show my current defaultusername org like so:

In case anyone is interested in using this or contributing to it, I published a github repo here
Your original version isn't going to be evalable because the author name has spaces in it - it would be interpreted as running a command Doe with the environment variable AUTHOR set to John. There's also virtually never a need to pipe jq to itself - the internal piping & dataflow can connect different filters together.
All of this is only sensible if you completely trust the input data (e.g. it's generated by a tool you control). There are several possible problems otherwise detailed below, but let's assume the data itself is certain to be in the format you expect for the moment.
You can make a much simpler version of your jq program:
jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + (.value | @sh)'
which outputs:
URL='example.com'
AUTHOR='John Doe'
CREATED='10/22/2017'
There's no need for a map: .[] deals with taking each object in the array through the rest of the pipeline as a separate item, so everything after the last | is applied to each one separately. At the end, we just assemble a valid shell assignment string with ordinary + concatenation, including appropriate quotes & escaping around the value with @sh.
All the pipes matter here - without them you get fairly unhelpful error messages, where parts of the program are evaluated in subtly different contexts.
This string is evalable if you completely trust the input data and has the effect you want:
eval "$(jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + (.value | @sh)' < data.json)"
echo "$AUTHOR"
As ever when using eval, be careful that you trust the data you're getting, since if it's malicious or just in an unexpected format things could go very wrong. In particular, if the key contains shell metacharacters like $ or whitespace, this could create a running command. It could also overwrite, for example, the PATH environment variable unexpectedly.
If you don't trust the data, either don't do this at all or filter the object to contain just the keys you want first:
jq '.SITE_DATA | { AUTHOR, URL, CREATED } | ...'
You could also have a problem in the case that the value is an array, so .value | tostring | @sh will be better - but this list of caveats may be a good reason not to do any of this in the first place.
It's also possible to build up an associative array instead where both keys and values are quoted:
eval "declare -A data=($(jq -r '.SITE_DATA | to_entries | .[] | @sh "[\(.key)]=\(.value)"' < test.json))"
After this, ${data[CREATED]} contains the creation date, and so on, regardless of what the content of the keys or values are. This is the safest option, but doesn't result in top-level variables that could be exported. It may still produce a Bash syntax error when a value is an array, or a jq error if it is an object, but won't execute code or overwrite anything.
Building on @Michael Homer's answer, you can avoid a potentially-unsafe eval entirely by reading the data into an associative array.
For example, if your JSON data is in a file called file.json:
#!/bin/bash
typeset -A myarray
while IFS== read -r key value; do
myarray["$key"]="$value"
done < <(jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + .value ' file.json)
# show the array definition
typeset -p myarray
# make use of the array variables
echo "URL = '${myarray[URL]}'"
echo "CREATED = '${myarray[CREATED]}'"
echo "AUTHOR = '${myarray[URL]}'"
Output:
$ ./read-into-array.sh
declare -A myarray=([CREATED]="10/22/2017" [AUTHOR]="John Doe" [URL]="example.com" )
URL = 'example.com'
CREATED = '10/22/2017'
AUTHOR = 'example.com'