Use parentheses to evaluate $key early as in:

 jq --null-input --arg key foobar '{($key): "value"}'

See also: Parentheses in JQ for .key

Answer from Hans Z. on Stack Overflow
Discussions

How to access shell variable value as key name
keyname="person2" value={this will hold the value} echo $(cat $Name.json | jq --arg foo bar '. + {$keyname: $value}') ... jq: error: syntax error, unexpected ':' (Unix shell quoting issues?) at , line 1: . + {: $foo} jq: 1 compile error · Can't we use the shell variable values as keyname ? More on github.com
🌐 github.com
1
December 21, 2017
How to dynamically find a value of a key-value in json in bash script - Unix & Linux Stack Exchange
This would probably only involve ... starting jq once for each key. ... As far as I know using single quotes is not valid (use double quotes): json format. The another problem I see is that you have a comma , between 'b' and 'bannana' (I assume you didn't notice that and you typed , instead of :) ... I usually see that the syntax '.key' is more used. However I think you should use any which works. For example, I see in your case that you are using a variable $key to get ... More on unix.stackexchange.com
🌐 unix.stackexchange.com
jq how to pass json keys from a shell variable - Stack Overflow
Anyhow -- this is a case where what you claim you want is best implemented with jq -r ".[] | [$keys]" -- note double quotes instead of single quotes, so the shell is generating code for jq to execute. That's usually undesirable, but this is a special case. ... Fetching the value of a jq variable ... More on stackoverflow.com
🌐 stackoverflow.com
bash - Passing variable to jq to edit a json file - Unix & Linux Stack Exchange
Note the use of single quotes above ... and $key1 as shell variables. You should keep the double quotes for --arg name "$name", because that really is a shell variable, and it should be quoted to make it safe to use. ... Had to use -n/--null-input two work without piped input name=foobar; jq -n --arg key1 ... More on unix.stackexchange.com
🌐 unix.stackexchange.com
🌐
Reddit
reddit.com › r/bash › is it possible to use dynamic key in jq
r/bash on Reddit: Is it possible to use dynamic key in jq
March 23, 2022 -

Is it possible to do something like this:

jq -r --argjson i 0 --arg key "name" '.projects[$i].$key' input-file

I've tried enclosing $key in both parenthesis and square brackets unsuccessfully.

🌐
Exercism
exercism.org › tracks › jq › concepts › variables
Variables in jq on Exercism
Like the identity expression ., the variable assignment's output is the same as its input. Setting the variable's value is a side-effect. Example, showing . after the assignment is the same as the initial input: $ jq -c -n '42 | (.
🌐
GitHub
github.com › jqlang › jq › issues › 1561
How to access shell variable value as key name · Issue #1561 · jqlang/jq
December 21, 2017 - keyname="person2" value={this will hold the value} echo $(cat $Name.json | jq --arg foo bar '. + {$keyname: $value}') ... jq: error: syntax error, unexpected ':' (Unix shell quoting issues?) at , line 1: . + {: $foo} jq: 1 compile error · Can't we use the shell variable values as keyname ?
Author   prudhviravella
Top answer
1 of 1
2

The main problem seems to be in your json data because you have:

{'a': 'apple', 'b', 'bananna'}

As far as I know using single quotes is not valid (use double quotes): json format.
The another problem I see is that you have a comma , between 'b' and 'bannana' (I assume you didn't notice that and you typed , instead of :)

So Your json data should be like this:

{"a": "apple", "b": "bananna"}

What is the proper syntax to get a key?

I usually see that the syntax '.key' is more used. However I think you should use any which works. For example, I see in your case that you are using a variable $key to get specific value from a json key. So, you should use something like ".$key" or maybe ."$key", .$key (I'm not sure if these are recommended):

Solution 1

FRUIT=$(echo $ALIASES | jq ".$key" -r)
#or
FRUIT=$(echo $ALIASES | jq ."$key" -r)
#or
FRUIT=$(echo $ALIASES | jq .$key -r)

If you use or want to use single quotes then you will not be able to pass any variable value to jq command. According to man bash:

Enclosing characters in single quotes (') preserves the literal value of each character within the quotes. A single quote may not occur between single quotes, even when preceded by a backslash

Therefore If you want to use single quotes to a json value by specifying some key you should use:

FRUIT=$(echo $ALIASES | jq '.somekey' -r)
#e.g.
FRUIT=$(echo $ALIASES | jq '.a' -r)
echo $FRUIT
#Output:
apple

You can check these answers for a better understanding about single quotes and double quotes

Solution 2 (using jq --arg)
I'm not sure why the code above doesn't work for you (using the double quotes). However there is another possible solution to pass variables to jq, you can try:

val="a"
echo $ALIASES | jq --arg key "$val" '.[$key]' -r

As you are using python to get the json then you should use the method json.dumps(your_json), for example:

python3 -c 'import json;aliases = json.load(open("file"));print(json.dumps(aliases))'
🌐
Jqlang
jqlang.github.io › jq › manual
jq 1.8 Manual
If you run jq with --argjson foo 123, then $foo is available in the program and has the value 123. ... This option reads all the JSON texts in the named file and binds an array of the parsed JSON values to the given global variable.
Find elsewhere
🌐
Baeldung
baeldung.com › home › files › guide to passing bash variables to jq
Guide to Passing Bash Variables to jq | Baeldung on Linux
March 18, 2024 - Further, let’s assume that we want to override the value of the fruit field with a user-defined value that’s available through the bash_fruit_var Bash variable: ... Next, let’s go ahead and write a jq program using the –arg option to display the fruit field from the bash_fruit_var variable and all other field values from the fruits_template.json file using the .field operator:
🌐
Phpfog
phpfog.com › home › blog › using variables in jq (command line json parser)
Using Variables in JQ (Command Line JSON Parser) - PHPFog.com
January 29, 2021 - To use a simple variable in jq use the --arg option. Using --arg var value will set the variable $var to value.
🌐
Rundeck
docs.rundeck.com › docs › manual › log-filters › json-jq.html
JSON jq key/value mapper
October 3, 2023 - This log filter will parse JSON data in a rundeck job step and create key-value data as Rundeck variables in the data context. The filter uses the jq library to make jquery searches into the data returns.
🌐
Google Groups
groups.google.com › g › rundeck-discuss › c › jUCr0hXNlWU
Spaces in Key's for JSON jq key/value Mapper
So it’s the Key’s I am having an issue with, not the values. It seems you can’t call the keys if they have spaces in them. ... And then try to call ${data.first name} you’ll see it won’t return a value (assume the variable ${data.xxxxx} can’t tolerate a space)
Top answer
1 of 2
7

You can use square bracket indexing on all objects in jq, so [$name] works for what you're trying:

jq --arg key1 true --arg name "$name" '.Linux.script_executed[$name] = $key1 ...' 

This use of square brackets is not very well documented in the manual, which makes it look like you can only use .[xyz], but ["x"] works anywhere that .x would have as long as it's not right at the start of an expression (that is, .a.x and .a["x"] are the same, but ["x"] is an array construction).

Note the use of single quotes above - that is so Bash won't try to interpret $name and $key1 as shell variables. You should keep the double quotes for --arg name "$name", because that really is a shell variable, and it should be quoted to make it safe to use.

2 of 2
-1

Michael Homer has the correct response. I'll post what I was aiming for in relation to op's question.

I am trying to modify a permissions graph obtained through a REST API curl call using jq to output a json file which I will PUT to the server with updates. The following is the output of the curl API query:

{
  "revision": 2,
  "groups": {
    "1": {
      "1": {
        "native": "none",
        "schemas": "none"
      },
      "2": {
        "native": "write",
        "schemas": "all"
      }
    },
    "2": {
      "1": {
        "native": "write",
        "schemas": "all"
      },
      "2": {
        "native": "write",
        "schemas": "all"
      }
    }
  }
}

As you can see it is nested. I am trying to modify this using bash variables every time a new permission set is created and when databases are created. For exmaple I'm trying to modify groups."1"."2"."native" and "schemas" also "groups."2"."2".native and "schemas" values. It's rough, but the keys are as follows groups.groupPermissionID.DatabaseID.*

In order to modify this nested graph on the fly via a bash shell script, I used Michael Homer's solution of applying [$name]. In my case I had to do this twice in a row. i.e., [$gID][$dID]. In the following setup the variables are constant but in my model they are command arguments passed to the bash shell script.

dbID="2"
pgroupID="2"

curl -X GET -H "Content-Type: application/json" -H "X-Metabase-Session: XXXXX-XXXXX-XXXXX-XXXXXXXXX" "http://localhost:9000/api/permissions/graph" | jq --arg dID "$dbID"   --arg gID "$pgroupID" -r '.groups."1"[$dID]."native" = "none" | .groups."1" [$dID]."schemas" = "none" | .groups[$gID][$dID]."native" ="none" .groups[$gID][$dID]."schemas" ="none"' > permissiongraph.json

Which produced the following updated JSON graph for me to PUT to my server:

{
  "revision": 2,
  "groups": {
    "1": {
      "1": {
        "native": "none",
        "schemas": "none"
      },
      "2": {
        "native": "none",
        "schemas": "none"
      }
    },
    "2": {
      "1": {
        "native": "write",
        "schemas": "all"
      },
      "2": {
        "native": "none",
        "schemas": "none"
      }
    }
  }
}

Michael is correct when he said this is sparsely documented. I couldn't find it anywhere in the manuals. I hope this helps someone.

Top answer
1 of 2
2

Congratulations on figuring out how to use to_entries.

One small suggestion is to avoid using shell interpolation to "construct" the jq program. A much better way to achieve the desired goal is to pass in the relevant values on the command-line. In your case, the following would be appropriate:

$ jq --arg username "$defaultusername" '
  .orgs | to_entries[] | select(.value == $username ).key'

Another small point is to avoid using echo to send JSON to STDIN. There are several possibilities, including these patterns:

  • if you are using bash: jq .... <<< "$JSON"
  • use printf "%s" "$JSON" | jq ...
  • jq -n --argjson JSON "$JSON" '$JSON | ...'

In your case, the last of these alternatives would look like this:

$ jq --arg username "$defaultusername" --argjson JSON "$aliasConfig" '
    $JSON
    | .orgs | to_entries[] | select(.value == $username ).key'
2 of 2
0

I think I got it figured out here:

get_sfdx_defaultusername() {
    config="$(cat .sfdx/sfdx-config.json 2> /dev/null)";
    globalConfig="$(cat ~/.sfdx/sfdx-config.json)";
    aliasConfig="$(cat ~/.sfdx/alias.json)";

    defaultusername="$(echo ${config} | jq -r .defaultusername)"
    defaultusernamealias="$(echo ${aliasConfig} | jq -r '.orgs | to_entries[] | select(.value =="'$defaultusername'").key' )"
    globaldefaultusername="$(echo ${globalConfig} | jq -r .defaultusername)"

    if [ ! $defaultusernamealias = "null" ]
    then
        echoString=$echoString$defaultusernamealias"$txtylw (alias)"
    elif [ ! $defaultusername = "null" ]
    then
        echoString=$echoString$defaultusername"$txtylw (local)"
    else
        echoString=$echoString$globaldefaultusername"$txtylw (global)"
    fi
    echo $echoString"\n"
}

This allows me to show my current defaultusername org like so:

In case anyone is interested in using this or contributing to it, I published a github repo here

Top answer
1 of 4
56

Your original version isn't going to be evalable because the author name has spaces in it - it would be interpreted as running a command Doe with the environment variable AUTHOR set to John. There's also virtually never a need to pipe jq to itself - the internal piping & dataflow can connect different filters together.

All of this is only sensible if you completely trust the input data (e.g. it's generated by a tool you control). There are several possible problems otherwise detailed below, but let's assume the data itself is certain to be in the format you expect for the moment.

You can make a much simpler version of your jq program:

jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + (.value | @sh)'

which outputs:

URL='example.com'
AUTHOR='John Doe'
CREATED='10/22/2017'

There's no need for a map: .[] deals with taking each object in the array through the rest of the pipeline as a separate item, so everything after the last | is applied to each one separately. At the end, we just assemble a valid shell assignment string with ordinary + concatenation, including appropriate quotes & escaping around the value with @sh.

All the pipes matter here - without them you get fairly unhelpful error messages, where parts of the program are evaluated in subtly different contexts.

This string is evalable if you completely trust the input data and has the effect you want:

eval "$(jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + (.value | @sh)' < data.json)"
echo "$AUTHOR"

As ever when using eval, be careful that you trust the data you're getting, since if it's malicious or just in an unexpected format things could go very wrong. In particular, if the key contains shell metacharacters like $ or whitespace, this could create a running command. It could also overwrite, for example, the PATH environment variable unexpectedly.

If you don't trust the data, either don't do this at all or filter the object to contain just the keys you want first:

jq '.SITE_DATA | { AUTHOR, URL, CREATED } | ...'

You could also have a problem in the case that the value is an array, so .value | tostring | @sh will be better - but this list of caveats may be a good reason not to do any of this in the first place.


It's also possible to build up an associative array instead where both keys and values are quoted:

eval "declare -A data=($(jq -r '.SITE_DATA | to_entries | .[] | @sh "[\(.key)]=\(.value)"' < test.json))"

After this, ${data[CREATED]} contains the creation date, and so on, regardless of what the content of the keys or values are. This is the safest option, but doesn't result in top-level variables that could be exported. It may still produce a Bash syntax error when a value is an array, or a jq error if it is an object, but won't execute code or overwrite anything.

2 of 4
20

Building on @Michael Homer's answer, you can avoid a potentially-unsafe eval entirely by reading the data into an associative array.

For example, if your JSON data is in a file called file.json:

#!/bin/bash

typeset -A myarray

while IFS== read -r key value; do
    myarray["$key"]="$value"
done < <(jq -r '.SITE_DATA | to_entries | .[] | .key + "=" + .value ' file.json)

# show the array definition
typeset -p myarray

# make use of the array variables
echo "URL = '${myarray[URL]}'"
echo "CREATED = '${myarray[CREATED]}'"
echo "AUTHOR = '${myarray[URL]}'"

Output:

$ ./read-into-array.sh 
declare -A myarray=([CREATED]="10/22/2017" [AUTHOR]="John Doe" [URL]="example.com" )
URL = 'example.com'
CREATED = '10/22/2017'
AUTHOR = 'example.com'