As the commenters have pointed out, this isn't trivial.
But, I'm always up for some gcloud-bashing ;-)
Your code sample suggests you want the answer in PowerShell which I don't have and hope you don't mind pointers (what follows is incomplete) in bash:
Does:
- Project ID
- Account name
- Enabled|Disabled
- Keys
- Creation (timestamp)
Doesn't:
- Role assignments
- Last used (audit logs?)
PROJECTS=$(gcloud projects list --format="value(projectId)")
for PROJECT in ${PROJECTS}
do
echo "Project: ${PROJECT}"
# Extracts ACCOUNT_ID, EMAIL (==ACCOUNT_ID@...), DISABLED
ROBOTS=$(\
gcloud iam service-accounts list \
--project=${PROJECT} \
--format="csvno-heading,email,email.split(\"@\").slice(0),disabled)")
for ROBOT in ${ROBOTS}
do
# Parse results
IFS=, read ENCODED_NAME EMAIL ACCOUNT_ID DISABLED <<< ${ROBOT}
NAME=$(echo -e ${ENCODED_NAME} | base64 --decode)
echo " Service Account: ${NAME}"
echo " Disabled: ${DISABLED}"
echo " Email: ${EMAIL}"
# Keys
KEYS=$(\
gcloud iam service-accounts keys list \
--iam-account=${EMAIL} \
--project=${PROJECT} \
--format="value(name.scope(keys))")
for KEY in ${KEYS}
do
echo " Key: ${KEY}"
done
# Creation (Only searches back 30-days!)
FILTER=""\
"logName=\"projects/${PROJECT}/logs/cloudaudit.googleapis.com%2Factivity\" "\
"resource.type=\"service_account\" "\
"protoPayload.methodName=\"google.iam.admin.v1.CreateServiceAccount\" "\
"protoPayload.request.account_id=\"${ACCOUNT_ID}\" "
LOG=$(\
gcloud logging read "${FILTER}" \
--project=${PROJECT} \
--format=json \
--freshness=30d \
--format="value(timestamp)")
echo " Created: ${LOG}"
done
done
Notes
- Service Account creation times -- IIUC -- can only be obtained through audit logs (
CreateServiceAccount[Key]). One challenge with this is having to search back through the project's (entire) history to find these. - Only user-created Service Accounts are created (during the project's lifetime). Google-maintained account e.g. App Engine will not be found.
- The script searches the logs for each account. This is inefficient and it would be better to search the logs once for all accounts and then merge the results.
- Role assignments is difficult because of inheritance. A naive solution would get the IAM policy for each Project but this is insufficient as it doesn't cover Organizational|Folder permissions nor does it include resource-specific bindings.
- I actually don't know how to grep the logs for last auth times and assume this is available through audit logs
- Apologies for the gnarly
base64encoding of the account'sdisplayName. This is to avoid over-eager parsing of (the majority of) names that include spaces. There's likely a better approach.
Videos
Thought I share some of my observations about service account keys. I did all this quick testing in GCP not QwikiLabs.
A service account created using web console looks like this. Note "No keys" (also the creation date is blank uhmm.. bug?)
IAM > Service AccountsHowever when listed in CLI the key is there:
me@cloudshell:~ (tg-project1)$ gcloud iam service-accounts keys list --iam-account=sa1-693@tg-project1.iam.gserviceaccount.com KEY_ID CREATED_AT EXPIRES_AT 4af846b08ffd89ca141804910a782cea1d99ff20 2020-02-08T07:42:44Z 2022-02-28T21:12:54Z me@cloudshell:~ (tg-project1)$
Note the key seems to have a lieftime of ~2 years and 20 days
Google says in their documentation that they rotate the keys every two weeks (link), which does seem to be the case, since a different service account I had created months ago has a near-week-old key:
me@cloudshell:~ (tg-project1)$ gcloud iam service-accounts keys list --iam-account=tg-project1@appspot.gserviceaccount.com
KEY_ID CREATED_AT EXPIRES_AT
edc07c4a4721131cdfae14687562def66637552f 2020-02-02T15:38:29Z 2022-02-11T13:32:41Z
me@cloudshell:~ (tg-project1)$
IF you are prepping to the #ACE exam:
-
Service accounts always have keys. GCP console does not always list them.
-
Default (GCP-generated) service account keys have a lifetime of ~2yrs but they are rotated every 2 weeks.
It gets interesting when we create a new key for the service account:
me@cloudshell:~ (tg-project1)$ gcloud iam service-accounts keys create ~/sa1-693.json --iam-account=sa1-693@tg-project1.iam.gserviceaccount.com
created key [dd9b1d062013b394c5745aa6de52cb338ef62503] of type [json] as [/home/me/sa1-693.json] for [sa1-693@tg-project1.iam.gserviceaccount.com]
me@cloudshell:~ (tg-project1)$ gcloud iam service-accounts keys list --iam-account=sa1-693@tg-project1.iam.gserviceaccount.com
KEY_ID CREATED_AT EXPIRES_AT
dd9b1d062013b394c5745aa6de52cb338ef62503 2020-02-08T08:10:59Z 9999-12-31T23:59:59Z
4af846b08ffd89ca141804910a782cea1d99ff20 2020-02-08T07:42:44Z 2022-02-28T21:12:54Z
me@cloudshell:~ (tg-project1)$
-
The old key still exists, at least after a couple of minutes.
-
The new key has the expiration date of Dec 31st 9999 :-P
Tagging: #ACE #AssociateCloudEngineer
As the commenters have pointed out, this isn't trivial.
But, I'm always up for some gcloud-bashing ;-)
Your code sample suggests you want the answer in PowerShell which I don't have and hope you don't mind pointers (what follows is incomplete) in bash:
Does:
- Project ID
- Account name
- Enabled|Disabled
- Keys
- Creation (timestamp)
Doesn't:
- Role assignments
- Last used (audit logs?)
PROJECTS=$(gcloud projects list --format="value(projectId)")
for PROJECT in ${PROJECTS}
do
echo "Project: ${PROJECT}"
# Extracts ACCOUNT_ID, EMAIL (==ACCOUNT_ID@...), DISABLED
ROBOTS=$(\
gcloud iam service-accounts list \
--project=${PROJECT} \
--format="csvno-heading,email,email.split(\"@\").slice(0),disabled)")
for ROBOT in ${ROBOTS}
do
# Parse results
IFS=, read ENCODED_NAME EMAIL ACCOUNT_ID DISABLED <<< ${ROBOT}
NAME=$(echo -e ${ENCODED_NAME} | base64 --decode)
echo " Service Account: ${NAME}"
echo " Disabled: ${DISABLED}"
echo " Email: ${EMAIL}"
# Keys
KEYS=$(\
gcloud iam service-accounts keys list \
--iam-account=${EMAIL} \
--project=${PROJECT} \
--format="value(name.scope(keys))")
for KEY in ${KEYS}
do
echo " Key: ${KEY}"
done
# Creation (Only searches back 30-days!)
FILTER=""\
"logName=\"projects/${PROJECT}/logs/cloudaudit.googleapis.com%2Factivity\" "\
"resource.type=\"service_account\" "\
"protoPayload.methodName=\"google.iam.admin.v1.CreateServiceAccount\" "\
"protoPayload.request.account_id=\"${ACCOUNT_ID}\" "
LOG=$(\
gcloud logging read "${FILTER}" \
--project=${PROJECT} \
--format=json \
--freshness=30d \
--format="value(timestamp)")
echo " Created: ${LOG}"
done
done
Notes
- Service Account creation times -- IIUC -- can only be obtained through audit logs (
CreateServiceAccount[Key]). One challenge with this is having to search back through the project's (entire) history to find these. - Only user-created Service Accounts are created (during the project's lifetime). Google-maintained account e.g. App Engine will not be found.
- The script searches the logs for each account. This is inefficient and it would be better to search the logs once for all accounts and then merge the results.
- Role assignments is difficult because of inheritance. A naive solution would get the IAM policy for each Project but this is insufficient as it doesn't cover Organizational|Folder permissions nor does it include resource-specific bindings.
- I actually don't know how to grep the logs for last auth times and assume this is available through audit logs
- Apologies for the gnarly
base64encoding of the account'sdisplayName. This is to avoid over-eager parsing of (the majority of) names that include spaces. There's likely a better approach.
Taking @DazWilin's excellent script, with a few mods it generates a Service Account Register or CSV file of all the service accounts, including their descriptions and keys. Removing the log scrape speeds it up.
#! /bin/bash
# Requires permission to list projects, list service accounts, view keys
if [ $# -lt 1 ]
then
echo "usage: $0 csv_output_file"
exit
fi
gcloud projects list --format="value(projectId)" --sort-by=projectId
OUTFILE=$1
FILTER='prefix'
PROJECTS=$(gcloud projects list --format="value(projectId)" --filter="${FILTER}")
echo "Project,ServiceAccountName,Account Name,Email,Description,key_id,key_created_at,key_expires_at" > $OUTFILE
for PROJECT in ${PROJECTS}
do
echo "Project: ${PROJECT}"
# Extracts ACCOUNT_ID, EMAIL (==ACCOUNT_ID@...), DISABLED, DESCRIPTION
ROBOTS=$(\
gcloud iam service-accounts list \
--project=${PROJECT} \
--format="csvno-heading,email,email.split(\"@\").slice(0),disabled,description.encode(\"base64\"))")
#echo $ROBOTS
for ROBOT in ${ROBOTS}
do
# Parse results
IFS=, read ENCODED_NAME EMAIL ACCOUNT_ID DISABLED ENCODED_DESCR<<< ${ROBOT}
NAME=$(echo -e ${ENCODED_NAME} | base64 --decode)
DESCR=$(echo -e ${ENCODED_DESCR} | base64 --decode)
echo " Service Account: ${NAME}"
echo " Disabled: ${DISABLED}"
echo " Email: ${EMAIL}"
echo " Descr: ${DESCR}"
RESPONSE=$(\
gcloud iam service-accounts keys list \
--iam-account=${EMAIL} \
--project=${PROJECT} \
--format="csvno-heading,validAfterTime,validBeforeTime)" \
)
IFS=$'\n' rows=($RESPONSE)
for row in "${rows[@]}"
do
echo "$PROJECT,$NAME, $ACCOUNT_ID,$EMAIL,$DESCR,$row" >> $OUTFILE
# IFS=$',' args=($row)
# keyname=args[0]
# created=args[1]
# expires=args[2]
done
done
done
The underlying REST method is projects.serviceAccounts.keys.list and the result is a ServiceAccountKey which contains valid[Before|After]Time which are strings in the protobuf Timestamp.
So, I think this needs to either be a string comparison of dates (!) or comparing durations (but I'm unfamiliar with the duration format and how to compare).
You can convert the validAfterTime to a duration, i.e. --filter=validAfterTime.duration() (see duration) and then compare (!) but as Durations
Or construct a date that's within your scope and compare as-is. The following is hacky, please proceed with caution:
CopyPROJECT=...
ACCOUNT=...
PAST=$(date -d "-90 days" +%Y-%m-%dT%H:%M:%SZ)
EMAIL="${ACCOUNT}@${PROJECT}.iam.gserviceaccount.com"
gcloud iam service-accounts keys list \
--iam-account=${EMAIL} \
--project=${PROJECT} \
--filter="validAfterTime<${PAST}"
I think there's a better way to do this!
I've ended up using the above method with ISO dates generated in a script and it seems to be working now. It feels like the kind of thing that should be nicely handled with the filters, but getting it working is taking more time than a bash date