UPDATE
I've documented my problem and solution in much more detail here »
I had been trying to deploy my script as a Cloud Run Service. I should've tried deploying it as a Cloud Run Job. The difference is that cloud run services require your script to listen for a port. jobs do not.

Confusingly, you cannot deploy a cloud run job directly from Artifact Registry. You have to start from the cloud run dashboard.
Answer from Ben on Stack OverflowUPDATE
I've documented my problem and solution in much more detail here »
I had been trying to deploy my script as a Cloud Run Service. I should've tried deploying it as a Cloud Run Job. The difference is that cloud run services require your script to listen for a port. jobs do not.

Confusingly, you cannot deploy a cloud run job directly from Artifact Registry. You have to start from the cloud run dashboard.
Your Flask application should be something like below:
import os
from flask import Flask
app = Flask(__name__)
@app.route("/")
def hello_world():
return "Hello World!"
if __name__ == "__main__":
app.run(debug=True, host="0.0.0.0", port=int(os.environ.get("PORT", 8080)))
See this official documentation for step by step instruction: Deploy a Python service to Cloud Run
There is a plugin called: Cloud Code IDE plugin which makes the test and deployment easy. I am using it for VS code, once the initial setups and permissions are taken care, few clicks, you will be able to run locally, debug and deploy Cloud run services from your local instance.
google cloud platform - Running basic Python script on GCP - Stack Overflow
Automating Python with Google Cloud
Running a python script on Google Cloud Compute Engine - Stack Overflow
What's the best way to run python scripts on the cloud to process Google Sheets data?
Videos
I you need to "just" your code run notebooks in GCP there is a service for that: AI Platform Notebooks.
From your notebook instance you can work and download data you need, just be sure your instance has the resources (memory and CPU) you'll need, and do not forget to shut the instance down when you're not using it to avoid charges.
If a notebook is too much for you, you can always run everything in a compute instance (that's a virtual machine). Just launch it from the web and ssh to it with the web application (no need to install ssh, etc.), then you can upload the code into the instance. Just remenber to use a small instance one to start ;-)
So I followed Inigo's advice, plus a little code I found elsewhere (starting with python on google plus a quick lab)
Create the VM as per here SSH in Run this code
sudo apt-get update
sudo apt-get install git -y
sudo apt-get install python3-setuptools python3-dev build-essential python3-venv -y
curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
# wget as suggested in the docs didn't work for me
sudo python3 get-pip.py
python3 --version
pip3 --version
python3 -m venv env
source env/bin/activate
Sprocket on the right to upload the files
pip install -r requirements.txt
python3 filename.py
Thanks for the comments and tips gang. This has got me moving
I just published a tutorial series on how to automate a Python script in Google Cloud using Cloud Functions and/or Cloud Run. Feedback would be great. Thanks!
-
Automating Python with Google Cloud
-
Automating Python with Google Cloud Functions
-
Automating Python with Google Cloud Run
-
I finally figured this out so I'll post the same answer on my own post that worked for me here. Using Debian Stretch on my VM. I'm assuming you already uploaded your file(s) to the VM and that you are in the same directory of your script.
Make your script an executable
chmod +x myscript.pyRun the
nohupcommand to execute the script in the background. The&option ensures that the process stays alive after exiting. I've added the shebang line to my python script so there's no need to callpythonherenohup /path/to/script/myscript.py &Logout from the shell if you want
logout
Done! Now your script is up and running. You can login back and make sure that your process is still alive by checking the output of this command:
ps -e | grep myscript.py
If anything went wrong, you can check out the nohup.out file to see the output of your script:
cat nohup.out
There is even a simpler approach to to run code in the background in gcp and in every linux terminal: using screen linux
Create a new background terminal window:
screen -S WRITE_A_NAME_OF_YOUR_CHOIC_HERE
now you are in a background window in the terminal. Run your code:
python3 mycode.py
Exit screen with the hotkeys and the job will keep running on the background.
ctrl + A + D
You can close all windows now. If you wanna go back and see what's happening. Log again in your terminal. And tap the following.
screen -ls
This one will give you the list of the created "windows". Now find yours and tap
screen -r WRITE_NAME_OF_YOUR_WINDOW
And there you have it :D You can find more commands here
Basically, I'm better programming on python than on JavaScript, so instead of using Appscript for complicated things, I'm wondering what's the best way to:
0 - Run a python script from a cloud server that will...
1- ...Read the a googlespreadhseet data ...
2 - ... and output a new csv and doc file to my drive
This is just me messing about - but I'm wondering what the 'best' (or most common) approach to this might be ? Basically I have a python module that will generate some fake data - and I want to write that to a storage bucket on a cloud scheduled job. So every X minutes it'll generate some synthetic data into some bucket (assume <=1GB of data).
It seems that cloud run / cloud functions are the most obvious for this task, but which one and why (or is there something else more suited to running python code like this?)
Reading some of the documentation for cloud run https://cloud.google.com/run/docs/developing it seems to be geared towards web services (most of the examples seem to use flask...), and I'm not sure that's what I'm after here.
I figured it'd be a fairly easy question for people with GCP experience.
Thanks
First, you will need to install Cloud SDK: https://cloud.google.com/sdk/downloads#apt-get
Then, the simplest way is to run your script through your Terminal (mac and I presume the instructions also work on Linux):
- Configure your project: gcloud config set project insert_your_project_name
- Set up SSH keys: gcloud compute config-ssh
- Connect to the VM: gcloud beta compute ssh vm_name --internal-ip
- Run script: python your_script.py
You can also connect PyCharm directly to GCP and run everything on your VM but you will need PyCharm Pro, otherwise the deployment option is not available. Let me know if this works.
Also, if you want to use the interactive version of setting your project then in step 1 do this instead: gcloud init
The other option is to setup jupyter notebook on GCP. You can use the following command to run jupyter notebook in background.
nohup jupyter notebpook --ip=0.0.0.0 &
Now you can do tunneling by doing an ssh into GCP:
ssh username@<public_ip> -L 8888:127.0.0.1:8888
Now you should be able to access jupyter notebook from your local machine with the following url in the browser
127.0.0.1:8888
» pip install google-cloud-run