UPDATE
I've documented my problem and solution in much more detail here »
I had been trying to deploy my script as a Cloud Run Service. I should've tried deploying it as a Cloud Run Job. The difference is that cloud run services require your script to listen for a port. jobs do not.

Confusingly, you cannot deploy a cloud run job directly from Artifact Registry. You have to start from the cloud run dashboard.
Answer from Ben on Stack Overflowdocker - How to run a "hello world" python script with Google Cloud Run - Stack Overflow
Automating Python with Google Cloud
Running a python script on Google Cloud Compute Engine - Stack Overflow
google cloud platform - Running basic Python script on GCP - Stack Overflow
Videos
» pip install google-cloud-run
UPDATE
I've documented my problem and solution in much more detail here »
I had been trying to deploy my script as a Cloud Run Service. I should've tried deploying it as a Cloud Run Job. The difference is that cloud run services require your script to listen for a port. jobs do not.

Confusingly, you cannot deploy a cloud run job directly from Artifact Registry. You have to start from the cloud run dashboard.
Your Flask application should be something like below:
import os
from flask import Flask
app = Flask(__name__)
@app.route("/")
def hello_world():
return "Hello World!"
if __name__ == "__main__":
app.run(debug=True, host="0.0.0.0", port=int(os.environ.get("PORT", 8080)))
See this official documentation for step by step instruction: Deploy a Python service to Cloud Run
There is a plugin called: Cloud Code IDE plugin which makes the test and deployment easy. I am using it for VS code, once the initial setups and permissions are taken care, few clicks, you will be able to run locally, debug and deploy Cloud run services from your local instance.
I just published a tutorial series on how to automate a Python script in Google Cloud using Cloud Functions and/or Cloud Run. Feedback would be great. Thanks!
-
Automating Python with Google Cloud
-
Automating Python with Google Cloud Functions
-
Automating Python with Google Cloud Run
-
I finally figured this out so I'll post the same answer on my own post that worked for me here. Using Debian Stretch on my VM. I'm assuming you already uploaded your file(s) to the VM and that you are in the same directory of your script.
Make your script an executable
chmod +x myscript.pyRun the
nohupcommand to execute the script in the background. The&option ensures that the process stays alive after exiting. I've added the shebang line to my python script so there's no need to callpythonherenohup /path/to/script/myscript.py &Logout from the shell if you want
logout
Done! Now your script is up and running. You can login back and make sure that your process is still alive by checking the output of this command:
ps -e | grep myscript.py
If anything went wrong, you can check out the nohup.out file to see the output of your script:
cat nohup.out
There is even a simpler approach to to run code in the background in gcp and in every linux terminal: using screen linux
Create a new background terminal window:
screen -S WRITE_A_NAME_OF_YOUR_CHOIC_HERE
now you are in a background window in the terminal. Run your code:
python3 mycode.py
Exit screen with the hotkeys and the job will keep running on the background.
ctrl + A + D
You can close all windows now. If you wanna go back and see what's happening. Log again in your terminal. And tap the following.
screen -ls
This one will give you the list of the created "windows". Now find yours and tap
screen -r WRITE_NAME_OF_YOUR_WINDOW
And there you have it :D You can find more commands here
I you need to "just" your code run notebooks in GCP there is a service for that: AI Platform Notebooks.
From your notebook instance you can work and download data you need, just be sure your instance has the resources (memory and CPU) you'll need, and do not forget to shut the instance down when you're not using it to avoid charges.
If a notebook is too much for you, you can always run everything in a compute instance (that's a virtual machine). Just launch it from the web and ssh to it with the web application (no need to install ssh, etc.), then you can upload the code into the instance. Just remenber to use a small instance one to start ;-)
So I followed Inigo's advice, plus a little code I found elsewhere (starting with python on google plus a quick lab)
Create the VM as per here SSH in Run this code
sudo apt-get update
sudo apt-get install git -y
sudo apt-get install python3-setuptools python3-dev build-essential python3-venv -y
curl https://bootstrap.pypa.io/get-pip.py -o get-pip.py
# wget as suggested in the docs didn't work for me
sudo python3 get-pip.py
python3 --version
pip3 --version
python3 -m venv env
source env/bin/activate
Sprocket on the right to upload the files
pip install -r requirements.txt
python3 filename.py
Thanks for the comments and tips gang. This has got me moving
Link
I recently took on the challenge of automating a Python script, on a schedule with Docker and Google Cloud. While it sounds simple, this was quite tricky for me (a data scientist - not an engineer). I documented my steps in the referenced post for others who may find this useful.
Various posts in this sub helped me get this accomplished. (Particularly this).
Note that Google Cloud Run Services are different than Google Cloud Run Jobs. Services require your app to listen for HTTP requests. Jobs don't. This subtle distinction tripped me up, and I don't think it's talked about enough.