I think your question lacks some more technical context, but I think that maybe if your Dockerfile is just:
FROM python:3
RUN pip install requests
It actually should have the specification of the app files to be deployed.
FROM python:3
WORKDIR /app
COPY . /app
RUN pip install --no-cache-dir -r requirements.txt
RUN pip install requests # (If needed)
CMD ["python", "main.py"]
If that's the case I recommend searching on how to build a proper Dockerfile.
Hope it helps. :D
Answer from Ariel Carvalho on Stack OverflowI think your question lacks some more technical context, but I think that maybe if your Dockerfile is just:
FROM python:3
RUN pip install requests
It actually should have the specification of the app files to be deployed.
FROM python:3
WORKDIR /app
COPY . /app
RUN pip install --no-cache-dir -r requirements.txt
RUN pip install requests # (If needed)
CMD ["python", "main.py"]
If that's the case I recommend searching on how to build a proper Dockerfile.
Hope it helps. :D
Besides main.py, you have to have two files in the local folder:
requirements.txt
Where you define the libraries and their version, according to pypi.org
requests==2.31.0
Dockerfile
FROM python:3.9
EXPOSE 8080
ENV PORT 8080
WORKDIR /home
COPY . /home
RUN pip install -r /home/requirements.txt
CMD python3 /home/main.py
How to automate a Python script with Docker and Google Cloud Run
docker - How to run a "hello world" python script with Google Cloud Run - Stack Overflow
How to create a Google Cloud Job/Service/Run based on a Docker image - Stack Overflow
Python apps very slow on Google Cloud Run
Videos
Link
I recently took on the challenge of automating a Python script, on a schedule with Docker and Google Cloud. While it sounds simple, this was quite tricky for me (a data scientist - not an engineer). I documented my steps in the referenced post for others who may find this useful.
Various posts in this sub helped me get this accomplished. (Particularly this).
Note that Google Cloud Run Services are different than Google Cloud Run Jobs. Services require your app to listen for HTTP requests. Jobs don't. This subtle distinction tripped me up, and I don't think it's talked about enough.
UPDATE
I've documented my problem and solution in much more detail here »
I had been trying to deploy my script as a Cloud Run Service. I should've tried deploying it as a Cloud Run Job. The difference is that cloud run services require your script to listen for a port. jobs do not.

Confusingly, you cannot deploy a cloud run job directly from Artifact Registry. You have to start from the cloud run dashboard.
Your Flask application should be something like below:
import os
from flask import Flask
app = Flask(__name__)
@app.route("/")
def hello_world():
return "Hello World!"
if __name__ == "__main__":
app.run(debug=True, host="0.0.0.0", port=int(os.environ.get("PORT", 8080)))
See this official documentation for step by step instruction: Deploy a Python service to Cloud Run
There is a plugin called: Cloud Code IDE plugin which makes the test and deployment easy. I am using it for VS code, once the initial setups and permissions are taken care, few clicks, you will be able to run locally, debug and deploy Cloud run services from your local instance.