Packaging, deploying and reusing private libraries?
GitHub package registry support
pip - How to install Python package from GitHub? - Stack Overflow
pip install package from Artifact Registry from github action - Stack Overflow
Videos
So, let's suppose I develop ProjectA in Python. I store its source code in my local Git repository. Then I develop ProjectB, and again, I store it in Git, in a separate folder.
Then I realise that both ProjectA and ProjectB share some common functions/classes, and it would make sense to extract them into a separate library pachura3_tools and develop it independently, with its own versioning etc. And then I would remove all traces of shared code from both ProjectA and ProjectB, add pachura3_tools to their dependencies, and pip install pachura3_tools to their virtual environments.
My first problem is that I have no idea how to package pachura3_tools so it would become a library that can be managed with pip (publishing, installing, upgrading). Second problem: I understand I need an artifact repository that would host all the different versions of pachura3_tools... much like e.g. Maven .m2 repository for Java... how do I set it up? If possible, I would like to host it locally and not in public/on the internet.
You need to use the proper git URL:
pip install git+https://github.com/jkbr/httpie.git#egg=httpie
Also see the VCS Support section of the pip documentation.
Don’t forget to include the egg=<projectname> part to explicitly name the project; this way pip can track metadata for it without having to have run the setup.py script.
To install Python package from github, you need to clone that repository.
git clone https://github.com/jkbr/httpie.git
Then just run the setup.py file from that directory,
sudo python setup.py install
I was facing the same issue, I found out that according to this GCP Documentation, I didn't have keyring and keyrings.google-artifactregistry-auth installed,
- Try installing these two packages in the step where you install your python packages.
- Make sure that your service account has required permissions
- Make sure you give read and write permissions in your workflow file.
Example Workflow,
name: Example Workflow
on:
push:
branches:
- main
pull_request:
branches:
- main
permissions:
id-token: write
contents: read
jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
with:
fetch-depth: 2
- name: Authenticate to Google Cloud
id: gcloud_auth
uses: google-github-actions/auth@v2
with:
token_format: 'access_token'
workload_identity_provider: 'projects/<PROJECT_ID>/locations/global/workloadIdentityPools/github-pool/providers/example-github-provider'
service_account: '[email protected]'
- name: Install dependencies
id: install_dependencies
run: |
python -m pip install keyrings.google-artifactregistry-auth
pip install --extra-index-url https://europe-west9-python.pkg.dev/path-to/simple/ PACKAGE
I hope this helps!
As suggested in this Github Link Can you try particularly giving the packages_to_install tag like below:
pip install --index-url https://europe-west9-python.pkg.dev/path-to/simple/ PACKAGEaspip install --index-url https://europe-west9-python.pkg.dev/path-to/simple/ packages_to_install=PACKAGE
Also based on your logs
Traceback (most recent call last): File [...] username, [...]_prompt_for_password [...] ask_input return input(message) EOFError: EOF when reading a line.
Can you try to install a pip package from Artifact Registry and to work with the workflow, Try with the below steps:
- Create an Artifacts Registry Python repository
- Create a Composer environment (everything is default)
- Create a service account
- Grant permissions in Artifacts Registry for this account
- Download JSON key for a service account and encode it in base64 using command
cat service-account.json | base64 - Upload
pip.confto/config/pip/pip.confin Composer GCS bucket
Note: Contents of this file as it is mentioned in Documentation, where KEY is a string generated on a step #5.
Please also have a look at this Stackoverflow Link.