🌐
Visual Studio Code
code.visualstudio.com › docs › containers › ssh
Connect to remote Docker over SSH
November 3, 2021 - We recommend using the Visual Studio Code Remote - SSH extension to connect to a remote machine running Docker engine. You can use the Remote - SSH and Dev Containers extensions together.
🌐
Programster
blog.programster.org › use-remote-docker-host-with-ssh
Use Remote Docker Host With SSH | Programster's Blog
August 25, 2022 - Now we need to add the details to our SSH configuration by editing our ~/.ssh/config file. Below is an example, but be sure to update the Host, HostName, Port, IdentityFile as appropriate to you. Host docker1.mydomain.com HostName docker1.mydomain.com User my-remote-user Port 2222 IdentityFile /path/to/private/key ControlMaster auto ControlPath ~/.ssh/control-%C ControlPersist yes
Discussions

Is it possible to configure Docker to use a remote host for everything?
The obvious and easy solution would be to simply SSH into the other host, do all your work there. docker context is also a option you can look at. https://docs.docker.com/engine/manage-resources/contexts/ But opening the Docker daemon for remote connections is huge security risk. Follow the documentation to secure it with mTLS, or run it only through a VPN for example. More on reddit.com
🌐 r/docker
8
0
December 12, 2024
Enable ssh into a live docker container
I am new to docker.I am running a Pi hole server in a docker container (called container manager in synology DSM). I was trying to run gravity vm to sync 2 pi holes which needs ssh. But the docker container does not have ssh enabled, how can i enable ssh in a running docker container More on forums.docker.com
🌐 forums.docker.com
6
0
June 6, 2024
containers - How to SSH into Docker? - Stack Overflow
I'd like to create the following infrastructure flow, where I have three Docker containers on a remote server and want admin and standard users able to use the same login for those resources. I expect the admin to ssh into a different IP than the standard user(s). More on stackoverflow.com
🌐 stackoverflow.com
VS Code: connect a docker container in a remote server - Stack Overflow
Remote-SSH: Connect to host. It works fine. I Installed [Docker] extension on the remoter server. More on stackoverflow.com
🌐 stackoverflow.com
🌐
Docker Docs
docs.docker.com › manuals › docker engine › daemon › configure remote access for docker daemon
Configure remote access for Docker daemon | Docker Docs
Configuring remote access allows Docker to accept requests from remote hosts by configuring it to listen on an IP address and port as well as the Unix socket
🌐
Reddit
reddit.com › r/docker › is it possible to configure docker to use a remote host for everything?
r/docker on Reddit: Is it possible to configure Docker to use a remote host for everything?
December 12, 2024 -

Here is my scenario. I have a Windows 10 professional deployment running as a guest under KVM. The performance of the Windows guest is sufficient. However, I need to use docker under Windows (work thing, no options here) and even though I can get it to work via configuring the KVM, the performance is no longer acceptable.

If I could somehow use the docker commands so that they would perform all the actions on a remote host, it would be great, because then I could use the KVM host to run docker, and use docker from within the Windows guest. I know it is possible to configure access to docker by exposing a TCP port etc but what I don't know is if stuff like port forwarding could work if I configured a remote docker host.

There's also the issue about mounting disk volumes. I can probably get away by using docker volumes to replace that, but that's not the same as just mounting a directory, which is what devcontainers do for example.

I realise I am really pushing for a convoluted configuration here, so please take the question as more of an intellectual exercise than something I insist on doing.

🌐
CircleCI
circleci.com › blog › ssh-into-docker-container
How to SSH into Docker containers - CircleCI
September 28, 2023 - The docker exec command creates a new shell session in the container. In this case you can use Bash, but recall that some distributions might have a different shell installed, such as Alpine’s default, Ash. You can also connect to a container by providing its ID instead of its name. You can now interact with the container as you normally would in a remote session.
🌐
GitHub
github.com › lab-sun › VScode-Tutorial
GitHub - lab-sun/VScode-Tutorial: [Use VSCode with Docker over SSH] · GitHub
First connect to the remote machine. Refer to the Dockerfile of this repo, and create your own Dockerfile in a folder, for example /docker. The provided Dockerfile will configure the SSH service in docker container automatically.
Starred by 4 users
Forked by 3 users
Languages   Dockerfile
🌐
DEV Community
dev.to › cod3mason › docker-remote-context-via-ssh-over-proxy-268l
Docker Remote Context via SSH over Proxy - DEV Community
December 30, 2025 - By configuring SSH to use a proxy and defining a Docker context that targets the remote host via SSH, Docker commands executed locally are transparently forwarded to the remote Docker daemon.
🌐
YouTube
youtube.com › awais mirza
How to setup SSH on Docker Container to access it remotely - YouTube
Subscribe to my other channel for updated videos on TECH and PROGRAMMING:https://www.youtube.com/@ProgrammingwithUmair321How to set up SSH on Docker Containe...
Published   September 14, 2021
Views   50K
Find elsewhere
🌐
Serverlab
serverlab.ca › home › how to run remote docker commands using ssh
How to Run Remote Docker Commands using SSH - Serverlab
September 1, 2020 - Learn how to securely run Docker commands over SSH to administer your containers, volumes, and images on a remote Docker host.
🌐
Qmacro
qmacro.org › blog › posts › 2024 › 08 › 24 › using-lazydocker-with-ssh-based-remote-contexts
Using lazydocker with SSH-based remote contexts - DJ Adams
August 24, 2024 - I manage these remote engines via Docker contexts. Running docker context ls here shows me this: NAME DESCRIPTION DOCKER ENDPOINT default Current DOCKER_HOST based configuration unix:///var/run/docker.sock docker * Docker Host on PVE LXC ssh://dj@docker homeops Docker Host on homeops ssh://dj@homeops kkhw42xrfy M2 Air ssh://user@kkhw42xrfy synology Docker Host on Synology NAS ssh://dj@synology
🌐
Docker Community
forums.docker.com › general
Enable ssh into a live docker container - General - Docker Community Forums
June 6, 2024 - I am new to docker.I am running a Pi hole server in a docker container (called container manager in synology DSM). I was trying to run gravity vm to sync 2 pi holes which needs ssh. But the docker container does not have…
🌐
TechSparx
techsparx.com › software-development › docker › damp › remote-control.html
Using SSH to remotely control a Docker Engine or Docker Swarm in two easy steps
Starting in Docker 18.09 it became possible to create a Docker Context with an SSH URL. Using this, the docker command on your laptop can interact with the Docker API of a remote Docker instance, over SSH, without opening a public Docker TCP port.
Top answer
1 of 1
6

After a 2 week deep dive into networks in general, docker networking in particular and how docker interacts with iptables, I now feel confident to answer my own question:

First of all, this has nothing to do with ssh tunnels. Anything you bind to the loopback address 127.0.0.1 will only be accessible from the local machine itself, and cannot be accessed from outside the machine's network interface. Docker here counts as "outside" because it has its own network (by default in the 172.16.0.0/12 range). This is a crucial part of the whole isolation concept of docker, it keeps the containers separate from the host and each other.

So how to make this work? There are a few options:

  • Bind whatever service you need to the docker gateway ip:

    ssh -fnNT -R 172.17.0.1:8080:localhost:80 remote-host
    

    For this to work you need an additional iptables rule, something like:

    iptables -I INPUT -i docker0 -d 172.17.0.1 -p tcp --dport 8080 -j ACCEPT
    

    This allows traffic coming from the default docker0 network to the docker gateway ip on port 8080. Without this rule, these packets would be dropped.

  • A bit easier, just bind it to all interfaces:

    ssh -fnNT -R 0.0.0.0:8080:localhost:80 remote-host
    

    The security concerns here probably depend on your individual use case and network architecture. But if you have a simple VPS with a public IP and your iptables INPUT policy is DROP, this is not a problem. I was confused here because when you publish a port with docker like -p 8080:80, docker will create an iptables rule to ACCEPT requests from outside the host, because it is insecure by default. If you bind any other (non-docker) service to a port on 0.0.0.0, it will still be blocked until you open it manually.

  • Tunnel into the container, using the host as a jump proxy. This requires your container to have an openssh server running, which is a bit tricky in itself, and the port needs to be published on the host too, for example with -p 127.0.0.1:2222:22. If you're using docker compose and don't mind starting another service, you could use docker-openssh-server for this.

    Then it's just a matter of:

    ssh -fnNT -R 0.0.0.0:8080:localhost:80 remote-container
    

    When combined with a setup in .ssh/config, for example

    Host remote-host
        HostName remote-host.example.com
        User peter
        Port 22
    
    Host remote-container
        ProxyJump remote-host
        HostName localhost
        User containerpeter
        Port 2222
    

    this will first ssh into your remote host, then from there open the remote tunnel directly into the container. You could of course open the container ssh port to the public and tunnel into there directly, but it's so much more fun with a jump.

  • You could meddle with the iptables PREROUTING chain in the nat table as suggested here, routing the requests with destination 172.17.0.1 to 127.0.0.1. I haven't tried it though, so can't guarantee for the result.

  • You could use a tool called socat as explained here, which can do TCP forwarding among many other things. Also have not tried this approach.

If you're now excited to jump into solving a similar issue yourself, here are some handy tools.

See wich processes listen to which port on which interface. One of:

lsof -i -P -n | grep LISTEN
netstat -tulpn | grep LISTEN
ss -tulpn | grep LISTEN

Trace requests through the iptables chains:

# First insert a rule with target TRACE into the INPUT chain 
iptables -I INPUT -p tcp --dport 8080 -j TRACE

# Then look at the packets while they are going through the chains
xtables-monitor --trace

Happy networking.

🌐
LinkedIn
linkedin.com › pulse › configuring-ssh-docker-container-doing-remote-login-from-komal-suthar
Configuring SSH in Docker Container and doing remote login from another Container and Putty
December 25, 2020 - Now we had login to the docker container named ssh_server. ... Here we have to login with the base OS IP in the Host name block and give the docker external port 81 in the Port Block. Thus, we have connected remotely using SSH.
Top answer
1 of 7
93

Firstly you need to install a SSH server in the images you wish to ssh-into. You can use a base image for all your container with the ssh server installed. Then you only have to run each container mapping the ssh port (default 22) to one to the host's ports (Remote Server in your image), using -p <hostPort>:<containerPort>. i.e:

docker run -p 52022:22 container1 
docker run -p 53022:22 container2

Then, if ports 52022 and 53022 of host's are accessible from outside, you can directly ssh to the containers using the ip of the host (Remote Server) specifying the port in ssh with -p <port>. I.e.:

ssh -p 52022 myuser@RemoteServer --> SSH to container1

ssh -p 53022 myuser@RemoteServer --> SSH to container2

2 of 7
51

Notice: this answer promotes a tool I've written.

The selected answer here suggests to install an SSH server into every image. Conceptually this is not the right approach (https://docs.docker.com/articles/dockerfile_best-practices/).

I've created a containerized SSH server that you can 'stick' to any running container. This way you can create compositions with every container. The only requirement is that the container has bash.

The following example would start an SSH server exposed on port 2222 of the local machine.

$ docker run -d -p 2222:22 \
  -v /var/run/docker.sock:/var/run/docker.sock \
  -e CONTAINER=my-container -e AUTH_MECHANISM=noAuth \
  jeroenpeeters/docker-ssh

$ ssh -p 2222 localhost

For more pointers and documentation see: https://github.com/jeroenpeeters/docker-ssh

Not only does this defeat the idea of one process per container, it is also a cumbersome approach when using images from the Docker Hub since they often don't (and shouldn't) contain an SSH server.

Top answer
1 of 11
28

I resolved this problem by switching to the remote server's Docker context on my local machine:

docker context create some-context-label --docker "host=ssh://user@remote_server_ip"

docker context use some-context-label

docker ps
# A list of remote containers on my local machine! It works!

After that:

  1. Connect via Remote-SSH to the container server
  2. Right click relevant container -> the "Attach Visual Studio Code"

That works for me.

(Note: One would think that I should be able to just use my local VSCode (skip step 1) to connect to said remote container after switching my local context, but VSCode complains Failed to connect. Is docker running? in the Docker control pane.)

2 of 11
8

This might sound very strange, but for me, I had to open a folder on the remote SSH server prior to using the Remote Containers extension in VS Code. If I didn't do that, then it would constantly try to find the docker service running locally, even though the terminal tab was connected to the remote SSH server.

This seems very weird, because if you're conncted via SSH in VS Code, then the extension should assume you're trying to attach to the container on the remote server. Shouldn't have to open a remote folder first.

By "opening a folder" on the remote server, the Remote Containers extension was then able to attach VS code to the container running on the remote SSH server. I didn't have to do any of the steps in any of those articles. Just simply use Remote SSH to connect VS Code remotely via SSH, open a folder, and then use Remote Containers.

🌐
Kinsta®
kinsta.com › home › resource center › blog › docker › how to ssh into a docker container
How To SSH Into a Docker Container - Kinsta®
January 22, 2026 - Adding an SSH server to your Docker container helps you manage and troubleshoot your containerized applications. An SSH server allows you to remotely access and manage containers, execute commands, and inspect logs from anywhere.
🌐
GitHub
gist.github.com › dnaprawa › d3cfd6e444891c84846e099157fd51ef
Using Docker on remote Docker Host with docker context · GitHub
In order to use remote Docker host, as a prerequisite you need SSH enabled (required login using SSH keys).
🌐
GitHub
github.com › codefresh-io › remote-docker
GitHub - codefresh-io/remote-docker: A Docker container to securely control a remote docker daemon CLI using ssh forwarding, no SSL setup needed. · GitHub
You can also execute any docker command directly, without openning a bash shell in the codefresh/remote-docker container. $ docker run -it --rm -v ${HOME}/.ssh/id_rdocker:/root/.ssh/id_rdocker codefresh/remote-docker rdocker user@webserver.com docker info
Starred by 17 users
Forked by 12 users
Languages   Shell
🌐
DigitalOcean
digitalocean.com › community › tutorials › how-to-use-a-remote-docker-server-to-speed-up-your-workflow
How to Use a Remote Docker Server to Speed Up Your Workflow | DigitalOcean
June 25, 2019 - This feature was introduced in Docker 18.09. It brings support for connecting to a Docker host remotely via SSH. It requires very little configuration on the client, and only needs a regular Docker server without any special config running on a remote machine.