Docker ssl error

I'm getting started with docker compose and have been working through the simple demo flask application. The thing is, I'm running this from inside of an organization that intercepts all communicat...

I’m getting started with docker compose and have been working through the simple demo flask application. The thing is, I’m running this from inside of an organization that intercepts all communication in such a way that SSL errors are thrown right and left. They provide us with three root certificates we need to install, and I’ve generally got these working on my own machine, but I’m lost when it comes to getting these to work inside docker-compose deployments.

When I run docker-compose up, I get the following:

$ sudo docker-compose up 
Creating network "project_default" with the default driver
Building web
Step 1/5 : FROM python:3.4-alpine
3.4-alpine: Pulling from library/python
81033e7c1d6a: Pull complete
9b61101706a6: Pull complete
415e2a07c89b: Pull complete
f22df7a3f000: Pull complete
8c16bf19c1f9: Pull complete
Digest: sha256:fe436cb066394d81cf49448a04dec7c765082445a500bc44f1ae5e8a455793bd
Status: Downloaded newer image for python:3.4-alpine
 ---> 5c72717ec319
Step 2/5 : ADD . /code
 ---> a5790c0e3e94
Removing intermediate container 052c614e41d0
Step 3/5 : WORKDIR /code
 ---> a2ea9acb3005
Removing intermediate container 77f2375ca0a6
Step 4/5 : RUN pip install -r requirements.txt
 ---> Running in 5f4fe856776d
Collecting flask (from -r requirements.txt (line 1))
  Retrying (Retry(total=4, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7fb0061f1d30>: Failed to establish a new connection: [Errno -3] Try again',)': /simple/flask/
  Retrying (Retry(total=3, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7fb0061f19b0>: Failed to establish a new connection: [Errno -3] Try again',)': /simple/flask/
  Retrying (Retry(total=2, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7fb0061f1828>: Failed to establish a new connection: [Errno -3] Try again',)': /simple/flask/
  Retrying (Retry(total=1, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7fb0061f1588>: Failed to establish a new connection: [Errno -3] Try again',)': /simple/flask/
  Retrying (Retry(total=0, connect=None, read=None, redirect=None)) after connection broken by 'NewConnectionError('<pip._vendor.requests.packages.urllib3.connection.VerifiedHTTPSConnection object at 0x7fb0061f1390>: Failed to establish a new connection: [Errno -3] Try again',)': /simple/flask/
  Could not find a version that satisfies the requirement flask (from -r requirements.txt (line 1)) (from versions: )
No matching distribution found for flask (from -r requirements.txt (line 1))

Pip fails to install anything.

The docker-compose.yml file looks like this:

version: '3'
services:
  web:
    build: .
    ports:
     - "5000:5000"
  redis:
    image: "redis:alpine"

And the main Dockerfile looks like this:

FROM python:3.4-alpine
ADD . /code
WORKDIR /code
RUN pip install -r requirements.txt
CMD ["python", "app.py"]

Is there any way to be able to make this work in this particular case? Is there a general solution to this sort of problem that would allow me to pass to any container deployed the SSL certificates and have them be used?

@jitekuma I’m not setting the env variables during the build/release — I just noted that I can repro the task behavior if I do so locally.

I’m still not sure if this is an issue with the docker task or the Azure cli that created the certificates in the first place. Bottom line: if I pass the certs in I get a successful connection — when I use env variables, I get failures. Here’s what happens when I run some docker commands from my machine:

Docker commands succeed when passing --tls:

colin@colinsurface31 ~> set vmName "cd-dockerhost"
colin@colinsurface31 ~> set location "westus"
colin@colinsurface31 ~> set dockerHost "tcp://$vmName.$location.cloudapp.azure.com:2376"
colin@colinsurface31 ~> docker -H $dockerHost --tls ps
CONTAINER ID        IMAGE                                             COMMAND             CREATED             STATUS              PORTS               NAMES
90497ba28b92        microsoft/vsts-agent:ubuntu-16.04-docker-1.11.2   "./start.sh"        4 days ago          Up 4 days                               jovial_nightingale

Docker commands fail when using env vars:

colin@colinsurface31 ~> set -x DOCKER_HOST $dockerHost
colin@colinsurface31 ~> set -x DOCKER_TLS_VERIFY 1
colin@colinsurface31 ~> docker ps
FATA[0000] An error occurred trying to connect: Get https://cd-dockerhost.westus.cloudapp.azure.com:2376/v1.18/containers/json: x509: certificate is valid for *, not cd-dockerhost.westus.cloudapp.azure.com

Docker-compose succeeds when passing in certs:

colin@colinsurface31 ~> docker-compose -H $dockerHost --tls --tlscacert ~/.docker/ca.pem --tlscert ~/.docker/cert.pem --tlskey ~/.docker/key.pem ps
/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py:838: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/security.html
  InsecureRequestWarning)
/usr/local/lib/python2.7/dist-packages/requests/packages/urllib3/connectionpool.py:838: InsecureRequestWarning: Unverified HTTPS request is being made. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/latest/security.html
  InsecureRequestWarning)
Name   Command   State   Ports
------------------------------

Docker-compose fails when using env vars:

colin@colinsurface31 ~> set -x DOCKER_TLS_VERIFY 1
colin@colinsurface31 ~> set -x DOCKER_HOST $dockerHost
colin@colinsurface31 ~> docker-compose ps
ERROR: SSL error: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590)

x509: certificate signed by unknown authority

Above error is all about SSL/TLS handshake errors which is due to mismatch of root certificates, i have encountered this issue while running kubeadm command and later i found, issue exists with CURL and Docker Pull command too.

Thing to note : I’m using a PROXY server ( Zscaler ) , thus it’s certificate must be imported within my linux base OS and thus into docker certificates too.

Curl Issue.

x509: certificate signed by unknown authority.

Fix :

Retrieve your own proxy certificate, mine is for zscaler hence using the below command and then refresh the trusts.

Syntax Template

# cp zscalar_root.crt /etc/pki/ca-trust/source/anchors/ && update-ca-trust

Docker Pull Issue.

x509: certificate signed by unknown authority.

Fix :

Retrieve your own proxy certificate, mine is for zscaler hence using the below command and then refresh the trusts

Syntax Template

# cp zscalar_root.crt /etc/docker/certs.d/tls/ && update-ca-trust

Now restart docker to reflect the changes within docker.

Restart docker

# systemctl restart docker

I’m running into the following error message when I do easy_install pip:

root@ff45b7b74944:/# easy_install pip
Searching for pip
Reading https://pypi.python.org/simple/pip/
Download error on https://pypi.python.org/simple/pip/: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590) -- Some packages may not be found!
Couldn't find index page for 'pip' (maybe misspelled?)
Scanning index of all packages (this may take a while)
Reading [--https link here, like above--]
Download error on https://pypi.python.org/simple/: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed (_ssl.c:590) -- Some packages may not be found!
No local packages or download links found for pip
error: Could not find suitable distribution for Requirement.parse('pip')

This is run in a docker container that runs on ubuntu:latest. I’m leaning towards the fact that it can’t do openssl stuff (https link), but I’m not completely certain. If anyone has a solution or any troubleshooting methods, I’d love to find out.

Thanks.

asked Jul 14, 2016 at 21:57

cid's user avatar

1

adding RUN apt-get install ca-certificates to my Dockerfile worked for me.

answered Mar 5, 2017 at 5:58

Matt's user avatar

MattMatt

3831 gold badge4 silver badges11 bronze badges

3

Solved:

Added the following to the Dockerfile:

RUN mkdir /etc/pki
RUN mkdir /etc/pki/tls
RUN mkdir /etc/pki/tls/certs
RUN apt-get install wget
RUN wget http://curl.haxx.se/ca/cacert.pem
RUN mv cacert.pem ca-bundle.crt
RUN mv ca-bundle.crt /etc/pki/tls/certs

answered Jul 14, 2016 at 22:57

cid's user avatar

cidcid

1611 gold badge1 silver badge3 bronze badges

Fresh installation of below packages solved my problem

apt-get -yqq install build-essential libssl-dev libffi-dev python3-pip python3-dev gnupg

answered Jan 4, 2019 at 6:18

Shams's user avatar

ShamsShams

1011 bronze badge

I have got a wildcard ssl certificate for *.domain.no by generating a CSR and I received a .pem file from the ssl-provider. Now I have the key files including:

server.key

certificates.pem (includes Intermediate certificate and the SSL-certificate)

I want to use this certificate on a docker-nginx that includes some subdomains, my config file looks like below:

/etc/nginx/conf.d/default.conf

server 
{
   listen      443 ssl;
   server_name     test.domain.no;
   access_log  /var/log/nginx/nginx.access.log;
   error_log   /var/log/nginx/nginx.error.log;
   ssl    on;
   ssl_certificate    /etc/ssl/certificates.pem;
   ssl_certificate_key    /etc/ssl/server.key;
   ssl_protocols TLSv1 TLSv1.1 TLSv1.2;
   location /
   {
      proxy_pass         {dockerEndpoint};
      proxy_redirect     off;

    ##proxy_set_header   Host             $host;
      proxy_set_header   X-Real-IP        $remote_addr;
      proxy_set_header   X-Forwarded-For  $proxy_add_x_forwarded_for;

      client_max_body_size       10m;
      client_body_buffer_size    128k;

      proxy_connect_timeout      90;
      proxy_send_timeout         90;
      proxy_read_timeout         90;

      proxy_buffer_size          4k;
      proxy_buffers              4 32k;
      proxy_busy_buffers_size    64k;
      proxy_temp_file_write_size 64k;

     }
}

Nginx-Dockerfile:

FROM nginx
VOLUME /etc/nginx/conf.d
COPY default.conf /etc/nginx/conf.d/
COPY certificates.pem /etc/ssl
COPY server.csr /etc/ssl
COPY server.key /etc/ssl

The https does not work and it gives the following error in the browser:

This site can’t be reached
Try:
Checking the connection
Checking the proxy and the firewall

As I’ve got the following error in docker-logs, I’ve changed Dockerfile to:

Error:

BIO_new_file("/etc/ssl/certificates.pem") failed (SSL: error:02001014:system library:fopen:Not a   directory:fopen('/etc/ssl/certificates.pem','r') error:2006D002:BIO   routines:BIO_new_file:system lib)
nginx: [emerg] BIO_new_file("/etc/ssl/certificates.pem") failed (SSL:  error:02001014:system library:fopen:Not a   directory:fopen('/etc/ssl/certificates.pem','r') error:2006D002:BIO   routines:BIO_new_file:system lib)

Modified Dockerfile:

FROM nginx

COPY default.conf /etc/nginx/conf.d/
#CMD ["nginx", "-g", "daemon off;"]
RUN mkdir /etc/nginx/ssl
RUN chown -R root:root /etc/nginx/ssl
RUN chmod -R 600 /etc/nginx/ssl
COPY certificates.pem /etc/nginx/ssl
COPY server.key /etc/nginx/ssl

Now it doesn’t give error in the docker-logs however it still doesn’t work with HTTPS. :(

I’ve tried to check the error.log in /var/log/nginx by connecting to the nginx-container and cat the file but there is nothing in the file.

Any help would be appreciated.

Updated:

I have modified the Nginx docker container port to 443 (-p 443:443)
and changed the permission of /etc/nginx/ssl to 644, now if I open the url using https it gives the following error:

There are issues with the site's certificate chain (net::ERR CERT COMMON_NAME_INVALID)

Although it says it is issued by my ssl-provider.

Hello people,

My Docker and Linux knowledge would be beginner level at best.

I’m running docker on Windows 10 and using it to manage a Python Airflow instance that handles scheduling python scripts for me. I have a python script that fetches data from the web. When I run this script on my host machine I get no issues. But when I run this script in docker it throws an SSL error saying SSL certificate verify failed.

I’ve done a bit of research for the past few days. I know I can ignore the warning and do a —no-check-certificate kind of thing, but I don’t know how bad of MITM attack would be in a container (would the container protect the host somehow…? I do have a mounted volume where my script dumps data…).

Most of the solutions I’ve found so far are for linux host machines. Some say to mount the host certificates to the container, but apparently my windows host machine doesn’t have a directory with these certs (they’re in some internal windows app or database or something). So I can’t figure out how to make that option viable.

I’ve also tried going into the container and updating the certs as root, and the container says they’re all up to date, but my script still throws the error.

# my Dockerfile
# Base Image
FROM python:3.7.9-slim-buster

# Arguments that can be set with docker build
ARG AIRFLOW_VERSION=1.10.10
ARG AIRFLOW_HOME=/usr/local/airflow

# Export the environment variable AIRFLOW_HOME where airflow will be installed
ENV AIRFLOW_HOME=${AIRFLOW_HOME}

# Install dependencies and tools
RUN apt-get update -y && 
    apt-get upgrade -yqq && 
    apt-get install -yqq --no-install-recommends 
    python3-dev 
    wget 
    libczmq-dev 
    curl 
    libssl-dev 
    git 
    inetutils-telnet 
    bind9utils freetds-dev 
    libkrb5-dev 
    libsasl2-dev 
    libffi-dev libpq-dev 
    freetds-bin build-essential 
    default-libmysqlclient-dev 
    apt-utils 
    rsync 
    zip 
    unzip 
    gcc 
    vim 
    netcat 
    && apt-get autoremove -yqq --purge && apt-get clean


COPY ./requirements-python3.7.txt /requirements-python3.7.txt

# Upgrade pip
# Create airflow user
# Install apache airflow with subpackages
RUN pip install --upgrade pip && 
    useradd -ms /bin/bash -d ${AIRFLOW_HOME} airflow && 
    pip install apache-airflow[all]==${AIRFLOW_VERSION} --constraint /requirements-python3.7.txt

# Copy the airflow.cfg file (config)
#COPY ./config/airflow.cfg ${AIRFLOW_HOME}/airflow.cfg

# Set the owner of the files in AIRFLOW_HOME to the user airflow
RUN chown -R airflow: ${AIRFLOW_HOME}

# Copy the entrypoint.sh from host to container (at path AIRFLOW_HOME)
COPY ./start-airflow.sh ./start-airflow.sh

# Set the entrypoint.sh file to be executable
RUN chmod +x ./start-airflow.sh

# Set the username to use
USER airflow

# Create the folder dags inside $AIRFLOW_HOME
RUN mkdir -p ${AIRFLOW_HOME}/dags

# Expose ports (just to indicate that this container needs to map port)
EXPOSE 8080

# Execute start-airflow.sh
CMD [ "./start-airflow.sh" ]

I would appreciate any pointers/tips/solutions! Thanks very much! Sorry for the long post!

Понравилась статья? Поделить с друзьями:
  • Docker service job for docker service failed because the control process exited with error code
  • Docker run npm install error
  • Docker php fpm error log
  • Docker msbuild error msb1009 project file does not exist
  • Docker mosquito error address not available