Forums

SSL internal error

I have been running a script that runs GET requests to an HTTPS server. It has been running without errors for several months but today all requests started to fail.

I have been using urllib2 for the requests. This is the code that has been working, but now it generates the following error:

>>> import urllib2
>>> url = "https://api.kraken.com/0/public/Time"
>>> req = urllib2.Request(url, headers={'User-Agent' : "Magic Browser"})
>>> con = urllib2.urlopen(req)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/urllib2.py", line 127, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.7/urllib2.py", line 404, in open
    response = self._open(req, data)
  File "/usr/lib/python2.7/urllib2.py", line 422, in _open
    '_open', req)
  File "/usr/lib/python2.7/urllib2.py", line 382, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 1222, in https_open
    return self.do_open(httplib.HTTPSConnection, req)
  File "/usr/lib/python2.7/urllib2.py", line 1184, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 1] _ssl.c:510: error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error>

What does this mean? Has anything changed on pythonanywhere since this worked? How can I solve this issue?

I also tried using httplib to see if it worked better (I have not used it before and am not sure this is the right way to do it), but it ended up in a similar error:

>>> import httplib
>>> conn = httplib.HTTPSConnection("api.kraken.com")
>>> conn.request("GET", "/0/public/Time")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/httplib.py", line 979, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python2.7/httplib.py", line 1013, in _send_request
    self.endheaders(body)
  File "/usr/lib/python2.7/httplib.py", line 975, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python2.7/httplib.py", line 835, in _send_output
    self.send(msg)
  File "/usr/lib/python2.7/httplib.py", line 797, in send
    self.connect()
  File "/usr/lib/python2.7/httplib.py", line 1182, in connect
    self.sock = ssl.wrap_socket(sock, self.key_file, self.cert_file)
  File "/usr/lib/python2.7/ssl.py", line 487, in wrap_socket
    ciphers=ciphers)
  File "/usr/lib/python2.7/ssl.py", line 243, in __init__
    self.do_handshake()
  File "/usr/lib/python2.7/ssl.py", line 405, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLError: [Errno 1] _ssl.c:510: error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error

The problem seems to be in my (or pythonanywhere's) end. The address https://api.kraken.com/0/public/Time opens successfully in a browser or by using curl.

I have exactly the same problem. Something definitely happened between 22:03 and 22:25 on January 18.

What has happened and how can I get around it?

Hm. I can't currently repro that problem with httplib.. Are you still seeing failures? Does it happen with the requests library as well?

python -c'import httplib; print(httplib.HTTPSConnection("api.kraken.com"))'
<httplib.HTTPSConnection instance at 0x7fc5442547a0>

python3 -c'import requests; print(requests.get("https://api.kraken.com"))' 
<Response [404]>

Yes I am still experiencing the problem with both urllib2 and httplib. Regarding your example: I get it to work that far as well...

>>> import httplib; print(httplib.HTTPSConnection("api.kraken.com"))
<httplib.HTTPSConnection instance at 0x7fe3fa6f00e0>

But it fails when sending a request, i.e.:

>>> import httplib
>>> conn = httplib.HTTPSConnection("api.kraken.com")
>>> print conn
<httplib.HTTPSConnection instance at 0x7fe3fa6f00e0>
>>> conn.request("GET", "/0/public/Time")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.7/httplib.py", line 979, in request
    self._send_request(method, url, body, headers)
  File "/usr/lib/python2.7/httplib.py", line 1013, in _send_request
    self.endheaders(body)
  File "/usr/lib/python2.7/httplib.py", line 975, in endheaders
    self._send_output(message_body)
  File "/usr/lib/python2.7/httplib.py", line 835, in _send_output
    self.send(msg)
  File "/usr/lib/python2.7/httplib.py", line 797, in send
    self.connect()
  File "/usr/lib/python2.7/httplib.py", line 1182, in connect
    self.sock = ssl.wrap_socket(sock, self.key_file, self.cert_file)
  File "/usr/lib/python2.7/ssl.py", line 487, in wrap_socket
    ciphers=ciphers)
  File "/usr/lib/python2.7/ssl.py", line 243, in __init__
    self.do_handshake()
  File "/usr/lib/python2.7/ssl.py", line 405, in do_handshake
    self._sslobj.do_handshake()
ssl.SSLError: [Errno 1] _ssl.c:510: error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error

However, the requests package seems to work. Here from a 2.7.6 console:

>>> import requests
>>> print(requests.get("https://api.kraken.com/0/public/Time"))
<Response [200]>
>>> print(requests.get("https://api.kraken.com/0/public/Time").text)
{"error":[],"result":{"unixtime":1453297206,"rfc1123":"Wed, 20 Jan 16 13:40:06 +0000"}}

Thanks a lot for the hint! I will now try to switch to the requests package for all my requests and see if it works (I also have some more complex requests than this, so I hope it works as smooth with them as for this one).

I got it to work with the requests package. Thank you again.

I am still confused why urllib2 stopped working as it has been running smoothly for so long.

But the new code is neater and seems to work fine.

My guess would be that the kraken server changed its ssl certificate or its web server config to accept new SSL V3 protocols, and then it ran into an incompatibility with the openssl library on our side. There's a similar issue reported here:

https://askubuntu.com/questions/649000/openssl-curl-error-ssl23-get-server-hellotlsv1-alert-internal-error

Not sure how requests magically avoids the issue, but requests is great anyway, so switching all your code to it is probably a win :)

I am having this problem now as well. However I am using a third party lib and cannot easily switch to requests. Is there any workaround for this?

slavasav@glenn-liveconsole1:~$ python alpha_vantage_test.py
Traceback (most recent call last):
  File "alpha_vantage_test.py", line 4, in <module>
    data, meta_data = ts.get_intraday(symbol='SEB-A',interval='1min', outputsize='compact')
  File "/home/slavasav/.local/lib/python2.7/site-packages/alpha_vantage/alphavantage.py", line 139, in _format_wrapper
    self, *args, **kwargs)
  File "/home/slavasav/.local/lib/python2.7/site-packages/alpha_vantage/alphavantage.py", line 124, in _call_wrapper
    return self._handle_api_call(url), data_key, meta_data_key
  File "/home/slavasav/.local/lib/python2.7/site-packages/alpha_vantage/alphavantage.py", line 57, in _retry_wrapper
    return func(self, *args, **kwargs)
  File "/home/slavasav/.local/lib/python2.7/site-packages/alpha_vantage/alphavantage.py", line 208, in _handle_api_call
    response = urlopen(url)
  File "/usr/lib/python2.7/urllib2.py", line 127, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.7/urllib2.py", line 410, in open
    response = meth(req, response)
  File "/usr/lib/python2.7/urllib2.py", line 523, in http_response
    'http', request, response, code, msg, hdrs)
  File "/usr/lib/python2.7/urllib2.py", line 442, in error
    result = self._call_chain(*args)
  File "/usr/lib/python2.7/urllib2.py", line 382, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 629, in http_error_302
    return self.parent.open(new, timeout=req.timeout)
  File "/usr/lib/python2.7/urllib2.py", line 404, in open
    response = self._open(req, data)
  File "/usr/lib/python2.7/urllib2.py", line 422, in _open
    '_open', req)
  File "/usr/lib/python2.7/urllib2.py", line 382, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.7/urllib2.py", line 1222, in https_open
    return self.do_open(httplib.HTTPSConnection, req)
  File "/usr/lib/python2.7/urllib2.py", line 1184, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 1] _ssl.c:510: error:14077438:SSL routines:SSL23_GET_SERVER_HELLO:tlsv1 alert internal error>

Rapid googling gives suggestions to upgrade python 2.7.6 to at least 2.7.9 https://stackoverflow.com/questions/33972671/downloading-https-pages-with-urllib-error14077438ssl-routinesssl23-get-serve

unfortunately we're stuck on 2.7.6 for now because of compatibility issues... not sure what else to suggest. a switch to python 3 maybe?

I will try that, yes.