Forums

MySQL failing to write >10MB to JSONField

Hi All,

My App needs to store JSON objects in DB - I use Django JSONField for that. That works fine for smaller objects, but for bigger ones (the threshold seems to be around 10-20MB) it fails with

MySQLdb._exceptions.OperationalError: (2006, 'MySQL server has gone away')

At first I thought it was a connection timeout, but even with nested write attempts with "close_old_connections()" and "connection.close()" it still happens. Here's my code:

try:
    db.close_old_connections()
    db.connections["default"].close()
    db.connection.close()
    Backtest.objects.filter(pk=backtest_pk).update(data=strategy_summary)
except:
    logger.exception("Couldn't save task result in object 1/3")
    try:
        for conn in db.connections.all():
            conn.close()
        Backtest.objects.filter(pk=backtest_pk).update(data=strategy_summary)
    except:
        logger.exception("Couldn't save task result in object 2/3")
        try:
            db.close_old_connections()
            db.connections["default"].close()
            db.connection.close()
            for conn in db.connections.all():
                conn.close()
            Backtest.objects.filter(pk=backtest_pk).update(data=strategy_summary)
        except:
            logger.exception("Couldn't save task result in object 3/3. Giving up.")

Each of the 3 attempted updates results in the same error and same traceback.

[ERROR] Couldn't save task result in object 3/3. Giving up.
Traceback (most recent call last):
File "/home/mastrodarko/.virtualenvs/mastrodarko.pythonanywhere.com/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute
    return self.cursor.execute(sql, params)
File "/home/mastrodarko/.virtualenvs/mastrodarko.pythonanywhere.com/lib/python3.8/site-packages/django/db/backends/mysql/base.py", line 73, in execute
    return self.cursor.execute(query, args)
File "/home/mastrodarko/.virtualenvs/mastrodarko.pythonanywhere.com/lib/python3.8/site-packages/MySQLdb/cursors.py", line 206, in execute
    res = self._query(query)
File "/home/mastrodarko/.virtualenvs/mastrodarko.pythonanywhere.com/lib/python3.8/site-packages/MySQLdb/cursors.py", line 319, in _query
    db.query(q)
File "/home/mastrodarko/.virtualenvs/mastrodarko.pythonanywhere.com/lib/python3.8/site-packages/MySQLdb/connections.py", line 259, in query
    _mysql.connection.query(self, query)
MySQLdb._exceptions.OperationalError: (2006, 'MySQL server has gone away')

This makes me think that it is a MySQL failure to complete the query, which results in closing the connection.

So my questions are:

  1. Is there any limitation/configuration I should be aware of, that may be causing such issues with bigger JSON objects?
  2. How can I access DB logs or see what failed in MySQL? All I see in my app logs is just "MySQL server has gone away" - how can I dig deeper?

Thanks a lot!

just to confirm- is each field that you are adding 10MB? or is all the total fields together 10MB?

If it's every individual db field, then you are probably running up against MySQL's max_allowed_packet limit, which we have set to 16MB (and is the MySQL default)

My Django model class has a single JSONField, and into this field I'm saving JSON string that is >10MB., so it sounds like I'm indeed hitting the max_allowed_packet limit.

Is it possible to have it increased to, let's say, 64MB? That should be sufficient for my needs. Or do I need to break this JSONField down into a few smaller fields that will not exceed 16MB?

hi there, it wouldn't be something that we can change for you right now, as that is a global setting that would affect all users.

unfortunately the easiest option may be to upgrade your account to use postgres- which I believe does not have this restriction.