Forums

Error with temporary files to load keras models

Hello,

I'm using MongoDB to store keras::LSTM models and then use these to make predictions whenever a user visits an endpoint.


The model loading process currently looks like

  1. Fetch the file (.h5) from Mongo using GridFS.
  2. Write the contents to disk in a temporary file. For now, the filenames look like f"{ObjectId()}{os.getpid()}".
  3. Call keras.load_model() and load the .h5 file into a custom object.
  4. Delete the file and move onto the next model.

As usual, locally runs fine but with the deployed version, I'm getting a FileNotFoundError (even with using abs paths) specifically,

with open(TEMP_LOADING_PATH, "wb") as file_obj:#012FileNotFoundError: [Errno 2] No such file or directory: '/home/CFOai/Phase1Library/temp_model_store/prediction/63444e6947466745d1e064a613-temp-store.h5'


Here is some more info that may be helpful:
CURRENT_WD = /home/CFOai/Phase1Library TEMP_LOADING_PATH='/home/CFOai/Phase1Library/temp_model_store/prediction/63444e6947466745d1e064a613-temp-store.h5' os.getpid()=13


My first guess was the there were file collisions that arose from deleting the files after use so I disabled those but it's still unable to find the files despite them being there on the filesystem.

Thanks in advance!

SOLVED

In my code I had a clause like

with open(TEMP_LOADING_PATH, "wb") as file_obj:
    # Write file locally
    file_obj.write(self.fs.get(ObjectId(model_ref_id)).read())
    model = keras.load_model(TEMP_LOADING_PATH)

I think the problem was trying to read the file contents while it was still open so moving the keras.load_model() call outside the context manager was the key. So the code becomes,

with open(TEMP_LOADING_PATH, "wb") as file_obj:
    # Write file locally
    file_obj.write(self.fs.get(ObjectId(model_ref_id)).read())

model = keras.load_model(TEMP_LOADING_PATH)

Glad to hear that you made it work!