Forums

tensorflow trained model could not give result with flask app with no error just server become busy. same could run on terminal

I have flask image classification app with trained model. all dependancies were installed. code is okay. I check Line by line. when I pass image for prediction. Server become busy take after long time. asked to refresh page.
1. in error_log file there are no error. 2. image url or path is correct Flask code snippet

model_path = os.getcwd() + '/next_clasifier/model.h5'
model = load_model(model_path)
model.make_predict_function()

def predict_label(test_image):

    test_image = os.path.join(test_image)
    test_image = image.load_img(test_image,target_size=(224,224))
    test_image =image.img_to_array(test_image) / 255
    test_image = np.expand_dims(test_image, axis=0)
    result = model.predict(test_image) # here are app stuck
    if result[0][0] >= 0.5:
        prediction = 'Heart'
    elif result[0][1] >= 0.5:
        prediction = 'Oblong'
    elif result[0][2] >= 0.5:
        prediction = 'Oval'
    elif result[0][3] >= 0.5:
        prediction = 'Round'
    else:
        prediction = 'Square'
    return prediction

server error ![after running log time give error ][4] same code run perfect on same env with terminal test.py with (env) 23:23 ~/next_clasifier $ python test.py command **

  1. running code with terminal

**

import os
from flask import Flask, render_template, request
from tensorflow.keras.preprocessing import image
import numpy as np
from tensorflow.keras.models import load_model

app = Flask(__name__)
dic = {0 : 'Heart', 1 : 'Oblong',2 : 'Oval', 3 : 'Round', 4 : 'Square'}

model_path = os.getcwd() + '/next_clasifier/model.h5'
model = load_model('model.h5')
model.make_predict_function()

def predict_label(test_image):

    test_image = image.load_img(test_image,target_size=(224,224))
    test_image = x=image.img_to_array(test_image) / 255
    test_image = np.expand_dims(test_image, axis=0)
    result = model.predict(test_image)

    if result[0][0] >= 0.5:
        prediction = 'Heart'
    elif result[0][5] >= 0.5:
        prediction = 'Oblong'
    elif result[0][6] >= 0.5:
        prediction = 'Oval'
    elif result[0][7] >= 0.5:
        prediction = 'Round'
    else:
        prediction = 'Square'
    return prediction
ig_path='image.jpg'

print(ig_path)
print("result:")

resulted = predict_label("image.jpg")
print(resulted)

and its result

terminal run successfully ![https://ibb.co/MSzBRzb][1]

Tensorflow would probably not work in the web app, see this help page. You may consider moving ML code outside of the web app, see this.

Even I have been facing similar issue. Request times-out after 5 minutes without error.

While pythonanywhere is working to solve this problem to get tensorflow working, is there a way to least make tensorflow-cpu version available. This should help to deploy smaller deep learning models for prediction. We would have atleast something available to try out than nothing at all.

See https://help.pythonanywhere.com/pages/MachineLearningInWebsiteCode/

Made some progress, converted "model.h5" file to tensorflow lite format ("model.tflite"). In order to load the model and then to predict, requires code changes.

The changes when deployed on pythonanywhere, it works as expected.

Prior to this, tried to install tensorflow-cpu in a virtual env. The file size is hardly 172 MB. Installation failed and the reason could be that the installation process consumes disk quota up to 512MB before it fails.

Including the link to github repository if someone is looking from .h5 weights file to .tflite format. This can be found in Animal.ipynb. https://github.com/ra9hur/Animal-Image-Classification-WebApp

hello i am also facing the same issue my chatbot is not responding to the query if anyone knows how to make it work please help me

We need more details to help you. If you don't want to provide details on public forums, email us at support@pythonanywhere.com. Note that we can help you with PythonAnywhere specific problems and for more general questions you need to ask on more general forums.