Hello, I am having issue loading a saved tensorflowmodel into pythonanywhere. The end goal is to serve the model inference through an flask api to my react frontend. My error log looks like this:
File "/home/kamleshsahoo/mysite/flask_app.py", line 23, in sentiment_analyzer
model = tf.keras.models.load_model('/home/kamleshsahoo/mysite/saved_models/model2')
File "/usr/local/lib/python3.10/site-packages/keras/utils/traceback_utils.py", line 67, in error_handler
raise e.with_traceback(filtered_tb) from None
File "/usr/local/lib/python3.10/site-packages/keras/saving/saved_model/load.py", line 532, in _revive_layer_or_model_from_config
raise RuntimeError(
RuntimeError: Unable to restore object of class 'TextVectorization' likely due to name conflict with built-in Keras class '<class 'keras.layers.preprocessing.text_vectorization.TextVectorization'>'. To override the built-in Keras definition of the object, decorate your class with `@keras.utils.register_keras_serializable` and include that file in your program, or pass your class in a `keras.utils.CustomObjectScope` that wraps this load call.
Flask app:
from flask import Flask, request, jsonify
from flask_cors import CORS, cross_origin
import tensorflow as tf
import numpy as np
app = Flask(__name__)
cors = CORS(app)
@app.route('/', methods=['POST'])
@cross_origin()
def sentiment_analyzer():
numeric_to_sentiment = {0: 'negative', 1: 'neutral', 2: 'positive'}
data = request.get_json()
news = data.get('news')
## fails at next line
model = tf.keras.models.load_model('/home/kamleshsahoo/mysite/saved_models/model2')
pred = model.predict([news])
senti = numeric_to_sentiment[np.argmax(pred)]
return jsonify(sentiment=senti)
I changed the backend to theano
in the keras.json
file as suggested in other related posts on the forum. On tensorflow side, I found the decorator fix was need when the model has a custom layer, however in my case I am using the built-in TextVectorization
layer but still getting the error. Please help.
[edit by admin: formatting]