improve LSTM accuracy
See original GitHub issueI’m trying to build LSTM architecture to predict sickness rate. I’m actually stuck in 40% accuracy, I’m new in machine learning and I tried several tips like changing the optimzer, the layer node number and the dropout value without any improving. So could you guys help me with some advice.
the x array is composed of 10 columns
the y array is only one column the sickness rate
here is my model
def lstm_model(): model = Sequential() model.add(LSTM(10, input_shape=(1,10), return_sequences= True)) model.add(Dropout(0.2)) model.add(LSTM(100, return_sequences= True)) model.add(LSTM(100, return_sequences= False)) model.add(Dropout(0.2)) model.add(Dense(50,kernel_constraint=NonNeg(),kernel_initializer='normal' ,activation="relu")) model.add(Dense(1,activation="linear")) model.compile(optimizer='adam',loss='mean_squared_error',metrics=['accuracy']) return model lstm = lstm_model()
this is the output of . evaluate()
1275/1275 [==============================] - 1s 526us/sample - loss: 0.0015 - acc: 0.3930 0.0014869439909029204 0.3930161
and thank you in advance
Issue Analytics
- State:
- Created 4 years ago
- Comments:7 (2 by maintainers)
Top Related StackOverflow Question
Adding to @RooieRakkert 's answer, there are two methods you can use to check how if your model is performing well for regression task:
root_mean_squaredmetric, make sure that training, validation, and testing error are low and close to each other in magnitude.sklearnlibrary. The best possible score is 1.0 and it can be negative (because the model can be arbitrarily worse). A constant model that always predicts the expected value of y, disregarding the input features, would get an R^2 score of 0.0. So you can check if your R^2 score is close to 1 then it’s a good model.In that case
linearandmean_square_errorare both fine,accuracyis not a valid metric in this case (not a classification problem). Consider usingmean_square_error(the loss function) ormean_absolute_erroras a metric.