machine learning - LSTM history length vs prediction error -
i use lstm predict next step voltage value in voltage time series signal. have question: why using longer sequences (5 or 10 steps in time) train lstm not improve prediction , reduce prediction error ? (it degrades - see figures e.g. results sequence_length=5 better sequence_length=10) testplot('epochs: 10', 'ratio: 1', 'sequence_length: 10', 'mean error: ', '0.00116802704509') testplot('epochs: 10', 'ratio: 1', 'sequence_length: 5', 'mean error: ', '0.000495359163296' (predicted signal in green, real in red) import os import matplotlib.pyplot plt import numpy np import time import csv keras.layers.core import dense, activation, dropout keras.layers.recurrent import lstm keras.models import sequential np.random.seed(1234) def data_power_consumption(path_to_dataset, sequence_length=50, ratio=1.0): max_values = ratio * 2049280 ...