![]() Window_length=WINDOW_LENGTH, batch_size=1)īut now the strange thing is that when I train the model as before, but using this generator as the input, model.fit_generator(generator=data_gen_custom, def get_generator(data, targets, window_length = 5, batch_size = 32):ĭata_gen = TimeseriesGenerator(data, targets, length=window_length,ĭata_gen_custom = get_generator(data, data, As a step towards this, I create a generator function which just wraps the TimeseriesGenerator used previously. Now the problem is, in my non-toy situation I want to process the data coming out from the TimeseriesGenerator before feeding the data into the fit_generator. pile(loss='mse', optimizer='rmsprop', metrics=)Īnd train it using the fit_generator function: model.fit_generator(generator=data_gen,Īnd this trains perfectly, and the model makes predictions as expected. ![]() Model = Model(inputs=input1, outputs=output) Output = Dense(data_dim, activation='linear')(hidden) Hidden = Dense(20, activation='relu')(lstm1) Input1 = Input(shape=(WINDOW_LENGTH, data_dim)) I use a simple LSTM network: data_dim = 1 To illustrate the problem, I have created a toy example trying to predict the next number in a simple ascending sequence, and I use the Keras TimeseriesGenerator to create a Sequence instance: WINDOW_LENGTH = 4ĭata_gen = TimeseriesGenerator(data, data, length=WINDOW_LENGTH, Model.So I'm trying to use Keras' fit_generator with a custom data generator to feed into an LSTM network. Val_set = load_dataset('images/val', (patch_height, patch_width), noise_sigma, batch_size) You can then use the above like so: train_set = load_dataset('images/train', (patch_height, patch_width), noise_sigma, batch_size) Generator = group_by_batch(generator, batch_size) Generator = corrupted_training_pair(generator, sigma) Generator = image_generator(files, crop_size, scale=1/255.0, shift=0.5) Sources, targets = zip(*)īatch = (np.stack(sources), np.stack(targets))ĭef load_dataset(directory, crop_size, sigma, batch_size): Source = img + np.random.normal(0, sigma, img.shape)/255.0 from import load_img, img_to_array, list_pictures After which, you can use Model.fit_generator to train using these methods. You could easily modify this to add other types of augmentations. Here is an example code I've used for an image denoising problem, where I use random crops + additive noise to generate clean and noisy image pairs on the fly. There are works on extending ImageDataGenerator to be more flexible for exactly these type of cases (see in this issue on Github for examples).Īdditionally, as mentioned by Mikael Rousson in the comments, you can easily create your own version of ImageDataGenerator yourself, while leveraging many of its built-in functions to make it easier. Train_generator = zip(image_generator, mask_generator) # combine generators into one which yields image and masks Mask_generator = mask_datagen.flow_from_directory( Image_generator = image_datagen.flow_from_directory( Mask_datagen.fit(masks, augment=True, seed=seed) Image_datagen.fit(images, augment=True, seed=seed) # Provide the same seed and keyword arguments to the fit and flow methods seed = 1 ![]() Mask_datagen = ImageDataGenerator(**data_gen_args) Image_datagen = ImageDataGenerator(**data_gen_args) # we create two instances with the same argumentsĭata_gen_args = dict(featurewise_center=True, You zip together two generators seeded with the same seeds and the fit_generator them.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |