这篇文章给大家分享的是有关Tensorflow训练网络的方式有哪些的内容。小编觉得挺实用的,因此分享给大家做个参考,一起跟随小编过来看看吧。TensorFlow训练网络有两种方式,一种是基于tensor(array),另外一种是迭代器两种方
这篇文章给大家分享的是有关Tensorflow训练网络的方式有哪些的内容。小编觉得挺实用的,因此分享给大家做个参考,一起跟随小编过来看看吧。
TensorFlow训练网络有两种方式,一种是基于tensor(array),另外一种是迭代器
两种方式区别是:
第一种是要加载全部数据形成一个tensor,然后调用model.fit()然后指定参数batch_size进行将所有数据进行分批训练
第二种是自己先将数据分批形成一个迭代器,然后遍历这个迭代器,分别训练每个批次的数据
IMAGE_SIZE = 1000# step1:加载数据集(train_images, train_labels), (val_images, val_labels) = tf.keras.datasets.mnist.load_data()# step2:将图像归一化train_images, val_images = train_images / 255.0, val_images / 255.0# step3:设置训练集大小train_images = train_images[:IMAGE_SIZE]val_images = val_images[:IMAGE_SIZE]train_labels = train_labels[:IMAGE_SIZE]val_labels = val_labels[:IMAGE_SIZE]# step4:将图像的维度变为(IMAGE_SIZE,28,28,1)train_images = tf.expand_dims(train_images, axis=3)val_images = tf.expand_dims(val_images, axis=3)# step5:将图像的尺寸变为(32,32)train_images = tf.image.resize(train_images, [32, 32])val_images = tf.image.resize(val_images, [32, 32])# step6:将数据变为迭代器train_loader = tf.data.Dataset.from_tensor_slices((train_images, train_labels)).batch(32)val_loader = tf.data.Dataset.from_tensor_slices((val_images, val_labels)).batch(IMAGE_SIZE)# step5:导入模型model = LeNet5()# 让模型知道输入数据的形式model.build(input_shape=(1, 32, 32, 1))# 结局Output Shape为 multiplemodel.call(Input(shape=(32, 32, 1)))# step6:编译模型model.compile(optimizer='adam', loss=tf.keras.losses.SparseCateGoricalCrossentropy(from_logits=True), metrics=['accuracy'])# 权重保存路径checkpoint_path = "./weight/cp.ckpt"# 回调函数,用户保存权重save_callback = tf.keras.callbacks.ModelCheckpoint(filepath=checkpoint_path, save_best_only=True, save_weights_only=True, monitor='val_loss', verbose=0)EPOCHS = 11for epoch in range(1, EPOCHS): # 每个批次训练集误差 train_epoch_loss_avg = tf.keras.metrics.Mean() # 每个批次训练集精度 train_epoch_accuracy = tf.keras.metrics.SparseCategoricalAccuracy() # 每个批次验证集误差 val_epoch_loss_avg = tf.keras.metrics.Mean() # 每个批次验证集精度 val_epoch_accuracy = tf.keras.metrics.SparseCategoricalAccuracy() for x, y in train_loader: history = model.fit(x, y, validation_data=val_loader, callbacks=[save_callback], verbose=0) # 更新误差,保留上次 train_epoch_loss_avg.update_state(history.history['loss'][0]) # 更新精度,保留上次 train_epoch_accuracy.update_state(y, model(x, training=True)) val_epoch_loss_avg.update_state(history.history['val_loss'][0]) val_epoch_accuracy.update_state(next(iter(val_loader))[1], model(next(iter(val_loader))[0], training=True)) # 使用.result()计算每个批次的误差和精度结果 print("Epoch {:d}: trainLoss: {:.3f}, trainAccuracy: {:.3%} valLoss: {:.3f}, valAccuracy: {:.3%}".fORMat(epoch, train_epoch_loss_avg.result(), train_epoch_accuracy.result(), val_epoch_loss_avg.result(), val_epoch_accuracy.result()))
import model_sequential(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.mnist.load_data()# step2:将图像归一化train_images, test_images = train_images / 255.0, test_images / 255.0# step3:将图像的维度变为(60000,28,28,1)train_images = tf.expand_dims(train_images, axis=3)test_images = tf.expand_dims(test_images, axis=3)# step4:将图像尺寸改为(60000,32,32,1)train_images = tf.image.resize(train_images, [32, 32])test_images = tf.image.resize(test_images, [32, 32])# step5:导入模型# history = LeNet5()history = model_sequential.LeNet()# 让模型知道输入数据的形式history.build(input_shape=(1, 32, 32, 1))# history(tf.zeros([1, 32, 32, 1]))# 结局Output Shape为 multiplehistory.call(Input(shape=(32, 32, 1)))history.summary()# step6:编译模型history.compile(optimizer='adam', loss=tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True), metrics=['accuracy'])# 权重保存路径checkpoint_path = "./weight/cp.ckpt"# 回调函数,用户保存权重save_callback = tf.keras.callbacks.ModelCheckpoint(filepath=checkpoint_path, save_best_only=True, save_weights_only=True, monitor='val_loss', verbose=1)# step7:训练模型history = history.fit(train_images, train_labels, epochs=10, batch_size=32, validation_data=(test_images, test_labels), callbacks=[save_callback])
感谢各位的阅读!关于“TensorFlow训练网络的方式有哪些”这篇文章就分享到这里了,希望以上内容可以对大家有一定的帮助,让大家可以学到更多知识,如果觉得文章不错,可以把它分享出去让更多的人看到吧!
--结束END--
本文标题: TensorFlow训练网络的方式有哪些
本文链接: https://lsjlt.com/news/303056.html(转载时请注明来源链接)
有问题或投稿请发送至: 邮箱/279061341@qq.com QQ/279061341
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
2024-05-24
回答
回答
回答
回答
回答
回答
回答
回答
回答
回答
0