《(5.4.1)--5.4PracticalCase.pdf》由会员分享,可在线阅读,更多相关《(5.4.1)--5.4PracticalCase.pdf(5页珍藏版)》请在得力文库 - 分享文档赚钱的网站上搜索。
1、Introduction to Artificial Intelligence Image Classification with Multilayer Perceptron Nankai University Introduction to Artificial Intelligence Image classification is based on the semantic information of the image to distinguish different categories of images,which is an important basic problem i
2、n computer vision.This experiment uses a Multi-Layer Perceptron(MLP)model to predict handwritten digital pictures.1.Preparing data set 1.1 Data set description The MNIST data set contains 60,000 training sets and 10,000 test data sets.Divided into pictures and labels,the picture is a 28*28 pixel mat
3、rix,and the labels are 10 numbers from 0 to 9.1.2 Download and prepare the MINST dataset import tensorflow as tf import numpy as np(x_train,y_train),(x_test,y_test)=tf.keras.datasets.mnist.load_data()x_train=tf.keras.utils.normalize(x_train,axis=1)x_test=tf.keras.utils.normalize(x_test,axis=1)#Norma
4、lization x_train and x_test are used to obtain mnist training set and test set respectively.2.Create the MLP The following code is to define a simple MLP.There are three layers,two hidden Introduction to Artificial Intelligence layers with a size of 100 and an output layer with a size of 10.Because
5、the MNIST data set is a handwritten grayscale image of 0 to 9,number of category is 10,so the final output size is 10.The activation function of the final output layer is softmax,so the final output layer is equivalent to a classifier.With an input layer,the structure of the multi-layer perceptron i
6、s:input layer-hidden layer-hidden layer-output layer.model=tf.keras.Sequential(tf.keras.layers.Flatten(),tf.keras.layers.Dense(100,activation=relu),tf.keras.layers.Dense(100,activation=relu),tf.keras.layers.Dense(10,activation=softmax)model.build(None,784,1)model.summary()3.Compile and train the mod
7、el Introduction to Artificial Intelligence pile(optimizer=adam,#Optimizer loss=sparse_categorical_crossentropy,#Cross entropy loss function metrics=accuracy)#Label#Train model history=model.fit(x_train,y_train,validation_data=(x_test,y_test),epochs=10,verbose=1)4.Evaluate the model from matplotlib i
8、mport pyplot as plt def plot_graphs(history,string):plt.plot(history.historystring)plt.plot(history.historyval_+string)plt.xlabel(Epochs)plt.ylabel(string)plt.legend(string,val_+string)plt.show()plot_graphs(history,accuracy)plot_graphs(history,loss)Introduction to Artificial Intelligence Our simple MLP model has achieved a test accuracy of over 97%.