Basics of MLP(multilayer_perceptron)

Ajay Mane
2 min readApr 26, 2020

--

Basics of MLP

  • Objective: create vanilla neural networks(Multilayer Perceptron) for simple regression/classification tasks with Keras.

MLP Structures

  • Each MLP model consists of one input layer, several hidden layers, and one output layer
  • The number of neurons in each layer is not limited.
  • Number of input neurons : 3
  • Number of hidden neurons: 4
  • Number of output neurons: 2
  • Number of input neurons : 3
  • Number of hidden neurons : (4,4)
  • Number of output neurons: 1

MLP for regression tasks

  • When the target (y) is continuous(real)
  • For loss function and evaluation metric, mean squared error (MSE) is commonly used
from keras.datasets import boston_housing
(X_train,y_train),(X_test,y_test) = boston_housing.load_data()

Datasets from Keras

Doc: https://keras.io/datasets/

print(X_train.shape)print(X_test.shape)print(y_train.shape)print(y_test.shape)

1. Creating a Model

  • created with sequential class
  • At the outset, the model is empty per se. It is completed by ‘adding’ additional layers and compilation
  • Doc: https://keras.io/models/sequential/
from keras.models import Sequentialmodel = Sequential()

1–1. Adding layers

from keras.layers import Activation, Dense# keras model witth two hidden layer with 10 neurons eachmodel.add(Dense(10,activation=None, input_shape = (13,))) # input layer => input_shape should be explicitly designatedmodel.add(Activation('sigmoid'))model.add(Dense(10)) # Hidden layer => only output dimension should be designatedmodel.add(Activation('sigmoid'))model.add(Dense(10))# Hidden layer => only output dimension should be designatedmodel.add(Activation('sigmoid'))model.add(Dense(1)) # Output layer => output dimension = 1 since it is regression problem# this is equivalent to the above code blockmodel.add(Dense(10, input_shape = (13,), activation= 'sigmoid'))model.add(Dense(10, activation= 'sigmoid'))model.add(Dense(10, activation= 'sigmoid'))model.add(Dense(1))

1–2. Model compile

  • Keras’ model should be “compiled” prior to training.
  • Types of loss (function) and optimizer should be designated
*Doc (optimizers): https://keras.io/optimizers/

*Doc (losses): https://keras.io/losses/
from keras import optimizerssgd = optimizers.SGD(lr = 0.01) # stochastic gradient decent optimizermodel.compile(optimizer = sgd, loss = 'mean_squared_error', metrics = ['mse']) # for regression problems, mean squared error (MSE) is often employed

Summary of the model

model.summary()

2. Training

  • the model with training data provided
model.fit(X_train,y_train, batch_size =50, epochs =100, verbose =1 )

3. Evaluation

  • Keras model can be evaluated with evaluate() function
  • Evaluate results are contained in a list
  • Doc(metrics) : https://keras.io/metrics/
results = model.evaluate(X_test,y_test)print(model.metrics_names) # listt of metric the model is employingprint(results) # actual fig of metrics computedprint('loss: ', results[0])print('mse: ',results[1])

Full code on Google Colaboratory link.

--

--

Ajay Mane
Ajay Mane

No responses yet