Modeling With Transformations
import tensorflow as tf
import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
from sklearn.compose import make_column_transformer
from sklearn.preprocessing import MinMaxScaler, OneHotEncoder
from sklearn.model_selection import train_test_splitImport Some Data
dataUrl = "https://raw.githubusercontent.com/stedy/Machine-Learning-with-R-datasets/master/insurance.csv"
dataFromWeb = pd.read_csv(dataUrl)
dataFromWeb.head()| age | sex | bmi | children | smoker | region | charges | |
|---|---|---|---|---|---|---|---|
| 0 | 19 | female | 27.900 | 0 | yes | southwest | 16884.92400 |
| 1 | 18 | male | 33.770 | 1 | no | southeast | 1725.55230 |
| 2 | 28 | male | 33.000 | 3 | no | southeast | 4449.46200 |
| 3 | 33 | male | 22.705 | 0 | no | northwest | 21984.47061 |
| 4 | 32 | male | 28.880 | 0 | no | northwest | 3866.85520 |
labelField = 'charges'
featureData = dataFromWeb.drop(labelField, axis=1)
labelData = dataFromWeb[labelField]Split into Training & Testing Data
testDataPercentage = .2 # how much of our data should we use for "testing"
randomVal = 42
feature_training_data, feature_testing_data, label_training_data, label_testing_data = train_test_split(featureData,
labelData,
test_size=testDataPercentage,
random_state=randomVal) # set random state for reproducible splitsTransform
The make_column_transformer (docs) function can take a list of transformer functions along with a list of columns to apply the transformer to. This creates a transformer instance.
The transformer instance, then, gets fitted to the data with the fit method.
The transformer instnace, then, gets used with the data with the transform method.
Here will be applied two transformers:
dataTransformer = make_column_transformer(
# get all values between 0 and 1
(MinMaxScaler(), ["age", "bmi", "children"]),
(OneHotEncoder(handle_unknown="ignore"), ["sex", "smoker", "region"])
)
dataTransformer.fit(feature_training_data)ColumnTransformer(transformers=[('minmaxscaler', MinMaxScaler(),
['age', 'bmi', 'children']),
('onehotencoder',
OneHotEncoder(handle_unknown='ignore'),
['sex', 'smoker', 'region'])])In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook. On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
ColumnTransformer(transformers=[('minmaxscaler', MinMaxScaler(),
['age', 'bmi', 'children']),
('onehotencoder',
OneHotEncoder(handle_unknown='ignore'),
['sex', 'smoker', 'region'])])['age', 'bmi', 'children']
MinMaxScaler()
['sex', 'smoker', 'region']
OneHotEncoder(handle_unknown='ignore')
Normalize
normailized_feature_training_data = dataTransformer.transform(feature_training_data)
normailized_feature_testing_data = dataTransformer.transform(feature_testing_data)Compare normalized vs non-normalized
normailized_feature_training_data[0]array([0.60869565, 0.10734463, 0.4 , 1. , 0. ,
1. , 0. , 0. , 1. , 0. ,
0. ])feature_training_data.loc[0]age 19 sex female bmi 27.9 children 0 smoker yes region southwest Name: 0, dtype: object
normailized_feature_training_data.shape(1070, 11)
feature_training_data.shape(1070, 6)
Build A Model
This will be based on the insurance_model_2 model that can be found in the modeling-and-wrangling notebook.
This model version, though, will use normalized data: one-hot-encoded column values and scaled column values.
tf.random.set_seed(42)
m = tf.keras.Sequential()
epochs = 100
# different & more layers
l1 = tf.keras.layers.Dense(100)
l2 = tf.keras.layers.Dense(10)
l3 = tf.keras.layers.Dense(1)
m.add(l1)
m.add(l2)
m.add(l3)
# Compile the model
m.compile(loss=tf.keras.losses.mae,
optimizer=tf.keras.optimizers.Adam(),
metrics=['mae'])
# Fit the model and save the history (we can plot this)
m_history = m.fit(normailized_feature_training_data, label_training_data, epochs=epochs, verbose=0)Review The Model
m.summary()Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 100) 1200
dense_1 (Dense) (None, 10) 1010
dense_2 (Dense) (None, 1) 11
=================================================================
Total params: 2221 (8.68 KB)
Trainable params: 2221 (8.68 KB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
m.evaluate(normailized_feature_testing_data,label_testing_data)9/9 [==============================] - 0s 5ms/step - loss: 3437.7786 - mae: 3437.7786
[3437.778564453125, 3437.778564453125]
print(f'Training Label Median: {label_training_data.median()}')
print(f'Training Label Mean: {label_training_data.mean()}')
print(f'm MAE: {m.get_metrics_result()["mae"].numpy()}')Training Label Median: 9575.4421 Training Label Mean: 13346.089736364485 m MAE: 3437.778564453125
Compare this model mae to the insurance_model_2 (im2) model in modeling-and-wrangling:
im2had an MAE of~4700- the new model mae looks to be
~3400
Normalizing this model's data, with one-hot-encoding and scaling, made this model perform better!
Experiment With The Model
Double The Epochs
m2 = tf.keras.Sequential()
m2epochs = 200
m2.add(l1)
m2.add(l2)
m2.add(l3)
# Compile the model
m2.compile(loss=tf.keras.losses.mae,
optimizer=tf.keras.optimizers.Adam(),
metrics=['mae'])
# Fit the model and save the history (we can plot this)
m2_history = m2.fit(normailized_feature_training_data, label_training_data, epochs=m2epochs, verbose=0)Review The Model
m2.summary()Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 100) 1200
dense_1 (Dense) (None, 10) 1010
dense_2 (Dense) (None, 1) 11
=================================================================
Total params: 2221 (8.68 KB)
Trainable params: 2221 (8.68 KB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
m2.evaluate(normailized_feature_testing_data,label_testing_data)9/9 [==============================] - 0s 5ms/step - loss: 3163.1138 - mae: 3163.1138
[3163.11376953125, 3163.11376953125]
print(f'Training Label Median: {label_training_data.median()}')
print(f'Training Label Mean: {label_training_data.mean()}')
print(f'm2 MAE: {m2.get_metrics_result()["mae"].numpy()}')
print(f'SHAPE: {normailized_feature_training_data.shape}')Training Label Median: 9575.4421 Training Label Mean: 13346.089736364485 m2 MAE: 3163.11376953125 SHAPE: (1070, 11)
Increasing the Epochs ?slightly? made a positive impact on reducing the mae!
Add A Layer, Change Layer Values
m3 = tf.keras.Sequential()
l4 = tf.keras.layers.Dense(100)
m3.add(l1)
m3.add(l4)
m3.add(l2)
m3.add(l3)
# Compile the model
m3.compile(loss=tf.keras.losses.mae,
optimizer=tf.keras.optimizers.Adam(),
metrics=['mae'])
# Fit the model and save the history (we can plot this)
# , verbose=0
m3_history = m3.fit(normailized_feature_training_data, label_training_data, epochs=m2epochs)Epoch 1/200 34/34 [==============================] - 3s 6ms/step - loss: 4890.2881 - mae: 4890.2881 Epoch 2/200 34/34 [==============================] - 0s 6ms/step - loss: 3555.6372 - mae: 3555.6372 Epoch 3/200 34/34 [==============================] - 0s 6ms/step - loss: 3499.9043 - mae: 3499.9043 Epoch 4/200 34/34 [==============================] - 0s 6ms/step - loss: 3509.0237 - mae: 3509.0237 Epoch 5/200 34/34 [==============================] - 0s 6ms/step - loss: 3540.9521 - mae: 3540.9521 Epoch 6/200 34/34 [==============================] - 0s 6ms/step - loss: 3517.5859 - mae: 3517.5859 Epoch 7/200 34/34 [==============================] - 0s 6ms/step - loss: 3518.6990 - mae: 3518.6990 Epoch 8/200 34/34 [==============================] - 0s 6ms/step - loss: 3529.7898 - mae: 3529.7898 Epoch 9/200 34/34 [==============================] - 0s 6ms/step - loss: 3534.4360 - mae: 3534.4360 Epoch 10/200 34/34 [==============================] - 0s 6ms/step - loss: 3517.0488 - mae: 3517.0488 Epoch 11/200 34/34 [==============================] - 0s 6ms/step - loss: 3534.7744 - mae: 3534.7744 Epoch 12/200 34/34 [==============================] - 0s 7ms/step - loss: 3515.1853 - mae: 3515.1853 Epoch 13/200 34/34 [==============================] - 0s 8ms/step - loss: 3517.1453 - mae: 3517.1453 Epoch 14/200 34/34 [==============================] - 0s 8ms/step - loss: 3548.9805 - mae: 3548.9805 Epoch 15/200 34/34 [==============================] - 0s 7ms/step - loss: 3507.3364 - mae: 3507.3364 Epoch 16/200 34/34 [==============================] - 0s 6ms/step - loss: 3510.1047 - mae: 3510.1047 Epoch 17/200 34/34 [==============================] - 0s 6ms/step - loss: 3598.8850 - mae: 3598.8850 Epoch 18/200 34/34 [==============================] - 0s 6ms/step - loss: 3552.2483 - mae: 3552.2483 Epoch 19/200 34/34 [==============================] - 0s 6ms/step - loss: 3627.5190 - mae: 3627.5190 Epoch 20/200 34/34 [==============================] - 0s 6ms/step - loss: 3531.2268 - mae: 3531.2268 Epoch 21/200 34/34 [==============================] - 0s 6ms/step - loss: 3530.7893 - mae: 3530.7893 Epoch 22/200 34/34 [==============================] - 0s 6ms/step - loss: 3543.9106 - mae: 3543.9106 Epoch 23/200 34/34 [==============================] - 0s 6ms/step - loss: 3573.3271 - mae: 3573.3271 Epoch 24/200 34/34 [==============================] - 0s 6ms/step - loss: 3520.0950 - mae: 3520.0950 Epoch 25/200 34/34 [==============================] - 0s 6ms/step - loss: 3768.9158 - mae: 3768.9158 Epoch 26/200 34/34 [==============================] - 0s 6ms/step - loss: 3520.4199 - mae: 3520.4199 Epoch 27/200 34/34 [==============================] - 0s 6ms/step - loss: 3560.7256 - mae: 3560.7256 Epoch 28/200 34/34 [==============================] - 0s 6ms/step - loss: 3574.5161 - mae: 3574.5161 Epoch 29/200 34/34 [==============================] - 0s 6ms/step - loss: 3610.9089 - mae: 3610.9089 Epoch 30/200 34/34 [==============================] - 0s 6ms/step - loss: 3544.8354 - mae: 3544.8354 Epoch 31/200 34/34 [==============================] - 0s 7ms/step - loss: 3526.4475 - mae: 3526.4475 Epoch 32/200 34/34 [==============================] - 0s 6ms/step - loss: 3547.3059 - mae: 3547.3059 Epoch 33/200 34/34 [==============================] - 0s 6ms/step - loss: 3592.1633 - mae: 3592.1633 Epoch 34/200 34/34 [==============================] - 0s 6ms/step - loss: 3529.9126 - mae: 3529.9126 Epoch 35/200 34/34 [==============================] - 0s 6ms/step - loss: 3548.4045 - mae: 3548.4045 Epoch 36/200 34/34 [==============================] - 0s 6ms/step - loss: 3559.6292 - mae: 3559.6292 Epoch 37/200 34/34 [==============================] - 0s 8ms/step - loss: 3513.7598 - mae: 3513.7598 Epoch 38/200 34/34 [==============================] - 0s 7ms/step - loss: 3539.0298 - mae: 3539.0298 Epoch 39/200 34/34 [==============================] - 0s 8ms/step - loss: 3504.5908 - mae: 3504.5908 Epoch 40/200 34/34 [==============================] - 0s 7ms/step - loss: 3596.6885 - mae: 3596.6885 Epoch 41/200 34/34 [==============================] - 0s 7ms/step - loss: 3572.3718 - mae: 3572.3718 Epoch 42/200 34/34 [==============================] - 0s 6ms/step - loss: 3570.4832 - mae: 3570.4832 Epoch 43/200 34/34 [==============================] - 0s 6ms/step - loss: 3524.2869 - mae: 3524.2869 Epoch 44/200 34/34 [==============================] - 0s 6ms/step - loss: 3523.8442 - mae: 3523.8442 Epoch 45/200 34/34 [==============================] - 0s 6ms/step - loss: 3520.4556 - mae: 3520.4556 Epoch 46/200 34/34 [==============================] - 0s 6ms/step - loss: 3502.3997 - mae: 3502.3997 Epoch 47/200 34/34 [==============================] - 0s 6ms/step - loss: 3489.3721 - mae: 3489.3721 Epoch 48/200 34/34 [==============================] - 0s 6ms/step - loss: 3560.3491 - mae: 3560.3491 Epoch 49/200 34/34 [==============================] - 0s 6ms/step - loss: 3529.6572 - mae: 3529.6572 Epoch 50/200 34/34 [==============================] - 0s 6ms/step - loss: 3504.9089 - mae: 3504.9089 Epoch 51/200 34/34 [==============================] - 0s 6ms/step - loss: 3508.7881 - mae: 3508.7881 Epoch 52/200 34/34 [==============================] - 0s 6ms/step - loss: 3510.9788 - mae: 3510.9788 Epoch 53/200 34/34 [==============================] - 0s 7ms/step - loss: 3506.0710 - mae: 3506.0710 Epoch 54/200 34/34 [==============================] - 0s 6ms/step - loss: 3558.2712 - mae: 3558.2712 Epoch 55/200 34/34 [==============================] - 0s 7ms/step - loss: 3550.1829 - mae: 3550.1829 Epoch 56/200 34/34 [==============================] - 0s 6ms/step - loss: 3530.3340 - mae: 3530.3340 Epoch 57/200 34/34 [==============================] - 0s 7ms/step - loss: 3504.7085 - mae: 3504.7085 Epoch 58/200 34/34 [==============================] - 0s 6ms/step - loss: 3507.2275 - mae: 3507.2275 Epoch 59/200 34/34 [==============================] - 0s 6ms/step - loss: 3523.6357 - mae: 3523.6357 Epoch 60/200 34/34 [==============================] - 0s 6ms/step - loss: 3542.3811 - mae: 3542.3811 Epoch 61/200 34/34 [==============================] - 0s 7ms/step - loss: 3522.3496 - mae: 3522.3496 Epoch 62/200 34/34 [==============================] - 0s 8ms/step - loss: 3597.2749 - mae: 3597.2749 Epoch 63/200 34/34 [==============================] - 0s 8ms/step - loss: 3519.0295 - mae: 3519.0295 Epoch 64/200 34/34 [==============================] - 0s 9ms/step - loss: 3507.2795 - mae: 3507.2795 Epoch 65/200 34/34 [==============================] - 0s 6ms/step - loss: 3503.3479 - mae: 3503.3479 Epoch 66/200 34/34 [==============================] - 0s 6ms/step - loss: 3531.8118 - mae: 3531.8118 Epoch 67/200 34/34 [==============================] - 0s 6ms/step - loss: 3602.3335 - mae: 3602.3335 Epoch 68/200 34/34 [==============================] - 0s 7ms/step - loss: 3544.2197 - mae: 3544.2197 Epoch 69/200 34/34 [==============================] - 0s 6ms/step - loss: 3512.3792 - mae: 3512.3792 Epoch 70/200 34/34 [==============================] - 0s 6ms/step - loss: 3543.5198 - mae: 3543.5198 Epoch 71/200 34/34 [==============================] - 0s 6ms/step - loss: 3592.4429 - mae: 3592.4429 Epoch 72/200 34/34 [==============================] - 0s 6ms/step - loss: 3539.2012 - mae: 3539.2012 Epoch 73/200 34/34 [==============================] - 0s 6ms/step - loss: 3582.9817 - mae: 3582.9817 Epoch 74/200 34/34 [==============================] - 0s 7ms/step - loss: 3537.9119 - mae: 3537.9119 Epoch 75/200 34/34 [==============================] - 0s 7ms/step - loss: 3492.1646 - mae: 3492.1646 Epoch 76/200 34/34 [==============================] - 0s 6ms/step - loss: 3582.4971 - mae: 3582.4971 Epoch 77/200 34/34 [==============================] - 0s 6ms/step - loss: 3521.8716 - mae: 3521.8716 Epoch 78/200 34/34 [==============================] - 0s 6ms/step - loss: 3556.5381 - mae: 3556.5381 Epoch 79/200 34/34 [==============================] - 0s 7ms/step - loss: 3628.1189 - mae: 3628.1189 Epoch 80/200 34/34 [==============================] - 0s 6ms/step - loss: 3536.9333 - mae: 3536.9333 Epoch 81/200 34/34 [==============================] - 0s 6ms/step - loss: 3547.2783 - mae: 3547.2783 Epoch 82/200 34/34 [==============================] - 0s 6ms/step - loss: 3498.4644 - mae: 3498.4644 Epoch 83/200 34/34 [==============================] - 0s 7ms/step - loss: 3546.2336 - mae: 3546.2336 Epoch 84/200 34/34 [==============================] - 0s 6ms/step - loss: 3539.2549 - mae: 3539.2549 Epoch 85/200 34/34 [==============================] - 0s 6ms/step - loss: 3492.8635 - mae: 3492.8635 Epoch 86/200 34/34 [==============================] - 0s 7ms/step - loss: 3504.1799 - mae: 3504.1799 Epoch 87/200 34/34 [==============================] - 0s 7ms/step - loss: 3537.9661 - mae: 3537.9661 Epoch 88/200 34/34 [==============================] - 0s 7ms/step - loss: 3493.2207 - mae: 3493.2207 Epoch 89/200 34/34 [==============================] - 0s 8ms/step - loss: 3517.7673 - mae: 3517.7673 Epoch 90/200 34/34 [==============================] - 0s 6ms/step - loss: 3564.2373 - mae: 3564.2373 Epoch 91/200 34/34 [==============================] - 0s 6ms/step - loss: 3527.2092 - mae: 3527.2092 Epoch 92/200 34/34 [==============================] - 0s 6ms/step - loss: 3654.0620 - mae: 3654.0620 Epoch 93/200 34/34 [==============================] - 0s 6ms/step - loss: 3513.1797 - mae: 3513.1797 Epoch 94/200 34/34 [==============================] - 0s 6ms/step - loss: 3539.7449 - mae: 3539.7449 Epoch 95/200 34/34 [==============================] - 0s 6ms/step - loss: 3509.5242 - mae: 3509.5242 Epoch 96/200 34/34 [==============================] - 0s 7ms/step - loss: 3513.6448 - mae: 3513.6448 Epoch 97/200 34/34 [==============================] - 0s 6ms/step - loss: 3598.5984 - mae: 3598.5984 Epoch 98/200 34/34 [==============================] - 0s 6ms/step - loss: 3522.8945 - mae: 3522.8945 Epoch 99/200 34/34 [==============================] - 0s 6ms/step - loss: 3568.2788 - mae: 3568.2788 Epoch 100/200 34/34 [==============================] - 0s 6ms/step - loss: 3498.8835 - mae: 3498.8835 Epoch 101/200 34/34 [==============================] - 0s 6ms/step - loss: 3520.1028 - mae: 3520.1028 Epoch 102/200 34/34 [==============================] - 0s 6ms/step - loss: 3509.0598 - mae: 3509.0598 Epoch 103/200 34/34 [==============================] - 0s 7ms/step - loss: 3533.2000 - mae: 3533.2000 Epoch 104/200 34/34 [==============================] - 0s 6ms/step - loss: 3509.3853 - mae: 3509.3853 Epoch 105/200 34/34 [==============================] - 0s 6ms/step - loss: 3530.0103 - mae: 3530.0103 Epoch 106/200 34/34 [==============================] - 0s 6ms/step - loss: 3504.2649 - mae: 3504.2649 Epoch 107/200 34/34 [==============================] - 0s 6ms/step - loss: 3516.0950 - mae: 3516.0950 Epoch 108/200 34/34 [==============================] - 0s 6ms/step - loss: 3521.8760 - mae: 3521.8760 Epoch 109/200 34/34 [==============================] - 0s 6ms/step - loss: 3535.5125 - mae: 3535.5125 Epoch 110/200 34/34 [==============================] - 0s 6ms/step - loss: 3594.5701 - mae: 3594.5701 Epoch 111/200 34/34 [==============================] - 0s 7ms/step - loss: 3545.5576 - mae: 3545.5576 Epoch 112/200 34/34 [==============================] - 0s 7ms/step - loss: 3526.7427 - mae: 3526.7427 Epoch 113/200 34/34 [==============================] - 0s 7ms/step - loss: 3574.1438 - mae: 3574.1438 Epoch 114/200 34/34 [==============================] - 0s 8ms/step - loss: 3572.7832 - mae: 3572.7832 Epoch 115/200 34/34 [==============================] - 0s 6ms/step - loss: 3625.1106 - mae: 3625.1106 Epoch 116/200 34/34 [==============================] - 0s 6ms/step - loss: 3558.2129 - mae: 3558.2129 Epoch 117/200 34/34 [==============================] - 0s 6ms/step - loss: 3536.2510 - mae: 3536.2510 Epoch 118/200 34/34 [==============================] - 0s 8ms/step - loss: 3489.8787 - mae: 3489.8787 Epoch 119/200 34/34 [==============================] - 0s 7ms/step - loss: 3508.3364 - mae: 3508.3364 Epoch 120/200 34/34 [==============================] - 0s 6ms/step - loss: 3509.6387 - mae: 3509.6387 Epoch 121/200 34/34 [==============================] - 0s 6ms/step - loss: 3516.1399 - mae: 3516.1399 Epoch 122/200 34/34 [==============================] - 0s 6ms/step - loss: 3513.2319 - mae: 3513.2319 Epoch 123/200 34/34 [==============================] - 0s 6ms/step - loss: 3526.8970 - mae: 3526.8970 Epoch 124/200 34/34 [==============================] - 0s 7ms/step - loss: 3518.3542 - mae: 3518.3542 Epoch 125/200 34/34 [==============================] - 0s 6ms/step - loss: 3586.0806 - mae: 3586.0806 Epoch 126/200 34/34 [==============================] - 0s 7ms/step - loss: 3556.5435 - mae: 3556.5435 Epoch 127/200 34/34 [==============================] - 0s 7ms/step - loss: 3499.9497 - mae: 3499.9497 Epoch 128/200 34/34 [==============================] - 0s 6ms/step - loss: 3534.5830 - mae: 3534.5830 Epoch 129/200 34/34 [==============================] - 0s 6ms/step - loss: 3549.4773 - mae: 3549.4773 Epoch 130/200 34/34 [==============================] - 0s 6ms/step - loss: 3559.7786 - mae: 3559.7786 Epoch 131/200 34/34 [==============================] - 0s 7ms/step - loss: 3502.9041 - mae: 3502.9041 Epoch 132/200 34/34 [==============================] - 0s 6ms/step - loss: 3525.9985 - mae: 3525.9985 Epoch 133/200 34/34 [==============================] - 0s 6ms/step - loss: 3545.4492 - mae: 3545.4492 Epoch 134/200 34/34 [==============================] - 0s 6ms/step - loss: 3512.9663 - mae: 3512.9663 Epoch 135/200 34/34 [==============================] - 0s 6ms/step - loss: 3557.0881 - mae: 3557.0881 Epoch 136/200 34/34 [==============================] - 0s 7ms/step - loss: 3496.7576 - mae: 3496.7576 Epoch 137/200 34/34 [==============================] - 0s 7ms/step - loss: 3507.7644 - mae: 3507.7644 Epoch 138/200 34/34 [==============================] - 0s 8ms/step - loss: 3580.8159 - mae: 3580.8159 Epoch 139/200 34/34 [==============================] - 0s 8ms/step - loss: 3521.2104 - mae: 3521.2104 Epoch 140/200 34/34 [==============================] - 0s 6ms/step - loss: 3555.9934 - mae: 3555.9934 Epoch 141/200 34/34 [==============================] - 0s 7ms/step - loss: 3571.3367 - mae: 3571.3367 Epoch 142/200 34/34 [==============================] - 0s 6ms/step - loss: 3546.5601 - mae: 3546.5601 Epoch 143/200 34/34 [==============================] - 0s 6ms/step - loss: 3542.0952 - mae: 3542.0952 Epoch 144/200 34/34 [==============================] - 0s 6ms/step - loss: 3515.2068 - mae: 3515.2068 Epoch 145/200 34/34 [==============================] - 0s 6ms/step - loss: 3483.7578 - mae: 3483.7578 Epoch 146/200 34/34 [==============================] - 0s 6ms/step - loss: 3523.3525 - mae: 3523.3525 Epoch 147/200 34/34 [==============================] - 0s 6ms/step - loss: 3565.4807 - mae: 3565.4807 Epoch 148/200 34/34 [==============================] - 0s 6ms/step - loss: 3645.0767 - mae: 3645.0767 Epoch 149/200 34/34 [==============================] - 0s 6ms/step - loss: 3503.2488 - mae: 3503.2488 Epoch 150/200 34/34 [==============================] - 0s 6ms/step - loss: 3489.9109 - mae: 3489.9109 Epoch 151/200 34/34 [==============================] - 0s 6ms/step - loss: 3504.5786 - mae: 3504.5786 Epoch 152/200 34/34 [==============================] - 0s 6ms/step - loss: 3544.1118 - mae: 3544.1118 Epoch 153/200 34/34 [==============================] - 0s 6ms/step - loss: 3532.0950 - mae: 3532.0950 Epoch 154/200 34/34 [==============================] - 0s 6ms/step - loss: 3534.7517 - mae: 3534.7517 Epoch 155/200 34/34 [==============================] - 0s 6ms/step - loss: 3505.0679 - mae: 3505.0679 Epoch 156/200 34/34 [==============================] - 0s 6ms/step - loss: 3506.8848 - mae: 3506.8848 Epoch 157/200 34/34 [==============================] - 0s 6ms/step - loss: 3556.0361 - mae: 3556.0361 Epoch 158/200 34/34 [==============================] - 0s 6ms/step - loss: 3528.0815 - mae: 3528.0815 Epoch 159/200 34/34 [==============================] - 0s 6ms/step - loss: 3506.0342 - mae: 3506.0342 Epoch 160/200 34/34 [==============================] - 0s 6ms/step - loss: 3569.9309 - mae: 3569.9309 Epoch 161/200 34/34 [==============================] - 0s 7ms/step - loss: 3543.8269 - mae: 3543.8269 Epoch 162/200 34/34 [==============================] - 0s 8ms/step - loss: 3515.2939 - mae: 3515.2939 Epoch 163/200 34/34 [==============================] - 0s 9ms/step - loss: 3526.0183 - mae: 3526.0183 Epoch 164/200 34/34 [==============================] - 0s 8ms/step - loss: 3546.1233 - mae: 3546.1233 Epoch 165/200 34/34 [==============================] - 0s 6ms/step - loss: 3483.1257 - mae: 3483.1257 Epoch 166/200 34/34 [==============================] - 0s 6ms/step - loss: 3519.6001 - mae: 3519.6001 Epoch 167/200 34/34 [==============================] - 0s 6ms/step - loss: 3513.6055 - mae: 3513.6055 Epoch 168/200 34/34 [==============================] - 0s 6ms/step - loss: 3502.6233 - mae: 3502.6233 Epoch 169/200 34/34 [==============================] - 0s 6ms/step - loss: 3539.0449 - mae: 3539.0449 Epoch 170/200 34/34 [==============================] - 0s 6ms/step - loss: 3514.3677 - mae: 3514.3677 Epoch 171/200 34/34 [==============================] - 0s 6ms/step - loss: 3569.6436 - mae: 3569.6436 Epoch 172/200 34/34 [==============================] - 0s 6ms/step - loss: 3523.1667 - mae: 3523.1667 Epoch 173/200 34/34 [==============================] - 0s 6ms/step - loss: 3531.2893 - mae: 3531.2893 Epoch 174/200 34/34 [==============================] - 0s 6ms/step - loss: 3511.9719 - mae: 3511.9719 Epoch 175/200 34/34 [==============================] - 0s 6ms/step - loss: 3519.1428 - mae: 3519.1428 Epoch 176/200 34/34 [==============================] - 0s 6ms/step - loss: 3595.8640 - mae: 3595.8640 Epoch 177/200 34/34 [==============================] - 0s 7ms/step - loss: 3543.6013 - mae: 3543.6013 Epoch 178/200 34/34 [==============================] - 0s 6ms/step - loss: 3552.3430 - mae: 3552.3430 Epoch 179/200 34/34 [==============================] - 0s 6ms/step - loss: 3594.3445 - mae: 3594.3445 Epoch 180/200 34/34 [==============================] - 0s 6ms/step - loss: 3498.3184 - mae: 3498.3184 Epoch 181/200 34/34 [==============================] - 0s 6ms/step - loss: 3500.4946 - mae: 3500.4946 Epoch 182/200 34/34 [==============================] - 0s 6ms/step - loss: 3502.3877 - mae: 3502.3877 Epoch 183/200 34/34 [==============================] - 0s 6ms/step - loss: 3492.7546 - mae: 3492.7546 Epoch 184/200 34/34 [==============================] - 0s 7ms/step - loss: 3572.3604 - mae: 3572.3604 Epoch 185/200 34/34 [==============================] - 0s 7ms/step - loss: 3506.9207 - mae: 3506.9207 Epoch 186/200 34/34 [==============================] - 0s 7ms/step - loss: 3601.9490 - mae: 3601.9490 Epoch 187/200 34/34 [==============================] - 0s 8ms/step - loss: 3510.8787 - mae: 3510.8787 Epoch 188/200 34/34 [==============================] - 0s 8ms/step - loss: 3585.2195 - mae: 3585.2195 Epoch 189/200 34/34 [==============================] - 0s 8ms/step - loss: 3544.9565 - mae: 3544.9565 Epoch 190/200 34/34 [==============================] - 0s 6ms/step - loss: 3519.8762 - mae: 3519.8762 Epoch 191/200 34/34 [==============================] - 0s 6ms/step - loss: 3630.3032 - mae: 3630.3032 Epoch 192/200 34/34 [==============================] - 0s 6ms/step - loss: 3554.2234 - mae: 3554.2234 Epoch 193/200 34/34 [==============================] - 0s 6ms/step - loss: 3564.4136 - mae: 3564.4136 Epoch 194/200 34/34 [==============================] - 0s 6ms/step - loss: 3517.9006 - mae: 3517.9006 Epoch 195/200 34/34 [==============================] - 0s 6ms/step - loss: 3496.4407 - mae: 3496.4407 Epoch 196/200 34/34 [==============================] - 0s 6ms/step - loss: 3502.6802 - mae: 3502.6802 Epoch 197/200 34/34 [==============================] - 0s 6ms/step - loss: 3607.3052 - mae: 3607.3052 Epoch 198/200 34/34 [==============================] - 0s 6ms/step - loss: 3607.6257 - mae: 3607.6257 Epoch 199/200 34/34 [==============================] - 0s 6ms/step - loss: 3512.8425 - mae: 3512.8425 Epoch 200/200 34/34 [==============================] - 0s 7ms/step - loss: 3495.7375 - mae: 3495.7375
Review The Model
m3.summary()Model: "sequential_6"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 100) 1200
dense_7 (Dense) (None, 100) 10100
dense_1 (Dense) (None, 10) 1010
dense_2 (Dense) (None, 1) 11
=================================================================
Total params: 12321 (48.13 KB)
Trainable params: 12321 (48.13 KB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
m3.evaluate(normailized_feature_testing_data,label_testing_data)9/9 [==============================] - 0s 6ms/step - loss: 3223.4221 - mae: 3223.4221
[3223.422119140625, 3223.422119140625]
print(f'Training Label Median: {label_training_data.median()}')
print(f'Training Label Mean: {label_training_data.mean()}')
print(f'm3 MAE: {m3.get_metrics_result()["mae"].numpy()} vs m2 MAE: {m2.get_metrics_result()["mae"].numpy()}')Training Label Median: 9575.4421 Training Label Mean: 13346.089736364485 m3 MAE: 3223.422119140625 vs m2 MAE: 3163.11376953125
Adding a layer made the mae roughly the same
Change the Learning Rate less epochs
m4 = tf.keras.Sequential()
# l4 = tf.keras.layers.Dense(100)
m4.add(l1)
# m4.add(l4)
m4.add(l2)
m4.add(l3)
# Compile the model
m4.compile(loss=tf.keras.losses.mae,
optimizer=tf.keras.optimizers.Adam(learning_rate=.008),
metrics=['mae'])
# Fit the model and save the history (we can plot this)
# , verbose=0
m4_history = m4.fit(normailized_feature_training_data, label_training_data, epochs=epochs)Epoch 1/100 34/34 [==============================] - 3s 6ms/step - loss: 3510.7605 - mae: 3510.7605 Epoch 2/100 34/34 [==============================] - 0s 6ms/step - loss: 3485.2773 - mae: 3485.2773 Epoch 3/100 34/34 [==============================] - 0s 5ms/step - loss: 3482.4819 - mae: 3482.4819 Epoch 4/100 34/34 [==============================] - 0s 6ms/step - loss: 3489.0757 - mae: 3489.0757 Epoch 5/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.5991 - mae: 3488.5991 Epoch 6/100 34/34 [==============================] - 0s 6ms/step - loss: 3494.5928 - mae: 3494.5928 Epoch 7/100 34/34 [==============================] - 0s 5ms/step - loss: 3487.7686 - mae: 3487.7686 Epoch 8/100 34/34 [==============================] - 0s 5ms/step - loss: 3485.9607 - mae: 3485.9607 Epoch 9/100 34/34 [==============================] - 0s 5ms/step - loss: 3488.7549 - mae: 3488.7549 Epoch 10/100 34/34 [==============================] - 0s 6ms/step - loss: 3489.1785 - mae: 3489.1785 Epoch 11/100 34/34 [==============================] - 0s 5ms/step - loss: 3506.1489 - mae: 3506.1489 Epoch 12/100 34/34 [==============================] - 0s 6ms/step - loss: 3484.4146 - mae: 3484.4146 Epoch 13/100 34/34 [==============================] - 0s 5ms/step - loss: 3481.8342 - mae: 3481.8342 Epoch 14/100 34/34 [==============================] - 0s 5ms/step - loss: 3497.3767 - mae: 3497.3767 Epoch 15/100 34/34 [==============================] - 0s 5ms/step - loss: 3485.5110 - mae: 3485.5110 Epoch 16/100 34/34 [==============================] - 0s 5ms/step - loss: 3497.8796 - mae: 3497.8796 Epoch 17/100 34/34 [==============================] - 0s 5ms/step - loss: 3492.0408 - mae: 3492.0408 Epoch 18/100 34/34 [==============================] - 0s 6ms/step - loss: 3484.4338 - mae: 3484.4338 Epoch 19/100 34/34 [==============================] - 0s 6ms/step - loss: 3498.3669 - mae: 3498.3669 Epoch 20/100 34/34 [==============================] - 0s 6ms/step - loss: 3485.9248 - mae: 3485.9248 Epoch 21/100 34/34 [==============================] - 0s 7ms/step - loss: 3480.2039 - mae: 3480.2039 Epoch 22/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.9619 - mae: 3488.9619 Epoch 23/100 34/34 [==============================] - 0s 6ms/step - loss: 3496.4609 - mae: 3496.4609 Epoch 24/100 34/34 [==============================] - 0s 5ms/step - loss: 3490.5063 - mae: 3490.5063 Epoch 25/100 34/34 [==============================] - 0s 6ms/step - loss: 3495.9028 - mae: 3495.9028 Epoch 26/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.0950 - mae: 3487.0950 Epoch 27/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.3955 - mae: 3487.3955 Epoch 28/100 34/34 [==============================] - 0s 5ms/step - loss: 3491.9612 - mae: 3491.9612 Epoch 29/100 34/34 [==============================] - 0s 5ms/step - loss: 3493.3364 - mae: 3493.3364 Epoch 30/100 34/34 [==============================] - 0s 6ms/step - loss: 3512.5054 - mae: 3512.5054 Epoch 31/100 34/34 [==============================] - 0s 5ms/step - loss: 3497.9985 - mae: 3497.9985 Epoch 32/100 34/34 [==============================] - 0s 6ms/step - loss: 3502.2434 - mae: 3502.2434 Epoch 33/100 34/34 [==============================] - 0s 6ms/step - loss: 3503.8032 - mae: 3503.8032 Epoch 34/100 34/34 [==============================] - 0s 5ms/step - loss: 3480.8286 - mae: 3480.8286 Epoch 35/100 34/34 [==============================] - 0s 5ms/step - loss: 3484.4680 - mae: 3484.4680 Epoch 36/100 34/34 [==============================] - 0s 5ms/step - loss: 3490.7930 - mae: 3490.7930 Epoch 37/100 34/34 [==============================] - 0s 6ms/step - loss: 3494.0649 - mae: 3494.0649 Epoch 38/100 34/34 [==============================] - 0s 6ms/step - loss: 3503.9114 - mae: 3503.9114 Epoch 39/100 34/34 [==============================] - 0s 5ms/step - loss: 3486.0576 - mae: 3486.0576 Epoch 40/100 34/34 [==============================] - 0s 7ms/step - loss: 3509.7026 - mae: 3509.7026 Epoch 41/100 34/34 [==============================] - 0s 6ms/step - loss: 3495.4399 - mae: 3495.4399 Epoch 42/100 34/34 [==============================] - 0s 6ms/step - loss: 3493.5073 - mae: 3493.5073 Epoch 43/100 34/34 [==============================] - 0s 6ms/step - loss: 3489.6760 - mae: 3489.6760 Epoch 44/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.4531 - mae: 3488.4531 Epoch 45/100 34/34 [==============================] - 0s 6ms/step - loss: 3478.4622 - mae: 3478.4622 Epoch 46/100 34/34 [==============================] - 0s 5ms/step - loss: 3489.9863 - mae: 3489.9863 Epoch 47/100 34/34 [==============================] - 0s 6ms/step - loss: 3495.1831 - mae: 3495.1831 Epoch 48/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.1218 - mae: 3487.1218 Epoch 49/100 34/34 [==============================] - 0s 6ms/step - loss: 3503.4702 - mae: 3503.4702 Epoch 50/100 34/34 [==============================] - 0s 7ms/step - loss: 3487.0396 - mae: 3487.0396 Epoch 51/100 34/34 [==============================] - 0s 6ms/step - loss: 3479.0144 - mae: 3479.0144 Epoch 52/100 34/34 [==============================] - 0s 6ms/step - loss: 3486.8059 - mae: 3486.8059 Epoch 53/100 34/34 [==============================] - 0s 6ms/step - loss: 3482.8071 - mae: 3482.8071 Epoch 54/100 34/34 [==============================] - 0s 6ms/step - loss: 3500.8154 - mae: 3500.8154 Epoch 55/100 34/34 [==============================] - 0s 5ms/step - loss: 3511.2195 - mae: 3511.2195 Epoch 56/100 34/34 [==============================] - 0s 6ms/step - loss: 3485.8831 - mae: 3485.8831 Epoch 57/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.1084 - mae: 3488.1084 Epoch 58/100 34/34 [==============================] - 0s 6ms/step - loss: 3483.1650 - mae: 3483.1650 Epoch 59/100 34/34 [==============================] - 0s 6ms/step - loss: 3486.4912 - mae: 3486.4912 Epoch 60/100 34/34 [==============================] - 0s 5ms/step - loss: 3488.1587 - mae: 3488.1587 Epoch 61/100 34/34 [==============================] - 0s 6ms/step - loss: 3495.7007 - mae: 3495.7007 Epoch 62/100 34/34 [==============================] - 0s 5ms/step - loss: 3490.6890 - mae: 3490.6890 Epoch 63/100 34/34 [==============================] - 0s 6ms/step - loss: 3484.7341 - mae: 3484.7341 Epoch 64/100 34/34 [==============================] - 0s 5ms/step - loss: 3489.3640 - mae: 3489.3640 Epoch 65/100 34/34 [==============================] - 0s 6ms/step - loss: 3481.2261 - mae: 3481.2261 Epoch 66/100 34/34 [==============================] - 0s 5ms/step - loss: 3496.1794 - mae: 3496.1794 Epoch 67/100 34/34 [==============================] - 0s 5ms/step - loss: 3484.1841 - mae: 3484.1841 Epoch 68/100 34/34 [==============================] - 0s 5ms/step - loss: 3485.4292 - mae: 3485.4292 Epoch 69/100 34/34 [==============================] - 0s 6ms/step - loss: 3480.9167 - mae: 3480.9167 Epoch 70/100 34/34 [==============================] - 0s 5ms/step - loss: 3489.8347 - mae: 3489.8347 Epoch 71/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.2981 - mae: 3487.2981 Epoch 72/100 34/34 [==============================] - 0s 6ms/step - loss: 3490.7551 - mae: 3490.7551 Epoch 73/100 34/34 [==============================] - 0s 5ms/step - loss: 3499.1602 - mae: 3499.1602 Epoch 74/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.1614 - mae: 3491.1614 Epoch 75/100 34/34 [==============================] - 0s 6ms/step - loss: 3481.2334 - mae: 3481.2334 Epoch 76/100 34/34 [==============================] - 0s 7ms/step - loss: 3486.4187 - mae: 3486.4187 Epoch 77/100 34/34 [==============================] - 0s 7ms/step - loss: 3481.8494 - mae: 3481.8494 Epoch 78/100 34/34 [==============================] - 0s 7ms/step - loss: 3488.6729 - mae: 3488.6729 Epoch 79/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.7092 - mae: 3487.7092 Epoch 80/100 34/34 [==============================] - 0s 6ms/step - loss: 3493.4900 - mae: 3493.4900 Epoch 81/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.5762 - mae: 3491.5762 Epoch 82/100 34/34 [==============================] - 0s 6ms/step - loss: 3486.0894 - mae: 3486.0894 Epoch 83/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.0815 - mae: 3488.0815 Epoch 84/100 34/34 [==============================] - 0s 6ms/step - loss: 3490.3613 - mae: 3490.3613 Epoch 85/100 34/34 [==============================] - 0s 6ms/step - loss: 3481.9177 - mae: 3481.9177 Epoch 86/100 34/34 [==============================] - 0s 6ms/step - loss: 3481.2058 - mae: 3481.2058 Epoch 87/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.7839 - mae: 3488.7839 Epoch 88/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.9512 - mae: 3487.9512 Epoch 89/100 34/34 [==============================] - 0s 5ms/step - loss: 3486.7156 - mae: 3486.7156 Epoch 90/100 34/34 [==============================] - 0s 7ms/step - loss: 3489.4817 - mae: 3489.4817 Epoch 91/100 34/34 [==============================] - 0s 6ms/step - loss: 3485.9375 - mae: 3485.9375 Epoch 92/100 34/34 [==============================] - 0s 5ms/step - loss: 3494.9062 - mae: 3494.9062 Epoch 93/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.6230 - mae: 3487.6230 Epoch 94/100 34/34 [==============================] - 0s 5ms/step - loss: 3485.3987 - mae: 3485.3987 Epoch 95/100 34/34 [==============================] - 0s 5ms/step - loss: 3486.7446 - mae: 3486.7446 Epoch 96/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.8154 - mae: 3487.8154 Epoch 97/100 34/34 [==============================] - 0s 5ms/step - loss: 3501.4971 - mae: 3501.4971 Epoch 98/100 34/34 [==============================] - 0s 5ms/step - loss: 3481.6580 - mae: 3481.6580 Epoch 99/100 34/34 [==============================] - 0s 5ms/step - loss: 3502.9363 - mae: 3502.9363 Epoch 100/100 34/34 [==============================] - 0s 5ms/step - loss: 3494.2498 - mae: 3494.2498
Review The Model
m4.summary()Model: "sequential_6"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 100) 1200
dense_7 (Dense) (None, 100) 10100
dense_1 (Dense) (None, 10) 1010
dense_2 (Dense) (None, 1) 11
=================================================================
Total params: 12321 (48.13 KB)
Trainable params: 12321 (48.13 KB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
m4.evaluate(normailized_feature_testing_data,label_testing_data)9/9 [==============================] - 0s 6ms/step - loss: 3223.4221 - mae: 3223.4221
[3223.422119140625, 3223.422119140625]
print(f'Training Label Median: {label_training_data.median()}')
print(f'Training Label Mean: {label_training_data.mean()}')
print(f'm3 MAE: {m4.get_metrics_result()["mae"].numpy()} vs m2 MAE: {m2.get_metrics_result()["mae"].numpy()}')Training Label Median: 9575.4421 Training Label Mean: 13346.089736364485 m3 MAE: 3494.249755859375 vs m2 MAE: 3163.11376953125
setting the learning rate to .008, compared to the default .001, made the outcome slightly worse here :/
Change the Learning Rate Again
m5 = tf.keras.Sequential()
# l4 = tf.keras.layers.Dense(100)
m5.add(l1)
# m5.add(l4)
m5.add(l2)
m5.add(l3)
# Compile the model
m5.compile(loss=tf.keras.losses.mae,
optimizer=tf.keras.optimizers.Adam(learning_rate=.01),
metrics=['mae'])
# Fit the model and save the history (we can plot this)
# , verbose=0
m5_history = m5.fit(normailized_feature_training_data, label_training_data, epochs=epochs)Epoch 1/100 34/34 [==============================] - 3s 6ms/step - loss: 3540.1484 - mae: 3540.1484 Epoch 2/100 34/34 [==============================] - 0s 5ms/step - loss: 3499.3572 - mae: 3499.3572 Epoch 3/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.2312 - mae: 3488.2312 Epoch 4/100 34/34 [==============================] - 0s 6ms/step - loss: 3494.9104 - mae: 3494.9104 Epoch 5/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.4202 - mae: 3491.4202 Epoch 6/100 34/34 [==============================] - 0s 5ms/step - loss: 3497.7678 - mae: 3497.7678 Epoch 7/100 34/34 [==============================] - 0s 6ms/step - loss: 3490.1836 - mae: 3490.1836 Epoch 8/100 34/34 [==============================] - 0s 5ms/step - loss: 3487.2910 - mae: 3487.2910 Epoch 9/100 34/34 [==============================] - 0s 5ms/step - loss: 3491.6929 - mae: 3491.6929 Epoch 10/100 34/34 [==============================] - 0s 5ms/step - loss: 3495.8262 - mae: 3495.8262 Epoch 11/100 34/34 [==============================] - 0s 5ms/step - loss: 3520.3125 - mae: 3520.3125 Epoch 12/100 34/34 [==============================] - 0s 5ms/step - loss: 3487.8164 - mae: 3487.8164 Epoch 13/100 34/34 [==============================] - 0s 6ms/step - loss: 3482.7441 - mae: 3482.7441 Epoch 14/100 34/34 [==============================] - 0s 5ms/step - loss: 3510.2600 - mae: 3510.2600 Epoch 15/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.3149 - mae: 3491.3149 Epoch 16/100 34/34 [==============================] - 0s 6ms/step - loss: 3498.7566 - mae: 3498.7566 Epoch 17/100 34/34 [==============================] - 0s 6ms/step - loss: 3498.7661 - mae: 3498.7661 Epoch 18/100 34/34 [==============================] - 0s 5ms/step - loss: 3487.8101 - mae: 3487.8101 Epoch 19/100 34/34 [==============================] - 0s 6ms/step - loss: 3509.2498 - mae: 3509.2498 Epoch 20/100 34/34 [==============================] - 0s 5ms/step - loss: 3489.2043 - mae: 3489.2043 Epoch 21/100 34/34 [==============================] - 0s 5ms/step - loss: 3482.3862 - mae: 3482.3862 Epoch 22/100 34/34 [==============================] - 0s 5ms/step - loss: 3488.4395 - mae: 3488.4395 Epoch 23/100 34/34 [==============================] - 0s 7ms/step - loss: 3495.2896 - mae: 3495.2896 Epoch 24/100 34/34 [==============================] - 0s 7ms/step - loss: 3493.9971 - mae: 3493.9971 Epoch 25/100 34/34 [==============================] - 0s 7ms/step - loss: 3535.2964 - mae: 3535.2964 Epoch 26/100 34/34 [==============================] - 0s 6ms/step - loss: 3490.0095 - mae: 3490.0095 Epoch 27/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.9321 - mae: 3488.9321 Epoch 28/100 34/34 [==============================] - 0s 6ms/step - loss: 3502.6750 - mae: 3502.6750 Epoch 29/100 34/34 [==============================] - 0s 5ms/step - loss: 3505.0737 - mae: 3505.0737 Epoch 30/100 34/34 [==============================] - 0s 5ms/step - loss: 3507.5300 - mae: 3507.5300 Epoch 31/100 34/34 [==============================] - 0s 5ms/step - loss: 3499.1841 - mae: 3499.1841 Epoch 32/100 34/34 [==============================] - 0s 5ms/step - loss: 3508.7507 - mae: 3508.7507 Epoch 33/100 34/34 [==============================] - 0s 5ms/step - loss: 3502.0920 - mae: 3502.0920 Epoch 34/100 34/34 [==============================] - 0s 5ms/step - loss: 3485.6548 - mae: 3485.6548 Epoch 35/100 34/34 [==============================] - 0s 5ms/step - loss: 3494.0027 - mae: 3494.0027 Epoch 36/100 34/34 [==============================] - 0s 6ms/step - loss: 3499.2178 - mae: 3499.2178 Epoch 37/100 34/34 [==============================] - 0s 5ms/step - loss: 3496.3865 - mae: 3496.3865 Epoch 38/100 34/34 [==============================] - 0s 6ms/step - loss: 3505.8816 - mae: 3505.8816 Epoch 39/100 34/34 [==============================] - 0s 6ms/step - loss: 3490.0427 - mae: 3490.0427 Epoch 40/100 34/34 [==============================] - 0s 6ms/step - loss: 3520.7771 - mae: 3520.7771 Epoch 41/100 34/34 [==============================] - 0s 6ms/step - loss: 3507.1648 - mae: 3507.1648 Epoch 42/100 34/34 [==============================] - 0s 6ms/step - loss: 3498.5571 - mae: 3498.5571 Epoch 43/100 34/34 [==============================] - 0s 6ms/step - loss: 3493.6125 - mae: 3493.6125 Epoch 44/100 34/34 [==============================] - 0s 6ms/step - loss: 3488.8826 - mae: 3488.8826 Epoch 45/100 34/34 [==============================] - 0s 5ms/step - loss: 3482.7305 - mae: 3482.7305 Epoch 46/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.7993 - mae: 3491.7993 Epoch 47/100 34/34 [==============================] - 0s 6ms/step - loss: 3492.4912 - mae: 3492.4912 Epoch 48/100 34/34 [==============================] - 0s 5ms/step - loss: 3490.3088 - mae: 3490.3088 Epoch 49/100 34/34 [==============================] - 0s 6ms/step - loss: 3502.4829 - mae: 3502.4829 Epoch 50/100 34/34 [==============================] - 0s 5ms/step - loss: 3503.2703 - mae: 3503.2703 Epoch 51/100 34/34 [==============================] - 0s 6ms/step - loss: 3484.8735 - mae: 3484.8735 Epoch 52/100 34/34 [==============================] - 0s 7ms/step - loss: 3488.7502 - mae: 3488.7502 Epoch 53/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.7288 - mae: 3487.7288 Epoch 54/100 34/34 [==============================] - 0s 7ms/step - loss: 3509.3267 - mae: 3509.3267 Epoch 55/100 34/34 [==============================] - 0s 6ms/step - loss: 3513.8994 - mae: 3513.8994 Epoch 56/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.2119 - mae: 3491.2119 Epoch 57/100 34/34 [==============================] - 0s 6ms/step - loss: 3493.1650 - mae: 3493.1650 Epoch 58/100 34/34 [==============================] - 0s 5ms/step - loss: 3487.9812 - mae: 3487.9812 Epoch 59/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.8193 - mae: 3491.8193 Epoch 60/100 34/34 [==============================] - 0s 5ms/step - loss: 3495.4663 - mae: 3495.4663 Epoch 61/100 34/34 [==============================] - 0s 6ms/step - loss: 3499.3489 - mae: 3499.3489 Epoch 62/100 34/34 [==============================] - 0s 5ms/step - loss: 3497.6179 - mae: 3497.6179 Epoch 63/100 34/34 [==============================] - 0s 5ms/step - loss: 3487.5103 - mae: 3487.5103 Epoch 64/100 34/34 [==============================] - 0s 6ms/step - loss: 3492.9077 - mae: 3492.9077 Epoch 65/100 34/34 [==============================] - 0s 6ms/step - loss: 3485.4131 - mae: 3485.4131 Epoch 66/100 34/34 [==============================] - 0s 6ms/step - loss: 3496.0811 - mae: 3496.0811 Epoch 67/100 34/34 [==============================] - 0s 7ms/step - loss: 3488.5515 - mae: 3488.5515 Epoch 68/100 34/34 [==============================] - 0s 5ms/step - loss: 3491.6892 - mae: 3491.6892 Epoch 69/100 34/34 [==============================] - 0s 6ms/step - loss: 3484.7234 - mae: 3484.7234 Epoch 70/100 34/34 [==============================] - 0s 5ms/step - loss: 3489.6514 - mae: 3489.6514 Epoch 71/100 34/34 [==============================] - 0s 5ms/step - loss: 3489.3477 - mae: 3489.3477 Epoch 72/100 34/34 [==============================] - 0s 6ms/step - loss: 3487.2004 - mae: 3487.2004 Epoch 73/100 34/34 [==============================] - 0s 5ms/step - loss: 3507.1978 - mae: 3507.1978 Epoch 74/100 34/34 [==============================] - 0s 6ms/step - loss: 3500.8391 - mae: 3500.8391 Epoch 75/100 34/34 [==============================] - 0s 6ms/step - loss: 3483.8801 - mae: 3483.8801 Epoch 76/100 34/34 [==============================] - 0s 5ms/step - loss: 3493.0049 - mae: 3493.0049 Epoch 77/100 34/34 [==============================] - 0s 5ms/step - loss: 3485.7744 - mae: 3485.7744 Epoch 78/100 34/34 [==============================] - 0s 5ms/step - loss: 3494.4929 - mae: 3494.4929 Epoch 79/100 34/34 [==============================] - 0s 6ms/step - loss: 3492.3030 - mae: 3492.3030 Epoch 80/100 34/34 [==============================] - 0s 7ms/step - loss: 3501.1536 - mae: 3501.1536 Epoch 81/100 34/34 [==============================] - 0s 6ms/step - loss: 3496.6230 - mae: 3496.6230 Epoch 82/100 34/34 [==============================] - 0s 7ms/step - loss: 3495.2131 - mae: 3495.2131 Epoch 83/100 34/34 [==============================] - 0s 6ms/step - loss: 3492.4189 - mae: 3492.4189 Epoch 84/100 34/34 [==============================] - 0s 6ms/step - loss: 3498.3599 - mae: 3498.3599 Epoch 85/100 34/34 [==============================] - 0s 6ms/step - loss: 3483.6519 - mae: 3483.6519 Epoch 86/100 34/34 [==============================] - 0s 6ms/step - loss: 3484.1389 - mae: 3484.1389 Epoch 87/100 34/34 [==============================] - 0s 6ms/step - loss: 3486.3140 - mae: 3486.3140 Epoch 88/100 34/34 [==============================] - 0s 6ms/step - loss: 3485.5459 - mae: 3485.5459 Epoch 89/100 34/34 [==============================] - 0s 5ms/step - loss: 3494.4602 - mae: 3494.4602 Epoch 90/100 34/34 [==============================] - 0s 5ms/step - loss: 3492.6377 - mae: 3492.6377 Epoch 91/100 34/34 [==============================] - 0s 6ms/step - loss: 3485.8967 - mae: 3485.8967 Epoch 92/100 34/34 [==============================] - 0s 6ms/step - loss: 3496.5437 - mae: 3496.5437 Epoch 93/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.3047 - mae: 3491.3047 Epoch 94/100 34/34 [==============================] - 0s 6ms/step - loss: 3484.4326 - mae: 3484.4326 Epoch 95/100 34/34 [==============================] - 0s 6ms/step - loss: 3491.6902 - mae: 3491.6902 Epoch 96/100 34/34 [==============================] - 0s 6ms/step - loss: 3494.6145 - mae: 3494.6145 Epoch 97/100 34/34 [==============================] - 0s 6ms/step - loss: 3507.8342 - mae: 3507.8342 Epoch 98/100 34/34 [==============================] - 0s 5ms/step - loss: 3482.3210 - mae: 3482.3210 Epoch 99/100 34/34 [==============================] - 0s 6ms/step - loss: 3513.8071 - mae: 3513.8071 Epoch 100/100 34/34 [==============================] - 0s 6ms/step - loss: 3509.5012 - mae: 3509.5012
Review The Model
m5.summary()Model: "sequential_9"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
dense (Dense) (None, 100) 1200
dense_1 (Dense) (None, 10) 1010
dense_2 (Dense) (None, 1) 11
=================================================================
Total params: 2221 (8.68 KB)
Trainable params: 2221 (8.68 KB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
m5.evaluate(normailized_feature_testing_data,label_testing_data)9/9 [==============================] - 0s 6ms/step - loss: 3223.4221 - mae: 3223.4221
[3223.422119140625, 3223.422119140625]
print(f'Training Label Median: {label_training_data.median()}')
print(f'Training Label Mean: {label_training_data.mean()}')
print(f'm5 MAE: {m5.get_metrics_result()["mae"].numpy()} vs m2 MAE: {m2.get_metrics_result()["mae"].numpy()}')Training Label Median: 9575.4421 Training Label Mean: 13346.089736364485 m5 MAE: 3509.501220703125 vs m2 MAE: 3163.11376953125
setting the learning rate to .01, compared to the default .001, made the outcome slightly worse here