AI for Mechanical Engineering: Heat Transfer


By Jongmok Lee
http://iailab.kaist.ac.kr/
Industrial AI Lab at KAIST

1. Regression Model for Heat Fins

1.1 Recap: Heat Fins

  • Fins are widely used to increase the rate of heat transfer from a wall$,$
    • The selection of a suitable fin geometry requires a compromise among the cost, weight, and etc.
    • The heat transfer effectiveness and efficiency is determined by these fin geometries
      • ex) $\eta_f = \frac{\tanh \sqrt{\bar{h}PL^2/KA}}{\sqrt{\bar{h}PL^2/KA}}$ for fin of rectangular cross section (length $L$ and thickness $t$)
    • However, for complex, geometry, FDM calculation is required to obtain heat transfer efficiency and effectiveness
      • Build regression model that predicts efficiency and effectiveness using fin geometries
  • Train regession model to predict heat transfer performance using fin parameters
  • Inputs: fin parameters:
    • Outer diameter of tube $(D_o)$
    • Fin spacing $(\delta)$
    • Fin thickness $(t)$
    • Thermal conductivity of the material $(k)$
    • Convective heat transfer coefficient $(h)$
  • Outputs: heat fin performances:
    • Efficiency $(\eta)$
    • Effectiveness $(\epsilon)$
  • In order to dimensionally balance the equation, the input variables are convected into non-dimensional forms subsequently decreasing the number of input variables
    • $\delta^* = \frac{\delta}{D_o}$
    • $t^* = \frac{t}{D_o}$
    • $M = \frac{h}{8k}L_{exp}$
  • $\eta = f_1(\delta^*, t^*, M)$
  • $\epsilon = f_2(\delta^*, t^*, M)$

we build regression model with two methods:

  1. K-Nearest Neighbor (KNN) Regression
  2. MLP Regression Model

1.2 K-Nearest Neighbor (KNN) Regression

Non-parametric method

We write our model as


$$y = f(x) + \varepsilon$$


where $\varepsilon$ captures measurement errors and other discrepancies.

Then, with a good $f$ we can make predictions of $y$ at new points $x_{\text{new}}$ . One possible way so called "nearest neighbor method" is:


$$\hat y = \text{avg} \left(y \mid x \in \mathcal{N}(x_{\text{new}}) \right)$$


where $\mathcal{N}(x)$ is some neighborhood of $x$

In [ ]:
import numpy as np
import matplotlib.pyplot as plt
from sklearn import neighbors
%matplotlib inline

Load Dataset

In [ ]:
import pandas as pd

df = pd.read_csv('./data/kNN_2.csv')
df.head()
Out[ ]:
t*=t/do del*=del/do Lexposed h qcu/qo (efficiency copper) qss/qo (efficiency steel) Ecu (effectiveness copper) Ess (effectiveness steel)
0 0.0333 0.1 75.2 5 0.999750 0.992662 3.988393 3.960116
1 0.0333 0.1 75.2 20 0.999004 0.971097 3.985414 3.874083
2 0.0333 0.1 75.2 50 0.997511 0.929920 3.979461 3.709814
3 0.0333 0.1 75.2 100 0.995031 0.866704 3.969567 3.457621
4 0.0333 0.1 75.2 200 0.990097 0.758150 3.949881 3.024554
In [ ]:
K_ss = 16.7

t_n = np.array(df['t*=t/do']).reshape(-1, 1)
del_n = np.array(df['del*=del/do']).reshape(-1, 1)
Mss = ((np.array(df['h']) * np.array([df['Lexposed']])) / (8 * K_ss)).reshape(-1, 1)

Xss = np.concatenate((t_n, del_n, Mss), 1)
In [ ]:
qss = np.array(df['qss/qo (efficiency steel)']).reshape(-1, 1)
Ess = np.array(df['Ess (effectiveness steel)']).reshape(-1, 1)

Yss = np.concatenate((qss, Ess), 1)
In [ ]:
Qss1 = neighbors.KNeighborsRegressor(n_neighbors=1)
Qss2 = neighbors.KNeighborsRegressor(n_neighbors=2)
Qss3 = neighbors.KNeighborsRegressor(n_neighbors=3)

indices = np.arange(Xss.shape[0])
np.random.seed(41)
np.random.shuffle(indices)

shuffled_Xss = Xss[indices]
shuffled_Yss = Yss[indices]

N_train = int(0.8*len(shuffled_Xss))
Qss1.fit(shuffled_Xss[:N_train], shuffled_Yss[:N_train, 0:1])
Qss2.fit(shuffled_Xss[:N_train], shuffled_Yss[:N_train, 0:1])
Qss3.fit(shuffled_Xss[:N_train], shuffled_Yss[:N_train, 0:1])

yp1 = Qss1.predict(shuffled_Xss)
yp2 = Qss2.predict(shuffled_Xss)
yp3 = Qss3.predict(shuffled_Xss)

t_ran = np.array(np.random.uniform(np.min(t_n), np.max(t_n))).reshape(-1, 1)
del_ran = np.array(np.random.uniform(np.min(del_n), np.max(del_n))).reshape(-1, 1)
M_ran = np.array(np.random.uniform(np.min(Mss), np.max(Mss))).reshape(-1, 1)
TEST_X = np.concatenate((t_ran, del_ran, M_ran), 1)
TEST_Y1 = Qss1.predict(TEST_X)
TEST_Y2 = Qss2.predict(TEST_X)
TEST_Y3 = Qss3.predict(TEST_X)

plt.figure(figsize=(15, 4), dpi=100)

plt.subplot(1, 3, 1)
plt.title('k-Nearest Neighbor Regression N=1')
plt.plot([0, 1.2], [0, 1.2], color='k', linestyle='--', alpha=0.7, zorder=0)
plt.scatter(yp1[:N_train], shuffled_Yss[:N_train, 0:1], color='gray', edgecolors='k', linewidths=0.5, alpha=0.7, label='Train')
plt.scatter(yp1[N_train:], shuffled_Yss[N_train:, 0:1], color='r', edgecolors='k', linewidths=0.5, alpha=0.7, label='Test')
plt.axis('square')
plt.xlim(0, 1.2)
plt.ylim(0, 1.2)
plt.xlabel('Efficiency $\\eta_{pred}$')
plt.ylabel('Efficiency $\\eta_{GT}$')
plt.legend()

plt.subplot(1, 3, 2)
plt.title('k-Nearest Neighbor Regression N=2')
plt.plot([0, 1.2], [0, 1.2], color='k', linestyle='--', alpha=0.7, zorder=0)
plt.scatter(yp2[:N_train], shuffled_Yss[:N_train, 0:1], color='gray', edgecolors='k', linewidths=0.5, alpha=0.7, label='Train')
plt.scatter(yp2[N_train:], shuffled_Yss[N_train:, 0:1], color='r', edgecolors='k', linewidths=0.5, alpha=0.7, label='Test')
plt.axis('square')
plt.xlim(0, 1.2)
plt.ylim(0, 1.2)
plt.xlabel('Efficiency $\\eta_{pred}$')
plt.ylabel('Efficiency $\\eta_{GT}$')
plt.legend()

plt.subplot(1, 3, 3)
plt.title('k-Nearest Neighbor Regression N=3')
plt.plot([0, 1.2], [0, 1.2], color='k', linestyle='--', alpha=0.7, zorder=0)
plt.scatter(yp3[:N_train], shuffled_Yss[:N_train, 0:1], color='gray', edgecolors='k', linewidths=0.5, alpha=0.7, label='Train')
plt.scatter(yp3[N_train:], shuffled_Yss[N_train:, 0:1], color='r', edgecolors='k', linewidths=0.5, alpha=0.7, label='Test')
plt.axis('square')
plt.xlim(0, 1.2)
plt.ylim(0, 1.2)
plt.xlabel('Efficiency $\\eta_{pred}$')
plt.ylabel('Efficiency $\\eta_{GT}$')
plt.legend()
plt.show()
No description has been provided for this image
In [ ]:
Ess1 = neighbors.KNeighborsRegressor(n_neighbors=1)
Ess2 = neighbors.KNeighborsRegressor(n_neighbors=2)
Ess3 = neighbors.KNeighborsRegressor(n_neighbors=3)

Ess1.fit(shuffled_Xss[:N_train], shuffled_Yss[:N_train, 1:2])
Ess2.fit(shuffled_Xss[:N_train], shuffled_Yss[:N_train, 1:2])
Ess3.fit(shuffled_Xss[:N_train], shuffled_Yss[:N_train, 1:2])

yp1 = Ess1.predict(shuffled_Xss)
yp2 = Ess2.predict(shuffled_Xss)
yp3 = Ess3.predict(shuffled_Xss)

t_ran = np.array(np.random.uniform(np.min(t_n), np.max(t_n))).reshape(-1, 1)
del_ran = np.array(np.random.uniform(np.min(del_n), np.max(del_n))).reshape(-1, 1)
M_ran = np.array(np.random.uniform(np.min(Mss), np.max(Mss))).reshape(-1, 1)
TEST_X = np.concatenate((t_ran, del_ran, M_ran), 1)
TEST_Y1 = Ess1.predict(TEST_X)
TEST_Y2 = Ess2.predict(TEST_X)
TEST_Y3 = Ess3.predict(TEST_X)

plt.figure(figsize=(15, 4), dpi=100)

plt.subplot(1, 3, 1)
plt.title('k-Nearest Neighbor Regression N=1')
plt.plot([0, 110], [0, 110], color='k', linestyle='--', alpha=0.7, zorder=0)
plt.scatter(yp1[:N_train], shuffled_Yss[:N_train, 1:2], color='gray', edgecolors='k', linewidths=0.5, alpha=0.7, label='Train')
plt.scatter(yp1[N_train:], shuffled_Yss[N_train:, 1:2], color='r', edgecolors='k', linewidths=0.5, alpha=0.7, label='Test')
plt.axis('square')
plt.xlim(0, 110)
plt.ylim(0, 110)
plt.xlabel('Effectiveness $\\epsilon_{pred}$')
plt.ylabel('Effectiveness $\\epsilon_{GT}$')
plt.legend()

plt.subplot(1, 3, 2)
plt.title('k-Nearest Neighbor Regression N=2')
plt.plot([0, 110], [0, 110], color='k', linestyle='--', alpha=0.7, zorder=0)
plt.scatter(yp2[:N_train], shuffled_Yss[:N_train, 1:2], color='gray', edgecolors='k', linewidths=0.5, alpha=0.7, label='Train')
plt.scatter(yp2[N_train:], shuffled_Yss[N_train:, 1:2], color='r', edgecolors='k', linewidths=0.5, alpha=0.7, label='Test')
plt.axis('square')
plt.xlim(0, 110)
plt.ylim(0, 110)
plt.xlabel('Effectiveness $\\epsilon_{pred}$')
plt.ylabel('Effectiveness $\\epsilon_{GT}$')
plt.legend()

plt.subplot(1, 3, 3)
plt.title('k-Nearest Neighbor Regression N=3')
plt.plot([0, 110], [0, 110], color='k', linestyle='--', alpha=0.7, zorder=0)
plt.scatter(yp3[:N_train], shuffled_Yss[:N_train, 1:2], color='gray', edgecolors='k', linewidths=0.5, alpha=0.7, label='Train')
plt.scatter(yp3[N_train:], shuffled_Yss[N_train:, 1:2], color='r', edgecolors='k', linewidths=0.5, alpha=0.7, label='Test')
plt.axis('square')
plt.xlim(0, 110)
plt.ylim(0, 110)
plt.xlabel('Effectiveness $\\epsilon_{pred}$')
plt.ylabel('Effectiveness $\\epsilon_{GT}$')
plt.legend()
plt.show()
No description has been provided for this image

1.2 MLP Regression Model

In [ ]:
import tensorflow as tf

model1 = tf.keras.models.Sequential([
    tf.keras.layers.Dense(128, input_dim=3, activation='relu'),
    tf.keras.layers.BatchNormalization(),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(64, activation='relu'),
    tf.keras.layers.BatchNormalization(),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(32, activation='relu'),
    tf.keras.layers.BatchNormalization(),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(1),
])

model1.summary()
2024-06-04 20:19:05.108828: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2024-06-04 20:19:05.169781: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2024-06-04 20:19:05.681807: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2024-06-04 20:19:05.684673: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-06-04 20:19:07.244614: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT
Model: "sequential"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 128)               512       
                                                                 
 batch_normalization (Batch  (None, 128)               512       
 Normalization)                                                  
                                                                 
 dropout (Dropout)           (None, 128)               0         
                                                                 
 dense_1 (Dense)             (None, 64)                8256      
                                                                 
 batch_normalization_1 (Bat  (None, 64)                256       
 chNormalization)                                                
                                                                 
 dropout_1 (Dropout)         (None, 64)                0         
                                                                 
 dense_2 (Dense)             (None, 32)                2080      
                                                                 
 batch_normalization_2 (Bat  (None, 32)                128       
 chNormalization)                                                
                                                                 
 dropout_2 (Dropout)         (None, 32)                0         
                                                                 
 dense_3 (Dense)             (None, 1)                 33        
                                                                 
=================================================================
Total params: 11777 (46.00 KB)
Trainable params: 11329 (44.25 KB)
Non-trainable params: 448 (1.75 KB)
_________________________________________________________________
2024-06-04 20:19:08.742537: W tensorflow/core/common_runtime/gpu/gpu_device.cc:1960] Cannot dlopen some GPU libraries. Please make sure the missing libraries mentioned above are installed properly if you would like to use GPU. Follow the guide at https://www.tensorflow.org/install/gpu for how to download and setup the required libraries for your platform.
Skipping registering GPU devices...
In [ ]:
from sklearn.preprocessing import StandardScaler

scaler = StandardScaler()
X_train = scaler.fit_transform(shuffled_Xss[:N_train])
X_test = scaler.transform(shuffled_Xss[N_train:])
In [ ]:
optimizer1 = tf.keras.optimizers.Adam(learning_rate=0.001)

model1.compile(optimizer=optimizer1,
              loss='mse',
              metrics=['mae'])
In [ ]:
model1.fit(X_train, shuffled_Yss[:N_train, 0:1], epochs=500, batch_size=32)
Epoch 1/500
4/4 [==============================] - 1s 6ms/step - loss: 1.7499 - mae: 1.0454
Epoch 2/500
4/4 [==============================] - 0s 5ms/step - loss: 1.8406 - mae: 1.0837
Epoch 3/500
4/4 [==============================] - 0s 5ms/step - loss: 1.8541 - mae: 1.0965
Epoch 4/500
4/4 [==============================] - 0s 5ms/step - loss: 1.4948 - mae: 0.9703
Epoch 5/500
4/4 [==============================] - 0s 6ms/step - loss: 1.4409 - mae: 0.9853
Epoch 6/500
4/4 [==============================] - 0s 6ms/step - loss: 1.3315 - mae: 0.8993
Epoch 7/500
4/4 [==============================] - 0s 6ms/step - loss: 1.2816 - mae: 0.8567
Epoch 8/500
4/4 [==============================] - 0s 6ms/step - loss: 1.4477 - mae: 0.8908
Epoch 9/500
4/4 [==============================] - 0s 6ms/step - loss: 1.0502 - mae: 0.7742
Epoch 10/500
4/4 [==============================] - 0s 6ms/step - loss: 0.9241 - mae: 0.7550
Epoch 11/500
4/4 [==============================] - 0s 6ms/step - loss: 1.1188 - mae: 0.8611
Epoch 12/500
4/4 [==============================] - 0s 6ms/step - loss: 0.9871 - mae: 0.7499
Epoch 13/500
4/4 [==============================] - 0s 6ms/step - loss: 0.7646 - mae: 0.7017
Epoch 14/500
4/4 [==============================] - 0s 6ms/step - loss: 0.9530 - mae: 0.7080
Epoch 15/500
4/4 [==============================] - 0s 6ms/step - loss: 0.7312 - mae: 0.6505
Epoch 16/500
4/4 [==============================] - 0s 6ms/step - loss: 0.6340 - mae: 0.6080
Epoch 17/500
4/4 [==============================] - 0s 6ms/step - loss: 0.7442 - mae: 0.6568
Epoch 18/500
4/4 [==============================] - 0s 6ms/step - loss: 0.6573 - mae: 0.6443
Epoch 19/500
4/4 [==============================] - 0s 6ms/step - loss: 0.6033 - mae: 0.5980
Epoch 20/500
4/4 [==============================] - 0s 6ms/step - loss: 0.5230 - mae: 0.5658
Epoch 21/500
4/4 [==============================] - 0s 6ms/step - loss: 0.6955 - mae: 0.6110
Epoch 22/500
4/4 [==============================] - 0s 6ms/step - loss: 0.6590 - mae: 0.6258
Epoch 23/500
4/4 [==============================] - 0s 6ms/step - loss: 0.5574 - mae: 0.5452
Epoch 24/500
4/4 [==============================] - 0s 6ms/step - loss: 0.5941 - mae: 0.5740
Epoch 25/500
4/4 [==============================] - 0s 6ms/step - loss: 0.5938 - mae: 0.5945
Epoch 26/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4162 - mae: 0.5139
Epoch 27/500
4/4 [==============================] - 0s 6ms/step - loss: 0.6953 - mae: 0.6507
Epoch 28/500
4/4 [==============================] - 0s 6ms/step - loss: 0.5848 - mae: 0.5766
Epoch 29/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4057 - mae: 0.5012
Epoch 30/500
4/4 [==============================] - 0s 6ms/step - loss: 0.5417 - mae: 0.5844
Epoch 31/500
4/4 [==============================] - 0s 6ms/step - loss: 0.5298 - mae: 0.5593
Epoch 32/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3116 - mae: 0.4452
Epoch 33/500
4/4 [==============================] - 0s 6ms/step - loss: 0.5978 - mae: 0.6233
Epoch 34/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4640 - mae: 0.5254
Epoch 35/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3557 - mae: 0.4450
Epoch 36/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4505 - mae: 0.4903
Epoch 37/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3465 - mae: 0.4461
Epoch 38/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3853 - mae: 0.5030
Epoch 39/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4600 - mae: 0.5047
Epoch 40/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4523 - mae: 0.5285
Epoch 41/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4394 - mae: 0.4896
Epoch 42/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3817 - mae: 0.4568
Epoch 43/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3255 - mae: 0.4473
Epoch 44/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4544 - mae: 0.5434
Epoch 45/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3422 - mae: 0.4489
Epoch 46/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3419 - mae: 0.4610
Epoch 47/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4303 - mae: 0.4894
Epoch 48/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4512 - mae: 0.5041
Epoch 49/500
4/4 [==============================] - 0s 5ms/step - loss: 0.3867 - mae: 0.4721
Epoch 50/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3264 - mae: 0.4556
Epoch 51/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3564 - mae: 0.4434
Epoch 52/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3072 - mae: 0.4291
Epoch 53/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3560 - mae: 0.4575
Epoch 54/500
4/4 [==============================] - 0s 6ms/step - loss: 0.4356 - mae: 0.4473
Epoch 55/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3662 - mae: 0.4820
Epoch 56/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3443 - mae: 0.4591
Epoch 57/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3622 - mae: 0.4485
Epoch 58/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3572 - mae: 0.4490
Epoch 59/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3672 - mae: 0.4630
Epoch 60/500
4/4 [==============================] - 0s 5ms/step - loss: 0.3317 - mae: 0.4140
Epoch 61/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3334 - mae: 0.4471
Epoch 62/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2571 - mae: 0.3959
Epoch 63/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2180 - mae: 0.3726
Epoch 64/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2408 - mae: 0.3963
Epoch 65/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2774 - mae: 0.3975
Epoch 66/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2313 - mae: 0.3532
Epoch 67/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2788 - mae: 0.4046
Epoch 68/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2677 - mae: 0.3879
Epoch 69/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2872 - mae: 0.4129
Epoch 70/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3290 - mae: 0.4257
Epoch 71/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2388 - mae: 0.3817
Epoch 72/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2763 - mae: 0.4056
Epoch 73/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2579 - mae: 0.4038
Epoch 74/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2729 - mae: 0.4030
Epoch 75/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2706 - mae: 0.3782
Epoch 76/500
4/4 [==============================] - 0s 5ms/step - loss: 0.2653 - mae: 0.3915
Epoch 77/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2968 - mae: 0.4066
Epoch 78/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2633 - mae: 0.3930
Epoch 79/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1897 - mae: 0.3129
Epoch 80/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2485 - mae: 0.3683
Epoch 81/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2787 - mae: 0.3952
Epoch 82/500
4/4 [==============================] - 0s 5ms/step - loss: 0.2625 - mae: 0.3981
Epoch 83/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2218 - mae: 0.3421
Epoch 84/500
4/4 [==============================] - 0s 5ms/step - loss: 0.2337 - mae: 0.3634
Epoch 85/500
4/4 [==============================] - 0s 6ms/step - loss: 0.3023 - mae: 0.4052
Epoch 86/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2750 - mae: 0.3973
Epoch 87/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2398 - mae: 0.3709
Epoch 88/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2280 - mae: 0.3625
Epoch 89/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1953 - mae: 0.3464
Epoch 90/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1814 - mae: 0.3275
Epoch 91/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1937 - mae: 0.3358
Epoch 92/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1897 - mae: 0.3172
Epoch 93/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1761 - mae: 0.3336
Epoch 94/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1678 - mae: 0.3000
Epoch 95/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1894 - mae: 0.3391
Epoch 96/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1425 - mae: 0.2973
Epoch 97/500
4/4 [==============================] - 0s 5ms/step - loss: 0.2189 - mae: 0.3578
Epoch 98/500
4/4 [==============================] - 0s 5ms/step - loss: 0.2151 - mae: 0.3409
Epoch 99/500
4/4 [==============================] - 0s 7ms/step - loss: 0.2672 - mae: 0.3339
Epoch 100/500
4/4 [==============================] - 0s 5ms/step - loss: 0.2076 - mae: 0.3510
Epoch 101/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1791 - mae: 0.3283
Epoch 102/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2056 - mae: 0.3447
Epoch 103/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1465 - mae: 0.2847
Epoch 104/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1196 - mae: 0.2553
Epoch 105/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1509 - mae: 0.2931
Epoch 106/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1947 - mae: 0.3397
Epoch 107/500
4/4 [==============================] - 0s 6ms/step - loss: 0.2063 - mae: 0.3535
Epoch 108/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1778 - mae: 0.3198
Epoch 109/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1749 - mae: 0.3135
Epoch 110/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1513 - mae: 0.3055
Epoch 111/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1283 - mae: 0.2694
Epoch 112/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1312 - mae: 0.2768
Epoch 113/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1545 - mae: 0.3016
Epoch 114/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1579 - mae: 0.3134
Epoch 115/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1265 - mae: 0.2743
Epoch 116/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1768 - mae: 0.3081
Epoch 117/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1555 - mae: 0.2859
Epoch 118/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1391 - mae: 0.2968
Epoch 119/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1556 - mae: 0.2851
Epoch 120/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1209 - mae: 0.2685
Epoch 121/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1889 - mae: 0.3222
Epoch 122/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1621 - mae: 0.2981
Epoch 123/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1222 - mae: 0.2522
Epoch 124/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1163 - mae: 0.2678
Epoch 125/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0972 - mae: 0.2580
Epoch 126/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1215 - mae: 0.2711
Epoch 127/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1597 - mae: 0.3080
Epoch 128/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1306 - mae: 0.2555
Epoch 129/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1241 - mae: 0.2620
Epoch 130/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1672 - mae: 0.2931
Epoch 131/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1577 - mae: 0.3060
Epoch 132/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1254 - mae: 0.2611
Epoch 133/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1185 - mae: 0.2425
Epoch 134/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1187 - mae: 0.2616
Epoch 135/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1279 - mae: 0.2596
Epoch 136/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1399 - mae: 0.2920
Epoch 137/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1209 - mae: 0.2716
Epoch 138/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1285 - mae: 0.2716
Epoch 139/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0912 - mae: 0.2271
Epoch 140/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1033 - mae: 0.2581
Epoch 141/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1109 - mae: 0.2495
Epoch 142/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0995 - mae: 0.2453
Epoch 143/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1242 - mae: 0.2830
Epoch 144/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1356 - mae: 0.2723
Epoch 145/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1167 - mae: 0.2460
Epoch 146/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1120 - mae: 0.2435
Epoch 147/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1391 - mae: 0.2759
Epoch 148/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1038 - mae: 0.2480
Epoch 149/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0720 - mae: 0.2118
Epoch 150/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0866 - mae: 0.2348
Epoch 151/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1300 - mae: 0.2748
Epoch 152/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1447 - mae: 0.2941
Epoch 153/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0995 - mae: 0.2488
Epoch 154/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0841 - mae: 0.2295
Epoch 155/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1144 - mae: 0.2563
Epoch 156/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1263 - mae: 0.2544
Epoch 157/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0730 - mae: 0.2133
Epoch 158/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1129 - mae: 0.2577
Epoch 159/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1013 - mae: 0.2365
Epoch 160/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1269 - mae: 0.2559
Epoch 161/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0797 - mae: 0.2217
Epoch 162/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1099 - mae: 0.2592
Epoch 163/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0998 - mae: 0.2469
Epoch 164/500
4/4 [==============================] - 0s 6ms/step - loss: 0.1165 - mae: 0.2473
Epoch 165/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0781 - mae: 0.2233
Epoch 166/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0989 - mae: 0.2410
Epoch 167/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0944 - mae: 0.2079
Epoch 168/500
4/4 [==============================] - 0s 5ms/step - loss: 0.1058 - mae: 0.2443
Epoch 169/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0904 - mae: 0.2145
Epoch 170/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0947 - mae: 0.2412
Epoch 171/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0758 - mae: 0.2255
Epoch 172/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0771 - mae: 0.2051
Epoch 173/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0927 - mae: 0.2076
Epoch 174/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0637 - mae: 0.1934
Epoch 175/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0813 - mae: 0.2177
Epoch 176/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0685 - mae: 0.2048
Epoch 177/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0954 - mae: 0.2250
Epoch 178/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0885 - mae: 0.2264
Epoch 179/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0843 - mae: 0.2180
Epoch 180/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0677 - mae: 0.2011
Epoch 181/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0889 - mae: 0.2317
Epoch 182/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0665 - mae: 0.2014
Epoch 183/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0926 - mae: 0.2401
Epoch 184/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0959 - mae: 0.2204
Epoch 185/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0658 - mae: 0.2037
Epoch 186/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0988 - mae: 0.2198
Epoch 187/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0661 - mae: 0.2022
Epoch 188/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0979 - mae: 0.2372
Epoch 189/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0967 - mae: 0.2356
Epoch 190/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0862 - mae: 0.2117
Epoch 191/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0698 - mae: 0.2035
Epoch 192/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0666 - mae: 0.1844
Epoch 193/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0694 - mae: 0.2058
Epoch 194/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0706 - mae: 0.2072
Epoch 195/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0779 - mae: 0.2151
Epoch 196/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0832 - mae: 0.2120
Epoch 197/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0636 - mae: 0.1902
Epoch 198/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0715 - mae: 0.2033
Epoch 199/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0636 - mae: 0.1835
Epoch 200/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0615 - mae: 0.1796
Epoch 201/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0731 - mae: 0.1938
Epoch 202/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0731 - mae: 0.1954
Epoch 203/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0612 - mae: 0.1947
Epoch 204/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0611 - mae: 0.1866
Epoch 205/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0532 - mae: 0.1746
Epoch 206/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0620 - mae: 0.1960
Epoch 207/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0550 - mae: 0.1954
Epoch 208/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0806 - mae: 0.2083
Epoch 209/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0724 - mae: 0.2052
Epoch 210/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0697 - mae: 0.1971
Epoch 211/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0710 - mae: 0.1987
Epoch 212/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0675 - mae: 0.1980
Epoch 213/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0545 - mae: 0.1828
Epoch 214/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0563 - mae: 0.1939
Epoch 215/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0705 - mae: 0.1994
Epoch 216/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0518 - mae: 0.1859
Epoch 217/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0572 - mae: 0.1733
Epoch 218/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0476 - mae: 0.1690
Epoch 219/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0662 - mae: 0.1894
Epoch 220/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0616 - mae: 0.1858
Epoch 221/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0502 - mae: 0.1622
Epoch 222/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0523 - mae: 0.1703
Epoch 223/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0569 - mae: 0.1943
Epoch 224/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0509 - mae: 0.1774
Epoch 225/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0596 - mae: 0.1848
Epoch 226/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0622 - mae: 0.1982
Epoch 227/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0523 - mae: 0.1729
Epoch 228/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0415 - mae: 0.1592
Epoch 229/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0611 - mae: 0.1828
Epoch 230/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0506 - mae: 0.1626
Epoch 231/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0745 - mae: 0.1954
Epoch 232/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0556 - mae: 0.1777
Epoch 233/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0513 - mae: 0.1689
Epoch 234/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0475 - mae: 0.1708
Epoch 235/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0467 - mae: 0.1675
Epoch 236/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0532 - mae: 0.1751
Epoch 237/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0448 - mae: 0.1643
Epoch 238/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0448 - mae: 0.1609
Epoch 239/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0562 - mae: 0.1700
Epoch 240/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0534 - mae: 0.1804
Epoch 241/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0545 - mae: 0.1846
Epoch 242/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0531 - mae: 0.1747
Epoch 243/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0397 - mae: 0.1567
Epoch 244/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0468 - mae: 0.1663
Epoch 245/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0532 - mae: 0.1731
Epoch 246/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0450 - mae: 0.1700
Epoch 247/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0335 - mae: 0.1450
Epoch 248/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0517 - mae: 0.1712
Epoch 249/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0601 - mae: 0.1828
Epoch 250/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0461 - mae: 0.1575
Epoch 251/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0447 - mae: 0.1710
Epoch 252/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0529 - mae: 0.1851
Epoch 253/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0404 - mae: 0.1563
Epoch 254/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0647 - mae: 0.1927
Epoch 255/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0407 - mae: 0.1538
Epoch 256/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0508 - mae: 0.1805
Epoch 257/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0408 - mae: 0.1612
Epoch 258/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0485 - mae: 0.1707
Epoch 259/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0430 - mae: 0.1655
Epoch 260/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0493 - mae: 0.1692
Epoch 261/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0372 - mae: 0.1478
Epoch 262/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0419 - mae: 0.1488
Epoch 263/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0338 - mae: 0.1477
Epoch 264/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0567 - mae: 0.1688
Epoch 265/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0336 - mae: 0.1396
Epoch 266/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0474 - mae: 0.1693
Epoch 267/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0486 - mae: 0.1720
Epoch 268/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0321 - mae: 0.1431
Epoch 269/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0472 - mae: 0.1747
Epoch 270/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0359 - mae: 0.1435
Epoch 271/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0542 - mae: 0.1783
Epoch 272/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0300 - mae: 0.1342
Epoch 273/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0295 - mae: 0.1318
Epoch 274/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0407 - mae: 0.1508
Epoch 275/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0473 - mae: 0.1654
Epoch 276/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0417 - mae: 0.1599
Epoch 277/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0294 - mae: 0.1295
Epoch 278/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0445 - mae: 0.1602
Epoch 279/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0442 - mae: 0.1608
Epoch 280/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0301 - mae: 0.1362
Epoch 281/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0456 - mae: 0.1642
Epoch 282/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0351 - mae: 0.1533
Epoch 283/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0327 - mae: 0.1415
Epoch 284/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0312 - mae: 0.1429
Epoch 285/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0332 - mae: 0.1377
Epoch 286/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0323 - mae: 0.1381
Epoch 287/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0413 - mae: 0.1590
Epoch 288/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0385 - mae: 0.1492
Epoch 289/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0405 - mae: 0.1472
Epoch 290/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0264 - mae: 0.1316
Epoch 291/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0312 - mae: 0.1423
Epoch 292/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0348 - mae: 0.1386
Epoch 293/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0441 - mae: 0.1672
Epoch 294/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0438 - mae: 0.1591
Epoch 295/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0338 - mae: 0.1452
Epoch 296/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0318 - mae: 0.1339
Epoch 297/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0367 - mae: 0.1509
Epoch 298/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0406 - mae: 0.1565
Epoch 299/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0340 - mae: 0.1451
Epoch 300/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0381 - mae: 0.1390
Epoch 301/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0333 - mae: 0.1417
Epoch 302/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0352 - mae: 0.1434
Epoch 303/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0311 - mae: 0.1332
Epoch 304/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0478 - mae: 0.1729
Epoch 305/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0285 - mae: 0.1240
Epoch 306/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0296 - mae: 0.1329
Epoch 307/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0353 - mae: 0.1439
Epoch 308/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0292 - mae: 0.1357
Epoch 309/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0378 - mae: 0.1490
Epoch 310/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0361 - mae: 0.1531
Epoch 311/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0328 - mae: 0.1424
Epoch 312/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0324 - mae: 0.1361
Epoch 313/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0327 - mae: 0.1398
Epoch 314/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0321 - mae: 0.1370
Epoch 315/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0283 - mae: 0.1292
Epoch 316/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0239 - mae: 0.1153
Epoch 317/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0335 - mae: 0.1411
Epoch 318/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0366 - mae: 0.1464
Epoch 319/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0342 - mae: 0.1451
Epoch 320/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0446 - mae: 0.1507
Epoch 321/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0294 - mae: 0.1342
Epoch 322/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0278 - mae: 0.1323
Epoch 323/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0252 - mae: 0.1238
Epoch 324/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0247 - mae: 0.1204
Epoch 325/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0243 - mae: 0.1194
Epoch 326/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0359 - mae: 0.1489
Epoch 327/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0343 - mae: 0.1442
Epoch 328/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0276 - mae: 0.1231
Epoch 329/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0313 - mae: 0.1318
Epoch 330/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0456 - mae: 0.1663
Epoch 331/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0350 - mae: 0.1500
Epoch 332/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0258 - mae: 0.1248
Epoch 333/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0301 - mae: 0.1399
Epoch 334/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0311 - mae: 0.1384
Epoch 335/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0291 - mae: 0.1353
Epoch 336/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0204 - mae: 0.1164
Epoch 337/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0217 - mae: 0.1165
Epoch 338/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0260 - mae: 0.1311
Epoch 339/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0306 - mae: 0.1341
Epoch 340/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0313 - mae: 0.1336
Epoch 341/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0215 - mae: 0.1163
Epoch 342/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0245 - mae: 0.1215
Epoch 343/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0268 - mae: 0.1230
Epoch 344/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0280 - mae: 0.1351
Epoch 345/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0331 - mae: 0.1425
Epoch 346/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0290 - mae: 0.1298
Epoch 347/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0291 - mae: 0.1343
Epoch 348/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0227 - mae: 0.1189
Epoch 349/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0227 - mae: 0.1193
Epoch 350/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0320 - mae: 0.1308
Epoch 351/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0261 - mae: 0.1301
Epoch 352/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0198 - mae: 0.1120
Epoch 353/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0332 - mae: 0.1476
Epoch 354/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0277 - mae: 0.1389
Epoch 355/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0240 - mae: 0.1192
Epoch 356/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0185 - mae: 0.1093
Epoch 357/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0215 - mae: 0.1203
Epoch 358/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0240 - mae: 0.1223
Epoch 359/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0244 - mae: 0.1257
Epoch 360/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0233 - mae: 0.1215
Epoch 361/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0220 - mae: 0.1139
Epoch 362/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0225 - mae: 0.1182
Epoch 363/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0339 - mae: 0.1388
Epoch 364/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0216 - mae: 0.1130
Epoch 365/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0262 - mae: 0.1300
Epoch 366/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0241 - mae: 0.1179
Epoch 367/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0211 - mae: 0.1164
Epoch 368/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0268 - mae: 0.1308
Epoch 369/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0268 - mae: 0.1322
Epoch 370/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0250 - mae: 0.1297
Epoch 371/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0261 - mae: 0.1241
Epoch 372/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0206 - mae: 0.1127
Epoch 373/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0350 - mae: 0.1420
Epoch 374/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0240 - mae: 0.1254
Epoch 375/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0260 - mae: 0.1268
Epoch 376/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0236 - mae: 0.1194
Epoch 377/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0228 - mae: 0.1132
Epoch 378/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0221 - mae: 0.1121
Epoch 379/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0191 - mae: 0.1075
Epoch 380/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0246 - mae: 0.1196
Epoch 381/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0277 - mae: 0.1336
Epoch 382/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0261 - mae: 0.1251
Epoch 383/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0283 - mae: 0.1288
Epoch 384/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0293 - mae: 0.1423
Epoch 385/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0184 - mae: 0.1057
Epoch 386/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0229 - mae: 0.1183
Epoch 387/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0195 - mae: 0.1088
Epoch 388/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0181 - mae: 0.1023
Epoch 389/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0225 - mae: 0.1156
Epoch 390/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0231 - mae: 0.1254
Epoch 391/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0185 - mae: 0.1070
Epoch 392/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0227 - mae: 0.1165
Epoch 393/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0181 - mae: 0.1063
Epoch 394/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0193 - mae: 0.1072
Epoch 395/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0228 - mae: 0.1186
Epoch 396/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0222 - mae: 0.1226
Epoch 397/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0219 - mae: 0.1170
Epoch 398/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0247 - mae: 0.1285
Epoch 399/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0198 - mae: 0.1121
Epoch 400/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0218 - mae: 0.1125
Epoch 401/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0243 - mae: 0.1260
Epoch 402/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0215 - mae: 0.1162
Epoch 403/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0125 - mae: 0.0875
Epoch 404/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0246 - mae: 0.1206
Epoch 405/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0197 - mae: 0.1090
Epoch 406/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0222 - mae: 0.1159
Epoch 407/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0233 - mae: 0.1169
Epoch 408/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0198 - mae: 0.1130
Epoch 409/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0239 - mae: 0.1191
Epoch 410/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0234 - mae: 0.1241
Epoch 411/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0264 - mae: 0.1291
Epoch 412/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0189 - mae: 0.1121
Epoch 413/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0237 - mae: 0.1170
Epoch 414/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0174 - mae: 0.1017
Epoch 415/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0231 - mae: 0.1177
Epoch 416/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0186 - mae: 0.1049
Epoch 417/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0193 - mae: 0.1083
Epoch 418/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0203 - mae: 0.1140
Epoch 419/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0312 - mae: 0.1395
Epoch 420/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0192 - mae: 0.1111
Epoch 421/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0215 - mae: 0.1127
Epoch 422/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0383 - mae: 0.1578
Epoch 423/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0170 - mae: 0.0988
Epoch 424/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0216 - mae: 0.1169
Epoch 425/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0169 - mae: 0.0997
Epoch 426/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0170 - mae: 0.1022
Epoch 427/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0169 - mae: 0.1068
Epoch 428/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0147 - mae: 0.0933
Epoch 429/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0208 - mae: 0.1119
Epoch 430/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0172 - mae: 0.1019
Epoch 431/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0156 - mae: 0.0985
Epoch 432/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0147 - mae: 0.0954
Epoch 433/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0233 - mae: 0.1193
Epoch 434/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0238 - mae: 0.1205
Epoch 435/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0282 - mae: 0.1321
Epoch 436/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0207 - mae: 0.1099
Epoch 437/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0148 - mae: 0.0956
Epoch 438/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0181 - mae: 0.1027
Epoch 439/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0139 - mae: 0.0936
Epoch 440/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0183 - mae: 0.1100
Epoch 441/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0220 - mae: 0.1165
Epoch 442/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0190 - mae: 0.1095
Epoch 443/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0222 - mae: 0.1221
Epoch 444/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0257 - mae: 0.1248
Epoch 445/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0122 - mae: 0.0883
Epoch 446/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0192 - mae: 0.1084
Epoch 447/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0158 - mae: 0.0943
Epoch 448/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0144 - mae: 0.0919
Epoch 449/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0120 - mae: 0.0866
Epoch 450/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0143 - mae: 0.0983
Epoch 451/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0140 - mae: 0.0923
Epoch 452/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0197 - mae: 0.1134
Epoch 453/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0141 - mae: 0.0939
Epoch 454/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0179 - mae: 0.1080
Epoch 455/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0175 - mae: 0.0992
Epoch 456/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0180 - mae: 0.1023
Epoch 457/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0228 - mae: 0.1138
Epoch 458/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0175 - mae: 0.1040
Epoch 459/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0168 - mae: 0.1049
Epoch 460/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0148 - mae: 0.0975
Epoch 461/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0156 - mae: 0.0980
Epoch 462/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0209 - mae: 0.1153
Epoch 463/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0136 - mae: 0.0938
Epoch 464/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0140 - mae: 0.0935
Epoch 465/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0146 - mae: 0.0926
Epoch 466/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0188 - mae: 0.1052
Epoch 467/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0196 - mae: 0.1112
Epoch 468/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0232 - mae: 0.1216
Epoch 469/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0233 - mae: 0.1165
Epoch 470/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0183 - mae: 0.1080
Epoch 471/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0243 - mae: 0.1269
Epoch 472/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0142 - mae: 0.0961
Epoch 473/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0235 - mae: 0.1165
Epoch 474/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0136 - mae: 0.0885
Epoch 475/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0184 - mae: 0.1016
Epoch 476/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0203 - mae: 0.1096
Epoch 477/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0130 - mae: 0.0928
Epoch 478/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0186 - mae: 0.1096
Epoch 479/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0151 - mae: 0.0996
Epoch 480/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0306 - mae: 0.1386
Epoch 481/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0134 - mae: 0.0929
Epoch 482/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0150 - mae: 0.0991
Epoch 483/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0195 - mae: 0.1039
Epoch 484/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0235 - mae: 0.1248
Epoch 485/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0190 - mae: 0.1083
Epoch 486/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0185 - mae: 0.1037
Epoch 487/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0175 - mae: 0.1048
Epoch 488/500
4/4 [==============================] - 0s 5ms/step - loss: 0.0166 - mae: 0.1051
Epoch 489/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0165 - mae: 0.0988
Epoch 490/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0154 - mae: 0.0950
Epoch 491/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0222 - mae: 0.1183
Epoch 492/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0132 - mae: 0.0935
Epoch 493/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0143 - mae: 0.0892
Epoch 494/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0119 - mae: 0.0831
Epoch 495/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0146 - mae: 0.0961
Epoch 496/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0136 - mae: 0.0901
Epoch 497/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0202 - mae: 0.1139
Epoch 498/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0111 - mae: 0.0827
Epoch 499/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0123 - mae: 0.0855
Epoch 500/500
4/4 [==============================] - 0s 6ms/step - loss: 0.0139 - mae: 0.0981
Out[ ]:
<keras.src.callbacks.History at 0x7f644e700f40>
In [ ]:
loss, mae = model1.evaluate(X_test, shuffled_Yss[N_train:, 0:1])
print(f'Test Loss: {loss}')
print(f'Test MAE: {mae}')
1/1 [==============================] - 0s 124ms/step - loss: 0.0045 - mae: 0.0458
Test Loss: 0.004540332593023777
Test MAE: 0.045848116278648376
In [ ]:
import tensorflow as tf

model2 = tf.keras.models.Sequential([
    tf.keras.layers.Dense(128, input_dim=3, activation='relu'),
    tf.keras.layers.BatchNormalization(),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(64, activation='relu'),
    tf.keras.layers.BatchNormalization(),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(32, activation='relu'),
    tf.keras.layers.BatchNormalization(),
    tf.keras.layers.Dropout(0.2),
    tf.keras.layers.Dense(1),
])

model2.summary()
Model: "sequential_1"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense_4 (Dense)             (None, 128)               512       
                                                                 
 batch_normalization_3 (Bat  (None, 128)               512       
 chNormalization)                                                
                                                                 
 dropout_3 (Dropout)         (None, 128)               0         
                                                                 
 dense_5 (Dense)             (None, 64)                8256      
                                                                 
 batch_normalization_4 (Bat  (None, 64)                256       
 chNormalization)                                                
                                                                 
 dropout_4 (Dropout)         (None, 64)                0         
                                                                 
 dense_6 (Dense)             (None, 32)                2080      
                                                                 
 batch_normalization_5 (Bat  (None, 32)                128       
 chNormalization)                                                
                                                                 
 dropout_5 (Dropout)         (None, 32)                0         
                                                                 
 dense_7 (Dense)             (None, 1)                 33        
                                                                 
=================================================================
Total params: 11777 (46.00 KB)
Trainable params: 11329 (44.25 KB)
Non-trainable params: 448 (1.75 KB)
_________________________________________________________________
In [ ]:
optimizer2 = tf.keras.optimizers.Adam(learning_rate=0.001)

model2.compile(optimizer=optimizer2,
              loss='mse',
              metrics=['mae'])
In [ ]:
model2.fit(X_train, shuffled_Yss[:N_train, 1:2], epochs=500, batch_size=32)
Epoch 1/500
4/4 [==============================] - 1s 4ms/step - loss: 539.6943 - mae: 16.0383
Epoch 2/500
4/4 [==============================] - 0s 4ms/step - loss: 528.8967 - mae: 16.1070
Epoch 3/500
4/4 [==============================] - 0s 4ms/step - loss: 503.4987 - mae: 15.9074
Epoch 4/500
4/4 [==============================] - 0s 5ms/step - loss: 490.6570 - mae: 15.8174
Epoch 5/500
4/4 [==============================] - 0s 5ms/step - loss: 478.2193 - mae: 15.9022
Epoch 6/500
4/4 [==============================] - 0s 5ms/step - loss: 480.6516 - mae: 15.9111
Epoch 7/500
4/4 [==============================] - 0s 5ms/step - loss: 472.6372 - mae: 15.8390
Epoch 8/500
4/4 [==============================] - 0s 5ms/step - loss: 457.7257 - mae: 15.8393
Epoch 9/500
4/4 [==============================] - 0s 6ms/step - loss: 457.3344 - mae: 15.8358
Epoch 10/500
4/4 [==============================] - 0s 6ms/step - loss: 445.9624 - mae: 15.7770
Epoch 11/500
4/4 [==============================] - 0s 6ms/step - loss: 440.9715 - mae: 15.7018
Epoch 12/500
4/4 [==============================] - 0s 6ms/step - loss: 437.9377 - mae: 15.7355
Epoch 13/500
4/4 [==============================] - 0s 6ms/step - loss: 417.3563 - mae: 15.5394
Epoch 14/500
4/4 [==============================] - 0s 6ms/step - loss: 416.4932 - mae: 15.6063
Epoch 15/500
4/4 [==============================] - 0s 6ms/step - loss: 400.9510 - mae: 15.3907
Epoch 16/500
4/4 [==============================] - 0s 6ms/step - loss: 406.5800 - mae: 15.4729
Epoch 17/500
4/4 [==============================] - 0s 6ms/step - loss: 404.2016 - mae: 15.4170
Epoch 18/500
4/4 [==============================] - 0s 6ms/step - loss: 397.9944 - mae: 15.4868
Epoch 19/500
4/4 [==============================] - 0s 6ms/step - loss: 381.3362 - mae: 15.1869
Epoch 20/500
4/4 [==============================] - 0s 6ms/step - loss: 379.7888 - mae: 15.2286
Epoch 21/500
4/4 [==============================] - 0s 6ms/step - loss: 368.9419 - mae: 15.1217
Epoch 22/500
4/4 [==============================] - 0s 6ms/step - loss: 380.9655 - mae: 15.2806
Epoch 23/500
4/4 [==============================] - 0s 6ms/step - loss: 364.0457 - mae: 15.1685
Epoch 24/500
4/4 [==============================] - 0s 6ms/step - loss: 365.8734 - mae: 14.9566
Epoch 25/500
4/4 [==============================] - 0s 6ms/step - loss: 357.8920 - mae: 14.9542
Epoch 26/500
4/4 [==============================] - 0s 6ms/step - loss: 359.0631 - mae: 14.9932
Epoch 27/500
4/4 [==============================] - 0s 6ms/step - loss: 336.7447 - mae: 14.8004
Epoch 28/500
4/4 [==============================] - 0s 6ms/step - loss: 355.3016 - mae: 14.9091
Epoch 29/500
4/4 [==============================] - 0s 6ms/step - loss: 328.5730 - mae: 14.6814
Epoch 30/500
4/4 [==============================] - 0s 6ms/step - loss: 337.4676 - mae: 14.8176
Epoch 31/500
4/4 [==============================] - 0s 6ms/step - loss: 329.9712 - mae: 14.5233
Epoch 32/500
4/4 [==============================] - 0s 6ms/step - loss: 327.0476 - mae: 14.5798
Epoch 33/500
4/4 [==============================] - 0s 6ms/step - loss: 325.2192 - mae: 14.5766
Epoch 34/500
4/4 [==============================] - 0s 6ms/step - loss: 320.4267 - mae: 14.4999
Epoch 35/500
4/4 [==============================] - 0s 6ms/step - loss: 322.8243 - mae: 14.5559
Epoch 36/500
4/4 [==============================] - 0s 6ms/step - loss: 291.6515 - mae: 14.2358
Epoch 37/500
4/4 [==============================] - 0s 6ms/step - loss: 301.5997 - mae: 14.2350
Epoch 38/500
4/4 [==============================] - 0s 6ms/step - loss: 307.9030 - mae: 14.2587
Epoch 39/500
4/4 [==============================] - 0s 6ms/step - loss: 305.5736 - mae: 14.0851
Epoch 40/500
4/4 [==============================] - 0s 5ms/step - loss: 296.9793 - mae: 14.2041
Epoch 41/500
4/4 [==============================] - 0s 6ms/step - loss: 290.2323 - mae: 13.9274
Epoch 42/500
4/4 [==============================] - 0s 6ms/step - loss: 296.6378 - mae: 14.0950
Epoch 43/500
4/4 [==============================] - 0s 6ms/step - loss: 276.2781 - mae: 14.0122
Epoch 44/500
4/4 [==============================] - 0s 6ms/step - loss: 289.0485 - mae: 13.7929
Epoch 45/500
4/4 [==============================] - 0s 6ms/step - loss: 269.4658 - mae: 13.5446
Epoch 46/500
4/4 [==============================] - 0s 5ms/step - loss: 268.0823 - mae: 13.5073
Epoch 47/500
4/4 [==============================] - 0s 6ms/step - loss: 255.0426 - mae: 13.5125
Epoch 48/500
4/4 [==============================] - 0s 6ms/step - loss: 269.8663 - mae: 13.4780
Epoch 49/500
4/4 [==============================] - 0s 6ms/step - loss: 250.0198 - mae: 13.4433
Epoch 50/500
4/4 [==============================] - 0s 6ms/step - loss: 257.9628 - mae: 13.2721
Epoch 51/500
4/4 [==============================] - 0s 6ms/step - loss: 237.7803 - mae: 13.0484
Epoch 52/500
4/4 [==============================] - 0s 6ms/step - loss: 247.1532 - mae: 13.1567
Epoch 53/500
4/4 [==============================] - 0s 6ms/step - loss: 240.0630 - mae: 13.0234
Epoch 54/500
4/4 [==============================] - 0s 6ms/step - loss: 215.0988 - mae: 12.7731
Epoch 55/500
4/4 [==============================] - 0s 6ms/step - loss: 247.6716 - mae: 12.8722
Epoch 56/500
4/4 [==============================] - 0s 5ms/step - loss: 241.5380 - mae: 12.8177
Epoch 57/500
4/4 [==============================] - 0s 6ms/step - loss: 230.5543 - mae: 12.7552
Epoch 58/500
4/4 [==============================] - 0s 6ms/step - loss: 240.0356 - mae: 13.0957
Epoch 59/500
4/4 [==============================] - 0s 6ms/step - loss: 222.3376 - mae: 12.5311
Epoch 60/500
4/4 [==============================] - 0s 6ms/step - loss: 223.9448 - mae: 12.4422
Epoch 61/500
4/4 [==============================] - 0s 6ms/step - loss: 236.9307 - mae: 12.8020
Epoch 62/500
4/4 [==============================] - 0s 7ms/step - loss: 236.0645 - mae: 12.4721
Epoch 63/500
4/4 [==============================] - 0s 6ms/step - loss: 184.7414 - mae: 12.0934
Epoch 64/500
4/4 [==============================] - 0s 6ms/step - loss: 185.4185 - mae: 11.8434
Epoch 65/500
4/4 [==============================] - 0s 5ms/step - loss: 207.1114 - mae: 12.2243
Epoch 66/500
4/4 [==============================] - 0s 6ms/step - loss: 181.8259 - mae: 11.8552
Epoch 67/500
4/4 [==============================] - 0s 6ms/step - loss: 173.0158 - mae: 11.4929
Epoch 68/500
4/4 [==============================] - 0s 6ms/step - loss: 197.1354 - mae: 12.0688
Epoch 69/500
4/4 [==============================] - 0s 6ms/step - loss: 175.0496 - mae: 11.5595
Epoch 70/500
4/4 [==============================] - 0s 6ms/step - loss: 210.5212 - mae: 12.0148
Epoch 71/500
4/4 [==============================] - 0s 6ms/step - loss: 198.9063 - mae: 11.4540
Epoch 72/500
4/4 [==============================] - 0s 6ms/step - loss: 161.5691 - mae: 11.3226
Epoch 73/500
4/4 [==============================] - 0s 6ms/step - loss: 176.4852 - mae: 11.4098
Epoch 74/500
4/4 [==============================] - 0s 6ms/step - loss: 161.3790 - mae: 11.0477
Epoch 75/500
4/4 [==============================] - 0s 6ms/step - loss: 150.8101 - mae: 10.9053
Epoch 76/500
4/4 [==============================] - 0s 6ms/step - loss: 161.4291 - mae: 11.0695
Epoch 77/500
4/4 [==============================] - 0s 6ms/step - loss: 165.8595 - mae: 11.0027
Epoch 78/500
4/4 [==============================] - 0s 6ms/step - loss: 160.0551 - mae: 11.0754
Epoch 79/500
4/4 [==============================] - 0s 6ms/step - loss: 157.7857 - mae: 10.5412
Epoch 80/500
4/4 [==============================] - 0s 6ms/step - loss: 162.2207 - mae: 10.6805
Epoch 81/500
4/4 [==============================] - 0s 6ms/step - loss: 174.7155 - mae: 11.2674
Epoch 82/500
4/4 [==============================] - 0s 5ms/step - loss: 144.3675 - mae: 10.6136
Epoch 83/500
4/4 [==============================] - 0s 6ms/step - loss: 163.2703 - mae: 11.0506
Epoch 84/500
4/4 [==============================] - 0s 6ms/step - loss: 160.0020 - mae: 10.7598
Epoch 85/500
4/4 [==============================] - 0s 6ms/step - loss: 144.3890 - mae: 10.4391
Epoch 86/500
4/4 [==============================] - 0s 6ms/step - loss: 137.8372 - mae: 10.2918
Epoch 87/500
4/4 [==============================] - 0s 6ms/step - loss: 144.5735 - mae: 10.4414
Epoch 88/500
4/4 [==============================] - 0s 6ms/step - loss: 150.8476 - mae: 10.1132
Epoch 89/500
4/4 [==============================] - 0s 6ms/step - loss: 123.1898 - mae: 9.7458
Epoch 90/500
4/4 [==============================] - 0s 6ms/step - loss: 142.4670 - mae: 10.1846
Epoch 91/500
4/4 [==============================] - 0s 6ms/step - loss: 151.7268 - mae: 10.2246
Epoch 92/500
4/4 [==============================] - 0s 6ms/step - loss: 128.8014 - mae: 9.7484
Epoch 93/500
4/4 [==============================] - 0s 6ms/step - loss: 127.2529 - mae: 10.0452
Epoch 94/500
4/4 [==============================] - 0s 6ms/step - loss: 137.5780 - mae: 10.1813
Epoch 95/500
4/4 [==============================] - 0s 6ms/step - loss: 126.1237 - mae: 9.7128
Epoch 96/500
4/4 [==============================] - 0s 6ms/step - loss: 137.1892 - mae: 9.5595
Epoch 97/500
4/4 [==============================] - 0s 6ms/step - loss: 113.0862 - mae: 9.2431
Epoch 98/500
4/4 [==============================] - 0s 6ms/step - loss: 111.8663 - mae: 9.2047
Epoch 99/500
4/4 [==============================] - 0s 6ms/step - loss: 102.1416 - mae: 8.6517
Epoch 100/500
4/4 [==============================] - 0s 6ms/step - loss: 139.9275 - mae: 9.9257
Epoch 101/500
4/4 [==============================] - 0s 6ms/step - loss: 115.1056 - mae: 8.8039
Epoch 102/500
4/4 [==============================] - 0s 5ms/step - loss: 104.2575 - mae: 8.9541
Epoch 103/500
4/4 [==============================] - 0s 5ms/step - loss: 112.3389 - mae: 8.8492
Epoch 104/500
4/4 [==============================] - 0s 6ms/step - loss: 105.9338 - mae: 8.6843
Epoch 105/500
4/4 [==============================] - 0s 6ms/step - loss: 96.2359 - mae: 8.3368
Epoch 106/500
4/4 [==============================] - 0s 6ms/step - loss: 86.4007 - mae: 8.0611
Epoch 107/500
4/4 [==============================] - 0s 6ms/step - loss: 128.2188 - mae: 8.8977
Epoch 108/500
4/4 [==============================] - 0s 6ms/step - loss: 86.1560 - mae: 7.8823
Epoch 109/500
4/4 [==============================] - 0s 5ms/step - loss: 107.4825 - mae: 8.7203
Epoch 110/500
4/4 [==============================] - 0s 6ms/step - loss: 92.6106 - mae: 8.3219
Epoch 111/500
4/4 [==============================] - 0s 6ms/step - loss: 103.4611 - mae: 8.2173
Epoch 112/500
4/4 [==============================] - 0s 6ms/step - loss: 104.5134 - mae: 8.0683
Epoch 113/500
4/4 [==============================] - 0s 6ms/step - loss: 98.6641 - mae: 7.8928
Epoch 114/500
4/4 [==============================] - 0s 6ms/step - loss: 111.8448 - mae: 8.5122
Epoch 115/500
4/4 [==============================] - 0s 6ms/step - loss: 97.2150 - mae: 8.3037
Epoch 116/500
4/4 [==============================] - 0s 5ms/step - loss: 78.9092 - mae: 7.4329
Epoch 117/500
4/4 [==============================] - 0s 6ms/step - loss: 64.5845 - mae: 7.0362
Epoch 118/500
4/4 [==============================] - 0s 6ms/step - loss: 89.3062 - mae: 7.9021
Epoch 119/500
4/4 [==============================] - 0s 6ms/step - loss: 89.5765 - mae: 7.4467
Epoch 120/500
4/4 [==============================] - 0s 6ms/step - loss: 101.0507 - mae: 8.0798
Epoch 121/500
4/4 [==============================] - 0s 6ms/step - loss: 100.4480 - mae: 7.9583
Epoch 122/500
4/4 [==============================] - 0s 6ms/step - loss: 82.7756 - mae: 7.3165
Epoch 123/500
4/4 [==============================] - 0s 6ms/step - loss: 69.6360 - mae: 6.8388
Epoch 124/500
4/4 [==============================] - 0s 6ms/step - loss: 58.9438 - mae: 6.4197
Epoch 125/500
4/4 [==============================] - 0s 6ms/step - loss: 117.1808 - mae: 8.0119
Epoch 126/500
4/4 [==============================] - 0s 6ms/step - loss: 68.4792 - mae: 6.8660
Epoch 127/500
4/4 [==============================] - 0s 6ms/step - loss: 54.8743 - mae: 6.1721
Epoch 128/500
4/4 [==============================] - 0s 6ms/step - loss: 55.8024 - mae: 6.2500
Epoch 129/500
4/4 [==============================] - 0s 6ms/step - loss: 76.0872 - mae: 6.5585
Epoch 130/500
4/4 [==============================] - 0s 6ms/step - loss: 68.6124 - mae: 6.8090
Epoch 131/500
4/4 [==============================] - 0s 6ms/step - loss: 61.1254 - mae: 6.1609
Epoch 132/500
4/4 [==============================] - 0s 6ms/step - loss: 70.1069 - mae: 6.7899
Epoch 133/500
4/4 [==============================] - 0s 6ms/step - loss: 52.1290 - mae: 6.1850
Epoch 134/500
4/4 [==============================] - 0s 6ms/step - loss: 54.7681 - mae: 5.9452
Epoch 135/500
4/4 [==============================] - 0s 6ms/step - loss: 74.9229 - mae: 6.2890
Epoch 136/500
4/4 [==============================] - 0s 6ms/step - loss: 74.6575 - mae: 6.4252
Epoch 137/500
4/4 [==============================] - 0s 5ms/step - loss: 73.3184 - mae: 6.1034
Epoch 138/500
4/4 [==============================] - 0s 6ms/step - loss: 71.5111 - mae: 6.1531
Epoch 139/500
4/4 [==============================] - 0s 6ms/step - loss: 59.1074 - mae: 5.8910
Epoch 140/500
4/4 [==============================] - 0s 6ms/step - loss: 46.2861 - mae: 5.3587
Epoch 141/500
4/4 [==============================] - 0s 6ms/step - loss: 57.7486 - mae: 5.9396
Epoch 142/500
4/4 [==============================] - 0s 6ms/step - loss: 44.1350 - mae: 5.2030
Epoch 143/500
4/4 [==============================] - 0s 6ms/step - loss: 39.6904 - mae: 5.0673
Epoch 144/500
4/4 [==============================] - 0s 6ms/step - loss: 38.0117 - mae: 5.1024
Epoch 145/500
4/4 [==============================] - 0s 6ms/step - loss: 50.0175 - mae: 5.6402
Epoch 146/500
4/4 [==============================] - 0s 6ms/step - loss: 51.2651 - mae: 5.7327
Epoch 147/500
4/4 [==============================] - 0s 6ms/step - loss: 49.0812 - mae: 5.2552
Epoch 148/500
4/4 [==============================] - 0s 6ms/step - loss: 45.9573 - mae: 5.4933
Epoch 149/500
4/4 [==============================] - 0s 6ms/step - loss: 57.0839 - mae: 5.5387
Epoch 150/500
4/4 [==============================] - 0s 6ms/step - loss: 43.0226 - mae: 4.9800
Epoch 151/500
4/4 [==============================] - 0s 6ms/step - loss: 59.5302 - mae: 5.3399
Epoch 152/500
4/4 [==============================] - 0s 6ms/step - loss: 62.4378 - mae: 5.8678
Epoch 153/500
4/4 [==============================] - 0s 6ms/step - loss: 40.2835 - mae: 5.0604
Epoch 154/500
4/4 [==============================] - 0s 6ms/step - loss: 33.4532 - mae: 4.9346
Epoch 155/500
4/4 [==============================] - 0s 6ms/step - loss: 36.3671 - mae: 4.6780
Epoch 156/500
4/4 [==============================] - 0s 5ms/step - loss: 75.8468 - mae: 6.0706
Epoch 157/500
4/4 [==============================] - 0s 6ms/step - loss: 34.8454 - mae: 4.5484
Epoch 158/500
4/4 [==============================] - 0s 6ms/step - loss: 49.9295 - mae: 4.7368
Epoch 159/500
4/4 [==============================] - 0s 6ms/step - loss: 60.0172 - mae: 5.6285
Epoch 160/500
4/4 [==============================] - 0s 6ms/step - loss: 64.4597 - mae: 5.6794
Epoch 161/500
4/4 [==============================] - 0s 6ms/step - loss: 44.2167 - mae: 4.9881
Epoch 162/500
4/4 [==============================] - 0s 6ms/step - loss: 45.1132 - mae: 4.8173
Epoch 163/500
4/4 [==============================] - 0s 6ms/step - loss: 46.4372 - mae: 5.3138
Epoch 164/500
4/4 [==============================] - 0s 6ms/step - loss: 27.3319 - mae: 4.0454
Epoch 165/500
4/4 [==============================] - 0s 6ms/step - loss: 34.8908 - mae: 4.2882
Epoch 166/500
4/4 [==============================] - 0s 6ms/step - loss: 37.3453 - mae: 4.5141
Epoch 167/500
4/4 [==============================] - 0s 6ms/step - loss: 50.2704 - mae: 5.1061
Epoch 168/500
4/4 [==============================] - 0s 6ms/step - loss: 48.6756 - mae: 4.9842
Epoch 169/500
4/4 [==============================] - 0s 6ms/step - loss: 31.3732 - mae: 4.3242
Epoch 170/500
4/4 [==============================] - 0s 6ms/step - loss: 41.6431 - mae: 4.2086
Epoch 171/500
4/4 [==============================] - 0s 6ms/step - loss: 59.9776 - mae: 5.3185
Epoch 172/500
4/4 [==============================] - 0s 6ms/step - loss: 44.9705 - mae: 4.5656
Epoch 173/500
4/4 [==============================] - 0s 6ms/step - loss: 36.9476 - mae: 4.3845
Epoch 174/500
4/4 [==============================] - 0s 6ms/step - loss: 45.6798 - mae: 4.6432
Epoch 175/500
4/4 [==============================] - 0s 5ms/step - loss: 48.8051 - mae: 4.9512
Epoch 176/500
4/4 [==============================] - 0s 6ms/step - loss: 33.1819 - mae: 4.2962
Epoch 177/500
4/4 [==============================] - 0s 6ms/step - loss: 24.1226 - mae: 3.7425
Epoch 178/500
4/4 [==============================] - 0s 6ms/step - loss: 32.9355 - mae: 4.3205
Epoch 179/500
4/4 [==============================] - 0s 6ms/step - loss: 36.8326 - mae: 4.3470
Epoch 180/500
4/4 [==============================] - 0s 6ms/step - loss: 27.8709 - mae: 4.0927
Epoch 181/500
4/4 [==============================] - 0s 6ms/step - loss: 36.5884 - mae: 4.5089
Epoch 182/500
4/4 [==============================] - 0s 5ms/step - loss: 36.0983 - mae: 4.3890
Epoch 183/500
4/4 [==============================] - 0s 6ms/step - loss: 34.0209 - mae: 4.0347
Epoch 184/500
4/4 [==============================] - 0s 6ms/step - loss: 32.0474 - mae: 3.9129
Epoch 185/500
4/4 [==============================] - 0s 6ms/step - loss: 44.7703 - mae: 4.8097
Epoch 186/500
4/4 [==============================] - 0s 6ms/step - loss: 48.8862 - mae: 4.5499
Epoch 187/500
4/4 [==============================] - 0s 6ms/step - loss: 40.9628 - mae: 4.3437
Epoch 188/500
4/4 [==============================] - 0s 6ms/step - loss: 21.3806 - mae: 3.7047
Epoch 189/500
4/4 [==============================] - 0s 6ms/step - loss: 22.0993 - mae: 3.6566
Epoch 190/500
4/4 [==============================] - 0s 6ms/step - loss: 46.4138 - mae: 4.3153
Epoch 191/500
4/4 [==============================] - 0s 6ms/step - loss: 21.3502 - mae: 3.4795
Epoch 192/500
4/4 [==============================] - 0s 6ms/step - loss: 30.6049 - mae: 4.1673
Epoch 193/500
4/4 [==============================] - 0s 6ms/step - loss: 41.6931 - mae: 4.5299
Epoch 194/500
4/4 [==============================] - 0s 7ms/step - loss: 25.1721 - mae: 3.6618
Epoch 195/500
4/4 [==============================] - 0s 6ms/step - loss: 20.0080 - mae: 3.5897
Epoch 196/500
4/4 [==============================] - 0s 5ms/step - loss: 43.0305 - mae: 4.5230
Epoch 197/500
4/4 [==============================] - 0s 6ms/step - loss: 21.9804 - mae: 3.5568
Epoch 198/500
4/4 [==============================] - 0s 6ms/step - loss: 38.7173 - mae: 4.1253
Epoch 199/500
4/4 [==============================] - 0s 6ms/step - loss: 38.6067 - mae: 4.1331
Epoch 200/500
4/4 [==============================] - 0s 6ms/step - loss: 26.9375 - mae: 3.8453
Epoch 201/500
4/4 [==============================] - 0s 5ms/step - loss: 46.8090 - mae: 4.7537
Epoch 202/500
4/4 [==============================] - 0s 6ms/step - loss: 31.0786 - mae: 3.7383
Epoch 203/500
4/4 [==============================] - 0s 6ms/step - loss: 30.0444 - mae: 3.7334
Epoch 204/500
4/4 [==============================] - 0s 6ms/step - loss: 24.8411 - mae: 3.5964
Epoch 205/500
4/4 [==============================] - 0s 6ms/step - loss: 49.3446 - mae: 4.8696
Epoch 206/500
4/4 [==============================] - 0s 6ms/step - loss: 39.4470 - mae: 4.2574
Epoch 207/500
4/4 [==============================] - 0s 6ms/step - loss: 32.9775 - mae: 4.3596
Epoch 208/500
4/4 [==============================] - 0s 6ms/step - loss: 20.2213 - mae: 3.4915
Epoch 209/500
4/4 [==============================] - 0s 6ms/step - loss: 26.8458 - mae: 3.6862
Epoch 210/500
4/4 [==============================] - 0s 6ms/step - loss: 33.2493 - mae: 3.9150
Epoch 211/500
4/4 [==============================] - 0s 6ms/step - loss: 23.7726 - mae: 3.3629
Epoch 212/500
4/4 [==============================] - 0s 6ms/step - loss: 47.8056 - mae: 4.6450
Epoch 213/500
4/4 [==============================] - 0s 6ms/step - loss: 26.0019 - mae: 3.8220
Epoch 214/500
4/4 [==============================] - 0s 6ms/step - loss: 38.5157 - mae: 4.0815
Epoch 215/500
4/4 [==============================] - 0s 6ms/step - loss: 41.1888 - mae: 4.3716
Epoch 216/500
4/4 [==============================] - 0s 6ms/step - loss: 38.4723 - mae: 4.5208
Epoch 217/500
4/4 [==============================] - 0s 6ms/step - loss: 36.2635 - mae: 4.4635
Epoch 218/500
4/4 [==============================] - 0s 6ms/step - loss: 25.0274 - mae: 3.8361
Epoch 219/500
4/4 [==============================] - 0s 6ms/step - loss: 47.5033 - mae: 4.6010
Epoch 220/500
4/4 [==============================] - 0s 6ms/step - loss: 31.1422 - mae: 4.0102
Epoch 221/500
4/4 [==============================] - 0s 6ms/step - loss: 28.6972 - mae: 3.7894
Epoch 222/500
4/4 [==============================] - 0s 6ms/step - loss: 39.9855 - mae: 4.1769
Epoch 223/500
4/4 [==============================] - 0s 5ms/step - loss: 34.3042 - mae: 4.0040
Epoch 224/500
4/4 [==============================] - 0s 6ms/step - loss: 37.5507 - mae: 4.2692
Epoch 225/500
4/4 [==============================] - 0s 6ms/step - loss: 30.8925 - mae: 3.7349
Epoch 226/500
4/4 [==============================] - 0s 6ms/step - loss: 18.0249 - mae: 3.2132
Epoch 227/500
4/4 [==============================] - 0s 6ms/step - loss: 40.4844 - mae: 3.7968
Epoch 228/500
4/4 [==============================] - 0s 6ms/step - loss: 40.3771 - mae: 4.5072
Epoch 229/500
4/4 [==============================] - 0s 6ms/step - loss: 30.0968 - mae: 3.8431
Epoch 230/500
4/4 [==============================] - 0s 6ms/step - loss: 42.3233 - mae: 4.7353
Epoch 231/500
4/4 [==============================] - 0s 6ms/step - loss: 28.4360 - mae: 3.8519
Epoch 232/500
4/4 [==============================] - 0s 6ms/step - loss: 34.3875 - mae: 4.0888
Epoch 233/500
4/4 [==============================] - 0s 6ms/step - loss: 37.7024 - mae: 4.0950
Epoch 234/500
4/4 [==============================] - 0s 6ms/step - loss: 22.9286 - mae: 3.4326
Epoch 235/500
4/4 [==============================] - 0s 6ms/step - loss: 26.7449 - mae: 3.4633
Epoch 236/500
4/4 [==============================] - 0s 6ms/step - loss: 31.1048 - mae: 4.0539
Epoch 237/500
4/4 [==============================] - 0s 6ms/step - loss: 33.0777 - mae: 3.4634
Epoch 238/500
4/4 [==============================] - 0s 6ms/step - loss: 30.5168 - mae: 3.9464
Epoch 239/500
4/4 [==============================] - 0s 6ms/step - loss: 21.4810 - mae: 3.0972
Epoch 240/500
4/4 [==============================] - 0s 6ms/step - loss: 15.0609 - mae: 2.9593
Epoch 241/500
4/4 [==============================] - 0s 6ms/step - loss: 31.3856 - mae: 3.7829
Epoch 242/500
4/4 [==============================] - 0s 6ms/step - loss: 21.3775 - mae: 3.5590
Epoch 243/500
4/4 [==============================] - 0s 6ms/step - loss: 25.6744 - mae: 3.4303
Epoch 244/500
4/4 [==============================] - 0s 6ms/step - loss: 30.5030 - mae: 3.8190
Epoch 245/500
4/4 [==============================] - 0s 6ms/step - loss: 29.9013 - mae: 3.5312
Epoch 246/500
4/4 [==============================] - 0s 6ms/step - loss: 20.6063 - mae: 3.3660
Epoch 247/500
4/4 [==============================] - 0s 6ms/step - loss: 22.5190 - mae: 3.5678
Epoch 248/500
4/4 [==============================] - 0s 6ms/step - loss: 31.3560 - mae: 3.6144
Epoch 249/500
4/4 [==============================] - 0s 5ms/step - loss: 31.6825 - mae: 4.0291
Epoch 250/500
4/4 [==============================] - 0s 6ms/step - loss: 23.7424 - mae: 3.4188
Epoch 251/500
4/4 [==============================] - 0s 6ms/step - loss: 19.6214 - mae: 3.0059
Epoch 252/500
4/4 [==============================] - 0s 6ms/step - loss: 19.8888 - mae: 3.2944
Epoch 253/500
4/4 [==============================] - 0s 6ms/step - loss: 37.3220 - mae: 4.1602
Epoch 254/500
4/4 [==============================] - 0s 6ms/step - loss: 32.3233 - mae: 4.0243
Epoch 255/500
4/4 [==============================] - 0s 6ms/step - loss: 20.8347 - mae: 3.5466
Epoch 256/500
4/4 [==============================] - 0s 6ms/step - loss: 30.6868 - mae: 3.6567
Epoch 257/500
4/4 [==============================] - 0s 6ms/step - loss: 31.9644 - mae: 4.0783
Epoch 258/500
4/4 [==============================] - 0s 6ms/step - loss: 20.9694 - mae: 3.4249
Epoch 259/500
4/4 [==============================] - 0s 7ms/step - loss: 19.4017 - mae: 3.1249
Epoch 260/500
4/4 [==============================] - 0s 6ms/step - loss: 31.8503 - mae: 4.1086
Epoch 261/500
4/4 [==============================] - 0s 6ms/step - loss: 19.5529 - mae: 3.1000
Epoch 262/500
4/4 [==============================] - 0s 6ms/step - loss: 26.6096 - mae: 3.4337
Epoch 263/500
4/4 [==============================] - 0s 6ms/step - loss: 18.6491 - mae: 3.1304
Epoch 264/500
4/4 [==============================] - 0s 6ms/step - loss: 22.2213 - mae: 3.4135
Epoch 265/500
4/4 [==============================] - 0s 6ms/step - loss: 33.0001 - mae: 3.9198
Epoch 266/500
4/4 [==============================] - 0s 6ms/step - loss: 31.8908 - mae: 3.8457
Epoch 267/500
4/4 [==============================] - 0s 6ms/step - loss: 27.8484 - mae: 3.6859
Epoch 268/500
4/4 [==============================] - 0s 5ms/step - loss: 43.1163 - mae: 4.5514
Epoch 269/500
4/4 [==============================] - 0s 6ms/step - loss: 30.7570 - mae: 3.9377
Epoch 270/500
4/4 [==============================] - 0s 6ms/step - loss: 32.1290 - mae: 3.8682
Epoch 271/500
4/4 [==============================] - 0s 6ms/step - loss: 26.7361 - mae: 3.3599
Epoch 272/500
4/4 [==============================] - 0s 6ms/step - loss: 32.1024 - mae: 4.1389
Epoch 273/500
4/4 [==============================] - 0s 6ms/step - loss: 35.8682 - mae: 3.7434
Epoch 274/500
4/4 [==============================] - 0s 6ms/step - loss: 30.0251 - mae: 3.8348
Epoch 275/500
4/4 [==============================] - 0s 6ms/step - loss: 34.5154 - mae: 4.1358
Epoch 276/500
4/4 [==============================] - 0s 6ms/step - loss: 20.6709 - mae: 3.4136
Epoch 277/500
4/4 [==============================] - 0s 6ms/step - loss: 39.8952 - mae: 4.4389
Epoch 278/500
4/4 [==============================] - 0s 6ms/step - loss: 27.7413 - mae: 3.5123
Epoch 279/500
4/4 [==============================] - 0s 6ms/step - loss: 33.4760 - mae: 4.2259
Epoch 280/500
4/4 [==============================] - 0s 6ms/step - loss: 51.0723 - mae: 5.1822
Epoch 281/500
4/4 [==============================] - 0s 6ms/step - loss: 33.1154 - mae: 3.5329
Epoch 282/500
4/4 [==============================] - 0s 6ms/step - loss: 36.9030 - mae: 4.3162
Epoch 283/500
4/4 [==============================] - 0s 6ms/step - loss: 22.0994 - mae: 3.6235
Epoch 284/500
4/4 [==============================] - 0s 6ms/step - loss: 21.3047 - mae: 3.1437
Epoch 285/500
4/4 [==============================] - 0s 6ms/step - loss: 25.4834 - mae: 3.7000
Epoch 286/500
4/4 [==============================] - 0s 6ms/step - loss: 23.1195 - mae: 3.3684
Epoch 287/500
4/4 [==============================] - 0s 6ms/step - loss: 24.1213 - mae: 3.6636
Epoch 288/500
4/4 [==============================] - 0s 5ms/step - loss: 40.2743 - mae: 4.2514
Epoch 289/500
4/4 [==============================] - 0s 6ms/step - loss: 38.5831 - mae: 4.3343
Epoch 290/500
4/4 [==============================] - 0s 6ms/step - loss: 53.5001 - mae: 4.9764
Epoch 291/500
4/4 [==============================] - 0s 6ms/step - loss: 41.6643 - mae: 4.8738
Epoch 292/500
4/4 [==============================] - 0s 6ms/step - loss: 11.5749 - mae: 2.5377
Epoch 293/500
4/4 [==============================] - 0s 6ms/step - loss: 31.5873 - mae: 4.0921
Epoch 294/500
4/4 [==============================] - 0s 6ms/step - loss: 53.1486 - mae: 5.2949
Epoch 295/500
4/4 [==============================] - 0s 6ms/step - loss: 25.8080 - mae: 3.8625
Epoch 296/500
4/4 [==============================] - 0s 6ms/step - loss: 24.5844 - mae: 3.6971
Epoch 297/500
4/4 [==============================] - 0s 6ms/step - loss: 25.5502 - mae: 3.3397
Epoch 298/500
4/4 [==============================] - 0s 6ms/step - loss: 23.3092 - mae: 3.3602
Epoch 299/500
4/4 [==============================] - 0s 6ms/step - loss: 40.0240 - mae: 3.9332
Epoch 300/500
4/4 [==============================] - 0s 6ms/step - loss: 24.1683 - mae: 3.6591
Epoch 301/500
4/4 [==============================] - 0s 6ms/step - loss: 21.3809 - mae: 3.3504
Epoch 302/500
4/4 [==============================] - 0s 6ms/step - loss: 18.3009 - mae: 2.9268
Epoch 303/500
4/4 [==============================] - 0s 6ms/step - loss: 21.7172 - mae: 3.1025
Epoch 304/500
4/4 [==============================] - 0s 6ms/step - loss: 22.5371 - mae: 3.5091
Epoch 305/500
4/4 [==============================] - 0s 6ms/step - loss: 25.9466 - mae: 3.9259
Epoch 306/500
4/4 [==============================] - 0s 5ms/step - loss: 31.4336 - mae: 3.7362
Epoch 307/500
4/4 [==============================] - 0s 6ms/step - loss: 24.1014 - mae: 3.7011
Epoch 308/500
4/4 [==============================] - 0s 6ms/step - loss: 21.4741 - mae: 3.2216
Epoch 309/500
4/4 [==============================] - 0s 6ms/step - loss: 25.9157 - mae: 3.4701
Epoch 310/500
4/4 [==============================] - 0s 6ms/step - loss: 26.5361 - mae: 3.9041
Epoch 311/500
4/4 [==============================] - 0s 6ms/step - loss: 22.5634 - mae: 3.3039
Epoch 312/500
4/4 [==============================] - 0s 6ms/step - loss: 22.5094 - mae: 3.5465
Epoch 313/500
4/4 [==============================] - 0s 6ms/step - loss: 33.3635 - mae: 4.0793
Epoch 314/500
4/4 [==============================] - 0s 6ms/step - loss: 20.1830 - mae: 3.0155
Epoch 315/500
4/4 [==============================] - 0s 6ms/step - loss: 21.1471 - mae: 3.4220
Epoch 316/500
4/4 [==============================] - 0s 5ms/step - loss: 35.1138 - mae: 4.2397
Epoch 317/500
4/4 [==============================] - 0s 6ms/step - loss: 41.4604 - mae: 4.1157
Epoch 318/500
4/4 [==============================] - 0s 6ms/step - loss: 26.9419 - mae: 3.9916
Epoch 319/500
4/4 [==============================] - 0s 5ms/step - loss: 25.3514 - mae: 3.8893
Epoch 320/500
4/4 [==============================] - 0s 6ms/step - loss: 32.8368 - mae: 3.7165
Epoch 321/500
4/4 [==============================] - 0s 6ms/step - loss: 41.5192 - mae: 4.3010
Epoch 322/500
4/4 [==============================] - 0s 6ms/step - loss: 31.1606 - mae: 4.1746
Epoch 323/500
4/4 [==============================] - 0s 6ms/step - loss: 22.3682 - mae: 3.7417
Epoch 324/500
4/4 [==============================] - 0s 6ms/step - loss: 20.5166 - mae: 3.6054
Epoch 325/500
4/4 [==============================] - 0s 6ms/step - loss: 32.6095 - mae: 3.5561
Epoch 326/500
4/4 [==============================] - 0s 6ms/step - loss: 31.7152 - mae: 4.1899
Epoch 327/500
4/4 [==============================] - 0s 6ms/step - loss: 36.9512 - mae: 3.9106
Epoch 328/500
4/4 [==============================] - 0s 6ms/step - loss: 50.1711 - mae: 4.5521
Epoch 329/500
4/4 [==============================] - 0s 6ms/step - loss: 40.0595 - mae: 4.6031
Epoch 330/500
4/4 [==============================] - 0s 6ms/step - loss: 29.0314 - mae: 3.7433
Epoch 331/500
4/4 [==============================] - 0s 6ms/step - loss: 29.9408 - mae: 3.8455
Epoch 332/500
4/4 [==============================] - 0s 5ms/step - loss: 30.4513 - mae: 4.0770
Epoch 333/500
4/4 [==============================] - 0s 6ms/step - loss: 32.0927 - mae: 4.2187
Epoch 334/500
4/4 [==============================] - 0s 6ms/step - loss: 14.7811 - mae: 2.9019
Epoch 335/500
4/4 [==============================] - 0s 6ms/step - loss: 20.1782 - mae: 3.3961
Epoch 336/500
4/4 [==============================] - 0s 6ms/step - loss: 23.5617 - mae: 3.4486
Epoch 337/500
4/4 [==============================] - 0s 6ms/step - loss: 21.1722 - mae: 3.2526
Epoch 338/500
4/4 [==============================] - 0s 6ms/step - loss: 36.9192 - mae: 4.1954
Epoch 339/500
4/4 [==============================] - 0s 6ms/step - loss: 39.7816 - mae: 4.6234
Epoch 340/500
4/4 [==============================] - 0s 6ms/step - loss: 44.3939 - mae: 4.6774
Epoch 341/500
4/4 [==============================] - 0s 6ms/step - loss: 29.0924 - mae: 4.2291
Epoch 342/500
4/4 [==============================] - 0s 6ms/step - loss: 32.7719 - mae: 3.9150
Epoch 343/500
4/4 [==============================] - 0s 6ms/step - loss: 24.4480 - mae: 3.7829
Epoch 344/500
4/4 [==============================] - 0s 6ms/step - loss: 24.3257 - mae: 3.1125
Epoch 345/500
4/4 [==============================] - 0s 6ms/step - loss: 30.0190 - mae: 3.4234
Epoch 346/500
4/4 [==============================] - 0s 6ms/step - loss: 30.7078 - mae: 3.9648
Epoch 347/500
4/4 [==============================] - 0s 5ms/step - loss: 38.5348 - mae: 4.3979
Epoch 348/500
4/4 [==============================] - 0s 5ms/step - loss: 41.8807 - mae: 3.9782
Epoch 349/500
4/4 [==============================] - 0s 6ms/step - loss: 21.2485 - mae: 3.3200
Epoch 350/500
4/4 [==============================] - 0s 6ms/step - loss: 28.4362 - mae: 3.8193
Epoch 351/500
4/4 [==============================] - 0s 6ms/step - loss: 24.9215 - mae: 3.4615
Epoch 352/500
4/4 [==============================] - 0s 6ms/step - loss: 48.6405 - mae: 4.6651
Epoch 353/500
4/4 [==============================] - 0s 6ms/step - loss: 21.8306 - mae: 3.2665
Epoch 354/500
4/4 [==============================] - 0s 6ms/step - loss: 34.0620 - mae: 4.4987
Epoch 355/500
4/4 [==============================] - 0s 6ms/step - loss: 27.4719 - mae: 3.6675
Epoch 356/500
4/4 [==============================] - 0s 6ms/step - loss: 17.7610 - mae: 3.2970
Epoch 357/500
4/4 [==============================] - 0s 6ms/step - loss: 21.3640 - mae: 3.3648
Epoch 358/500
4/4 [==============================] - 0s 6ms/step - loss: 26.8303 - mae: 3.4866
Epoch 359/500
4/4 [==============================] - 0s 5ms/step - loss: 20.7953 - mae: 3.1091
Epoch 360/500
4/4 [==============================] - 0s 5ms/step - loss: 29.1832 - mae: 3.8943
Epoch 361/500
4/4 [==============================] - 0s 6ms/step - loss: 41.7955 - mae: 4.7450
Epoch 362/500
4/4 [==============================] - 0s 6ms/step - loss: 21.2183 - mae: 3.1562
Epoch 363/500
4/4 [==============================] - 0s 6ms/step - loss: 37.4058 - mae: 4.2850
Epoch 364/500
4/4 [==============================] - 0s 6ms/step - loss: 25.8754 - mae: 3.6258
Epoch 365/500
4/4 [==============================] - 0s 6ms/step - loss: 19.5446 - mae: 3.4960
Epoch 366/500
4/4 [==============================] - 0s 6ms/step - loss: 22.7807 - mae: 3.4204
Epoch 367/500
4/4 [==============================] - 0s 6ms/step - loss: 14.6023 - mae: 2.7192
Epoch 368/500
4/4 [==============================] - 0s 6ms/step - loss: 24.4627 - mae: 3.6302
Epoch 369/500
4/4 [==============================] - 0s 5ms/step - loss: 21.2064 - mae: 3.2520
Epoch 370/500
4/4 [==============================] - 0s 5ms/step - loss: 16.0627 - mae: 3.0966
Epoch 371/500
4/4 [==============================] - 0s 6ms/step - loss: 33.1941 - mae: 3.8617
Epoch 372/500
4/4 [==============================] - 0s 6ms/step - loss: 31.9227 - mae: 4.1271
Epoch 373/500
4/4 [==============================] - 0s 6ms/step - loss: 29.7327 - mae: 4.1407
Epoch 374/500
4/4 [==============================] - 0s 5ms/step - loss: 19.9148 - mae: 2.8756
Epoch 375/500
4/4 [==============================] - 0s 6ms/step - loss: 21.3475 - mae: 3.4532
Epoch 376/500
4/4 [==============================] - 0s 6ms/step - loss: 24.9475 - mae: 3.8906
Epoch 377/500
4/4 [==============================] - 0s 6ms/step - loss: 22.1945 - mae: 3.3289
Epoch 378/500
4/4 [==============================] - 0s 6ms/step - loss: 17.9352 - mae: 2.8045
Epoch 379/500
4/4 [==============================] - 0s 6ms/step - loss: 31.6507 - mae: 3.2712
Epoch 380/500
4/4 [==============================] - 0s 6ms/step - loss: 31.2562 - mae: 3.5019
Epoch 381/500
4/4 [==============================] - 0s 6ms/step - loss: 33.9455 - mae: 4.0533
Epoch 382/500
4/4 [==============================] - 0s 5ms/step - loss: 18.1645 - mae: 3.1256
Epoch 383/500
4/4 [==============================] - 0s 5ms/step - loss: 21.2629 - mae: 3.1686
Epoch 384/500
4/4 [==============================] - 0s 6ms/step - loss: 32.9945 - mae: 4.3926
Epoch 385/500
4/4 [==============================] - 0s 6ms/step - loss: 14.3348 - mae: 2.9844
Epoch 386/500
4/4 [==============================] - 0s 6ms/step - loss: 22.7046 - mae: 3.2391
Epoch 387/500
4/4 [==============================] - 0s 6ms/step - loss: 33.4814 - mae: 4.1067
Epoch 388/500
4/4 [==============================] - 0s 6ms/step - loss: 31.2364 - mae: 3.6885
Epoch 389/500
4/4 [==============================] - 0s 6ms/step - loss: 20.3194 - mae: 3.2754
Epoch 390/500
4/4 [==============================] - 0s 6ms/step - loss: 19.3080 - mae: 3.2959
Epoch 391/500
4/4 [==============================] - 0s 6ms/step - loss: 34.2163 - mae: 4.0786
Epoch 392/500
4/4 [==============================] - 0s 6ms/step - loss: 20.5532 - mae: 3.2954
Epoch 393/500
4/4 [==============================] - 0s 6ms/step - loss: 22.0707 - mae: 3.5345
Epoch 394/500
4/4 [==============================] - 0s 6ms/step - loss: 18.9319 - mae: 3.0459
Epoch 395/500
4/4 [==============================] - 0s 6ms/step - loss: 26.6240 - mae: 3.4785
Epoch 396/500
4/4 [==============================] - 0s 5ms/step - loss: 27.2738 - mae: 3.5459
Epoch 397/500
4/4 [==============================] - 0s 6ms/step - loss: 17.9927 - mae: 3.0003
Epoch 398/500
4/4 [==============================] - 0s 6ms/step - loss: 22.0635 - mae: 3.1937
Epoch 399/500
4/4 [==============================] - 0s 6ms/step - loss: 19.3747 - mae: 3.0341
Epoch 400/500
4/4 [==============================] - 0s 6ms/step - loss: 19.8680 - mae: 3.0474
Epoch 401/500
4/4 [==============================] - 0s 6ms/step - loss: 37.2409 - mae: 4.1802
Epoch 402/500
4/4 [==============================] - 0s 6ms/step - loss: 16.6825 - mae: 3.0786
Epoch 403/500
4/4 [==============================] - 0s 6ms/step - loss: 19.5283 - mae: 3.4774
Epoch 404/500
4/4 [==============================] - 0s 6ms/step - loss: 24.7809 - mae: 3.3879
Epoch 405/500
4/4 [==============================] - 0s 6ms/step - loss: 24.0803 - mae: 3.5016
Epoch 406/500
4/4 [==============================] - 0s 6ms/step - loss: 26.0909 - mae: 3.8069
Epoch 407/500
4/4 [==============================] - 0s 5ms/step - loss: 25.5587 - mae: 3.8175
Epoch 408/500
4/4 [==============================] - 0s 6ms/step - loss: 21.3686 - mae: 3.4736
Epoch 409/500
4/4 [==============================] - 0s 6ms/step - loss: 25.1729 - mae: 3.6003
Epoch 410/500
4/4 [==============================] - 0s 6ms/step - loss: 39.8001 - mae: 4.7413
Epoch 411/500
4/4 [==============================] - 0s 6ms/step - loss: 20.7109 - mae: 3.2070
Epoch 412/500
4/4 [==============================] - 0s 6ms/step - loss: 27.3822 - mae: 3.6057
Epoch 413/500
4/4 [==============================] - 0s 6ms/step - loss: 30.4212 - mae: 3.8643
Epoch 414/500
4/4 [==============================] - 0s 6ms/step - loss: 29.5742 - mae: 3.2714
Epoch 415/500
4/4 [==============================] - 0s 6ms/step - loss: 33.8206 - mae: 4.0412
Epoch 416/500
4/4 [==============================] - 0s 6ms/step - loss: 31.0576 - mae: 3.9707
Epoch 417/500
4/4 [==============================] - 0s 6ms/step - loss: 50.3482 - mae: 4.8812
Epoch 418/500
4/4 [==============================] - 0s 6ms/step - loss: 28.6595 - mae: 3.8395
Epoch 419/500
4/4 [==============================] - 0s 6ms/step - loss: 18.9796 - mae: 3.2801
Epoch 420/500
4/4 [==============================] - 0s 6ms/step - loss: 41.4039 - mae: 4.2515
Epoch 421/500
4/4 [==============================] - 0s 6ms/step - loss: 31.2249 - mae: 4.1244
Epoch 422/500
4/4 [==============================] - 0s 6ms/step - loss: 24.0527 - mae: 3.2666
Epoch 423/500
4/4 [==============================] - 0s 6ms/step - loss: 18.1597 - mae: 2.9453
Epoch 424/500
4/4 [==============================] - 0s 6ms/step - loss: 13.1063 - mae: 2.6807
Epoch 425/500
4/4 [==============================] - 0s 6ms/step - loss: 35.3151 - mae: 4.1794
Epoch 426/500
4/4 [==============================] - 0s 6ms/step - loss: 30.4824 - mae: 3.7982
Epoch 427/500
4/4 [==============================] - 0s 6ms/step - loss: 18.7333 - mae: 3.0284
Epoch 428/500
4/4 [==============================] - 0s 6ms/step - loss: 29.3699 - mae: 4.1242
Epoch 429/500
4/4 [==============================] - 0s 6ms/step - loss: 25.2209 - mae: 3.5712
Epoch 430/500
4/4 [==============================] - 0s 6ms/step - loss: 14.4628 - mae: 2.8507
Epoch 431/500
4/4 [==============================] - 0s 6ms/step - loss: 31.2039 - mae: 4.0883
Epoch 432/500
4/4 [==============================] - 0s 6ms/step - loss: 20.0704 - mae: 3.3345
Epoch 433/500
4/4 [==============================] - 0s 6ms/step - loss: 26.0525 - mae: 3.5876
Epoch 434/500
4/4 [==============================] - 0s 6ms/step - loss: 18.0660 - mae: 3.0996
Epoch 435/500
4/4 [==============================] - 0s 6ms/step - loss: 13.9587 - mae: 2.8661
Epoch 436/500
4/4 [==============================] - 0s 6ms/step - loss: 36.8491 - mae: 4.5794
Epoch 437/500
4/4 [==============================] - 0s 6ms/step - loss: 39.9692 - mae: 4.1921
Epoch 438/500
4/4 [==============================] - 0s 5ms/step - loss: 18.6107 - mae: 3.2059
Epoch 439/500
4/4 [==============================] - 0s 6ms/step - loss: 15.5653 - mae: 2.9623
Epoch 440/500
4/4 [==============================] - 0s 6ms/step - loss: 23.7138 - mae: 3.6724
Epoch 441/500
4/4 [==============================] - 0s 6ms/step - loss: 32.0358 - mae: 3.9400
Epoch 442/500
4/4 [==============================] - 0s 6ms/step - loss: 16.7914 - mae: 2.8601
Epoch 443/500
4/4 [==============================] - 0s 6ms/step - loss: 34.7240 - mae: 3.6578
Epoch 444/500
4/4 [==============================] - 0s 6ms/step - loss: 13.0386 - mae: 2.7434
Epoch 445/500
4/4 [==============================] - 0s 6ms/step - loss: 13.9547 - mae: 2.6844
Epoch 446/500
4/4 [==============================] - 0s 6ms/step - loss: 22.6006 - mae: 3.2216
Epoch 447/500
4/4 [==============================] - 0s 6ms/step - loss: 22.8912 - mae: 3.4885
Epoch 448/500
4/4 [==============================] - 0s 6ms/step - loss: 18.0215 - mae: 3.1761
Epoch 449/500
4/4 [==============================] - 0s 6ms/step - loss: 43.3448 - mae: 3.9906
Epoch 450/500
4/4 [==============================] - 0s 6ms/step - loss: 22.4020 - mae: 3.4494
Epoch 451/500
4/4 [==============================] - 0s 6ms/step - loss: 29.0953 - mae: 3.7104
Epoch 452/500
4/4 [==============================] - 0s 5ms/step - loss: 20.7079 - mae: 3.1622
Epoch 453/500
4/4 [==============================] - 0s 6ms/step - loss: 52.3280 - mae: 5.3311
Epoch 454/500
4/4 [==============================] - 0s 6ms/step - loss: 29.3063 - mae: 3.9434
Epoch 455/500
4/4 [==============================] - 0s 6ms/step - loss: 29.7123 - mae: 3.6758
Epoch 456/500
4/4 [==============================] - 0s 6ms/step - loss: 28.2698 - mae: 3.9506
Epoch 457/500
4/4 [==============================] - 0s 6ms/step - loss: 19.3378 - mae: 3.0378
Epoch 458/500
4/4 [==============================] - 0s 6ms/step - loss: 20.6509 - mae: 3.2146
Epoch 459/500
4/4 [==============================] - 0s 6ms/step - loss: 17.3401 - mae: 2.9854
Epoch 460/500
4/4 [==============================] - 0s 6ms/step - loss: 47.0800 - mae: 5.3008
Epoch 461/500
4/4 [==============================] - 0s 6ms/step - loss: 27.9684 - mae: 3.4936
Epoch 462/500
4/4 [==============================] - 0s 6ms/step - loss: 37.1166 - mae: 4.1524
Epoch 463/500
4/4 [==============================] - 0s 6ms/step - loss: 34.7341 - mae: 3.6884
Epoch 464/500
4/4 [==============================] - 0s 6ms/step - loss: 20.0783 - mae: 3.0752
Epoch 465/500
4/4 [==============================] - 0s 6ms/step - loss: 23.6794 - mae: 3.6409
Epoch 466/500
4/4 [==============================] - 0s 6ms/step - loss: 14.2577 - mae: 2.5552
Epoch 467/500
4/4 [==============================] - 0s 6ms/step - loss: 35.3742 - mae: 4.5890
Epoch 468/500
4/4 [==============================] - 0s 6ms/step - loss: 21.5290 - mae: 3.1241
Epoch 469/500
4/4 [==============================] - 0s 6ms/step - loss: 28.0460 - mae: 4.0295
Epoch 470/500
4/4 [==============================] - 0s 6ms/step - loss: 16.7887 - mae: 2.9684
Epoch 471/500
4/4 [==============================] - 0s 6ms/step - loss: 54.3610 - mae: 5.3638
Epoch 472/500
4/4 [==============================] - 0s 6ms/step - loss: 30.4099 - mae: 4.0314
Epoch 473/500
4/4 [==============================] - 0s 6ms/step - loss: 27.4760 - mae: 3.7894
Epoch 474/500
4/4 [==============================] - 0s 6ms/step - loss: 27.0273 - mae: 4.2066
Epoch 475/500
4/4 [==============================] - 0s 6ms/step - loss: 29.9575 - mae: 3.6928
Epoch 476/500
4/4 [==============================] - 0s 6ms/step - loss: 18.3740 - mae: 2.7114
Epoch 477/500
4/4 [==============================] - 0s 6ms/step - loss: 37.1923 - mae: 4.4717
Epoch 478/500
4/4 [==============================] - 0s 6ms/step - loss: 27.7295 - mae: 3.4800
Epoch 479/500
4/4 [==============================] - 0s 6ms/step - loss: 31.9873 - mae: 4.3936
Epoch 480/500
4/4 [==============================] - 0s 6ms/step - loss: 25.2029 - mae: 3.5721
Epoch 481/500
4/4 [==============================] - 0s 6ms/step - loss: 27.5264 - mae: 3.7016
Epoch 482/500
4/4 [==============================] - 0s 6ms/step - loss: 21.0346 - mae: 3.5050
Epoch 483/500
4/4 [==============================] - 0s 5ms/step - loss: 26.8941 - mae: 3.4982
Epoch 484/500
4/4 [==============================] - 0s 6ms/step - loss: 22.7373 - mae: 3.7142
Epoch 485/500
4/4 [==============================] - 0s 6ms/step - loss: 23.5938 - mae: 3.3338
Epoch 486/500
4/4 [==============================] - 0s 6ms/step - loss: 19.1781 - mae: 3.2565
Epoch 487/500
4/4 [==============================] - 0s 6ms/step - loss: 19.5989 - mae: 3.2685
Epoch 488/500
4/4 [==============================] - 0s 6ms/step - loss: 26.5214 - mae: 3.9266
Epoch 489/500
4/4 [==============================] - 0s 6ms/step - loss: 17.7209 - mae: 3.1287
Epoch 490/500
4/4 [==============================] - 0s 6ms/step - loss: 18.1514 - mae: 2.9598
Epoch 491/500
4/4 [==============================] - 0s 6ms/step - loss: 33.7328 - mae: 3.7316
Epoch 492/500
4/4 [==============================] - 0s 6ms/step - loss: 33.1473 - mae: 4.2713
Epoch 493/500
4/4 [==============================] - 0s 6ms/step - loss: 27.8820 - mae: 3.8409
Epoch 494/500
4/4 [==============================] - 0s 6ms/step - loss: 33.0977 - mae: 4.4499
Epoch 495/500
4/4 [==============================] - 0s 6ms/step - loss: 20.9210 - mae: 3.1978
Epoch 496/500
4/4 [==============================] - 0s 6ms/step - loss: 38.1428 - mae: 4.4071
Epoch 497/500
4/4 [==============================] - 0s 6ms/step - loss: 20.5474 - mae: 3.1658
Epoch 498/500
4/4 [==============================] - 0s 6ms/step - loss: 25.4546 - mae: 3.4619
Epoch 499/500
4/4 [==============================] - 0s 6ms/step - loss: 17.5291 - mae: 2.9870
Epoch 500/500
4/4 [==============================] - 0s 6ms/step - loss: 16.1016 - mae: 3.1496
Out[ ]:
<keras.src.callbacks.History at 0x7f64331ddbb0>
In [ ]:
loss2, mae2 = model2.evaluate(X_test, shuffled_Yss[N_train:, 1:2])
print(f'Test Loss: {loss2}')
print(f'Test MAE: {mae2}')
1/1 [==============================] - 0s 104ms/step - loss: 22.3948 - mae: 2.6088
Test Loss: 22.39481544494629
Test MAE: 2.6087722778320312
In [ ]:
yp1 = model1.predict(np.concatenate((X_train, X_test), 0))
yp2 = model2.predict(np.concatenate((X_train, X_test), 0))

plt.figure(figsize=(10, 4), dpi=100)

plt.subplot(1, 2, 1)
plt.title('MLP Regression of Efficiency')
plt.plot([0, 1.2], [0, 1.2], color='k', linestyle='--', alpha=0.7, zorder=0)
plt.scatter(yp1[:N_train, 0:1], shuffled_Yss[:N_train, 0:1], color='gray', edgecolors='k', linewidths=0.5, alpha=0.7, label='Train')
plt.scatter(yp1[N_train:, 0:1], shuffled_Yss[N_train:, 0:1], color='r', edgecolors='k', linewidths=0.5, alpha=0.7, label='Test')
plt.text(0.6, 0.05, 'Test MAE: {:.4f}'.format(mae), fontsize=12)
plt.axis('square')
plt.xlim(0, 1.2)
plt.ylim(0, 1.2)
plt.xlabel('Efficiency $\\eta_{pred}$')
plt.ylabel('Efficiency $\\eta_{GT}$')
plt.legend()

plt.subplot(1, 2, 2)
plt.title('MLP Regression of Effectiveness')
plt.plot([0, 120], [0, 120], color='k', linestyle='--', alpha=0.7, zorder=0)
plt.scatter(yp2[:N_train, 0:1], shuffled_Yss[:N_train, 1:2], color='gray', edgecolors='k', linewidths=0.5, alpha=0.7, label='Train')
plt.scatter(yp2[N_train:, 0:1], shuffled_Yss[N_train:, 1:2], color='r', edgecolors='k', linewidths=0.5, alpha=0.7, label='Test')
plt.text(60, 5, 'Test MAE: {:.4f}'.format(mae2), fontsize=12)
plt.axis('square')
plt.xlim(0, 120)
plt.ylim(0, 120)
plt.xlabel('Effectiveness $\\epsilon_{pred}$')
plt.ylabel('Effectiveness $\\epsilon_{GT}$')
plt.legend()
plt.show()
5/5 [==============================] - 0s 3ms/step
5/5 [==============================] - 0s 2ms/step
No description has been provided for this image

2. CNN-based Regression Model

2.1 Recap: Effective Thermal Conductivity

  • Effective thermal conductivity $(k_{eff})$ represents heat transfer characteristics in heterogeneous materials such as composite materials or porous materials
  • These materials are widely used as wall insulation
  • In thermodynamics class, we learnt how to predict effective thermal conductivity of materials depending on temperature
  • In porous materials, $k_{eff}$ varies depending on distribution and size of pores
  • However, running simulations every time the structure of the porous material changes is very time-consuming
  • Therefore, in this problem, we implement CNN-based regression model to predict the effective thermal conductivity based on the distribution of porous martials
  • This model immediately calculate the effective thermal conductivity of materials with arbitrary structures
In [ ]:
import tensorflow as tf
import keras
from tensorflow.keras import layers, models
import numpy as np
import matplotlib.pyplot as plt
from PIL import Image
import os
import csv
2024-06-05 13:28:55.452922: I tensorflow/core/util/port.cc:110] oneDNN custom operations are on. You may see slightly different numerical results due to floating-point round-off errors from different computation orders. To turn them off, set the environment variable `TF_ENABLE_ONEDNN_OPTS=0`.
2024-06-05 13:28:55.454804: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2024-06-05 13:28:55.488603: I tensorflow/tsl/cuda/cudart_stub.cc:28] Could not find cuda drivers on your machine, GPU will not be used.
2024-06-05 13:28:55.489416: I tensorflow/core/platform/cpu_feature_guard.cc:182] This TensorFlow binary is optimized to use available CPU instructions in performance-critical operations.
To enable the following instructions: AVX2 AVX512F AVX512_VNNI FMA, in other operations, rebuild TensorFlow with the appropriate compiler flags.
2024-06-05 13:28:56.191227: W tensorflow/compiler/tf2tensorrt/utils/py_utils.cc:38] TF-TRT Warning: Could not find TensorRT

Load Dataset

In [ ]:
N_train = int(0.8 * (len(image_s)))

train_x = np.load('./data/Eff_train_x.npy')
train_y = np.load('./data/Eff_train_y.npy')
test_x = np.load('./data/Eff_test_x.npy')
test_y = np.load('./data/Eff_test_y.npy')
In [ ]:
plt.figure(figsize=(4, 4))
plt.imshow(train_x[3], cmap='gray')
plt.show()
No description has been provided for this image

Build Model

In [ ]:
model = tf.keras.models.Sequential([
    tf.keras.layers.Conv2D(filters = 8,
                           kernel_size = (3, 3),
                           activation='relu',
                           input_shape=(128, 128, 1)),
    keras.layers.MaxPooling2D((2, 2)),

    tf.keras.layers.Conv2D(filters = 16,
                           kernel_size = (3, 3),
                           activation='relu'),
    keras.layers.MaxPooling2D((2, 2)),

    tf.keras.layers.Conv2D(filters = 32,
                           kernel_size = (3, 3),
                           activation='relu'),
    keras.layers.MaxPooling2D((2, 2)),

    tf.keras.layers.Conv2D(filters = 64,
                           kernel_size = (3, 3),
                           activation='relu'),
    keras.layers.MaxPooling2D((2, 2)),

    tf.keras.layers.Conv2D(filters = 64,
                           kernel_size = (3, 3),
                           activation='relu'),
    keras.layers.MaxPooling2D((2, 2)),

    keras.layers.Flatten(),
    keras.layers.Dense(1024, activation='relu'),
    keras.layers.Dense(64, activation='relu'),
    keras.layers.Dense(1)
])

model.summary()
Model: "sequential_3"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 conv2d_5 (Conv2D)           (None, 126, 126, 8)       80        
                                                                 
 max_pooling2d_5 (MaxPoolin  (None, 63, 63, 8)         0         
 g2D)                                                            
                                                                 
 conv2d_6 (Conv2D)           (None, 61, 61, 16)        1168      
                                                                 
 max_pooling2d_6 (MaxPoolin  (None, 30, 30, 16)        0         
 g2D)                                                            
                                                                 
 conv2d_7 (Conv2D)           (None, 28, 28, 32)        4640      
                                                                 
 max_pooling2d_7 (MaxPoolin  (None, 14, 14, 32)        0         
 g2D)                                                            
                                                                 
 conv2d_8 (Conv2D)           (None, 12, 12, 64)        18496     
                                                                 
 max_pooling2d_8 (MaxPoolin  (None, 6, 6, 64)          0         
 g2D)                                                            
                                                                 
 conv2d_9 (Conv2D)           (None, 4, 4, 64)          36928     
                                                                 
 max_pooling2d_9 (MaxPoolin  (None, 2, 2, 64)          0         
 g2D)                                                            
                                                                 
 flatten_1 (Flatten)         (None, 256)               0         
                                                                 
 dense_11 (Dense)            (None, 1024)              263168    
                                                                 
 dense_12 (Dense)            (None, 64)                65600     
                                                                 
 dense_13 (Dense)            (None, 1)                 65        
                                                                 
=================================================================
Total params: 390145 (1.49 MB)
Trainable params: 390145 (1.49 MB)
Non-trainable params: 0 (0.00 Byte)
_________________________________________________________________
In [ ]:
from tensorflow.keras import optimizers

optimizer = optimizers.Adam(learning_rate=0.001)

model.compile(optimizer=optimizer,
              loss='mean_absolute_percentage_error',
              metrics=['mean_absolute_percentage_error'])
In [ ]:
model.fit(train_x, train_y, batch_size = 50, epochs = 200)
Epoch 1/200
160/160 [==============================] - 4s 17ms/step - loss: 65.7674 - mean_absolute_percentage_error: 65.7674
Epoch 2/200
160/160 [==============================] - 3s 17ms/step - loss: 55.0613 - mean_absolute_percentage_error: 55.0613
Epoch 3/200
160/160 [==============================] - 3s 17ms/step - loss: 51.5423 - mean_absolute_percentage_error: 51.5423
Epoch 4/200
160/160 [==============================] - 3s 17ms/step - loss: 46.4769 - mean_absolute_percentage_error: 46.4769
Epoch 5/200
160/160 [==============================] - 3s 17ms/step - loss: 33.3354 - mean_absolute_percentage_error: 33.3354
Epoch 6/200
160/160 [==============================] - 3s 17ms/step - loss: 28.0196 - mean_absolute_percentage_error: 28.0196
Epoch 7/200
160/160 [==============================] - 3s 17ms/step - loss: 23.5287 - mean_absolute_percentage_error: 23.5287
Epoch 8/200
160/160 [==============================] - 3s 17ms/step - loss: 21.0908 - mean_absolute_percentage_error: 21.0908
Epoch 9/200
160/160 [==============================] - 3s 17ms/step - loss: 18.6010 - mean_absolute_percentage_error: 18.6010
Epoch 10/200
160/160 [==============================] - 3s 17ms/step - loss: 17.8257 - mean_absolute_percentage_error: 17.8257
Epoch 11/200
160/160 [==============================] - 3s 17ms/step - loss: 17.6371 - mean_absolute_percentage_error: 17.6371
Epoch 12/200
160/160 [==============================] - 3s 17ms/step - loss: 15.5549 - mean_absolute_percentage_error: 15.5549
Epoch 13/200
160/160 [==============================] - 3s 17ms/step - loss: 14.3761 - mean_absolute_percentage_error: 14.3761
Epoch 14/200
160/160 [==============================] - 3s 17ms/step - loss: 13.9650 - mean_absolute_percentage_error: 13.9650
Epoch 15/200
160/160 [==============================] - 3s 17ms/step - loss: 13.6368 - mean_absolute_percentage_error: 13.6368
Epoch 16/200
160/160 [==============================] - 3s 17ms/step - loss: 12.5128 - mean_absolute_percentage_error: 12.5128
Epoch 17/200
160/160 [==============================] - 3s 17ms/step - loss: 12.4088 - mean_absolute_percentage_error: 12.4088
Epoch 18/200
160/160 [==============================] - 3s 17ms/step - loss: 12.4032 - mean_absolute_percentage_error: 12.4032
Epoch 19/200
160/160 [==============================] - 3s 17ms/step - loss: 11.4114 - mean_absolute_percentage_error: 11.4114
Epoch 20/200
160/160 [==============================] - 3s 17ms/step - loss: 10.2889 - mean_absolute_percentage_error: 10.2889
Epoch 21/200
160/160 [==============================] - 3s 17ms/step - loss: 10.5713 - mean_absolute_percentage_error: 10.5713
Epoch 22/200
160/160 [==============================] - 3s 17ms/step - loss: 10.3478 - mean_absolute_percentage_error: 10.3478
Epoch 23/200
160/160 [==============================] - 3s 17ms/step - loss: 9.2913 - mean_absolute_percentage_error: 9.2913
Epoch 24/200
160/160 [==============================] - 3s 17ms/step - loss: 9.7035 - mean_absolute_percentage_error: 9.7035
Epoch 25/200
160/160 [==============================] - 3s 17ms/step - loss: 9.2096 - mean_absolute_percentage_error: 9.2096
Epoch 26/200
160/160 [==============================] - 3s 17ms/step - loss: 8.3302 - mean_absolute_percentage_error: 8.3302
Epoch 27/200
160/160 [==============================] - 3s 17ms/step - loss: 10.5150 - mean_absolute_percentage_error: 10.5150
Epoch 28/200
160/160 [==============================] - 3s 17ms/step - loss: 8.5494 - mean_absolute_percentage_error: 8.5494
Epoch 29/200
160/160 [==============================] - 3s 17ms/step - loss: 8.1313 - mean_absolute_percentage_error: 8.1313
Epoch 30/200
160/160 [==============================] - 3s 17ms/step - loss: 8.2291 - mean_absolute_percentage_error: 8.2291
Epoch 31/200
160/160 [==============================] - 3s 17ms/step - loss: 7.5821 - mean_absolute_percentage_error: 7.5821
Epoch 32/200
160/160 [==============================] - 3s 17ms/step - loss: 7.5924 - mean_absolute_percentage_error: 7.5924
Epoch 33/200
160/160 [==============================] - 3s 17ms/step - loss: 7.4719 - mean_absolute_percentage_error: 7.4719
Epoch 34/200
160/160 [==============================] - 3s 17ms/step - loss: 7.3819 - mean_absolute_percentage_error: 7.3819
Epoch 35/200
160/160 [==============================] - 3s 17ms/step - loss: 7.1068 - mean_absolute_percentage_error: 7.1068
Epoch 36/200
160/160 [==============================] - 3s 17ms/step - loss: 6.8313 - mean_absolute_percentage_error: 6.8313
Epoch 37/200
160/160 [==============================] - 3s 17ms/step - loss: 6.8590 - mean_absolute_percentage_error: 6.8590
Epoch 38/200
160/160 [==============================] - 3s 17ms/step - loss: 6.3195 - mean_absolute_percentage_error: 6.3195
Epoch 39/200
160/160 [==============================] - 3s 17ms/step - loss: 6.5397 - mean_absolute_percentage_error: 6.5397
Epoch 40/200
160/160 [==============================] - 3s 17ms/step - loss: 6.2511 - mean_absolute_percentage_error: 6.2511
Epoch 41/200
160/160 [==============================] - 3s 17ms/step - loss: 6.3961 - mean_absolute_percentage_error: 6.3961
Epoch 42/200
160/160 [==============================] - 3s 17ms/step - loss: 6.1205 - mean_absolute_percentage_error: 6.1205
Epoch 43/200
160/160 [==============================] - 3s 17ms/step - loss: 5.9933 - mean_absolute_percentage_error: 5.9933
Epoch 44/200
160/160 [==============================] - 3s 17ms/step - loss: 5.9166 - mean_absolute_percentage_error: 5.9166
Epoch 45/200
160/160 [==============================] - 3s 17ms/step - loss: 6.1083 - mean_absolute_percentage_error: 6.1083
Epoch 46/200
160/160 [==============================] - 3s 17ms/step - loss: 5.9219 - mean_absolute_percentage_error: 5.9219
Epoch 47/200
160/160 [==============================] - 3s 17ms/step - loss: 6.0102 - mean_absolute_percentage_error: 6.0102
Epoch 48/200
160/160 [==============================] - 3s 17ms/step - loss: 5.9023 - mean_absolute_percentage_error: 5.9023
Epoch 49/200
160/160 [==============================] - 3s 17ms/step - loss: 5.5371 - mean_absolute_percentage_error: 5.5371
Epoch 50/200
160/160 [==============================] - 3s 17ms/step - loss: 5.7348 - mean_absolute_percentage_error: 5.7348
Epoch 51/200
160/160 [==============================] - 3s 17ms/step - loss: 5.3793 - mean_absolute_percentage_error: 5.3793
Epoch 52/200
160/160 [==============================] - 3s 17ms/step - loss: 5.2321 - mean_absolute_percentage_error: 5.2321
Epoch 53/200
160/160 [==============================] - 3s 17ms/step - loss: 5.4501 - mean_absolute_percentage_error: 5.4501
Epoch 54/200
160/160 [==============================] - 3s 17ms/step - loss: 5.0320 - mean_absolute_percentage_error: 5.0320
Epoch 55/200
160/160 [==============================] - 3s 17ms/step - loss: 5.3487 - mean_absolute_percentage_error: 5.3487
Epoch 56/200
160/160 [==============================] - 3s 17ms/step - loss: 5.5477 - mean_absolute_percentage_error: 5.5477
Epoch 57/200
160/160 [==============================] - 3s 17ms/step - loss: 5.1299 - mean_absolute_percentage_error: 5.1299
Epoch 58/200
160/160 [==============================] - 3s 17ms/step - loss: 4.9305 - mean_absolute_percentage_error: 4.9305
Epoch 59/200
160/160 [==============================] - 3s 17ms/step - loss: 4.8182 - mean_absolute_percentage_error: 4.8182
Epoch 60/200
160/160 [==============================] - 3s 17ms/step - loss: 4.7130 - mean_absolute_percentage_error: 4.7130
Epoch 61/200
160/160 [==============================] - 3s 17ms/step - loss: 5.3377 - mean_absolute_percentage_error: 5.3377
Epoch 62/200
160/160 [==============================] - 3s 17ms/step - loss: 4.8388 - mean_absolute_percentage_error: 4.8388
Epoch 63/200
160/160 [==============================] - 3s 17ms/step - loss: 4.3695 - mean_absolute_percentage_error: 4.3695
Epoch 64/200
160/160 [==============================] - 3s 17ms/step - loss: 4.3205 - mean_absolute_percentage_error: 4.3205
Epoch 65/200
160/160 [==============================] - 3s 17ms/step - loss: 4.6417 - mean_absolute_percentage_error: 4.6417
Epoch 66/200
160/160 [==============================] - 3s 17ms/step - loss: 4.5568 - mean_absolute_percentage_error: 4.5568
Epoch 67/200
160/160 [==============================] - 3s 17ms/step - loss: 4.6989 - mean_absolute_percentage_error: 4.6989
Epoch 68/200
160/160 [==============================] - 3s 17ms/step - loss: 4.3693 - mean_absolute_percentage_error: 4.3693
Epoch 69/200
160/160 [==============================] - 3s 17ms/step - loss: 4.5591 - mean_absolute_percentage_error: 4.5591
Epoch 70/200
160/160 [==============================] - 3s 17ms/step - loss: 4.3720 - mean_absolute_percentage_error: 4.3720
Epoch 71/200
160/160 [==============================] - 3s 17ms/step - loss: 4.1566 - mean_absolute_percentage_error: 4.1566
Epoch 72/200
160/160 [==============================] - 3s 17ms/step - loss: 4.4732 - mean_absolute_percentage_error: 4.4732
Epoch 73/200
160/160 [==============================] - 3s 17ms/step - loss: 4.3001 - mean_absolute_percentage_error: 4.3001
Epoch 74/200
160/160 [==============================] - 3s 17ms/step - loss: 4.0635 - mean_absolute_percentage_error: 4.0635
Epoch 75/200
160/160 [==============================] - 3s 17ms/step - loss: 4.2258 - mean_absolute_percentage_error: 4.2258
Epoch 76/200
160/160 [==============================] - 3s 17ms/step - loss: 4.5629 - mean_absolute_percentage_error: 4.5629
Epoch 77/200
160/160 [==============================] - 3s 17ms/step - loss: 3.9801 - mean_absolute_percentage_error: 3.9801
Epoch 78/200
160/160 [==============================] - 3s 17ms/step - loss: 4.1115 - mean_absolute_percentage_error: 4.1115
Epoch 79/200
160/160 [==============================] - 3s 17ms/step - loss: 4.0054 - mean_absolute_percentage_error: 4.0054
Epoch 80/200
160/160 [==============================] - 3s 17ms/step - loss: 3.9613 - mean_absolute_percentage_error: 3.9613
Epoch 81/200
160/160 [==============================] - 3s 17ms/step - loss: 4.0147 - mean_absolute_percentage_error: 4.0147
Epoch 82/200
160/160 [==============================] - 3s 17ms/step - loss: 4.1662 - mean_absolute_percentage_error: 4.1662
Epoch 83/200
160/160 [==============================] - 3s 17ms/step - loss: 3.9973 - mean_absolute_percentage_error: 3.9973
Epoch 84/200
160/160 [==============================] - 3s 17ms/step - loss: 3.8665 - mean_absolute_percentage_error: 3.8665
Epoch 85/200
160/160 [==============================] - 3s 17ms/step - loss: 3.9688 - mean_absolute_percentage_error: 3.9688
Epoch 86/200
160/160 [==============================] - 3s 17ms/step - loss: 4.2215 - mean_absolute_percentage_error: 4.2215
Epoch 87/200
160/160 [==============================] - 3s 17ms/step - loss: 3.9118 - mean_absolute_percentage_error: 3.9118
Epoch 88/200
160/160 [==============================] - 3s 17ms/step - loss: 3.8125 - mean_absolute_percentage_error: 3.8125
Epoch 89/200
160/160 [==============================] - 3s 17ms/step - loss: 3.8722 - mean_absolute_percentage_error: 3.8722
Epoch 90/200
160/160 [==============================] - 3s 17ms/step - loss: 3.5640 - mean_absolute_percentage_error: 3.5640
Epoch 91/200
160/160 [==============================] - 3s 17ms/step - loss: 3.6733 - mean_absolute_percentage_error: 3.6733
Epoch 92/200
160/160 [==============================] - 3s 17ms/step - loss: 3.9559 - mean_absolute_percentage_error: 3.9559
Epoch 93/200
160/160 [==============================] - 3s 17ms/step - loss: 3.8102 - mean_absolute_percentage_error: 3.8102
Epoch 94/200
160/160 [==============================] - 3s 17ms/step - loss: 3.6528 - mean_absolute_percentage_error: 3.6528
Epoch 95/200
160/160 [==============================] - 3s 17ms/step - loss: 3.6576 - mean_absolute_percentage_error: 3.6576
Epoch 96/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4927 - mean_absolute_percentage_error: 3.4927
Epoch 97/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4784 - mean_absolute_percentage_error: 3.4784
Epoch 98/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4220 - mean_absolute_percentage_error: 3.4220
Epoch 99/200
160/160 [==============================] - 3s 17ms/step - loss: 3.6074 - mean_absolute_percentage_error: 3.6074
Epoch 100/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4375 - mean_absolute_percentage_error: 3.4375
Epoch 101/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4718 - mean_absolute_percentage_error: 3.4718
Epoch 102/200
160/160 [==============================] - 3s 17ms/step - loss: 3.3194 - mean_absolute_percentage_error: 3.3194
Epoch 103/200
160/160 [==============================] - 3s 17ms/step - loss: 3.3167 - mean_absolute_percentage_error: 3.3167
Epoch 104/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4278 - mean_absolute_percentage_error: 3.4278
Epoch 105/200
160/160 [==============================] - 3s 17ms/step - loss: 3.5191 - mean_absolute_percentage_error: 3.5191
Epoch 106/200
160/160 [==============================] - 3s 17ms/step - loss: 3.5997 - mean_absolute_percentage_error: 3.5997
Epoch 107/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4740 - mean_absolute_percentage_error: 3.4740
Epoch 108/200
160/160 [==============================] - 3s 17ms/step - loss: 3.6039 - mean_absolute_percentage_error: 3.6039
Epoch 109/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4731 - mean_absolute_percentage_error: 3.4731
Epoch 110/200
160/160 [==============================] - 3s 17ms/step - loss: 3.4594 - mean_absolute_percentage_error: 3.4594
Epoch 111/200
160/160 [==============================] - 3s 17ms/step - loss: 3.2639 - mean_absolute_percentage_error: 3.2639
Epoch 112/200
160/160 [==============================] - 3s 17ms/step - loss: 3.2684 - mean_absolute_percentage_error: 3.2684
Epoch 113/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0331 - mean_absolute_percentage_error: 3.0331
Epoch 114/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0858 - mean_absolute_percentage_error: 3.0858
Epoch 115/200
160/160 [==============================] - 3s 17ms/step - loss: 3.3886 - mean_absolute_percentage_error: 3.3886
Epoch 116/200
160/160 [==============================] - 3s 17ms/step - loss: 3.2561 - mean_absolute_percentage_error: 3.2561
Epoch 117/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1349 - mean_absolute_percentage_error: 3.1349
Epoch 118/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1439 - mean_absolute_percentage_error: 3.1439
Epoch 119/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1508 - mean_absolute_percentage_error: 3.1508
Epoch 120/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0333 - mean_absolute_percentage_error: 3.0333
Epoch 121/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1872 - mean_absolute_percentage_error: 3.1872
Epoch 122/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1948 - mean_absolute_percentage_error: 3.1948
Epoch 123/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1305 - mean_absolute_percentage_error: 3.1305
Epoch 124/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0879 - mean_absolute_percentage_error: 3.0879
Epoch 125/200
160/160 [==============================] - 3s 17ms/step - loss: 3.2698 - mean_absolute_percentage_error: 3.2698
Epoch 126/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1628 - mean_absolute_percentage_error: 3.1628
Epoch 127/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0192 - mean_absolute_percentage_error: 3.0192
Epoch 128/200
160/160 [==============================] - 3s 17ms/step - loss: 2.9778 - mean_absolute_percentage_error: 2.9778
Epoch 129/200
160/160 [==============================] - 3s 17ms/step - loss: 2.9792 - mean_absolute_percentage_error: 2.9792
Epoch 130/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0539 - mean_absolute_percentage_error: 3.0539
Epoch 131/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8295 - mean_absolute_percentage_error: 2.8295
Epoch 132/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0851 - mean_absolute_percentage_error: 3.0851
Epoch 133/200
160/160 [==============================] - 3s 17ms/step - loss: 2.9507 - mean_absolute_percentage_error: 2.9507
Epoch 134/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1773 - mean_absolute_percentage_error: 3.1773
Epoch 135/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1595 - mean_absolute_percentage_error: 3.1595
Epoch 136/200
160/160 [==============================] - 3s 17ms/step - loss: 3.1207 - mean_absolute_percentage_error: 3.1207
Epoch 137/200
160/160 [==============================] - 3s 17ms/step - loss: 2.9870 - mean_absolute_percentage_error: 2.9870
Epoch 138/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8698 - mean_absolute_percentage_error: 2.8698
Epoch 139/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0120 - mean_absolute_percentage_error: 3.0120
Epoch 140/200
160/160 [==============================] - 3s 17ms/step - loss: 2.9440 - mean_absolute_percentage_error: 2.9440
Epoch 141/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8079 - mean_absolute_percentage_error: 2.8079
Epoch 142/200
160/160 [==============================] - 3s 17ms/step - loss: 2.9128 - mean_absolute_percentage_error: 2.9128
Epoch 143/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0506 - mean_absolute_percentage_error: 3.0506
Epoch 144/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8171 - mean_absolute_percentage_error: 2.8171
Epoch 145/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8074 - mean_absolute_percentage_error: 2.8074
Epoch 146/200
160/160 [==============================] - 3s 17ms/step - loss: 2.7563 - mean_absolute_percentage_error: 2.7563
Epoch 147/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6980 - mean_absolute_percentage_error: 2.6980
Epoch 148/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8555 - mean_absolute_percentage_error: 2.8555
Epoch 149/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8338 - mean_absolute_percentage_error: 2.8338
Epoch 150/200
160/160 [==============================] - 3s 17ms/step - loss: 2.7947 - mean_absolute_percentage_error: 2.7947
Epoch 151/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5825 - mean_absolute_percentage_error: 2.5825
Epoch 152/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6011 - mean_absolute_percentage_error: 2.6011
Epoch 153/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0274 - mean_absolute_percentage_error: 3.0274
Epoch 154/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8632 - mean_absolute_percentage_error: 2.8632
Epoch 155/200
160/160 [==============================] - 3s 17ms/step - loss: 2.9638 - mean_absolute_percentage_error: 2.9638
Epoch 156/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8506 - mean_absolute_percentage_error: 2.8506
Epoch 157/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8752 - mean_absolute_percentage_error: 2.8752
Epoch 158/200
160/160 [==============================] - 3s 17ms/step - loss: 3.0242 - mean_absolute_percentage_error: 3.0242
Epoch 159/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8739 - mean_absolute_percentage_error: 2.8739
Epoch 160/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8667 - mean_absolute_percentage_error: 2.8667
Epoch 161/200
160/160 [==============================] - 3s 17ms/step - loss: 2.7414 - mean_absolute_percentage_error: 2.7414
Epoch 162/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5166 - mean_absolute_percentage_error: 2.5166
Epoch 163/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6019 - mean_absolute_percentage_error: 2.6019
Epoch 164/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5993 - mean_absolute_percentage_error: 2.5993
Epoch 165/200
160/160 [==============================] - 3s 17ms/step - loss: 2.7449 - mean_absolute_percentage_error: 2.7449
Epoch 166/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6823 - mean_absolute_percentage_error: 2.6823
Epoch 167/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6750 - mean_absolute_percentage_error: 2.6750
Epoch 168/200
160/160 [==============================] - 3s 17ms/step - loss: 2.7655 - mean_absolute_percentage_error: 2.7655
Epoch 169/200
160/160 [==============================] - 3s 17ms/step - loss: 2.7438 - mean_absolute_percentage_error: 2.7438
Epoch 170/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6603 - mean_absolute_percentage_error: 2.6603
Epoch 171/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5392 - mean_absolute_percentage_error: 2.5392
Epoch 172/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5557 - mean_absolute_percentage_error: 2.5557
Epoch 173/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5426 - mean_absolute_percentage_error: 2.5426
Epoch 174/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4769 - mean_absolute_percentage_error: 2.4769
Epoch 175/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5404 - mean_absolute_percentage_error: 2.5404
Epoch 176/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4470 - mean_absolute_percentage_error: 2.4470
Epoch 177/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4986 - mean_absolute_percentage_error: 2.4986
Epoch 178/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6750 - mean_absolute_percentage_error: 2.6750
Epoch 179/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6751 - mean_absolute_percentage_error: 2.6751
Epoch 180/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8756 - mean_absolute_percentage_error: 2.8756
Epoch 181/200
160/160 [==============================] - 3s 17ms/step - loss: 2.7387 - mean_absolute_percentage_error: 2.7387
Epoch 182/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6272 - mean_absolute_percentage_error: 2.6272
Epoch 183/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6384 - mean_absolute_percentage_error: 2.6384
Epoch 184/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5430 - mean_absolute_percentage_error: 2.5430
Epoch 185/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4738 - mean_absolute_percentage_error: 2.4738
Epoch 186/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5254 - mean_absolute_percentage_error: 2.5254
Epoch 187/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4463 - mean_absolute_percentage_error: 2.4463
Epoch 188/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4207 - mean_absolute_percentage_error: 2.4207
Epoch 189/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5306 - mean_absolute_percentage_error: 2.5306
Epoch 190/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5286 - mean_absolute_percentage_error: 2.5286
Epoch 191/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4654 - mean_absolute_percentage_error: 2.4654
Epoch 192/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6012 - mean_absolute_percentage_error: 2.6012
Epoch 193/200
160/160 [==============================] - 3s 17ms/step - loss: 2.7919 - mean_absolute_percentage_error: 2.7919
Epoch 194/200
160/160 [==============================] - 3s 17ms/step - loss: 2.6460 - mean_absolute_percentage_error: 2.6460
Epoch 195/200
160/160 [==============================] - 3s 17ms/step - loss: 2.8089 - mean_absolute_percentage_error: 2.8089
Epoch 196/200
160/160 [==============================] - 3s 17ms/step - loss: 2.3011 - mean_absolute_percentage_error: 2.3011
Epoch 197/200
160/160 [==============================] - 3s 17ms/step - loss: 2.3335 - mean_absolute_percentage_error: 2.3335
Epoch 198/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4590 - mean_absolute_percentage_error: 2.4590
Epoch 199/200
160/160 [==============================] - 3s 17ms/step - loss: 2.4384 - mean_absolute_percentage_error: 2.4384
Epoch 200/200
160/160 [==============================] - 3s 17ms/step - loss: 2.5089 - mean_absolute_percentage_error: 2.5089
Out[ ]:
<keras.src.callbacks.History at 0x7f6395491f10>
In [ ]:
Keffs = []

pred = model.predict(test_x)

plt.figure(dpi=100)
plt.scatter(pred, test_y, color='forestgreen', edgecolor='k', linewidths=0.5, alpha=0.5)
plt.plot([0, 360], [0, 360], color='k', linestyle='--', alpha=0.7)
plt.axis('square')
plt.xlabel('k$_{eff}$ Pred')
plt.ylabel('k$_{eff}$ GT')
plt.xlim(0, 360)
plt.ylim(0, 360)
plt.show()
63/63 [==============================] - 1s 8ms/step
No description has been provided for this image
In [ ]:
idx = 0

plt.figure(figsize=(8, 8), dpi=100)
for i, idx in enumerate([4, 5, 6, 7]):

    pred = model.predict(test_x[[idx]])

    plt.subplot(2, 2, i+1)
    plt.title('GT: {:.2f}, Pred: {:.2f}'.format(test_y[idx], float(pred[[0]])), fontsize=14)
    plt.imshow(train_x[idx], cmap='gray')
plt.show()
1/1 [==============================] - 0s 23ms/step
1/1 [==============================] - 0s 16ms/step
1/1 [==============================] - 0s 15ms/step
1/1 [==============================] - 0s 15ms/step
No description has been provided for this image

3. Two-Dimensional Heat Conduction

3.1 Recap: Numerical analysis of heat conduction

  • Convectional method iteratively predicted temperature using the overall heat balance on the control volume

  • Rate of conduction into the control volume = $-k \left. \frac{\partial T}{\partial x}\right|_{left} \Delta y - k \left. \frac{\partial T}{\partial y}\right|_{bottom} \Delta x$

  • Rate of conduction out the control volume = $-k \left. \frac{\partial T}{\partial x}\right|_{right} \Delta y - k \left. \frac{\partial T}{\partial y}\right|_{top} \Delta x$

  • $\frac{T_{i+1,j}-2T_{i,j}+T_{i-1,j}}{\Delta x^2} + \frac{T_{i,j+1}-2T_{i,j}+T_{i,j-1}}{\Delta y^2} = 0$

  • Solving for $T_{i,j}$ when $\Delta x = \Delta y$, we have $T_{i,j}=\frac{1}{4}\left(T_{i+1,j}+T_{i-1,j}+T_{i,j+1}+T_{i,j+1} \right)$

For example, temperature at internal node of $4\times 4$ grid can be calculated with those equations

  • $T_{2,2}=\frac{1}{4}\left(50+0+T_{2,3}+T_{3,2} \right)$
  • $T_{2,3}=\frac{1}{4}\left(50+0+T_{2,2}+T_{3,3} \right)$
  • $T_{3,2}=\frac{1}{4}\left(100+0+T_{2,2}+T_{3,3} \right)$
  • $T_{3,3}=\frac{1}{4}\left(100+0+T_{2,3}+T_{3,2} \right)$

However, this method 1) cannot measure the temperature between each node and 2) is difficult to apply to complex domains.

Therefore, we apply PINN to solve 2D heat conduction problems

In [ ]:
# !pip install deepxde
In [ ]:
import deepxde as dde
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import time
Using backend: pytorch
Other supported backends: tensorflow.compat.v1, tensorflow, jax, paddle.
paddle supports more examples now and is recommended.

Numerial analysis

In [ ]:
n = 100
l = 1.
r = 2*l/(n+1)
T = np.zeros([n*n, n*n])

bc = {
    "x=-l": 50.,
    "x=+l": 100.,
    "y=-l": 0.,
    "y=+l": 0.
}
computation_time = {}
In [ ]:
B = np.zeros([n, n])
k = 0
for i in range(n):
    x = i * r
    for j in range(n):
        y = j * r
        M = np.zeros([n, n])
        M[i, j] = -4
        if i != 0: # ok i know
            M[i-1, j] = 1
        else:
            B[i, j] += -bc["y=-l"]   # b.c y = 0
        if i != n-1:
            M[i+1, j] = 1
        else:
            B[i, j] += -bc["y=+l"]   # b.c y = l
        if j != 0:
            M[i, j-1] = 1
        else:
            B[i, j] += -bc["x=-l"]   # b.c x = 0
        if j != n-1:
            M[i, j+1] = 1
        else:
            B[i, j] += -bc["x=+l"]   # b.c x = l
        #B[i, j] += -r**2 * q(x, y) * K(x, y)
        m = np.reshape(M, (1, n**2))
        T[k, :] = m
        k += 1

#
b = np.reshape(B, (n**2, 1))
start = time.time()
T = np.matmul(np.linalg.inv(T), b)
T = T.reshape([n, n])
Temperature = T
end = time.time()
computation_time["fdm"] = end - start
print(f"\ncomputation time: {end-start:.3f}\n")
computation time: 7.055

In [ ]:
plt.figure()
x = np.linspace(-1, 1, 100)
y = np.linspace(-1, 1, 100)
x, y = np.meshgrid(x, y)

plt.pcolormesh(x, y, T, cmap='magma')
plt.colorbar()
plt.title('FDM')
plt.xlabel('x')
plt.ylabel('y')
plt.xlim(-1, 1)
plt.ylim(-1, 1)
plt.tight_layout()
plt.axis("square")
plt.show()
No description has been provided for this image

3.2 Physics-informed Neural Network for Square Domain

In [ ]:
xmin, xmax = -1, 1
ymin, ymax = -1, 1
k = 1
In [ ]:
def pde(x, y):
    dT_xx = dde.grad.hessian(y, x, i = 0, j = 0, component = 0)
    dT_yy = dde.grad.hessian(y, x, i = 1, j = 1, component = 0)
    return k*(dT_xx + dT_yy)

def boundary_left(x, on_boundary):
    return on_boundary and np.isclose(x[0], xmin)

def boundary_right(x, on_boundary):
    return on_boundary and np.isclose(x[0], xmax)

def boundary_bottom(x, on_boundary):
    return on_boundary and np.isclose(x[1], ymin)

def boundary_top(x, on_boundary):
    return on_boundary and np.isclose(x[1], ymax)
In [ ]:
geom = dde.geometry.Rectangle([xmin, ymin], [xmax, ymax])

bc_l = dde.DirichletBC(geom, lambda x: 50/100, boundary_left)
bc_r = dde.DirichletBC(geom, lambda x: 1, boundary_right)
bc_b = dde.DirichletBC(geom, lambda x: 0, boundary_bottom)
bc_t = dde.DirichletBC(geom, lambda x: 0, boundary_top)
In [ ]:
data = dde.data.PDE(geom,
                    pde,
                    [bc_l, bc_r, bc_b, bc_t],
                    num_domain=10000,
                    num_boundary=100,
                    num_test=1000)
Warning: 1000 points required, but 1024 points sampled.
In [ ]:
plt.figure()
plt.scatter(data.train_x_all[:,0], data.train_x_all[:,1], s = 0.5)
plt.xlabel('x-direction length (m)')
plt.ylabel('y-direction length (m)')
plt.axis('square')
plt.show()
No description has been provided for this image
In [ ]:
layer_size = [2] + [64] * 5 + [1]
activation = "tanh"
initializer = "Glorot uniform"

net = dde.maps.FNN(layer_size, activation, initializer)

model = dde.Model(data, net)
model.compile("adam", lr=1e-3)
Compiling model...
'compile' took 0.000437 s

In [ ]:
losshistory, train_state = model.train(epochs = 10000)
dde.saveplot(losshistory, train_state, issave = False, isplot = True)
Warning: epochs is deprecated and will be removed in a future version. Use iterations instead.
Training model...

Step      Train loss                                            Test loss                                             Test metric
0         [1.26e-02, 5.53e-01, 5.72e-01, 1.94e-02, 2.10e-02]    [1.24e-02, 5.53e-01, 5.72e-01, 1.94e-02, 2.10e-02]    []  
1000      [6.79e-03, 8.10e-03, 1.76e-02, 6.72e-03, 1.71e-02]    [5.38e-03, 8.09e-03, 1.76e-02, 6.75e-03, 1.71e-02]    []  
2000      [2.69e-03, 2.67e-03, 1.35e-02, 2.91e-03, 1.17e-02]    [1.70e-03, 2.67e-03, 1.35e-02, 2.91e-03, 1.17e-02]    []  
3000      [2.05e-03, 1.54e-03, 1.28e-02, 1.95e-03, 1.04e-02]    [1.38e-03, 1.54e-03, 1.28e-02, 1.95e-03, 1.04e-02]    []  
4000      [3.12e-03, 1.28e-03, 1.15e-02, 1.48e-03, 1.09e-02]    [1.66e-03, 1.28e-03, 1.15e-02, 1.49e-03, 1.09e-02]    []  
5000      [1.55e-03, 1.07e-03, 1.17e-02, 1.10e-03, 9.90e-03]    [1.02e-03, 1.07e-03, 1.17e-02, 1.11e-03, 9.92e-03]    []  
6000      [2.19e-03, 9.69e-04, 1.14e-02, 1.03e-03, 1.01e-02]    [1.48e-03, 9.65e-04, 1.13e-02, 1.03e-03, 1.01e-02]    []  
7000      [2.02e-03, 7.46e-04, 1.18e-02, 7.07e-04, 9.37e-03]    [1.36e-03, 7.38e-04, 1.18e-02, 7.11e-04, 9.40e-03]    []  
8000      [2.49e-03, 5.95e-04, 1.23e-02, 6.15e-04, 8.65e-03]    [1.67e-03, 5.94e-04, 1.23e-02, 6.19e-04, 8.66e-03]    []  
9000      [2.04e-03, 4.96e-04, 1.05e-02, 6.17e-04, 1.02e-02]    [1.41e-03, 4.91e-04, 1.05e-02, 6.18e-04, 1.02e-02]    []  
10000     [1.63e-03, 4.87e-04, 1.22e-02, 5.59e-04, 8.65e-03]    [9.52e-04, 4.83e-04, 1.22e-02, 5.63e-04, 8.65e-03]    []  

Best model at step 10000:
  train loss: 2.35e-02
  test loss: 2.28e-02
  test metric: []

'train' took 204.724486 s

No description has been provided for this image
No description has been provided for this image
In [ ]:
x = np.linspace(-1, 1, 100)
y = np.linspace(-1, 1, 100)
x, y = np.meshgrid(x, y)
samples = np.hstack([x.reshape(-1, 1), y.reshape(-1, 1)])

result = model.predict(samples)*100

plt.figure(figsize=(12, 3))

plt.subplot(1, 3, 1)
plt.pcolormesh(x, y, T, cmap='magma')
plt.title('FDM')
plt.colorbar()
plt.xlabel('x')
plt.ylabel('y')
plt.xlim(-1, 1)
plt.ylim(-1, 1)
# plt.tight_layout()
plt.axis("square")

plt.subplot(1, 3, 2)
plt.pcolormesh(x, y, result.reshape(100, 100), cmap='magma')
plt.title('PINN')
plt.colorbar()
plt.xlabel('x')
plt.ylabel('y')
plt.xlim(-1, 1)
plt.ylim(-1, 1)
# plt.tight_layout()
plt.axis("square")

plt.subplot(1, 3, 3)
plt.pcolormesh(x, y, np.abs(T - result.reshape(100, 100)), cmap='magma')
plt.title('Error')
plt.colorbar()
plt.clim(0, 10)
plt.xlabel('x')
plt.ylabel('y')
plt.xlim(-1, 1)
plt.ylim(-1, 1)
plt.axis("square")

plt.tight_layout()
plt.show()
No description has been provided for this image
In [ ]:
x = np.linspace(-1, 1, 100).reshape(-1, 1)
y = -0.2 * np.ones((100, 1))
line = np.concatenate((x, y), 1)

plt.figure(figsize=(3, 3))
result = model.predict(line)*100
plt.title('y = -0.2', fontsize=11)
plt.plot(x, T[40, :], color='b', linewidth=2, label='FDM')
plt.plot(x, result, linestyle='--', color='r', linewidth=2, label='PINN')
plt.xlabel('x')
plt.ylabel('T')
plt.xlim(-1, 1)
plt.ylim(0, 100)
plt.legend()
plt.show()
No description has been provided for this image

3.3 Physics-informed Neural Network for Disk Domain

  • The cylindrical coordinate is widely used in various industrial systems
  • ex) pipes, wires, heat exchanger shells, reactors, and etc.
  • However, the numerial analysis of this system is complex
  • Complex energy balance equation for the control volume
  • Complex consideration for irregular boundary conditions
In [ ]:
r_in = 0.25
r_out = 1.0
k = 1
In [ ]:
def pde(x, y):
    dT_xx = dde.grad.hessian(y, x, i = 0, j = 0, component = 0)
    dT_yy = dde.grad.hessian(y, x, i = 1, j = 1, component = 0)
    return k*(dT_xx + dT_yy)

def boundary_inner(x, on_boundary):
    return on_boundary and np.isclose(x[0]**2 + x[1]**2, r_in**2)

def boundary_outer(x, on_boundary):
    return on_boundary and np.isclose(x[0]**2 + x[1]**2, r_out**2)
In [ ]:
geom = dde.geometry.Disk([0, 0], 1) - dde.geometry.Disk([0, 0], 0.25)

bc_i = dde.DirichletBC(geom, lambda x: 50/50, boundary_inner)
bc_o = dde.DirichletBC(geom, lambda x: 0, boundary_outer)

data = dde.data.PDE(geom,
                    pde,
                    [bc_i, bc_o],
                    num_domain=7000,
                    num_boundary=1000,
                    num_test=10000)

plt.figure(dpi=100)
plt.scatter(data.train_x_all[:,0], data.train_x_all[:,1], s = 0.5)
plt.xlabel('x-direction length (m)')
plt.ylabel('y-direction length (m)')
plt.axis('square')
plt.show()
Warning: CSGDifference.uniform_points not implemented. Use random_points instead.
No description has been provided for this image
In [ ]:
layer_size = [2] + [64] * 5 + [1]
activation = "tanh"
initializer = "Glorot uniform"

net = dde.maps.FNN(layer_size, activation, initializer)

model = dde.Model(data, net)
model.compile("adam", lr=1e-3)
Compiling model...
'compile' took 0.000246 s

In [ ]:
losshistory, train_state = model.train(epochs = 3000)
dde.saveplot(losshistory, train_state, issave = False, isplot = True)
Warning: epochs is deprecated and will be removed in a future version. Use iterations instead.
Training model...

Step      Train loss                        Test loss                         Test metric
0         [4.72e-02, 1.02e+00, 1.63e-01]    [4.63e-02, 1.02e+00, 1.63e-01]    []  
1000      [1.89e-03, 5.80e-05, 5.26e-05]    [1.71e-03, 5.80e-05, 5.26e-05]    []  
2000      [2.36e-03, 3.12e-05, 5.89e-05]    [1.79e-03, 3.12e-05, 5.89e-05]    []  
3000      [7.77e-04, 2.01e-05, 8.13e-06]    [6.92e-04, 2.01e-05, 8.13e-06]    []  

Best model at step 3000:
  train loss: 8.05e-04
  test loss: 7.21e-04
  test metric: []

'train' took 57.766042 s

No description has been provided for this image
No description has been provided for this image
In [ ]:
test = data.test_points()

result = model.predict(test)*50

plt.figure(dpi=100)
plt.scatter(test[:, 0], test[:, 1], c=result, cmap='magma')
plt.title('PINN')
plt.colorbar()
plt.xlabel('x')
plt.ylabel('y')
plt.xlim(-1, 1)
plt.ylim(-1, 1)
plt.axis("square")
# plt.tight_layout()
plt.show()
Warning: CSGDifference.uniform_points not implemented. Use random_points instead.
No description has been provided for this image

4. Thermal Fluid Dynamics

  • Fluid flow and heat transfer are often interconnected phenomena in engineering and scientific applications
  • ex) effective cooling system, power plant, heat exchanger, and etc.
  • It is import to solve fluid flow (Navier-Stokes equation) and heat transfer (Convection equation) simultaneously

Load Dataset

In [ ]:
import deepxde as dde
import numpy as np
import matplotlib.pyplot as plt
Using backend: pytorch
Other supported backends: tensorflow.compat.v1, tensorflow, jax, paddle.
paddle supports more examples now and is recommended.
In [ ]:
cor_fluid = np.loadtxt('./data/Convection/xy_cor.txt')
u_fluid = np.loadtxt('./data/Convection/u.txt')
v_fluid = np.loadtxt('./data/Convection/v.txt')
p_fluid = np.loadtxt('./data/Convection/p.txt')
T_fluid = np.loadtxt('./data/Convection/T.txt') - 273.15
CFD_results = [u_fluid, v_fluid, p_fluid, T_fluid]

u_fluid_max, u_fluid_min = np.max(u_fluid), np.min(u_fluid)
v_fluid_max, v_fluid_min = np.max(v_fluid), np.min(v_fluid)
p_fluid_max, p_fluid_min = np.max(p_fluid), np.min(p_fluid)
T_fluid_max, T_fluid_min = np.max(T_fluid), np.min(T_fluid)
In [ ]:
# Properties
rho = 1
mu = 0.01
k_fluid = 0.1 # thermal conductivity
cp = 1 # specific heat capacity
thermal_diffusivity = k_fluid / (rho * cp)
u_in = 1
T = 1
D = 1
L = 2
In [ ]:
def boundary_wall(X, on_boundary):
    on_wall = np.logical_and(np.logical_or(np.isclose(X[1], -D/2),
                                           np.isclose(X[1], D/2)), on_boundary)
    return on_wall

def boundary_inlet(X, on_boundary):
    return on_boundary and np.isclose(X[0], -L/2)

def boundary_outlet(X, on_boundary):
    return on_boundary and np.isclose(X[0], L/2)
In [ ]:
def pde(X, Y):
    du_x = dde.grad.jacobian(Y, X, i = 0, j = 0)
    du_y = dde.grad.jacobian(Y, X, i = 0, j = 1)
    dv_x = dde.grad.jacobian(Y, X, i = 1, j = 0)
    dv_y = dde.grad.jacobian(Y, X, i = 1, j = 1)
    dp_x = dde.grad.jacobian(Y, X, i = 2, j = 0)
    dp_y = dde.grad.jacobian(Y, X, i = 2, j = 1)
    dT_x = dde.grad.jacobian(Y, X, i = 3, j = 0)
    dT_y = dde.grad.jacobian(Y, X, i = 3, j = 1)

    du_xx = dde.grad.hessian(Y, X, i = 0, j = 0, component = 0)
    du_yy = dde.grad.hessian(Y, X, i = 1, j = 1, component = 0)
    dv_xx = dde.grad.hessian(Y, X, i = 0, j = 0, component = 1)
    dv_yy = dde.grad.hessian(Y, X, i = 1, j = 1, component = 1)
    dT_xx = dde.grad.jacobian(dT_x, X, i = 0, j = 0)
    dT_yy = dde.grad.jacobian(dT_y, X, i = 0, j = 1)

    pde_u = Y[:,0:1]*du_x + Y[:,1:2]*du_y + 1/rho * dp_x - (mu/rho)*(du_xx + du_yy)
    pde_v = Y[:,0:1]*dv_x + Y[:,1:2]*dv_y + 1/rho * dp_y - (mu/rho)*(dv_xx + dv_yy)
    pde_cont = du_x + dv_y
    pde_T = (Y[:,0:1]*dT_x + Y[:,1:2]*dT_y) - thermal_diffusivity * (dT_xx + dT_yy)

    return [pde_u, pde_v, pde_cont, pde_T]
In [ ]:
geom = dde.geometry.Rectangle(xmin=[-L/2, -D/2], xmax=[L/2, D/2])

bc_wall_u = dde.DirichletBC(geom, lambda X: 0., boundary_wall, component = 0)
bc_wall_v = dde.DirichletBC(geom, lambda X: 0., boundary_wall, component = 1)
bc_wall_T = dde.DirichletBC(geom, lambda X: 0., boundary_wall, component = 3)

bc_inlet_u = dde.DirichletBC(geom, lambda X: u_in, boundary_inlet, component = 0)
bc_inlet_v = dde.DirichletBC(geom, lambda X: 0., boundary_inlet, component = 1)
bc_inlet_T = dde.DirichletBC(geom, lambda X: T, boundary_inlet, component = 3)

bc_outlet_p = dde.DirichletBC(geom, lambda X: 0., boundary_outlet, component = 2)
bc_outlet_v = dde.DirichletBC(geom, lambda X: 0., boundary_outlet, component = 1)
bc_outlet_T = dde.NeumannBC(geom, lambda X: 0., boundary_outlet, component = 3)
In [ ]:
data = dde.data.PDE(geom,
                    pde,
                    [bc_wall_u, bc_wall_v, bc_wall_T, bc_inlet_u, bc_inlet_v, bc_inlet_T, bc_outlet_p, bc_outlet_v],
                    num_domain = 2000,
                    num_boundary = 200,
                    num_test = 100)
Warning: 100 points required, but 120 points sampled.
In [ ]:
plt.figure(figsize = (6, 4))
plt.scatter(data.train_x_all[:,0], data.train_x_all[:,1], s = 0.5)
plt.xlabel('x-direction length')
plt.ylabel('Distance from the middle of plates (m)')
plt.show()
No description has been provided for this image
In [ ]:
layer_size = [2] + [64] * 5 + [4]
activation = "tanh"
initializer = "Glorot uniform"

net = dde.maps.FNN(layer_size, activation, initializer)

model = dde.Model(data, net)
model.compile("adam", lr = 1e-3)
Compiling model...
'compile' took 0.000204 s

In [ ]:
losshistory, train_state = model.train(epochs = 10000)
dde.saveplot(losshistory, train_state, issave = False, isplot = True)
Warning: epochs is deprecated and will be removed in a future version. Use iterations instead.
Training model...

Step      Train loss                                                                                                                  Test loss                                                                                                                   Test metric
0         [1.65e-01, 1.53e-03, 6.02e-04, 6.79e-04, 1.68e-02, 9.03e-03, 2.98e-02, 1.08e+00, 2.38e-02, 1.15e+00, 1.65e-01, 2.35e-02]    [1.72e-01, 1.43e-03, 5.06e-04, 5.59e-04, 1.68e-02, 9.03e-03, 2.98e-02, 1.08e+00, 2.38e-02, 1.15e+00, 1.65e-01, 2.35e-02]    []  
1000      [1.88e-03, 3.46e-04, 5.49e-04, 1.83e-03, 4.65e-03, 6.65e-04, 8.84e-03, 7.31e-03, 3.86e-04, 6.51e-03, 1.10e-05, 3.24e-05]    [8.55e-04, 2.82e-04, 3.41e-04, 8.85e-04, 4.65e-03, 6.65e-04, 8.84e-03, 7.31e-03, 3.86e-04, 6.51e-03, 1.10e-05, 3.24e-05]    []  
2000      [3.59e-03, 2.04e-04, 3.99e-04, 3.14e-03, 2.37e-03, 2.69e-04, 3.75e-03, 2.98e-03, 3.70e-04, 3.23e-03, 1.05e-05, 1.16e-04]    [1.01e-03, 8.69e-05, 2.31e-04, 1.13e-03, 2.37e-03, 2.69e-04, 3.75e-03, 2.98e-03, 3.70e-04, 3.23e-03, 1.05e-05, 1.16e-04]    []  
3000      [1.09e-03, 3.75e-04, 2.32e-04, 8.73e-04, 1.73e-03, 2.02e-04, 2.80e-03, 1.78e-03, 2.93e-04, 1.77e-03, 8.38e-05, 2.63e-05]    [3.62e-04, 3.44e-04, 1.10e-04, 1.75e-04, 1.73e-03, 2.02e-04, 2.80e-03, 1.78e-03, 2.93e-04, 1.77e-03, 8.38e-05, 2.63e-05]    []  
4000      [1.90e-03, 2.07e-04, 4.09e-04, 2.69e-03, 3.34e-03, 1.96e-04, 3.19e-03, 8.35e-04, 1.80e-04, 1.04e-03, 8.61e-06, 7.32e-05]    [8.34e-04, 1.15e-04, 8.37e-05, 5.40e-04, 3.34e-03, 1.96e-04, 3.19e-03, 8.35e-04, 1.80e-04, 1.04e-03, 8.61e-06, 7.32e-05]    []  
5000      [2.06e-04, 1.49e-04, 1.22e-04, 2.12e-04, 1.00e-03, 2.15e-04, 1.96e-03, 1.06e-03, 1.72e-04, 1.14e-03, 3.69e-06, 1.98e-05]    [1.29e-04, 7.68e-05, 4.86e-05, 8.93e-05, 1.00e-03, 2.15e-04, 1.96e-03, 1.06e-03, 1.72e-04, 1.14e-03, 3.69e-06, 1.98e-05]    []  
6000      [2.27e-04, 1.57e-04, 1.21e-04, 1.58e-03, 8.40e-04, 1.78e-04, 1.75e-03, 9.03e-04, 1.30e-04, 1.01e-03, 6.13e-06, 2.11e-05]    [1.06e-04, 8.52e-05, 5.75e-05, 1.86e-04, 8.40e-04, 1.78e-04, 1.75e-03, 9.03e-04, 1.30e-04, 1.01e-03, 6.13e-06, 2.11e-05]    []  
7000      [4.26e-04, 1.89e-04, 1.73e-04, 1.09e-03, 1.02e-03, 1.60e-04, 1.67e-03, 7.05e-04, 1.05e-04, 9.41e-04, 1.21e-05, 3.60e-05]    [1.64e-04, 7.60e-05, 9.27e-05, 2.29e-04, 1.02e-03, 1.60e-04, 1.67e-03, 7.05e-04, 1.05e-04, 9.41e-04, 1.21e-05, 3.60e-05]    []  
8000      [1.85e-03, 1.83e-04, 9.62e-05, 2.17e-03, 9.80e-04, 1.75e-04, 1.55e-03, 5.93e-04, 1.02e-04, 8.87e-04, 1.06e-05, 5.71e-05]    [3.26e-04, 7.12e-05, 3.47e-05, 1.96e-04, 9.80e-04, 1.75e-04, 1.55e-03, 5.93e-04, 1.02e-04, 8.87e-04, 1.06e-05, 5.71e-05]    []  
9000      [3.42e-04, 3.94e-04, 1.94e-04, 1.75e-03, 1.22e-03, 9.73e-05, 1.64e-03, 5.02e-04, 7.21e-05, 8.80e-04, 5.93e-05, 1.43e-05]    [9.24e-05, 2.44e-04, 9.86e-05, 1.95e-04, 1.22e-03, 9.73e-05, 1.64e-03, 5.02e-04, 7.21e-05, 8.80e-04, 5.93e-05, 1.43e-05]    []  
10000     [3.45e-04, 1.26e-04, 1.02e-04, 4.79e-04, 5.17e-04, 1.14e-04, 1.19e-03, 5.41e-04, 8.23e-05, 1.11e-03, 7.97e-06, 3.25e-05]    [7.68e-05, 6.10e-05, 5.73e-05, 9.15e-05, 5.17e-04, 1.14e-04, 1.19e-03, 5.41e-04, 8.23e-05, 1.11e-03, 7.97e-06, 3.25e-05]    []  

Best model at step 10000:
  train loss: 4.64e-03
  test loss: 3.88e-03
  test metric: []

'train' took 339.910986 s

No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
In [ ]:
color_legend = [[u_fluid_min, u_fluid_max], [v_fluid_min, v_fluid_max],
                [p_fluid_min, 1.5], [T_fluid_min, T_fluid_max]]
titles = ['U-velocity', 'V-velocity', 'Pressure', 'Temperature']

result = model.predict(cor_fluid)

plt.figure(figsize=(9, 4), dpi=300)
for idx in range(4):
    plt.subplot(2, 2, idx+1)
    plt.title(titles[idx], fontsize=13)
    plt.scatter(cor_fluid[:, 0],
                cor_fluid[:, 1],
                c = result[:, idx],
                cmap = 'jet',
                s = 0.3)
    plt.colorbar()
    plt.clim(color_legend[idx])
    plt.axis('scaled')
    plt.xlim((0-L/2, L-L/2))
    plt.ylim((0-D/2, D-D/2))

plt.tight_layout()
plt.show()
No description has been provided for this image
In [ ]:
# samples = geom.random_points(500000)

result = model.predict(cor_fluid)

plt.figure(figsize=(12, 7), dpi=300)
for idx in range(4):
    plt.subplot(4, 3, 3*idx+1)
    plt.title('CFD', fontsize=13)
    plt.scatter(cor_fluid[:, 0],
                cor_fluid[:, 1],
                c = CFD_results[idx],
                cmap = 'jet',
                s=0.3)
    plt.colorbar()
    plt.clim(color_legend[idx])
    plt.axis('scaled')
    plt.xlim((0-L/2, L-L/2))
    plt.ylim((0-D/2, D-D/2))
    plt.ylabel(titles[idx])

    plt.subplot(4, 3, 3*idx+2)
    plt.title('PINN', fontsize=13)
    plt.scatter(cor_fluid[:, 0],
                cor_fluid[:, 1],
                c = result[:, idx],
                cmap = 'jet',
                s = 0.3)
    plt.colorbar()
    plt.clim(color_legend[idx])
    plt.axis('scaled')
    plt.xlim((0-L/2, L-L/2))
    plt.ylim((0-D/2, D-D/2))

    plt.subplot(4, 3, 3*idx+3)
    plt.title('Error', fontsize=13)
    plt.scatter(cor_fluid[:, 0],
                cor_fluid[:, 1],
                c = np.abs(result[:, idx] - CFD_results[idx]),
                cmap = 'jet',
                s = 0.3)
    plt.colorbar()
    plt.clim(0, 0.2 * color_legend[idx][1])
    plt.axis('scaled')
    plt.xlim((0-L/2, L-L/2))
    plt.ylim((0-D/2, D-D/2))

plt.tight_layout()
plt.show()
No description has been provided for this image

Reference

  • Krishnayatra, Gaurav, Sulekh Tokas, and Rajesh Kumar. "Numerical heat transfer analysis & predicting thermal performance of fins for a novel heat exchanger using machine learning." Case Studies in Thermal Engineering 21 (2020): 100706.
  • Adam, Andre, Huazhen Fang, and Xianglin Li. "Effective thermal conductivity estimation using a convolutional neural network and its application in topology optimization." Energy and AI 15 (2024): 100310.
  • Edalatifar, Mohammad, et al. "Using deep learning to learn physics of conduction heat transfer." Journal of Thermal Analysis and Calorimetry 146 (2021): 1435-1452.