PINN with Data


Fluid Mechanics Example


By Prof. Seungchul Lee
http://iai.postech.ac.kr/
Industrial AI Lab at POSTECH

Table of Contents

1. Data-driven Approach with Big dataĀ¶

1.1. Load and Sample DataĀ¶

Fluid_bigdata Download

InĀ [2]:
import deepxde as dde
import numpy as np
import matplotlib.pyplot as plt
import pandas as pd
Using backend: pytorch

InĀ [3]:
fluid_bigdata = np.load('./data_files/fluid_bigdata.npy')

observe_x = fluid_bigdata[:, :2]
observe_y = fluid_bigdata[:, 2:]
InĀ [4]:
observe_u = dde.icbc.PointSetBC(observe_x, observe_y[:, 0].reshape(-1, 1), component=0)
observe_v = dde.icbc.PointSetBC(observe_x, observe_y[:, 1].reshape(-1, 1), component=1)
observe_p = dde.icbc.PointSetBC(observe_x, observe_y[:, 2].reshape(-1, 1), component=2)

1.2. Define ParametersĀ¶

InĀ [5]:
# Properties
rho = 1
mu = 1
u_in = 1
D = 1
L = 2

1.3. Define GeometryĀ¶

InĀ [6]:
geom = dde.geometry.Rectangle(xmin = [-L/2, -D/2], xmax = [L/2, D/2])
data = dde.data.PDE(geom,
                    None,
                    [observe_u, observe_v, observe_p], 
                    num_domain = 0, 
                    num_boundary = 0, 
                    num_test = 100)
Warning: 100 points required, but 120 points sampled.
InĀ [7]:
plt.figure(figsize = (20,4))
plt.scatter(data.train_x_all[:,0], data.train_x_all[:,1], s = 0.5)
plt.scatter(observe_x[:, 0], observe_x[:, 1], c = observe_y[:, 0], s = 6.5, cmap = 'jet')
plt.scatter(observe_x[:, 0], observe_x[:, 1], s = 0.5, color='k', alpha = 0.5)
plt.xlim((0-L/2, L-L/2))
plt.ylim((0-D/2, D-D/2))
plt.xlabel('x-direction length (m)')
plt.ylabel('Distance from middle of plates (m)')
plt.title('Velocity (u)')
plt.show()
No description has been provided for this image

1.4. Define Network and Hyper-parametersĀ¶

InĀ [8]:
layer_size = [2] + [64] * 5 + [3]
activation = "tanh"
initializer = "Glorot uniform"

net = dde.maps.FNN(layer_size, activation, initializer)

model = dde.Model(data, net)
model.compile("adam", lr = 1e-3)
Compiling model...
'compile' took 0.000181 s

1.5. Train (Adam Optimizer)Ā¶

InĀ [9]:
losshistory, train_state = model.train(epochs = 10000)
dde.saveplot(losshistory, train_state, issave = False, isplot = False)
Training model...

Step      Train loss                        Test loss                         Test metric
0         [1.20e+00, 3.17e-02, 2.00e+02]    [1.20e+00, 3.17e-02, 2.00e+02]    []  
1000      [1.83e-01, 5.27e-03, 6.67e-01]    [1.83e-01, 5.27e-03, 6.67e-01]    []  
2000      [6.29e-03, 4.43e-03, 6.54e-02]    [6.29e-03, 4.43e-03, 6.54e-02]    []  
3000      [1.79e-03, 1.70e-03, 2.05e-02]    [1.79e-03, 1.70e-03, 2.05e-02]    []  
4000      [4.26e-04, 5.19e-04, 8.69e-03]    [4.26e-04, 5.19e-04, 8.69e-03]    []  
5000      [2.89e-04, 2.47e-04, 1.84e-03]    [2.89e-04, 2.47e-04, 1.84e-03]    []  
6000      [1.51e-04, 1.03e-04, 3.99e-04]    [1.51e-04, 1.03e-04, 3.99e-04]    []  
7000      [8.78e-05, 5.29e-05, 2.20e-03]    [8.78e-05, 5.29e-05, 2.20e-03]    []  
8000      [5.93e-05, 4.02e-05, 1.15e-03]    [5.93e-05, 4.02e-05, 1.15e-03]    []  
9000      [4.22e-05, 2.70e-05, 1.23e-04]    [4.22e-05, 2.70e-05, 1.23e-04]    []  
10000     [3.40e-05, 2.28e-05, 1.09e-04]    [3.40e-05, 2.28e-05, 1.09e-04]    []  

Best model at step 10000:
  train loss: 1.65e-04
  test loss: 1.65e-04
  test metric: []

'train' took 75.887699 s

1.6. Train More (L-BFGS Optimizer)Ā¶

InĀ [10]:
dde.optimizers.config.set_LBFGS_options()
model.compile("L-BFGS")
losshistory, train_state = model.train()
dde.saveplot(losshistory, train_state, issave = False, isplot = True)
Compiling model...
'compile' took 0.000401 s

Training model...

Step      Train loss                        Test loss                         Test metric
10000     [3.40e-05, 2.28e-05, 1.09e-04]    [3.40e-05, 2.28e-05, 1.09e-04]    []  
11000     [3.84e-06, 3.34e-06, 4.07e-05]    [3.84e-06, 3.34e-06, 4.07e-05]    []  
12000     [3.61e-06, 1.97e-06, 2.23e-05]    [3.61e-06, 1.97e-06, 2.23e-05]    []  
13000     [3.14e-06, 9.65e-07, 1.29e-05]    [3.14e-06, 9.65e-07, 1.29e-05]    []  
14000     [1.86e-06, 1.24e-06, 5.42e-06]    [1.86e-06, 1.24e-06, 5.42e-06]    []  
15000     [1.34e-06, 4.84e-07, 2.77e-06]    [1.34e-06, 4.84e-07, 2.77e-06]    []  
16000     [9.03e-07, 3.72e-07, 1.46e-06]    [9.03e-07, 3.72e-07, 1.46e-06]    []  
17000     [8.04e-07, 3.14e-07, 1.21e-06]    [8.04e-07, 3.14e-07, 1.21e-06]    []  
18000     [8.02e-07, 3.12e-07, 1.20e-06]    [8.02e-07, 3.12e-07, 1.20e-06]    []  
19000     [8.00e-07, 3.11e-07, 1.20e-06]    [8.00e-07, 3.11e-07, 1.20e-06]    []  
20000     [7.99e-07, 3.09e-07, 1.20e-06]    [7.99e-07, 3.09e-07, 1.20e-06]    []  
21000     [7.98e-07, 3.08e-07, 1.19e-06]    [7.98e-07, 3.08e-07, 1.19e-06]    []  
22000     [7.97e-07, 3.08e-07, 1.19e-06]    [7.97e-07, 3.08e-07, 1.19e-06]    []  
23000     [7.97e-07, 3.07e-07, 1.19e-06]    [7.97e-07, 3.07e-07, 1.19e-06]    []  
24000     [7.96e-07, 3.06e-07, 1.19e-06]    [7.96e-07, 3.06e-07, 1.19e-06]    []  
25000     [7.95e-07, 3.05e-07, 1.18e-06]    [7.95e-07, 3.05e-07, 1.18e-06]    []  

Best model at step 25000:
  train loss: 2.28e-06
  test loss: 2.28e-06
  test metric: []

'train' took 219.993909 s

No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image

1.7. Plot Results (Adam + L-BFGS)Ā¶

InĀ [11]:
samples = geom.random_points(500000)
result = model.predict(samples)
color_legend = [[0, 1.5], [-0.3, 0.3], [0, 35]]

for idx in range(3):
    plt.figure(figsize = (20, 4))
    plt.scatter(samples[:, 0],
                samples[:, 1],
                c = result[:, idx],
                s = 2,
                cmap = 'jet')
    plt.colorbar()
    plt.clim(color_legend[idx])
    plt.xlim((0-L/2, L-L/2))
    plt.ylim((0-D/2, D-D/2))
plt.tight_layout()
plt.show()
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image

2. Data-driven Approach with Small DataĀ¶

2.1. Load and Sample DataĀ¶

Fluid_smalldata Download

InĀ [12]:
fluid_smalldata = np.load('./data_files/fluid_smalldata.npy')

observe_x = fluid_smalldata[:, :2]
observe_y = fluid_smalldata[:, 2:]
InĀ [13]:
observe_u = dde.icbc.PointSetBC(observe_x, observe_y[:, 0].reshape(-1, 1), component=0)
observe_v = dde.icbc.PointSetBC(observe_x, observe_y[:, 1].reshape(-1, 1), component=1)
observe_p = dde.icbc.PointSetBC(observe_x, observe_y[:, 2].reshape(-1, 1), component=2)

2.2. Define GeometryĀ¶

InĀ [14]:
geom = dde.geometry.Rectangle(xmin = [-L/2, -D/2], xmax = [L/2, D/2])
data = dde.data.PDE(geom,
                    None,
                    [observe_u, observe_v, observe_p], 
                    num_domain = 0, 
                    num_boundary = 0, 
                    num_test = 120)
Warning: 120 points required, but 128 points sampled.
InĀ [15]:
plt.figure(figsize = (20,4))
plt.scatter(data.train_x_all[:,0], data.train_x_all[:,1], s = 0.5)
plt.scatter(observe_x[:, 0], observe_x[:, 1], c = observe_y[:, 0], s = 6.5, cmap = 'jet')
plt.scatter(observe_x[:, 0], observe_x[:, 1], s = 0.5, color='k', alpha = 0.5)
plt.xlim((0-L/2, L-L/2))
plt.ylim((0-D/2, D-D/2))
plt.xlabel('x-direction length (m)')
plt.ylabel('Distance from middle of plates (m)')
plt.title('Velocity (u)')
plt.show()
No description has been provided for this image

2.3. Define Network and Hyper-parametersĀ¶

InĀ [16]:
layer_size = [2] + [64] * 5 + [3]
activation = "tanh"
initializer = "Glorot uniform"

net = dde.maps.FNN(layer_size, activation, initializer)

model = dde.Model(data, net)
model.compile("adam", lr = 1e-3)
Compiling model...
'compile' took 0.000184 s

2.4. Train (Adam Optimizer)Ā¶

InĀ [17]:
losshistory, train_state = model.train(epochs = 10000)
dde.saveplot(losshistory, train_state, issave = False, isplot = False)
Training model...

Step      Train loss                        Test loss                         Test metric
0         [1.18e+00, 5.41e-03, 1.97e+02]    [1.18e+00, 5.41e-03, 1.97e+02]    []  
1000      [1.82e-01, 5.47e-03, 1.57e-01]    [1.82e-01, 5.47e-03, 1.57e-01]    []  
2000      [2.45e-02, 5.42e-03, 1.61e-01]    [2.45e-02, 5.42e-03, 1.61e-01]    []  
3000      [1.08e-03, 1.14e-03, 1.32e-02]    [1.08e-03, 1.14e-03, 1.32e-02]    []  
4000      [2.55e-04, 1.68e-04, 9.06e-04]    [2.55e-04, 1.68e-04, 9.06e-04]    []  
5000      [1.32e-04, 8.80e-05, 5.40e-04]    [1.32e-04, 8.80e-05, 5.40e-04]    []  
6000      [7.04e-05, 5.30e-05, 6.66e-05]    [7.04e-05, 5.30e-05, 6.66e-05]    []  
7000      [4.40e-05, 3.16e-05, 4.42e-05]    [4.40e-05, 3.16e-05, 4.42e-05]    []  
8000      [3.01e-05, 1.77e-05, 4.26e-05]    [3.01e-05, 1.77e-05, 4.26e-05]    []  
9000      [2.18e-05, 1.02e-05, 3.13e-05]    [2.18e-05, 1.02e-05, 3.13e-05]    []  
10000     [2.02e-05, 7.31e-06, 5.91e-05]    [2.02e-05, 7.31e-06, 5.91e-05]    []  

Best model at step 9000:
  train loss: 6.33e-05
  test loss: 6.33e-05
  test metric: []

'train' took 74.360621 s

2.5. Train More (L-BFGS Optimizer)Ā¶

InĀ [18]:
dde.optimizers.config.set_LBFGS_options()
model.compile("L-BFGS")
losshistory, train_state = model.train()
dde.saveplot(losshistory, train_state, issave = False, isplot = True)
Compiling model...
'compile' took 0.000330 s

Training model...

Step      Train loss                        Test loss                         Test metric
10000     [2.02e-05, 7.31e-06, 5.91e-05]    [2.02e-05, 7.31e-06, 5.91e-05]    []  
11000     [7.79e-07, 6.43e-07, 1.84e-05]    [7.79e-07, 6.43e-07, 1.84e-05]    []  
12000     [5.59e-07, 3.91e-07, 1.48e-05]    [5.59e-07, 3.91e-07, 1.48e-05]    []  
13000     [5.31e-07, 7.57e-07, 1.01e-05]    [5.31e-07, 7.57e-07, 1.01e-05]    []  
14000     [6.30e-07, 6.24e-07, 6.61e-06]    [6.30e-07, 6.24e-07, 6.61e-06]    []  
15000     [2.74e-07, 3.76e-07, 4.33e-06]    [2.74e-07, 3.76e-07, 4.33e-06]    []  
16000     [2.72e-07, 3.70e-07, 4.32e-06]    [2.72e-07, 3.70e-07, 4.32e-06]    []  
17000     [2.71e-07, 3.66e-07, 4.31e-06]    [2.71e-07, 3.66e-07, 4.31e-06]    []  
18000     [2.70e-07, 3.63e-07, 4.30e-06]    [2.70e-07, 3.63e-07, 4.30e-06]    []  
19000     [2.70e-07, 3.61e-07, 4.30e-06]    [2.70e-07, 3.61e-07, 4.30e-06]    []  
20000     [2.70e-07, 3.59e-07, 4.30e-06]    [2.70e-07, 3.59e-07, 4.30e-06]    []  
21000     [2.70e-07, 3.58e-07, 4.29e-06]    [2.70e-07, 3.58e-07, 4.29e-06]    []  
22000     [2.71e-07, 3.56e-07, 4.29e-06]    [2.71e-07, 3.56e-07, 4.29e-06]    []  
23000     [2.71e-07, 3.55e-07, 4.29e-06]    [2.71e-07, 3.55e-07, 4.29e-06]    []  
24000     [2.71e-07, 3.54e-07, 4.29e-06]    [2.71e-07, 3.54e-07, 4.29e-06]    []  
25000     [2.71e-07, 3.53e-07, 4.29e-06]    [2.71e-07, 3.53e-07, 4.29e-06]    []  

Best model at step 25000:
  train loss: 4.91e-06
  test loss: 4.91e-06
  test metric: []

'train' took 219.921242 s

No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image

2.6. Plot Results (Adam + L-BFGS)Ā¶

InĀ [19]:
samples = geom.random_points(500000)
result = model.predict(samples)
color_legend = [[0, 1.5], [-0.3, 0.3], [0, 35]]

for idx in range(3):
    plt.figure(figsize = (20, 4))
    plt.scatter(samples[:, 0],
                samples[:, 1],
                c = result[:, idx],
                s = 2,
                cmap = 'jet')
    plt.colorbar()
    plt.clim(color_legend[idx])
    plt.xlim((0-L/2, L-L/2))
    plt.ylim((0-D/2, D-D/2))
plt.tight_layout()
plt.show()
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image

3. PINN with Small DataĀ¶

3.1. Define PDE with Boundary & Initial ConditionsĀ¶

InĀ [20]:
def boundary_wall(X, on_boundary):
    on_wall = np.logical_and(np.logical_or(np.isclose(X[1], -D/2), np.isclose(X[1], D/2)), on_boundary)
    return on_wall

def boundary_inlet(X, on_boundary):
    return on_boundary and np.isclose(X[0], -L/2)

def boundary_outlet(X, on_boundary):
    return on_boundary and np.isclose(X[0], L/2)
InĀ [21]:
def pde(X, Y):
    du_x = dde.grad.jacobian(Y, X, i = 0, j = 0)
    du_y = dde.grad.jacobian(Y, X, i = 0, j = 1)
    dv_x = dde.grad.jacobian(Y, X, i = 1, j = 0)
    dv_y = dde.grad.jacobian(Y, X, i = 1, j = 1)
    dp_x = dde.grad.jacobian(Y, X, i = 2, j = 0)
    dp_y = dde.grad.jacobian(Y, X, i = 2, j = 1)
    du_xx = dde.grad.hessian(Y, X, i = 0, j = 0, component = 0)
    du_yy = dde.grad.hessian(Y, X, i = 1, j = 1, component = 0)
    dv_xx = dde.grad.hessian(Y, X, i = 0, j = 0, component = 1)
    dv_yy = dde.grad.hessian(Y, X, i = 1, j = 1, component = 1)
    
    pde_u = Y[:,0:1] * du_x + Y[:,1:2] * du_y + 1/rho * dp_x - (mu/rho) * (du_xx + du_yy)
    pde_v = Y[:,0:1] * dv_x + Y[:,1:2] * dv_y + 1/rho * dp_y - (mu/rho) * (dv_xx + dv_yy)
    pde_cont = du_x + dv_y

    return [pde_u, pde_v, pde_cont]

3.2. Define Geometry and Implement Boundary ConditionĀ¶

InĀ [22]:
geom = dde.geometry.Rectangle(xmin=[-L/2, -D/2], xmax=[L/2, D/2])

bc_wall_u = dde.DirichletBC(geom, lambda X: 0., boundary_wall, component = 0)
bc_wall_v = dde.DirichletBC(geom, lambda X: 0., boundary_wall, component = 1)

bc_inlet_u = dde.DirichletBC(geom, lambda X: u_in, boundary_inlet, component = 0)
bc_inlet_v = dde.DirichletBC(geom, lambda X: 0., boundary_inlet, component = 1)

bc_outlet_p = dde.DirichletBC(geom, lambda X: 0., boundary_outlet, component = 2)
bc_outlet_v = dde.DirichletBC(geom, lambda X: 0., boundary_outlet, component = 1)
InĀ [23]:
data = dde.data.PDE(geom,
                    pde,
                    [bc_wall_u, bc_wall_v, bc_inlet_u, bc_inlet_v, bc_outlet_p, bc_outlet_v, observe_u, observe_v, observe_p], 
                    num_domain = 1000, 
                    num_boundary = 500, 
                    num_test = 1000,
                    train_distribution = 'LHS')
Warning: 1000 points required, but 1035 points sampled.
InĀ [24]:
plt.figure(figsize = (20,4))
plt.scatter(data.train_x_all[:,0], data.train_x_all[:,1], s = 0.5)
plt.scatter(observe_x[:, 0], observe_x[:, 1], c = observe_y[:, 0], s = 6.5, cmap = 'jet')
plt.scatter(observe_x[:, 0], observe_x[:, 1], s = 0.5, color='k', alpha = 0.5)
plt.xlim((0-L/2, L-L/2))
plt.ylim((0-D/2, D-D/2))
plt.xlabel('x-direction length (m)')
plt.ylabel('Distance from middle of plates (m)')
plt.title('Velocity (u)')
plt.show()
No description has been provided for this image

3.3. Define Network and Hyper-parametersĀ¶

InĀ [25]:
layer_size = [2] + [64] * 5 + [3]
activation = "tanh"
initializer = "Glorot uniform"

net = dde.maps.FNN(layer_size, activation, initializer)

model = dde.Model(data, net)
model.compile("adam", lr = 1e-3, loss_weights = [1, 1, 1, 1, 1, 1, 1, 1, 1, 9, 9, 9])
Compiling model...
'compile' took 0.000162 s

3.4. Train (Adam Optimizer)Ā¶

InĀ [26]:
losshistory, train_state = model.train(epochs = 10000)
dde.saveplot(losshistory, train_state, issave = False, isplot = False)
Training model...

Step      Train loss                                                                                                                  Test loss                                                                                                                   Test metric
0         [2.43e-01, 1.50e-02, 1.52e-01, 2.09e-01, 9.14e-03, 1.55e+00, 6.70e-03, 4.31e-02, 6.78e-03, 1.13e+01, 1.10e-01, 1.79e+03]    [1.98e-01, 1.38e-02, 1.65e-01, 2.09e-01, 9.14e-03, 1.55e+00, 6.70e-03, 4.31e-02, 6.78e-03, 1.13e+01, 1.10e-01, 1.79e+03]    []  
1000      [9.14e-02, 2.51e-02, 9.81e-02, 4.17e-02, 2.22e-03, 6.53e-02, 1.52e-03, 3.73e-03, 1.25e-03, 3.89e-02, 1.65e-02, 4.81e-01]    [1.05e-01, 2.73e-02, 1.03e-01, 4.17e-02, 2.22e-03, 6.53e-02, 1.52e-03, 3.73e-03, 1.25e-03, 3.89e-02, 1.65e-02, 4.81e-01]    []  
2000      [2.37e-02, 1.06e-02, 7.20e-02, 2.15e-02, 9.96e-04, 6.18e-02, 1.79e-03, 9.55e-04, 1.66e-04, 2.57e-02, 8.35e-03, 1.91e-01]    [2.56e-02, 1.24e-02, 6.20e-02, 2.15e-02, 9.96e-04, 6.18e-02, 1.79e-03, 9.55e-04, 1.66e-04, 2.57e-02, 8.35e-03, 1.91e-01]    []  
3000      [8.56e-03, 5.70e-03, 7.83e-02, 1.80e-02, 2.13e-04, 5.83e-02, 2.01e-03, 1.21e-04, 1.61e-04, 2.31e-02, 1.03e-02, 1.15e-01]    [1.14e-02, 7.17e-03, 5.74e-02, 1.80e-02, 2.13e-04, 5.83e-02, 2.01e-03, 1.21e-04, 1.61e-04, 2.31e-02, 1.03e-02, 1.15e-01]    []  
4000      [1.78e-02, 5.30e-03, 8.74e-02, 1.63e-02, 1.83e-04, 5.93e-02, 3.21e-03, 1.21e-02, 1.06e-04, 2.20e-02, 1.09e-02, 9.68e-02]    [1.33e-02, 7.04e-03, 6.01e-02, 1.63e-02, 1.83e-04, 5.93e-02, 3.21e-03, 1.21e-02, 1.06e-04, 2.20e-02, 1.09e-02, 9.68e-02]    []  
5000      [5.38e-03, 4.03e-03, 8.87e-02, 1.58e-02, 2.13e-04, 5.87e-02, 4.16e-03, 4.27e-04, 6.11e-05, 2.10e-02, 1.13e-02, 5.30e-02]    [6.94e-03, 6.24e-03, 5.99e-02, 1.58e-02, 2.13e-04, 5.87e-02, 4.16e-03, 4.27e-04, 6.11e-05, 2.10e-02, 1.13e-02, 5.30e-02]    []  
6000      [4.73e-03, 3.95e-03, 8.72e-02, 1.52e-02, 2.61e-04, 5.83e-02, 5.15e-03, 1.19e-04, 1.72e-05, 2.04e-02, 1.15e-02, 3.97e-02]    [6.58e-03, 6.78e-03, 6.11e-02, 1.52e-02, 2.61e-04, 5.83e-02, 5.15e-03, 1.19e-04, 1.72e-05, 2.04e-02, 1.15e-02, 3.97e-02]    []  
7000      [5.06e-03, 3.88e-03, 8.24e-02, 1.51e-02, 2.81e-04, 5.71e-02, 6.31e-03, 1.31e-04, 9.77e-06, 1.98e-02, 1.12e-02, 3.16e-02]    [6.78e-03, 7.08e-03, 6.13e-02, 1.51e-02, 2.81e-04, 5.71e-02, 6.31e-03, 1.31e-04, 9.77e-06, 1.98e-02, 1.12e-02, 3.16e-02]    []  
8000      [5.30e-03, 3.68e-03, 7.69e-02, 1.48e-02, 2.89e-04, 5.55e-02, 7.73e-03, 1.28e-04, 3.16e-06, 1.94e-02, 1.08e-02, 2.52e-02]    [7.06e-03, 7.11e-03, 6.07e-02, 1.48e-02, 2.89e-04, 5.55e-02, 7.73e-03, 1.28e-04, 3.16e-06, 1.94e-02, 1.08e-02, 2.52e-02]    []  
9000      [9.36e-03, 4.41e-03, 7.13e-02, 1.50e-02, 3.50e-04, 5.23e-02, 9.24e-03, 2.23e-04, 1.04e-05, 1.89e-02, 1.08e-02, 2.07e-02]    [1.03e-02, 7.82e-03, 5.99e-02, 1.50e-02, 3.50e-04, 5.23e-02, 9.24e-03, 2.23e-04, 1.04e-05, 1.89e-02, 1.08e-02, 2.07e-02]    []  
10000     [1.19e-02, 5.13e-03, 6.69e-02, 1.52e-02, 4.70e-04, 4.86e-02, 1.04e-02, 5.92e-03, 1.71e-05, 1.87e-02, 1.10e-02, 3.13e-02]    [1.28e-02, 8.45e-03, 5.93e-02, 1.52e-02, 4.70e-04, 4.86e-02, 1.04e-02, 5.92e-03, 1.71e-05, 1.87e-02, 1.10e-02, 3.13e-02]    []  

Best model at step 9000:
  train loss: 2.13e-01
  test loss: 2.06e-01
  test metric: []

'train' took 363.535719 s

3.5. Train More (L-BFGS Optimizer)Ā¶

InĀ [27]:
dde.optimizers.config.set_LBFGS_options()
model.compile("L-BFGS", loss_weights = [1, 1, 1, 1, 1, 1, 1, 1, 1, 9, 9, 9])
losshistory, train_state = model.train()
dde.saveplot(losshistory, train_state, issave = False, isplot = True)
Compiling model...
'compile' took 0.000942 s

Training model...

Step      Train loss                                                                                                                  Test loss                                                                                                                   Test metric
10000     [1.19e-02, 5.13e-03, 6.69e-02, 1.52e-02, 4.70e-04, 4.86e-02, 1.04e-02, 5.92e-03, 1.71e-05, 1.87e-02, 1.10e-02, 3.13e-02]    [1.28e-02, 8.45e-03, 5.93e-02, 1.52e-02, 4.70e-04, 4.86e-02, 1.04e-02, 5.92e-03, 1.71e-05, 1.87e-02, 1.10e-02, 3.13e-02]    []  
11000     [4.52e-03, 3.82e-03, 2.92e-02, 1.17e-02, 2.16e-03, 1.27e-02, 1.84e-02, 6.36e-05, 3.78e-06, 6.70e-03, 3.04e-03, 4.22e-03]    [8.56e-03, 6.00e-03, 3.03e-02, 1.17e-02, 2.16e-03, 1.27e-02, 1.84e-02, 6.36e-05, 3.78e-06, 6.70e-03, 3.04e-03, 4.22e-03]    []  
12000     [3.19e-03, 2.64e-03, 8.26e-03, 8.32e-03, 4.53e-04, 1.22e-02, 7.11e-03, 6.87e-05, 2.83e-05, 1.99e-03, 8.69e-04, 1.30e-03]    [5.36e-03, 2.95e-03, 8.17e-03, 8.32e-03, 4.53e-04, 1.22e-02, 7.11e-03, 6.87e-05, 2.83e-05, 1.99e-03, 8.69e-04, 1.30e-03]    []  
13000     [2.02e-03, 1.20e-03, 3.65e-03, 8.22e-03, 2.81e-04, 9.10e-03, 2.92e-03, 5.14e-06, 7.42e-06, 4.49e-04, 3.09e-04, 5.99e-04]    [4.05e-03, 2.48e-03, 2.98e-03, 8.22e-03, 2.81e-04, 9.10e-03, 2.92e-03, 5.14e-06, 7.42e-06, 4.49e-04, 3.09e-04, 5.99e-04]    []  
14000     [1.21e-03, 9.97e-04, 2.45e-03, 8.27e-03, 2.89e-04, 6.63e-03, 2.10e-03, 2.84e-05, 9.26e-06, 2.27e-04, 2.57e-04, 3.44e-04]    [3.03e-03, 2.26e-03, 1.71e-03, 8.27e-03, 2.89e-04, 6.63e-03, 2.10e-03, 2.84e-05, 9.26e-06, 2.27e-04, 2.57e-04, 3.44e-04]    []  
15000     [8.52e-04, 6.03e-04, 1.72e-03, 8.88e-03, 2.99e-04, 4.90e-03, 2.21e-03, 2.12e-05, 6.36e-06, 2.37e-04, 2.24e-04, 3.72e-04]    [2.08e-03, 1.12e-03, 1.21e-03, 8.88e-03, 2.99e-04, 4.90e-03, 2.21e-03, 2.12e-05, 6.36e-06, 2.37e-04, 2.24e-04, 3.72e-04]    []  
16000     [5.31e-04, 5.08e-04, 1.15e-03, 8.24e-03, 4.11e-04, 4.42e-03, 2.46e-03, 1.01e-05, 1.57e-05, 1.82e-04, 2.47e-04, 4.10e-04]    [1.34e-03, 7.70e-04, 8.60e-04, 8.24e-03, 4.11e-04, 4.42e-03, 2.46e-03, 1.01e-05, 1.57e-05, 1.82e-04, 2.47e-04, 4.10e-04]    []  
17000     [3.94e-04, 3.61e-04, 8.44e-04, 7.75e-03, 4.39e-04, 4.27e-03, 2.59e-03, 1.66e-05, 2.42e-05, 1.57e-04, 1.94e-04, 3.95e-04]    [1.06e-03, 6.19e-04, 6.84e-04, 7.75e-03, 4.39e-04, 4.27e-03, 2.59e-03, 1.66e-05, 2.42e-05, 1.57e-04, 1.94e-04, 3.95e-04]    []  
18000     [3.63e-04, 3.17e-04, 7.21e-04, 7.28e-03, 4.40e-04, 4.36e-03, 2.50e-03, 9.66e-06, 3.48e-06, 1.14e-04, 1.61e-04, 4.09e-04]    [1.19e-03, 5.56e-04, 7.29e-04, 7.28e-03, 4.40e-04, 4.36e-03, 2.50e-03, 9.66e-06, 3.48e-06, 1.14e-04, 1.61e-04, 4.09e-04]    []  
19000     [3.25e-04, 2.85e-04, 6.56e-04, 7.20e-03, 4.50e-04, 4.07e-03, 2.46e-03, 5.23e-06, 1.04e-05, 1.02e-04, 1.51e-04, 4.04e-04]    [9.51e-04, 5.72e-04, 6.73e-04, 7.20e-03, 4.50e-04, 4.07e-03, 2.46e-03, 5.23e-06, 1.04e-05, 1.02e-04, 1.51e-04, 4.04e-04]    []  
20000     [3.02e-04, 3.51e-04, 6.30e-04, 6.81e-03, 4.96e-04, 3.89e-03, 2.40e-03, 7.56e-06, 6.01e-06, 7.78e-05, 1.66e-04, 4.13e-04]    [7.78e-04, 9.39e-04, 6.01e-04, 6.81e-03, 4.96e-04, 3.89e-03, 2.40e-03, 7.56e-06, 6.01e-06, 7.78e-05, 1.66e-04, 4.13e-04]    []  
21000     [3.59e-04, 2.79e-04, 6.48e-04, 6.60e-03, 4.71e-04, 3.59e-03, 2.30e-03, 2.78e-06, 1.67e-06, 9.32e-05, 1.39e-04, 4.37e-04]    [8.23e-04, 1.53e-03, 6.13e-04, 6.60e-03, 4.71e-04, 3.59e-03, 2.30e-03, 2.78e-06, 1.67e-06, 9.32e-05, 1.39e-04, 4.37e-04]    []  
22000     [3.36e-04, 3.37e-04, 6.59e-04, 6.44e-03, 4.92e-04, 3.44e-03, 2.17e-03, 6.91e-06, 2.39e-06, 8.05e-05, 1.51e-04, 4.01e-04]    [9.43e-04, 1.95e-03, 6.72e-04, 6.44e-03, 4.92e-04, 3.44e-03, 2.17e-03, 6.91e-06, 2.39e-06, 8.05e-05, 1.51e-04, 4.01e-04]    []  
23000     [3.47e-04, 3.10e-04, 6.53e-04, 6.53e-03, 4.91e-04, 3.13e-03, 2.14e-03, 4.15e-06, 1.50e-06, 7.24e-05, 1.37e-04, 3.86e-04]    [8.31e-04, 1.88e-03, 5.82e-04, 6.53e-03, 4.91e-04, 3.13e-03, 2.14e-03, 4.15e-06, 1.50e-06, 7.24e-05, 1.37e-04, 3.86e-04]    []  
24000     [3.37e-04, 2.83e-04, 6.65e-04, 6.19e-03, 5.35e-04, 3.11e-03, 2.14e-03, 1.83e-06, 2.46e-06, 6.44e-05, 1.41e-04, 3.61e-04]    [9.49e-04, 1.25e-03, 6.36e-04, 6.19e-03, 5.35e-04, 3.11e-03, 2.14e-03, 1.83e-06, 2.46e-06, 6.44e-05, 1.41e-04, 3.61e-04]    []  
25000     [3.76e-04, 2.93e-04, 7.04e-04, 5.88e-03, 5.56e-04, 3.06e-03, 2.14e-03, 3.03e-06, 1.31e-06, 5.38e-05, 1.40e-04, 3.40e-04]    [9.93e-04, 1.26e-03, 6.75e-04, 5.88e-03, 5.56e-04, 3.06e-03, 2.14e-03, 3.03e-06, 1.31e-06, 5.38e-05, 1.40e-04, 3.40e-04]    []  

Best model at step 25000:
  train loss: 1.35e-02
  test loss: 1.51e-02
  test metric: []

'train' took 762.536904 s

No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image

3.6. Plot Results (Adam + L-BFGS)Ā¶

InĀ [28]:
samples = geom.random_points(500000)
result = model.predict(samples)
color_legend = [[0, 1.5], [-0.3, 0.3], [0, 35]]

for idx in range(3):
    plt.figure(figsize = (20, 4))
    plt.scatter(samples[:, 0],
                samples[:, 1],
                c = result[:, idx],
                s = 2,
                cmap = 'jet')
    plt.colorbar()
    plt.clim(color_legend[idx])
    plt.xlim((0-L/2, L-L/2))
    plt.ylim((0-D/2, D-D/2))
plt.tight_layout()
plt.show()
No description has been provided for this image
No description has been provided for this image
No description has been provided for this image
InĀ [29]:
%%javascript
$.getScript('https://kmahelona.github.io/ipython_notebook_goodies/ipython_notebook_toc.js')