Skip to content

Commit 8031977

Browse files
authored
Separate DiscreteSINDy as a new class (#654)
* CLN: Separate DiscreteSINDy as its own class * CLN: ensure consistency of shapes and types of x_next and u, and compatibility with multiple trajectories for DiscreteSINDy * TST: update tests for DiscreteSINDy * DOC: Update documentation for DiscreteSINDy * CLN: use predict from BaseSINDy for SINDy and DiscreteSINDy models * CLN: Remove SINDy.differentiate() * FIX: process trajectories only when x_dot is None * CLN: lint the code * FIX: fix _tensordot_to_einsum error * CLN: remove discrete_time from scikit-time API * CLN: remove SINDy.differentiate() and update DiscreteSINDy for feature overview notebook * FIX: remove SINDyPI conditionals from DiscreteSINDy.print() * TST: add test for validate_control_variables function * DOC: update documentation and add logistic map example in DiscreteSINDy documentation * CLN: separate equations function for DiscreteSINDy and code cleanup * DOC: remove discrete SINDy examples from feature overview * CLN: remove unnecessary import from core * DOC: fix logistic map plot in DiscreteSINDy documentation * FIX: show plot source code for discrete sindy documentation
1 parent daac208 commit 8031977

File tree

13 files changed

+1455
-1170
lines changed

13 files changed

+1455
-1170
lines changed

docs/conf.py

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@
3131
"sphinx.ext.mathjax",
3232
"sphinx.ext.intersphinx",
3333
"IPython.sphinxext.ipython_console_highlighting",
34+
"matplotlib.sphinxext.plot_directive",
3435
]
3536

3637
nb_execution_mode = "off"
@@ -57,6 +58,9 @@
5758
html_show_sphinx = False
5859
html_show_copyright = True
5960

61+
plot_html_show_source_link = False
62+
plot_html_show_formats = False
63+
6064
default_role = "any"
6165
html_sourcelink_suffix = ""
6266

examples/1_feature_overview/example.ipynb

Lines changed: 881 additions & 759 deletions
Large diffs are not rendered by default.

examples/1_feature_overview/example.py

Lines changed: 3 additions & 83 deletions
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ def ignore_specific_warnings():
114114
x_dot_test_predicted = model.predict(x_test)
115115

116116
# Compute derivatives with a finite difference method, for comparison
117-
x_dot_test_computed = model.differentiate(x_test, t=dt)
117+
x_dot_test_computed = model.differentiation_method(x_test, t=dt)
118118

119119
fig, axs = plt.subplots(x_test.shape[1], 1, sharex=True, figsize=(7, 9))
120120
for i in range(x_test.shape[1]):
@@ -149,30 +149,6 @@ def ignore_specific_warnings():
149149

150150
fig.show()
151151

152-
# %% [markdown]
153-
# ## Discrete time dynamical system (map)
154-
155-
# %%
156-
157-
158-
def f(x):
159-
return 3.6 * x * (1 - x)
160-
161-
162-
if __name__ != "testing":
163-
n_steps = 1000
164-
else:
165-
n_steps = 10
166-
eps = 0.001 # Noise level
167-
x_train_map = np.zeros((n_steps))
168-
x_train_map[0] = 0.5
169-
for i in range(1, n_steps):
170-
x_train_map[i] = f(x_train_map[i - 1]) + eps * np.random.randn()
171-
model = ps.SINDy(discrete_time=True)
172-
model.fit(x_train_map, t=1)
173-
174-
model.print()
175-
176152
# %% [markdown]
177153
# ## Optimization options
178154
# In this section we provide examples of different parameters accepted by the built-in sparse regression optimizers `STLSQ`, `SR3`, `ConstrainedSR3`, `MIOSR`, `SSR`, and `FROLS`. The `Trapping` optimizer is not straightforward to use; please check out Example 8 for some examples. We also show how to use a scikit-learn sparse regressor with PySINDy.
@@ -782,7 +758,7 @@ def f(x):
782758
x_dot_test_predicted = model.predict(x_test)
783759

784760
# Compute derivatives with a finite difference method, for comparison
785-
x_dot_test_computed = model.differentiate(x_test, t=dt)
761+
x_dot_test_computed = model.differentiation_method(x_test, t=dt)
786762

787763
fig, axs = plt.subplots(x_test.shape[1], 1, sharex=True, figsize=(7, 9))
788764
for i in range(x_test.shape[1]):
@@ -905,7 +881,7 @@ def u_fun(t):
905881
x_dot_test_predicted = model.predict(x_test, u=u_test)
906882

907883
# Compute derivatives with a finite difference method, for comparison
908-
x_dot_test_computed = model.differentiate(x_test, t=dt)
884+
x_dot_test_computed = model.differentiation_method(x_test, t=dt)
909885

910886
fig, axs = plt.subplots(x_test.shape[1], 1, sharex=True, figsize=(7, 9))
911887
for i in range(x_test.shape[1]):
@@ -1053,62 +1029,6 @@ def u_fun(t):
10531029
model.fit(x_train, t=t)
10541030
model.print()
10551031

1056-
# %% [markdown]
1057-
# ## SINDy with control parameters (SINDyCP)
1058-
# The control input in PySINDy can be used to discover equations parameterized by control parameters in conjunction with the `ParameterizedLibrary`. We demonstrate on the logistic map
1059-
# $$ x_{n+1} = r x_n(1-x_n)$$
1060-
# which depends on a single parameter $r$.
1061-
1062-
# %%
1063-
# Iterate the map and drop the initial 500-step transient. The behavior is chaotic for r>3.6.
1064-
if __name__ != "testing":
1065-
num = 1000
1066-
N = 1000
1067-
N_drop = 500
1068-
else:
1069-
num = 20
1070-
N = 20
1071-
N_drop = 10
1072-
r0 = 3.5
1073-
rs = r0 + np.arange(num) / num * (4 - r0)
1074-
xss = []
1075-
for r in rs:
1076-
xs = []
1077-
x = 0.5
1078-
for n in range(N + N_drop):
1079-
if n >= N_drop:
1080-
xs = xs + [x]
1081-
x = r * x * (1 - x)
1082-
xss = xss + [xs]
1083-
1084-
plt.figure(figsize=(4, 4), dpi=100)
1085-
for ind in range(num):
1086-
plt.plot(np.ones(N) * rs[ind], xss[ind], ",", alpha=0.1, c="black", rasterized=True)
1087-
plt.xlabel("$r$")
1088-
plt.ylabel("$x_n$")
1089-
plt.show()
1090-
1091-
# %% [markdown]
1092-
# We construct a `parameter_library` and a `feature_library` to act on the input data `x` and the control input `u` independently. The `ParameterizedLibrary` is composed of products of the two libraries output features. This enables fine control over the library features, which is especially useful in the case of PDEs like those arising in pattern formation modeling. See this [notebook](https://github.com/dynamicslab/pysindy/blob/master/examples/17_parameterized_pattern_formation/17_parameterized_pattern_formation.ipynb) for examples.
1093-
1094-
# %%
1095-
# use four parameter values as training data
1096-
rs_train = [3.6, 3.7, 3.8, 3.9]
1097-
xs_train = [np.array(xss[np.where(np.array(rs) == r)[0][0]]) for r in rs_train]
1098-
1099-
feature_lib = ps.PolynomialLibrary(degree=3, include_bias=True)
1100-
parameter_lib = ps.PolynomialLibrary(degree=1, include_bias=True)
1101-
lib = ps.ParameterizedLibrary(
1102-
feature_library=feature_lib,
1103-
parameter_library=parameter_lib,
1104-
num_features=1,
1105-
num_parameters=1,
1106-
)
1107-
opt = ps.STLSQ(threshold=1e-1, normalize_columns=False)
1108-
model = ps.SINDy(feature_library=lib, optimizer=opt, discrete_time=True)
1109-
model.fit(xs_train, u=rs_train, t=1, feature_names=["x", "r"])
1110-
model.print()
1111-
11121032
# %% [markdown]
11131033
# ## PDEFIND Feature Overview
11141034
# PySINDy now supports SINDy for PDE identification (PDE-FIND) (Rudy, Samuel H., Steven L. Brunton, Joshua L. Proctor, and J. Nathan Kutz. "Data-driven discovery of partial differential equations." Science Advances 3, no. 4 (2017): e1602614.). We illustrate a basic example on Burgers' equation:

pyproject.toml

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -60,6 +60,7 @@ docs = [
6060
"sphinx==8.2.3",
6161
"pyyaml",
6262
"sphinxcontrib-apidoc",
63+
"matplotlib"
6364
]
6465
miosr = [
6566
"gurobipy>=9.5.1,!=10.0.0"

pysindy/__init__.py

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,9 @@
1111
from . import optimizers
1212
from . import deeptime
1313
from . import utils
14+
1415
from ._core import SINDy
16+
from ._core import DiscreteSINDy
1517
from ._core import AxesArray
1618
from .differentiation import BaseDifferentiation
1719
from .differentiation import FiniteDifference
@@ -65,6 +67,7 @@
6567

6668
__all__ = [
6769
"SINDy",
70+
"DiscreteSINDy",
6871
"differentiation",
6972
"feature_library",
7073
"optimizers",

0 commit comments

Comments
 (0)