import escape as esc
import numpy as np
esc.require("0.9.8")
Loading material database from C:\dev\escape-core\python\src\escape\scattering\..\data\mdb\materials.db
Numerical derivatives¶
Derivatives appear everywhere in fitting and sensitivity analysis: optimizers need how a model changes when a parameter moves; you may want $\partial I/\partial x$ along a profile or check a Jacobian numerically. ESCAPE builds models from variables (e.g. momentum transfer $Q$) and parameters (thickness, roughness, …). The functions esc.derivative, esc.derivative2, and esc.derivative3 approximate derivatives by finite differences on those expressions.
API (first order): esc.derivative(fun, x, calculate_error=False, name="", notes="") — differentiate fun with respect to x. The same optional arguments apply to derivative2 and derivative3.
Supported pairings (this is the important part):
fun |
x |
Typical result type |
|---|---|---|
| real functor or variable | variable | functor_obj |
| complex functor | variable | cplx_functor_obj |
| parameter expression | parameter | parameter_obj (dependent parameter) |
| real functor or variable | parameter | functor_obj (still a function of your variables) |
| complex functor | parameter | cplx_functor_obj (still a function of your variables) |
Second and third order: use esc.derivative2(fun, x) and esc.derivative3(fun, x). Prefer derivative2 over derivative(derivative(...)) — nesting applies finite differences twice and is usually less accurate than the built-in second-order stencil.
Error reporting: with calculate_error=True and maxerr>0, after you evaluate the returned object (e.g. call it like a functor or read .value),
exception will be raised if abserr > maxerr.
The sections below walk through the same cases in order, similar in spirit to how integration.ipynb walks through integrals and averaging.
Real functor with respect to a variable¶
Here fun is a real-valued functor (or a bare variable, which is promoted to a functor), and x is the independent variable you differentiate with respect to. The result is a new functor_obj you can call like the original.
Example: if $f(x)=x^2$, then $f'(x)=2x$. We print the result type and compare to the analytical value at one point.
x = esc.var("x")
f = x * x
df = esc.derivative(f, x)
print(df.variables)
print(type(df).__name__)
print("df(2.0) ≈", df(2.0), " (analytical: 4.0)")
[variable(name='x')] functor_obj df(2.0) ≈ 3.999999999657255 (analytical: 4.0)
Compare numerical and analytical curves for $f(x)=\sin(x)$, where $f'(x)=\cos(x)$, on a grid (same idea as overlaying numerical and analytical integrals in integration.ipynb).
X = esc.var("X")
f = esc.sin(X)
df = esc.derivative(f, X)
truth = esc.cos(X)
coords = np.linspace(-np.pi, np.pi, 300)
print(type(df).__name__)
print("df(2.0) ≈", df(2.0), f" (analytical: {esc.cos(2.0)})")
esc.overlay(df, truth, coordinates=coords).config(
labels=("numerical d/dx sin(X)", "cos(X)"),
xlabel="X",
ylabel="value",
title="First derivative: functor w.r.t. variable",
)
functor_obj df(2.0) ≈ -0.41614683620210224 (analytical: parameter_obj(name='', value=-0.4161468365471424, min=0.0, max=0.0, fixed=False, units=''))
Second and third derivatives; nested derivative vs derivative2¶
For $f(x)=\sin(x)$, $f''(x)=-\sin(x)$. Calling derivative(derivative(f, x), x) uses two finite-difference passes; derivative2(f, x) uses a single stencil aimed at the second derivative. The direct method tracks the analytical curve much more closely.
f = esc.sin(X)
d2_nested = esc.derivative(esc.derivative(f, X), X)
d2_direct = esc.derivative2(f, X)
truth2 = -esc.sin(X)
coords = np.linspace(-np.pi, np.pi, 300)
esc.overlay(d2_nested, d2_direct, truth2, coordinates=coords).config(
labels=("nested derivative(derivative(...))", "derivative2", "-sin(X)"),
xlabel="X",
ylabel="value",
title="Second derivative: prefer derivative2",
)
For $g(x)=x^3$, the third derivative is the constant $6$. esc.derivative3 is the appropriate tool.
g = X * X * X
d3g = esc.derivative3(g, X)
print("d3g(0.7) ≈", d3g(0.7), " (analytical: 6.0)")
d3g(0.7) ≈ 6.000000145849212 (analytical: 6.0)
Real functor with respect to a parameter¶
If the model depends on a parameter $p$ and you differentiate with respect to $p$, the result is still a functor of the remaining variables. For example, $f(x;p)=x\,p$ has $\partial f/\partial p = x$.
p = esc.par("p", 1.0)
f = X * p
dfdp = esc.derivative(f, p)
print(type(dfdp).__name__)
print("∂(X*p)/∂p at X=2.5:", dfdp(2.5), " (analytical: 2.5)")
functor_obj ∂(X*p)/∂p at X=2.5: 2.499999997923139 (analytical: 2.5)
Parameter expression with respect to a parameter¶
When both fun and x are parameters (typically x is an independent parameter you optimize), the derivative is another parameter object. Example: $g(p)=p^2$ gives $g'(p)=2p$.
p = esc.par("p", 2.0)
g = p * p
dg = esc.derivative(g, p)
print(type(dg).__name__)
print("dg.value ≈", dg.value, " (analytical: 4.0)")
parameter_obj dg.value ≈ 3.999999999657255 (analytical: 4.0)
Complex functor with respect to a variable¶
If you build a complex expression (for example with a Python j literal or complex esc.cfunc), you get a cplx_functor_obj. Differentiating with respect to a real variable returns another complex functor. For $h(x)=e^{ix}$, $\frac{d}{dx}h = i\,e^{ix}$.
h = esc.exp(1j * X)
dh = esc.derivative(h, X)
print(type(h).__name__, "->", type(dh).__name__)
x0 = 0.5
num = dh(x0)
ref = 1j * np.exp(1j * x0)
print("numerical:", num)
print("reference:", ref)
print("max abs error:", np.max(np.abs(np.array([num - ref], dtype=complex))))
cplx_functor_obj -> cplx_functor_obj numerical: (-0.479425532472317+0.8775825597813808j) reference: (-0.479425538604203+0.8775825618903728j) max abs error: 6.484433134854919e-09
Multi-variable functors¶
You differentiate with respect to one variable at a time. The result functor still depends on every variable that appeared in the expression, including the one you differentiated with respect to (because the derivative itself is a function of $x$, e.g. $\partial\sin(px)/\partial x = p\cos(px)$ still depends on $x$).
Below: $F(x,y)=\sin(p x)\cos(y)$ and $\partial F/\partial x = p\cos(px)\cos(y)$. We check a single point.
Y = esc.var("Y")
p = esc.par("p", 1.0)
F = esc.sin(p * X) * esc.cos(Y)
dF_dx = esc.derivative(F, X)
print("variables:", dF_dx.variables)
xv, yv = 0.5, 0.3
analytical = p.value * np.cos(p.value * xv) * np.cos(yv)
print("numerical:", dF_dx(xv, yv))
print("analytical:", analytical)
variables: [variable(name='X'), variable(name='Y')] numerical: 0.8383866380746513 analytical: 0.8383866435942036
Optional names, notes, and maxerror¶
name and notes are passed through to the underlying object for documentation in models or logs.
With calculate_error=True, the library can record how the last finite-difference step behaved. If calculated error is > then maxerror, exception will be raised
x = esc.var("x")
f = x * x
df = esc.derivative(f, x, calculate_error=True, maxerr= 1e-12, name="d_x2_dx")
v = df(1.2)
--------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) Cell In[10], line 5 2 f = x * x 4 df = esc.derivative(f, x, calculate_error=True, maxerr= 1e-12, name="d_x2_dx") ----> 5 v = df(1.2) File src/escape/core/entities.pyx:2912, in escape.core.entities.functor_obj.__call__() RuntimeError: Numerical derivative error is greater than maximum allowed error: 0.000000 > 0.000000
Unsupported combinations¶
Not every pairing is valid, even if theoretical result is trivial. For example, differentiating a parameter-only expression with respect to a variable, or a functor, which doesn't depend on a given variable or parameter. Implementation of such cases can potentially lead to misinterpreted results or errors in complicated expressions.
x = esc.var("x")
p = esc.par("p", 1.0)
try:
esc.derivative(p * p, x)
except TypeError as e:
print("TypeError (expected):", e)
See also¶
integration.ipynbin this folder — numerical integrals and distribution-weighted averages.functors.ipynb— building expressions from variables and parameters.parameters.ipynb— independent vs dependent parameters in fits.