Skip to content

Commit

Permalink
merge fixes
Browse files Browse the repository at this point in the history
  • Loading branch information
TorkelE committed Feb 2, 2024
1 parent 7e613d1 commit 7e2108c
Show file tree
Hide file tree
Showing 2 changed files with 6 additions and 5 deletions.
7 changes: 4 additions & 3 deletions docs/pages.jl
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,9 @@ pages = Any["Home" => "index.md",
"catalyst_applications/nonlinear_solve.md",
"catalyst_applications/bifurcation_diagrams.md"],
"Inverse Problems" => Any["inverse_problems/optimization_ode_param_fitting.md",
"inverse_problems/petab_ode_param_fitting.md",
"inverse_problems/structural_identifiability.md",
"Inverse problem examples" => Any["inverse_problems/examples/ode_fitting_oscillation.md"]],
"inverse_problems/petab_ode_param_fitting.md",
"inverse_problems/behaviour_optimisation.md",
"inverse_problems/structural_identifiability.md",
"Inverse problem examples" => Any["inverse_problems/examples/ode_fitting_oscillation.md"]],
"FAQs" => "faqs.md",
"API" => "api/catalyst_api.md"]
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Optimization for non-data fitting purposes
# [Optimization for non-data fitting purposes](@id behaviour_optimisation)
In previous tutorials we have described how to use [PEtab.jl](@ref petab_parameter_fitting) and [Optimization.jl](@ref optimization_parameter_fitting) for parameter fitting. This involves solving an optimisation problem (to find the parameter set yielding the best model-to-data fit). There are, however, other situations that require solving optimisation problems. Typically, these involve the creation of a custom cost function, which optimum can then be found using Optimization.jl. In this tutorial we will describe this process, demonstrating how parameter space can be searched to find values that achieve a desired system behaviour. A more throughout description on how to solve these problems are provided by [Optimization.jl's documentation](https://docs.sciml.ai/Optimization/stable/) and the literature [^1].

## Maximising the pulse amplitude of an incoherent feed forward loop.
Expand Down Expand Up @@ -71,7 +71,7 @@ For this model, it turns out that $Z$'s maximum pulse amplitude is equal to twic
There are several modifications to our problem where it would actually have parameters. E.g. our model might have had additional parameters (e.g. a degradation rate) which we would like to keep fixed throughout the optimisation process. If we then would like to run the optimisation process for several different values of these fixed parameters, we could have made them parameters to our `OptimizationProblem` (and their values provided as a third argument, after `initial_guess`).

## Utilising automatic differentiation
Optimisation methods can be divided into differentiation-free and differentiation-based optimisation methods. E.g. consider finding the minimum of the function $f(x) = x^2$, given some initial guess of $x$. Here, we can simply compute the differential and descend along it until we find $x=0$ (admittedly, for this simple problem the minimum can be computed directly). This principle forms the basis of optimisation methods such as gradient descent, which utilises information of a function's differential to minimise it. When attempting to find a global minimum, to avoid getting stuck in local minimums, these methods are often augmented by additional routines. While the differention of most algebraic functions is trivial, it turns out that even complicated functions (such as the one we used above) can be differentiated computationally through the use of [*automatic differentiation* (AD)](https://en.wikipedia.org/wiki/Automatic_differentiation).
Optimisation methods can be divided into differentiation-free and differentiation-based optimisation methods. E.g. consider finding the minimum of the function $f(x) = x^2$, given some initial guess of $x$. Here, we can simply compute the differential and descend along it until we find $x=0$ (admittedly, for this simple problem the minimum can be computed directly). This principle forms the basis of optimisation methods such as gradient descent, which utilises information of a function's differential to minimise it. When attempting to find a global minimum, to avoid getting stuck in local minimums, these methods are often augmented by additional routines. While the differentiation of most algebraic functions is trivial, it turns out that even complicated functions (such as the one we used above) can be differentiated computationally through the use of [*automatic differentiation* (AD)](https://en.wikipedia.org/wiki/Automatic_differentiation).

Through packages such as [ForwardDiff.jl](/~https://github.com/JuliaDiff/ForwardDiff.jl), [ReverseDiff.jl](/~https://github.com/JuliaDiff/ReverseDiff.jl), and [Zygote.jl](/~https://github.com/FluxML/Zygote.jl), Julia supports AD for most code. Specifically for code including simulation of differential equations, differentiation is supported by [SciMLSensitivity.jl](/~https://github.com/SciML/SciMLSensitivity.jl). Generally, AD can be used without specific knowledge from the user, however, it requires an additional step in the construction of our `OptimizationProblem`. Here, we create a [specialised `OptimizationFunction` from our cost function](https://docs.sciml.ai/Optimization/stable/API/optimization_function/#optfunction). To it, we will also provide our choice of AD method. There are [several alternatives](https://docs.sciml.ai/Optimization/stable/API/optimization_function/#Automatic-Differentiation-Construction-Choice-Recommendations), and in our case we will use `AutoForwardDiff()` (a good choice for small optimisation problems). We can then create a new `OptimizationProblem` using our updated cast function:
```@example behaviour_optimization
Expand Down

0 comments on commit 7e2108c

Please sign in to comment.