You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
- run: julia -e 'using Pkg; Pkg.add("PlutoSliderServer"); Pkg.add("FMIFlux")'
80
80
- run: julia -e 'using PlutoSliderServer; PlutoSliderServer.export_directory("examples/pluto-src")'
81
-
81
+
82
82
- name: Archive examples artifacts (success)
83
83
if: success()
84
84
uses: actions/upload-artifact@v3
85
85
with:
86
86
name: pluto-examples
87
87
path: examples/pluto-src/*
88
88
89
-
auto-commit:
89
+
filecheck:
90
90
needs: [jypiter, pluto]
91
+
runs-on: ubuntu-latest
92
+
steps:
93
+
- name: Download jupyter examples
94
+
uses: actions/download-artifact@v3
95
+
with:
96
+
name: jupyter-examples
97
+
path: examples/jupyter-src/
98
+
99
+
- name: Download pluto examples
100
+
uses: actions/download-artifact@v3
101
+
with:
102
+
name: pluto-examples
103
+
path: examples/pluto-src/
104
+
105
+
- name: Check if the example files generated are valid (if jupyter-examples failed, svgs are missing; jupyter command does not fail even if examples fail)
Copy file name to clipboardExpand all lines: README.md
+18-9
Original file line number
Diff line number
Diff line change
@@ -38,15 +38,19 @@ You can evaluate FMUs inside of your loss function.
38
38
## What is currently supported in FMIFlux.jl?
39
39
- building and training ME-NeuralFMUs (NeuralODEs) with support for event-handling (*DiffEqCallbacks.jl*) and discontinuous sensitivity analysis (*SciMLSensitivity.jl*)
40
40
- building and training CS-NeuralFMUs
41
-
- building and training NeuralFMUs consisiting of multiple FMUs
41
+
- building and training NeuralFMUs consisting of multiple FMUs
42
42
- building and training FMUINNs (PINNs)
43
43
- different AD-frameworks: ForwardDiff.jl (CI-tested), ReverseDiff.jl (CI-tested, default setting), FiniteDiff.jl (not CI-tested) and Zygote.jl (not CI-tested)
44
-
- use `Flux.jl`optimisers as well as the ones from `Optim.jl`
45
-
- using the entire *DifferentialEquations.jl* solver suite (`autodiff=false` for implicit solvers)
44
+
- use `Flux.jl`optimizers as well as the ones from `Optim.jl`
45
+
- using the entire *DifferentialEquations.jl* solver suite (`autodiff=false` for implicit solvers, not all are tested, see following section)
46
46
- ...
47
47
48
48
## (Current) Limitations
49
49
50
+
- Not all implicit solvers work for challenging, hybrid models (stiff FMUs with events), currently tested are: `Rosenbrock23(autodiff=false)`.
51
+
52
+
- Implicit solvers using `autodiff=true` is not supported (now), but you can use implicit solvers with `autodiff=false`.
53
+
50
54
- Sensitivity information over state change by event $\partial x^{+} / \partial x^{-}$ can't be accessed in FMI.
51
55
These sensitivities are simplified on basis of one of the following assumptions (defined by user):
52
56
(1) the state after event depends on nothing, so sensitivities are zero or
@@ -55,13 +59,11 @@ The second is often correct for e.g. mechanical contacts, but may lead to wrong
55
59
However even if the gradient might not be 100% correct in any case, gradients are often usable for optimization tasks.
56
60
This issue is also part of the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.
57
61
58
-
- Discontinuous systems with implicite solvers use continuous adjoints instead of automatic differentiation through the ODE solver.
59
-
This might lead to issues, because FMUs are by design not simulatable backward in time.
60
-
On the other hand, many FMUs are capabale of doing so.
62
+
- Discontinuous systems with implicit solvers use continuous adjoints instead of automatic differentiation through the ODE solver.
63
+
This might lead to issues, because FMUs are by design not capable of being simulated backwards in time.
64
+
On the other hand, many FMUs are capable of doing so.
61
65
This issue is also part of the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.
62
66
63
-
- Implicit solvers using `autodiff=true` is not supported (now), but you can use implicit solvers with `autodiff=false`.
64
-
65
67
- For now, only FMI version 2.0 is supported, but FMI 3.0 support is coming with the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.
66
68
67
69
## What is under development in FMIFlux.jl?
@@ -82,12 +84,19 @@ To keep dependencies nice and clean, the original package [*FMI.jl*](https://git
82
84
-[*FMI.jl*](https://github.com/ThummeTo/FMI.jl): High level loading, manipulating, saving or building entire FMUs from scratch
83
85
-[*FMIImport.jl*](https://github.com/ThummeTo/FMIImport.jl): Importing FMUs into Julia
84
86
-[*FMIExport.jl*](https://github.com/ThummeTo/FMIExport.jl): Exporting stand-alone FMUs from Julia Code
87
+
-[*FMIBase.jl*](https://github.com/ThummeTo/FMIBase.jl): Common concepts for import and export of FMUs
85
88
-[*FMICore.jl*](https://github.com/ThummeTo/FMICore.jl): C-code wrapper for the FMI-standard
86
89
-[*FMISensitivity.jl*](https://github.com/ThummeTo/FMISensitivity.jl): Static and dynamic sensitivities over FMUs
87
90
-[*FMIBuild.jl*](https://github.com/ThummeTo/FMIBuild.jl): Compiler/Compilation dependencies for FMIExport.jl
88
-
-[*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl): Machine Learning with FMUs (differentiation over FMUs)
91
+
-[*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl): Machine Learning with FMUs
89
92
-[*FMIZoo.jl*](https://github.com/ThummeTo/FMIZoo.jl): A collection of testing and example FMUs
90
93
94
+
## Video-Workshops
95
+
### JuliaCon 2024 (Eindhoven University of Technology, Netherlands)
96
+
[](https://www.youtube.com/watch?v=sQ2MXSswrSo)
97
+
### JuliaCon 2023 (Massachusetts Institute of Technology, United States)
98
+
[](https://www.youtube.com/watch?v=X_u0KlZizD4)
99
+
91
100
## How to cite?
92
101
Tobias Thummerer, Johannes Stoljar and Lars Mikelsons. 2022. **NeuralFMU: presenting a workflow for integrating hybrid NeuralODEs into real-world applications.** Electronics 11, 19, 3202. [DOI: 10.3390/electronics11193202](https://doi.org/10.3390/electronics11193202)
Copy file name to clipboardExpand all lines: docs/src/examples/overview.md
+5-4
Original file line number
Diff line number
Diff line change
@@ -10,15 +10,16 @@ The examples show how to combine FMUs with machine learning ("NeuralFMU") and il
10
10
## Examples
11
11
-[__Simple CS-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/simple_hybrid_CS/): Showing how to train a NeuralFMU in Co-Simulation-Mode.
12
12
-[__Simple ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/simple_hybrid_ME/): Showing how to train a NeuralFMU in Model-Exchange-Mode.
13
-
-[__Growing Horizon ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/growing_horizon_ME/): Growing horizon training technique for a ME-NeuralFMU.
14
13
15
14
## Advanced examples: Demo applications
16
15
-[__JuliaCon 2023: Using NeuralODEs in real life applications__](https://thummeto.github.io/FMIFlux.jl/dev/examples/juliacon_2023/): An example for a NeuralODE in a real world engineering scenario.
17
16
-[__Modelica Conference 2021: NeuralFMUs__](https://thummeto.github.io/FMIFlux.jl/dev/examples/modelica_conference_2021/): Showing basics on how to train a NeuralFMU (Contribution for the *Modelica Conference 2021*).
18
17
18
+
## Workshops
19
+
[Pluto](https://plutojl.org/) based notebooks, that can easily be executed on your own Pluto-Setup.
20
+
-[__Scientific Machine Learning using Functional Mock-up Units__](../pluto-src/SciMLUsingFMUs/SciMLUsingFMUs.html): Workshop at JuliaCon 2024 (Eindhoven University, Netherlands)
21
+
19
22
## Archived
20
23
-[__MDPI 2022: Physics-enhanced NeuralODEs in real-world applications__](https://thummeto.github.io/FMIFlux.jl/dev/examples/mdpi_2022/): An example for a NeuralODE in a real world modeling scenario (Contribution in *MDPI Electronics 2022*).
21
-
22
-
## Workshops
23
-
[Pluto](https://plutojl.org/) based notebooks, that can easyly be executed on your own Pluto-Setup.
24
+
-[__Growing Horizon ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/growing_horizon_ME/): Growing horizon training technique for a ME-NeuralFMU.
24
25
-[__HybridModelingUsingFMI__](../pluto-src/HybridModelingUsingFMI/HybridModelingUsingFMI.html): Workshop at MODPROD 2024 (Linköping University, Sweden)
0 commit comments