Skip to content

Commit f1db228

Browse files
V0.13.0 (#133)
* fixed new train method layout in examples * further adjustments * allow for DiffEq default solver heurisitc * fix action * revert change * Patch example action (#128) * changes for debugging * test non escape for awk * changed escapes for debugging * cleanup * doc fix for FMIFlux.train! (#127) * changes fro FMIBase * adaptions for FMIBase * adjustment for Julia 1.9+ extension system * modified examples * revert tutorial/workshop * Added checks to Example action (#144) * Update Example.yml Added check for success of jupyter examples, fail action if example-building fails, prevents autocommit to examples branch * relaxed compats Update Project.toml --------- Co-authored-by: ThummeTo <83663542+ThummeTo@users.noreply.github.com> * minor adaptions * updated codecov action v4 * testing tests * modified state change sampling * test * switched to linux FMU * make assert a warning * fixed some tests * fixed tests, examples, workshop, typos * updated tutorials --------- Co-authored-by: Simon Exner <43469235+0815Creeper@users.noreply.github.com>
1 parent dbadbed commit f1db228

39 files changed

+1250
-1289
lines changed

.github/workflows/Example.yml

+35-13
Original file line numberDiff line numberDiff line change
@@ -20,24 +20,24 @@ jobs:
2020
fail-fast: false
2121
matrix:
2222
os: [windows-latest] # , ubuntu-latest]
23-
file-name: [growing_horizon_ME, modelica_conference_2021, simple_hybrid_CS, simple_hybrid_ME, mdpi_2022, juliacon_2023]
23+
file-name: [simple_hybrid_CS, simple_hybrid_ME, juliacon_2023]
2424
julia-version: ['1.8']
2525
julia-arch: [x64]
2626
experimental: [false]
27-
27+
2828
steps:
2929
- name: "Check out repository"
3030
uses: actions/checkout@v3
31-
31+
3232
- name: "Set up Julia"
3333
uses: julia-actions/setup-julia@v1
3434
with:
3535
version: ${{ matrix.julia-version }}
3636
arch: ${{ matrix.julia-arch }}
37-
37+
3838
- name: "Install dependencies"
3939
run: julia --project=examples/ -e 'using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate()'
40-
40+
4141
- name: "Install packages"
4242
run: pip install jupyter nbconvert
4343

@@ -48,7 +48,7 @@ jobs:
4848
jupyter nbconvert --ExecutePreprocessor.kernel_name="julia-1.8" --to notebook --inplace --execute ${{ env.FILE }}
4949
jupyter nbconvert --to script ${{ env.FILE }}
5050
jupyter nbconvert --to markdown ${{ env.FILE }}
51-
51+
5252
- name: "Fix GIFs"
5353
run: |
5454
echo "starting gif fixing"
@@ -57,7 +57,7 @@ jobs:
5757
awk '{if($0~/<img src="data:image\/gif;base64,[[:alpha:],[:digit:],\/,+,=]*" \/>/) {sub(/<img src="data:image\/gif;base64,[[:alpha:],[:digit:],\/,+,=]*" \/>/,"![gif](${{ matrix.file-name }}_files\/gif_"++i".gif)")}}1' examples/jupyter-src/${{ matrix.file-name }}.md > examples/jupyter-src/tmp_${{ matrix.file-name }}.md
5858
mv -Force examples/jupyter-src/tmp_${{ matrix.file-name }}.md examples/jupyter-src/${{ matrix.file-name }}.md
5959
echo "gifs should be fixed"
60-
60+
6161
- name: Archive examples artifacts (success)
6262
if: success() && matrix.os == 'windows-latest'
6363
uses: actions/upload-artifact@v3
@@ -70,30 +70,52 @@ jobs:
7070
steps:
7171
- name: "Check out repository"
7272
uses: actions/checkout@v3
73-
73+
7474
- name: "Set up Julia"
7575
uses: julia-actions/setup-julia@v1
7676
with:
7777
version: '1.10'
78-
78+
7979
- run: julia -e 'using Pkg; Pkg.add("PlutoSliderServer"); Pkg.add("FMIFlux")'
8080
- run: julia -e 'using PlutoSliderServer; PlutoSliderServer.export_directory("examples/pluto-src")'
81-
81+
8282
- name: Archive examples artifacts (success)
8383
if: success()
8484
uses: actions/upload-artifact@v3
8585
with:
8686
name: pluto-examples
8787
path: examples/pluto-src/*
8888

89-
auto-commit:
89+
filecheck:
9090
needs: [jypiter, pluto]
91+
runs-on: ubuntu-latest
92+
steps:
93+
- name: Download jupyter examples
94+
uses: actions/download-artifact@v3
95+
with:
96+
name: jupyter-examples
97+
path: examples/jupyter-src/
98+
99+
- name: Download pluto examples
100+
uses: actions/download-artifact@v3
101+
with:
102+
name: pluto-examples
103+
path: examples/pluto-src/
104+
105+
- name: Check if the example files generated are valid (if jupyter-examples failed, svgs are missing; jupyter command does not fail even if examples fail)
106+
uses: andstor/file-existence-action@v3
107+
with:
108+
files: "examples/jupyter-src/*/*.svg"
109+
fail: true
110+
111+
auto-commit:
112+
needs: [jypiter, pluto, filecheck]
91113
if: github.event_name != 'pull_request'
92114
runs-on: ubuntu-latest
93115
steps:
94116
- name: Check out repository
95117
uses: actions/checkout@v3
96-
118+
97119
- name: Download jupyter examples
98120
uses: actions/download-artifact@v3
99121
with:
@@ -130,7 +152,7 @@ jobs:
130152
git add ${{ env.EXAMPLES_PATH }}
131153
git commit -m "${{ env.CI_COMMIT_MESSAGE }}"
132154
git push origin examples
133-
155+
134156
call-docu:
135157
needs: auto-commit
136158
if: github.event_name != 'pull_request'

.github/workflows/TestLTS.yml

+1-11
Original file line numberDiff line numberDiff line change
@@ -54,14 +54,4 @@ jobs:
5454

5555
# Run the tests
5656
- name: "Run tests"
57-
uses: julia-actions/julia-runtest@v1
58-
59-
# Preprocess Coverage
60-
- name: "Preprocess Coverage"
61-
uses: julia-actions/julia-processcoverage@v1
62-
63-
# Run codecov
64-
- name: "Run CodeCov"
65-
uses: codecov/codecov-action@v3
66-
with:
67-
file: lcov.info
57+
uses: julia-actions/julia-runtest@v1

.github/workflows/TestLatest.yml

+4-2
Original file line numberDiff line numberDiff line change
@@ -62,6 +62,8 @@ jobs:
6262

6363
# Run codecov
6464
- name: "Run CodeCov"
65-
uses: codecov/codecov-action@v3
65+
uses: codecov/codecov-action@v4
66+
env:
67+
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }}
6668
with:
67-
file: lcov.info
69+
file: lcov.info

Project.toml

+16-13
Original file line numberDiff line numberDiff line change
@@ -1,30 +1,33 @@
11
name = "FMIFlux"
22
uuid = "fabad875-0d53-4e47-9446-963b74cae21f"
3-
version = "0.12.2"
3+
version = "0.13.0"
44

55
[deps]
66
Colors = "5ae59095-9a9b-59fe-a467-6f913c188581"
77
DifferentiableEigen = "73a20539-4e65-4dcb-a56d-dc20f210a01b"
8-
DifferentialEquations = "0c46a032-eb83-5123-abaf-570d42b7fbaa"
98
FMIImport = "9fcbc62e-52a0-44e9-a616-1359a0008194"
109
FMISensitivity = "3e748fe5-cd7f-4615-8419-3159287187d2"
1110
Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c"
1211
Optim = "429524aa-4258-5aef-a3af-852621145aeb"
12+
OrdinaryDiffEq = "1dea7af3-3e70-54e6-95c3-0bf5283fa5ed"
1313
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
14-
ProgressMeter = "92933f4c-e287-5a05-a399-4b506db050ca"
15-
Requires = "ae029012-a4dd-5104-9daa-d747884805df"
1614
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
1715
ThreadPools = "b189fb0b-2eb5-4ed4-bc0c-d34c51242431"
1816

17+
[weakdeps]
18+
JLD2 = "033835bb-8acc-5ee8-8aae-3f567f8a3819"
19+
20+
[extensions]
21+
JLD2Ext = ["JLD2"]
22+
1923
[compat]
20-
Colors = "0.12.8"
24+
Colors = "0.12"
2125
DifferentiableEigen = "0.2.0"
22-
DifferentialEquations = "7.7.0 - 7.12"
23-
FMIImport = "0.16.4"
24-
FMISensitivity = "0.1.4"
25-
Flux = "0.13.0 - 0.14"
26-
Optim = "1.7.0"
27-
ProgressMeter = "1.7.0 - 1.9"
28-
Requires = "1.3.0"
29-
ThreadPools = "2.1.1"
26+
FMIImport = "1.0.0"
27+
FMISensitivity = "0.2.0"
28+
Flux = "0.9 - 0.14"
29+
Optim = "1.6"
30+
OrdinaryDiffEq = "6.0"
31+
Statistics = "1"
32+
ThreadPools = "2.1"
3033
julia = "1.6"

README.md

+18-9
Original file line numberDiff line numberDiff line change
@@ -38,15 +38,19 @@ You can evaluate FMUs inside of your loss function.
3838
## What is currently supported in FMIFlux.jl?
3939
- building and training ME-NeuralFMUs (NeuralODEs) with support for event-handling (*DiffEqCallbacks.jl*) and discontinuous sensitivity analysis (*SciMLSensitivity.jl*)
4040
- building and training CS-NeuralFMUs
41-
- building and training NeuralFMUs consisiting of multiple FMUs
41+
- building and training NeuralFMUs consisting of multiple FMUs
4242
- building and training FMUINNs (PINNs)
4343
- different AD-frameworks: ForwardDiff.jl (CI-tested), ReverseDiff.jl (CI-tested, default setting), FiniteDiff.jl (not CI-tested) and Zygote.jl (not CI-tested)
44-
- use `Flux.jl` optimisers as well as the ones from `Optim.jl`
45-
- using the entire *DifferentialEquations.jl* solver suite (`autodiff=false` for implicit solvers)
44+
- use `Flux.jl` optimizers as well as the ones from `Optim.jl`
45+
- using the entire *DifferentialEquations.jl* solver suite (`autodiff=false` for implicit solvers, not all are tested, see following section)
4646
- ...
4747

4848
## (Current) Limitations
4949

50+
- Not all implicit solvers work for challenging, hybrid models (stiff FMUs with events), currently tested are: `Rosenbrock23(autodiff=false)`.
51+
52+
- Implicit solvers using `autodiff=true` is not supported (now), but you can use implicit solvers with `autodiff=false`.
53+
5054
- Sensitivity information over state change by event $\partial x^{+} / \partial x^{-}$ can't be accessed in FMI.
5155
These sensitivities are simplified on basis of one of the following assumptions (defined by user):
5256
(1) the state after event depends on nothing, so sensitivities are zero or
@@ -55,13 +59,11 @@ The second is often correct for e.g. mechanical contacts, but may lead to wrong
5559
However even if the gradient might not be 100% correct in any case, gradients are often usable for optimization tasks.
5660
This issue is also part of the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.
5761

58-
- Discontinuous systems with implicite solvers use continuous adjoints instead of automatic differentiation through the ODE solver.
59-
This might lead to issues, because FMUs are by design not simulatable backward in time.
60-
On the other hand, many FMUs are capabale of doing so.
62+
- Discontinuous systems with implicit solvers use continuous adjoints instead of automatic differentiation through the ODE solver.
63+
This might lead to issues, because FMUs are by design not capable of being simulated backwards in time.
64+
On the other hand, many FMUs are capable of doing so.
6165
This issue is also part of the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.
6266

63-
- Implicit solvers using `autodiff=true` is not supported (now), but you can use implicit solvers with `autodiff=false`.
64-
6567
- For now, only FMI version 2.0 is supported, but FMI 3.0 support is coming with the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.
6668

6769
## What is under development in FMIFlux.jl?
@@ -82,12 +84,19 @@ To keep dependencies nice and clean, the original package [*FMI.jl*](https://git
8284
- [*FMI.jl*](https://github.com/ThummeTo/FMI.jl): High level loading, manipulating, saving or building entire FMUs from scratch
8385
- [*FMIImport.jl*](https://github.com/ThummeTo/FMIImport.jl): Importing FMUs into Julia
8486
- [*FMIExport.jl*](https://github.com/ThummeTo/FMIExport.jl): Exporting stand-alone FMUs from Julia Code
87+
- [*FMIBase.jl*](https://github.com/ThummeTo/FMIBase.jl): Common concepts for import and export of FMUs
8588
- [*FMICore.jl*](https://github.com/ThummeTo/FMICore.jl): C-code wrapper for the FMI-standard
8689
- [*FMISensitivity.jl*](https://github.com/ThummeTo/FMISensitivity.jl): Static and dynamic sensitivities over FMUs
8790
- [*FMIBuild.jl*](https://github.com/ThummeTo/FMIBuild.jl): Compiler/Compilation dependencies for FMIExport.jl
88-
- [*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl): Machine Learning with FMUs (differentiation over FMUs)
91+
- [*FMIFlux.jl*](https://github.com/ThummeTo/FMIFlux.jl): Machine Learning with FMUs
8992
- [*FMIZoo.jl*](https://github.com/ThummeTo/FMIZoo.jl): A collection of testing and example FMUs
9093

94+
## Video-Workshops
95+
### JuliaCon 2024 (Eindhoven University of Technology, Netherlands)
96+
[![YouTube Video of Workshop](https://img.youtube.com/vi/sQ2MXSswrSo/0.jpg)](https://www.youtube.com/watch?v=sQ2MXSswrSo)
97+
### JuliaCon 2023 (Massachusetts Institute of Technology, United States)
98+
[![YouTube Video of Workshop](https://img.youtube.com/vi/X_u0KlZizD4/0.jpg)](https://www.youtube.com/watch?v=X_u0KlZizD4)
99+
91100
## How to cite?
92101
Tobias Thummerer, Johannes Stoljar and Lars Mikelsons. 2022. **NeuralFMU: presenting a workflow for integrating hybrid NeuralODEs into real-world applications.** Electronics 11, 19, 3202. [DOI: 10.3390/electronics11193202](https://doi.org/10.3390/electronics11193202)
93102

docs/src/examples/overview.md

+5-4
Original file line numberDiff line numberDiff line change
@@ -10,15 +10,16 @@ The examples show how to combine FMUs with machine learning ("NeuralFMU") and il
1010
## Examples
1111
- [__Simple CS-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/simple_hybrid_CS/): Showing how to train a NeuralFMU in Co-Simulation-Mode.
1212
- [__Simple ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/simple_hybrid_ME/): Showing how to train a NeuralFMU in Model-Exchange-Mode.
13-
- [__Growing Horizon ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/growing_horizon_ME/): Growing horizon training technique for a ME-NeuralFMU.
1413

1514
## Advanced examples: Demo applications
1615
- [__JuliaCon 2023: Using NeuralODEs in real life applications__](https://thummeto.github.io/FMIFlux.jl/dev/examples/juliacon_2023/): An example for a NeuralODE in a real world engineering scenario.
1716
- [__Modelica Conference 2021: NeuralFMUs__](https://thummeto.github.io/FMIFlux.jl/dev/examples/modelica_conference_2021/): Showing basics on how to train a NeuralFMU (Contribution for the *Modelica Conference 2021*).
1817

18+
## Workshops
19+
[Pluto](https://plutojl.org/) based notebooks, that can easily be executed on your own Pluto-Setup.
20+
- [__Scientific Machine Learning using Functional Mock-up Units__](../pluto-src/SciMLUsingFMUs/SciMLUsingFMUs.html): Workshop at JuliaCon 2024 (Eindhoven University, Netherlands)
21+
1922
## Archived
2023
- [__MDPI 2022: Physics-enhanced NeuralODEs in real-world applications__](https://thummeto.github.io/FMIFlux.jl/dev/examples/mdpi_2022/): An example for a NeuralODE in a real world modeling scenario (Contribution in *MDPI Electronics 2022*).
21-
22-
## Workshops
23-
[Pluto](https://plutojl.org/) based notebooks, that can easyly be executed on your own Pluto-Setup.
24+
- [__Growing Horizon ME-NeuralFMU__](https://thummeto.github.io/FMIFlux.jl/dev/examples/growing_horizon_ME/): Growing horizon training technique for a ME-NeuralFMU.
2425
- [__HybridModelingUsingFMI__](../pluto-src/HybridModelingUsingFMI/HybridModelingUsingFMI.html): Workshop at MODPROD 2024 (Linköping University, Sweden)

examples/jupyter-src/.gitignore

+2-1
Original file line numberDiff line numberDiff line change
@@ -1 +1,2 @@
1-
params/
1+
params/
2+
*.png

examples/jupyter-src/growing_horizon_ME.ipynb

+6
Original file line numberDiff line numberDiff line change
@@ -8,6 +8,12 @@
88
"# ME-NeuralFMUs using Growing Horizon\n",
99
"Tutorial by Johannes Stoljar, Tobias Thummerer\n",
1010
"\n",
11+
"----------\n",
12+
"\n",
13+
"📚📚📚 This tutorial is archieved (so keeping it runnable is low priority) 📚📚📚\n",
14+
"\n",
15+
"----------\n",
16+
"\n",
1117
"*Last edit: 08.11.2023*\n",
1218
"\n",
1319
"## LICENSE\n"

0 commit comments

Comments
 (0)