This is a simple set of code that automatically finds airfoils, optimizing for the lift-to-drag ratio blockMesh
turned out to be very painful, so the meshing is handled by curiosityFluids' excellent mesher (blog post)
SciPy's differential evolution is taken as an optimization algorithm. It's far slower than other methods, but using a global optimizer here seems like the better choice, since I want to explore the full space, to see if there are multiple viable solutions. Other derivative-free algorithms like Nelder-Mead also found reasonable airfoils and were much faster, however.1.
The simulation is then ran. If any issues are encountered with the meshing, blockMesh
, or simpleFoam
, or convergence, the code returns
The result is a CSV containing airfoil parameters and their performance. These can be further post-processed with ParaView.
This was a fun Christmas holiday project, and a nice foray into coupling non-trivial simulation problems with (surrogate-based) optimization. If anyone has suggestions on how to improve or adjust things further - I am very much open to them! Overall, I'm surprised at how smoothly this project went. The existing repos helped a lot, especially with meshing. I'm still impressed at how effective differential evolution was - with a previous meshing-template, it was able to find and exploit flaws with ease. I had to adjust the goal function so many times there. I am also very much impressed with how effective random forests were at representing these complex simulations in the surrogate model aspect!
This assumes you already have OpenFOAM installed - I am using the OpenFOAM.com/ESI version. You can clone the repo. Source the OpenFOAM functions first. For the ESI-version, this can be done with
source /usr/lib/openfoam/openfoam2406/etc/bashrc
This exposes functions like blockMesh
and simpleFoam
. Dependencies were kept to a minimum: pandas
, numpy
, matplotlib
, scipy
, and optionally sklearn
. Once these are installed, run python main.py
.
The code is quite minimal. This is partially on purpose. If you are using OpenFOAM, you are used to editing code files. It's also quite difficult to create a sufficiently flexible way of working with these things without oversimplifying. The intended way of using the code is to look through everything and try to understand it. Everything should be quite straight-forward and readable.
The code uses an OpenFOAM template folder, which is repeatedly copied into several directories, depending on the number of workers specified, using UUIDs as names. Relevant parameters are then entered into these templates (such as the adjusted blockMesh
, and U
) As soon as a case completes (either because of errors, or because it correctly finished), the folder is deleted. Since OpenFOAM requires its folders in a particular structure, differential evolution can easily be parallelized, and because each individual case runs fast, this seemed like a better choice than decomposing the domain across multiple cores. With the current method, there should be almost no overhead from parallelization.
The code can post-process the top-n highest scoring airfoils so far, simulating each of them. The results can subsequently be rendered with a ParaView Python macro. First, run
python main.py --custom
This places the top-n runs under custom_runs
. Follow this by
python src/post_processing/post_process.py
If all goes well, this places all the results under results/renders
.
The latter isn't entirely reliable: ParaView includes its own Python-distribution based on Python 3.10. If you encounter errors here, it's best to either look at the code, or open each .foam
under custom_runs
individually.
The repo includes an analysis-notebook. This tracks performance over time, and allows for selection of the best-performing airfoils.
I started off with the default parameters from a case I found, which used
It turns out,
I tested some of these for one of the airfoils that previously messed up, and it appears there is almost no difference between the smaller values.
Case | ||||
---|---|---|---|---|
Original | 0.516667 | -0.000555 | -931.013514 | |
OpenFOAM default | 1.13645 | -0.0742927 | -15.295453 | |
Lower NASA-bound | 1.1369 | -0.0743574 | -15.287780 | |
Far lower value | 1.13665 | -0.0743269 | -15.290234 |
I will continue to use
Another interesting issue; some airfoils never converged with SIMPLE, instead oscillating at different
With that, we get an interesting population. Three out of the top-four are all very different; the best performer is a fairly standard airfoil, albeit a bit thick. The next-best is almost bird-like, and the third is high-camber instead. It's surprising to see such variation even after a fairly long run.
I added code to evaluate the lift-drag ratio as a function of the angle-of-attack (AoA). The fact that the airfoil performs best at 5° isn't very surprising; it's optimized for that point. For higher angles, performance rapidly decreases. I considered multi-objective optimization to create an airfoil that performs well over a wider range, but that would be very slow to run.
One interesting aspect here: I ran this up to 45°, but did not obtain sufficiently converged solutions there. Examining the forces for the highest AoA where simpleFOAM
did not simply crash, we observe an oscillatory solution - I think this is basically the air detaching from the airfoil, resulting in a kind of Kármán vortex street, with SIMPLE unable to converge to a steady-state solution. Examining ParaView here, we indeed see oscillations in the flow field.
This behavior previously caused a lot of issues with very small or negative
I am curious about potential model reduction: by predicting performance based on the six inputs, a lot of time could be saved. If a rough prediction on which airfoils perform best is accurate, a simple machine learning model like random forests could be used for an initial optimization stage. I doubt a simple model like this would be sufficient, but it's an interesting avenue to explore.
After some attempts, it seems surprisingly good. I get MAEs of 1.5 - 5, for a small training set, even for the best-performing airfoils, where there is correspondingly less data available. This is technically a surrogate model-approach.
After running it a bit longer, it gets better and better; I'm very surprised. We do have a fair amount of data, but this is spread out in six dimensions; the curse of dimensionality should be kicking in here, yet somehow, even with quite sparse data, it's doing well. However, I am not using a randomly sampled set; the data is all from an optimizer, so it's likely to be clustered around certain regions, effectively reducing dimensionality.
I added a grid-search with 5-fold cross-validation to optimize a classification and regression model, then optimized those with the same differential evolution code. This is a two-step process; we first predict whether we will have any result at all (i.e. no failures in overlapping airfoils, blockMesh
, simpleFoam
, or convergence issues), and if the random forest predicts there aren't, we regress our vector to obtain
Oddly, this gets stuck below the best-performers that we previously found using the regular optimization method. Overall, though, it's very similar in shape and design to the optimal version, and it's close; the best airfoil I have found thus far reached 59.68698, and this one is at 59.31823; it's very close, and it only took a few minutes to run, compared with 72 hours for the full model.2 After some thought, the reason for this is obvious; a random forest partitions space with constant values, but cannot extrapolate. A switch to an SVM for regression gives better results; it now hits
Footnotes
-
We can't use derivative-based optimization methods very easily, because if we have a crash in
blockMesh
, we have no measure of how 'badly' things messed up, so steering gradients away from there is difficult. ↩ -
The figure name has
58.57
in there, but I messed up the naming; that was for a previous, worse airfoil. ↩