Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Interpolate restart file when it does not match the mesh #1277

Merged
merged 12 commits into from
May 6, 2021

Conversation

pcarruscag
Copy link
Member

Method described below.
No config options needed.
Takes about a minute to interpolate an 8M node mesh to a 70M mesh on 1000 cores.
May or may not add OpenMP to this which would improve scalability.
Restarts are read twice for RANS problems, which is not great for this type of feature...

Comment on lines 3130 to 3138
/* Challenges:
* - Do not use too much memory by gathering the restart data in all ranks.
* - Do not repeat too many computations in all ranks.
* Solution?:
* - Build a local ADT for the domain points (not the restart points).
* - Find the closest target point for each donor, which does not match all targets.
* - "Diffuse" the data to neighbor points.
* Complexity is approx. Nlt + (Nlt + Nd) log(Nlt) where Nlt is the LOCAL number
* of target points, Nd the TOTAL number of donors. */
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

method

@@ -2882,16 +2884,18 @@ void CSolver::Read_SU2_Restart_ASCII(CGeometry *geometry, const CConfig *config,
}
}

if (iPoint_Global != geometry->GetGlobal_nPointDomain())
SU2_MPI::Error("The solution file does not match the mesh.",CURRENT_FUNCTION);
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does not work for ASCII

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

will you support it for this PR, or this is for later?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is a bit messy to do it, besides, if someone is restarting from ASCII the problem is probably not very large.
(every mpi rank needs to read the entire ASCII file because lines are defined by the \n character)

@bigfooted
Copy link
Contributor

Method described below.
No config options needed.
Takes about a minute to interpolate an 8M node mesh to a 70M mesh on 1000 cores.
May or may not add OpenMP to this which would improve scalability.
Restarts are read twice for RANS problems, which is not great for this type of feature...

This is a great addition, I am now doing this outside of SU2 with some python code, but having this native in SU2 is really great! I haven't looked at the details yet, but I guess the assumption is that the geometry stay the same, we only have a finer mesh? Also, the 'restarts are read twice for RANS' is that a general issue when having additional solvers (like heat, scalar, nemo)?
Last thing (for now): Does it work when using an inlet.dat boundary file?

@pcarruscag
Copy link
Member Author

There is no assumption, but if the geometry is very different you'll end up with free-stream values in the boundary layer and vice versa.
The restarts are read once per solver, so, if you have Incomp RANS, with weakly coupled heat, a scalar, and mesh deformation... 5 times :)
I expect it to work with everything (to the extent that a matching restart does). Except maybe for problems that update the grid coordinates based on the contents of the restart.

@bigfooted
Copy link
Contributor

The restarts are read once per solver, so, if you have Incomp RANS, with weakly coupled heat, a scalar, and mesh deformation... 5 times :)
Well, it seems fast enough, so it does not pose an immediate problem. I guess the solution to this is to move the reading of the data 'higher up', first get all the solvers and variables, then read the file and populate the solver variables, something like that?
Anyway, I think the way it is done now looks good.

Copy link
Contributor

@bigfooted bigfooted left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good to me.

Copy link
Contributor

@TobiKattmann TobiKattmann left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks 💐 this is surely a neat feature, especially for grid-independence studies.
Do you plan to add/ adapt a Testcase?

Common/src/CConfig.cpp Show resolved Hide resolved
SU2_CFD/src/solvers/CSolver.cpp Show resolved Hide resolved
SU2_CFD/src/solvers/CSolver.cpp Show resolved Hide resolved
SU2_CFD/src/solvers/CSolver.cpp Outdated Show resolved Hide resolved
SU2_CFD/src/solvers/CSolver.cpp Show resolved Hide resolved
@lgtm-com
Copy link

lgtm-com bot commented May 5, 2021

This pull request fixes 1 alert when merging a366672 into 3cc0208 - view on LGTM.com

fixed alerts:

  • 1 for Comparison of narrow type with wide type in loop condition

@su2code su2code deleted a comment from lgtm-com bot May 6, 2021
@su2code su2code deleted a comment from lgtm-com bot May 6, 2021
@su2code su2code deleted a comment from lgtm-com bot May 6, 2021
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% %
% SU2 configuration file %
% Case description: Test interp. restart, and auto time-step for dual-time %
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

@pcarruscag pcarruscag merged commit 72a11f4 into develop May 6, 2021
@pcarruscag pcarruscag deleted the feature_interpolate_restart branch May 6, 2021 09:03
@lgtm-com
Copy link

lgtm-com bot commented May 6, 2021

This pull request fixes 1 alert when merging 06c818a into 3cc0208 - view on LGTM.com

fixed alerts:

  • 1 for Comparison of narrow type with wide type in loop condition

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants