NuSiF-Solver/BasicSolver/README.md

54 lines
1.6 KiB
Markdown
Raw Normal View History

2024-02-05 08:46:13 +01:00
# BasicSolver Variants
## Introduction
2023-02-05 07:52:43 +01:00
This folder contains variants of the NuSiF basic solver. The basic solver does
not allow obstacles within the domain.
All basic solver variants include two test cases for validation:
2024-02-05 08:46:13 +01:00
2023-02-05 07:52:43 +01:00
* dcavity - Lid driven cavity
* canal - Channel flow
2024-02-05 08:46:13 +01:00
## 2D solver variants
### Sequential solver (2D-seq)
2023-02-05 08:02:01 +01:00
This is the basic sequential version. Gnuplot result visualization.
2023-02-05 07:52:43 +01:00
2024-02-05 08:46:13 +01:00
### MPI parallel solver (2D-mpi)
2023-02-05 07:52:43 +01:00
The simplest possible MPI parallelization with domain decomposition in one
direction and communication just based on simple send and recv calls.
2023-02-05 08:02:01 +01:00
Gnuplot result visualization.
2023-02-05 07:52:43 +01:00
A MPI parallelization with two-dimensional domain decomposition using MPI
virtual topologies.
2023-02-05 08:02:01 +01:00
Gnuplot result visualization.
2023-02-05 07:52:43 +01:00
A MPI parallelization with two-dimensional domain decomposition using
neighborhood collective call instead of send and recv calls.
2023-02-05 08:02:01 +01:00
Gnuplot result visualization.
2023-02-05 07:52:43 +01:00
The final version of the 2D MPI parallel solver. All MPI calls are contained in
a single communication module. The rest of the code does not depend on MPI.
This version is prepared to also compile and run without MPI.
2023-02-05 08:02:01 +01:00
VTK result visualization.
2023-02-05 07:52:43 +01:00
2024-02-05 08:46:13 +01:00
## 3D solver variants
### Sequential solver (3D-seq)
2023-02-05 07:52:43 +01:00
This is the basic sequential version.
2023-02-05 08:02:01 +01:00
VTK result visualization.
2023-02-05 07:52:43 +01:00
2024-02-05 08:46:13 +01:00
### MPI parallel solver (3D-mpi)
2023-02-05 07:52:43 +01:00
A MPI parallel solver with 3D domain decomposition using MPI virtual topologies
and neighborhood collectives. All MPI calls are contained in a single
communication module. The rest of the code does not depend on MPI. This version
is prepared to also compile and run without MPI.
2023-02-05 08:02:01 +01:00
VTK result visualization.
Identical to the 3D-MPI variant but using MPI-IO for VTK result file output.