NuSiF-Solver/BasicSolver
2023-08-09 13:05:15 +02:00
..
2D-mpi Completed porting, fixing bugs and testing 2023-07-05 20:38:50 +02:00
2D-mpi-v1 Particle Tracer work in progress 2023-08-05 19:35:16 +02:00
2D-mpi-v2 Implemented Function pointer for SOR, RB and RBA variants 2023-07-14 21:50:38 +02:00
2D-mpi-v3 Implemented Function pointer for SOR, RB and RBA variants 2023-07-14 21:50:38 +02:00
2D-seq Working on reconstruction vtk writer 2023-08-09 13:05:15 +02:00
2D-seq-pt Working PT code for basic 2D 2023-08-06 21:23:55 +02:00
3D-mpi Implemented Function pointer for SOR, RB and RBA variants 2023-07-14 21:50:38 +02:00
3D-mpi-io Implemented Function pointer for SOR, RB and RBA variants 2023-07-14 21:50:38 +02:00
3D-seq Partially working vtk implementation 2023-08-07 14:27:32 +02:00
README.md Cleanup and add MPI-IO starting point 2023-02-05 08:02:01 +01:00

Introduction

This folder contains variants of the NuSiF basic solver. The basic solver does not allow obstacles within the domain.

All basic solver variants include two test cases for validation:

  • dcavity - Lid driven cavity
  • canal - Channel flow

2D solver variants

Sequential solver (2D-seq)

This is the basic sequential version. Gnuplot result visualization.

Sequential solver with particle tracing (2D-seq-pt)

This version adds particle tracing and streak lines to the sequential basic solver. Gnuplot result visualization.

Simple MPI parallel solver (2D-mpi-v1)

The simplest possible MPI parallelization with domain decomposition in one direction and communication just based on simple send and recv calls. Gnuplot result visualization.

MPI parallel solver with 2D domain decomposition (2D-mpi-v2)

A MPI parallelization with two-dimensional domain decomposition using MPI virtual topologies. Gnuplot result visualization.

MPI parallel solver using MPI-3 neighborhood collectives (2D-mpi-v3)

A MPI parallelization with two-dimensional domain decomposition using neighborhood collective call instead of send and recv calls. Gnuplot result visualization.

Refactored MPI parallel solver (2D-mpi)

The final version of the 2D MPI parallel solver. All MPI calls are contained in a single communication module. The rest of the code does not depend on MPI. This version is prepared to also compile and run without MPI. VTK result visualization.

3D solver variants

Sequential solver (3D-seq)

This is the basic sequential version. VTK result visualization.

MPI parallel solver (3D-mpi)

A MPI parallel solver with 3D domain decomposition using MPI virtual topologies and neighborhood collectives. All MPI calls are contained in a single communication module. The rest of the code does not depend on MPI. This version is prepared to also compile and run without MPI. VTK result visualization.

MPI parallel solver with MPI-IO (3D-mpi-io)

Identical to the 3D-MPI variant but using MPI-IO for VTK result file output.