NuSiF-Solver/BasicSolver
2023-02-05 07:52:43 +01:00
..
2D-mpi Initial checkin 2023-02-05 07:34:23 +01:00
2D-mpi-v1 Initial checkin 2023-02-05 07:34:23 +01:00
2D-mpi-v2 Initial checkin 2023-02-05 07:34:23 +01:00
2D-mpi-v3 Initial checkin 2023-02-05 07:34:23 +01:00
2D-seq Initial checkin 2023-02-05 07:34:23 +01:00
2D-seq-pt Initial checkin 2023-02-05 07:34:23 +01:00
3D-mpi Initial checkin 2023-02-05 07:34:23 +01:00
3D-seq Initial checkin 2023-02-05 07:34:23 +01:00
README.md Add README for Basic Solver 2023-02-05 07:52:43 +01:00

Introduction

This folder contains variants of the NuSiF basic solver. The basic solver does not allow obstacles within the domain.

All basic solver variants include two test cases for validation:

  • dcavity - Lid driven cavity
  • canal - Channel flow

2D solver variants

Sequential solver (2D-seq)

This is the basic sequential version.

Sequential solver with particle tracing (2D-seq-pt)

This version adds particle tracing and streak lines to the sequential basic solver.

Simple MPI parallel solver (2D-mpi-v1)

The simplest possible MPI parallelization with domain decomposition in one direction and communication just based on simple send and recv calls.

MPI parallel solver with 2D domain decomposition (2D-mpi-v2)

A MPI parallelization with two-dimensional domain decomposition using MPI virtual topologies.

MPI parallel solver using MPI-3 neighborhood collectives (2D-mpi-v3)

A MPI parallelization with two-dimensional domain decomposition using neighborhood collective call instead of send and recv calls.

Refactored MPI parallel solver (2D-mpi)

The final version of the 2D MPI parallel solver. All MPI calls are contained in a single communication module. The rest of the code does not depend on MPI. This version is prepared to also compile and run without MPI.

3D solver variants

Sequential solver (3D-seq)

This is the basic sequential version.

MPI parallel solver (3D-mpi)

A MPI parallel solver with 3D domain decomposition using MPI virtual topologies and neighborhood collectives. All MPI calls are contained in a single communication module. The rest of the code does not depend on MPI. This version is prepared to also compile and run without MPI.