Différences entre les versions de « Simulation de problèmes de sédimentation sur machine massivement parallèle »

De
Aller à la navigation Aller à la recherche
Ligne 8 : Ligne 8 :
  
 
* '''Prof  [http://www-turbul.ifh.uni-karlsruhe.de/uhlmann/home/report_1.html Markus Uhlmann],''' '''''Massively parallel simulation of sedimentation problems involving many spherical particles and wake effects'''''<br> Institut für Hydromechanik, Karlruhe Institute of Technology<br><br>
 
* '''Prof  [http://www-turbul.ifh.uni-karlsruhe.de/uhlmann/home/report_1.html Markus Uhlmann],''' '''''Massively parallel simulation of sedimentation problems involving many spherical particles and wake effects'''''<br> Institut für Hydromechanik, Karlruhe Institute of Technology<br><br>
 +
 +
Click on [http://http://audiovideocours.u-strasbg.fr/avc/courseaccess?id=7837&type=flash this link] to follow Prof. Uhlmann's presentation at Strasbourg CSDC (please skip the first 3'10 minutes to get directly to the beginning of the presentation).
  
 
:The interaction between turbulent flow and suspended solid particles is of relevance in a considerable number of technical applications (e.g. civil and chemical engineering, combustion) as well as natural processes (meteorology, blood flow, ...). Reliable flow data is, however, still scarce due in part to measurement difficulties in these multi-phase flow systems.
 
:The interaction between turbulent flow and suspended solid particles is of relevance in a considerable number of technical applications (e.g. civil and chemical engineering, combustion) as well as natural processes (meteorology, blood flow, ...). Reliable flow data is, however, still scarce due in part to measurement difficulties in these multi-phase flow systems.

Version du 24 septembre 2012 à 05:46

Jeudi 10 mai 2012 à 14h00, Petit Amphithéâtre de Mathématiques, UFR Math-Info, 7 rue Descartes, Université de Strasbourg.
Séminaire organisé par Jan Dusek, responsable de l'équipe Mécanique et Environnement de l'Institut de Mécanique des Fluides et des Solides (IMFS) de l'Université de Strasbourg.

Inauguration du Campus Numérique des Systèmes Complexes

avec Paul Bourgine Directeur honoraire du Réseau National des Systèmes Complexes.

Séminaire

  • Prof Markus Uhlmann, Massively parallel simulation of sedimentation problems involving many spherical particles and wake effects
    Institut für Hydromechanik, Karlruhe Institute of Technology

Click on this link to follow Prof. Uhlmann's presentation at Strasbourg CSDC (please skip the first 3'10 minutes to get directly to the beginning of the presentation).

The interaction between turbulent flow and suspended solid particles is of relevance in a considerable number of technical applications (e.g. civil and chemical engineering, combustion) as well as natural processes (meteorology, blood flow, ...). Reliable flow data is, however, still scarce due in part to measurement difficulties in these multi-phase flow systems.
Traditionally, suspensions involving large numbers of particles have been described computationally (at best) by a point-particle approximation. However, when the size of the particles is comparable to or larger than the smallest flow scales, this ansatz loses its validity. The same is true when the Reynolds number of the flow around individual particles is not negligibly small. We are performing numerical simulations for finite particle sizes, where the computational particles are larger than the grid and their interface is resolved.
For this purpose we resort to an immersed-boundary technique which allows for an efficient representation of submerged solid bodies in arbitrary motion across a fixed computational mesh. Parallelism is achieved through 3D Cartesian domain decomposition, with mostly nearest-neighbor communication of 'ghost-cell' data and particle-related communication protocols.
In this talk the need for rigorous validation and benchmarking will be stressed, and computational requirements for high-fidelity simulations are discussed. Targeting systems with O(10^5) particles at dilute solid volume fractions, allowing for sufficient small-scale resolution as well as large-scale sampling (box size), meshes with O(10^10) points are necessary. Typical runs are performed on O(10^4) cores of the IBM BlueGene system at Jülich Supercomputing Center where reasonable weak scaling has been demonstrated in tests of up to O(10^5) processor cores.