Welcome, Guest
Username: Password: Remember me

TOPIC: MPI problem in parallelization of Lagrangian transport module

MPI problem in parallelization of Lagrangian transport module 7 years 9 months ago #25084

  • Phelype
  • Phelype's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 140
  • Thank you received: 64
Hello,
I wrote a very simple Lagrangian transport module directly within the subroutine telemac3d.f (I intend to move it to a separate subroutine later).

The code is working perfectly in scalar.

To make the code work in parallel I used the function p_dsum.f (from the parallel folder), which calls the MPI_ALLREDUCE function. My

The reason I am using this function is because if the Lagrangian particle is within the sub-domain, its properties are calculated, otherwise, the processor core responsible for that sub-domain sets its properties to zero. Thus, when it sums a series of zeros with the actual value of the properties of the particle, it should, in principle, have the actual value of said property.

My problem is: somewhere in the code, when the program calls p_dsum, it does not advance nor stops, as if it entered and infinite loop in MPI_ALLREDUCE (I know it is in this function because I added a few PRINT* statements to know where the execution is).

My questions are: 1 - Why is this happening?
And: 2 - Is it correct to do what I am doing? If not, what is the correct way to parallelize my code?

I am attaching a working example of my code.

(In the .zip file there are two .m files. fheadr.m and fstepr.m. They work exactly like the telheadr.m and telstepr.m functions, but are used to read the .f3d file that has the results of the Lagrangian transport module).

If I missed any information that may be helpful to solve my problem, please ask.

Regards

Phelype
Attachments:
The administrator has disabled public write access.

MPI problem in parallelization of Lagrangian transport module 7 years 9 months ago #25087

  • c.coulet
  • c.coulet's Avatar
  • OFFLINE
  • Moderator
  • Posts: 3722
  • Thank you received: 1031
Hi

Why are you adding something which already exists in telemac?

As far as I know, such functions are availaible in 3D and in 2D.

regards
Christophe
The administrator has disabled public write access.

MPI problem in parallelization of Lagrangian transport module 7 years 9 months ago #25090

  • Phelype
  • Phelype's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 140
  • Thank you received: 64
Hi,

I didn't know there were such functions.

What are the subroutines that handle this implementation?

Are they available in v6p3?

Regards
The administrator has disabled public write access.

MPI problem in parallelization of Lagrangian transport module 7 years 9 months ago #25091

  • c.coulet
  • c.coulet's Avatar
  • OFFLINE
  • Moderator
  • Posts: 3722
  • Thank you received: 1031
Hi

First of all, you should update your version to the latest one.
Assistance is only given on the latest version which is now v7p2.

Look for Drogues and or Oil Spill/Algae modules in the documentation.

Regards
Christophe
The administrator has disabled public write access.
The following user(s) said Thank You: Phelype

MPI problem in parallelization of Lagrangian transport module 7 years 9 months ago #25092

  • Phelype
  • Phelype's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 140
  • Thank you received: 64
Hi,

I am still using the old version because of some implementations we already have and, mainly, because I still didn't manage to install v7p2 in Ubuntu. But this will change soon.

I will look for those modules.

Thank you very much.

Regards
The administrator has disabled public write access.
Moderators: pham

The open TELEMAC-MASCARET template for Joomla!2.5, the HTML 4 version.