Welcome, Guest
Username: Password: Remember me

TOPIC: MPI mode issues at the border of each two segments

MPI mode issues at the border of each two segments 3 years 9 months ago #37790

  • amanj2013
  • amanj2013's Avatar
  • OFFLINE
  • Expert Boarder
  • Posts: 211
  • Thank you received: 24
Hello,

I am facing an issue when using Telemac3D in MPI mode. I have a flume test with a weir overtopping, on upstream side when the flow is sub-critical, everything is fine, however on downstream side when the flow accelerating and become supercritical, I have notice the wiggles at the border of two partitioned segments and the attached image shows the free surface flow and wiggles at each segment edge.


Any idea to resolve this issue greatly appreciated!

Amanj
Attachments:
The administrator has disabled public write access.

MPI mode issues at the border of each two segments 3 years 9 months ago #37801

  • pham
  • pham's Avatar
  • OFFLINE
  • Administrator
  • Posts: 1559
  • Thank you received: 602
Hello Amanj,

Which release of TELEMAC-3D do you use?
Do you have the same behaviour when changing the number of processors and also between subdomains?
I assume you tried in serial mode and do not see this kind of wiggles, do you?
On the figure you sent, there are also 2 other wiggles on the right, are they also located between 2 partitioned domains?

Can you upload the steering file? Did you add any FORTRAN FILE?

Chi-Tuan
The administrator has disabled public write access.

MPI mode issues at the border of each two segments 3 years 9 months ago #37804

  • amanj2013
  • amanj2013's Avatar
  • OFFLINE
  • Expert Boarder
  • Posts: 211
  • Thank you received: 24
Hello Chi-Tuan,

Thank you for your response to my post, the version is V8P1R1. I tried with 40 processors and now I am trying with 20 instead and seems the wiggles are went away. Its coupled with Sisyphe and the wiggles appeared on steep slope only. The fortran file is just for defining the rigid bed ( nonerode).

Thank you,
Amanj
Attachments:
The administrator has disabled public write access.

MPI mode issues at the border of each two segments 3 years 9 months ago #37806

  • pham
  • pham's Avatar
  • OFFLINE
  • Administrator
  • Posts: 1559
  • Thank you received: 602
How many elements are there in your mesh?
You should not have a too small number of elements for each subdomain otherwise you may have issues. I would not advise to have 2,000 elements or less per subdomain after partitioning.

Chi-Tuan
The administrator has disabled public write access.

MPI mode issues at the border of each two segments 3 years 9 months ago #37807

  • amanj2013
  • amanj2013's Avatar
  • OFFLINE
  • Expert Boarder
  • Posts: 211
  • Thank you received: 24
total elements are 16864 and when using 40 cores, means each sub domain is approximately 420 elements.

Is there any logic behind this?

amanj
The administrator has disabled public write access.

MPI mode issues at the border of each two segments 3 years 9 months ago #37808

  • pham
  • pham's Avatar
  • OFFLINE
  • Administrator
  • Posts: 1559
  • Thank you received: 602
There is a balance to find between the time of computation on every subdomain and the time of communications between every subdomain. If the number of elements per subdomain is too low, you will spend more time to communicate rather than to compute. And it also may lead to errors.

Moreover, I am not sure that for your computation, using 40 or 20 processors leads to faster CPU times than less than 10 (I would try first with 1, 2 and 4 processors).

Chi-Tuan
The administrator has disabled public write access.

MPI mode issues at the border of each two segments 3 years 9 months ago #37809

  • amanj2013
  • amanj2013's Avatar
  • OFFLINE
  • Expert Boarder
  • Posts: 211
  • Thank you received: 24
I agree, I should try with a smaller cores number first and then gradually increase.

Thank you Chi-Tuan,

Amanj
The administrator has disabled public write access.
Moderators: pham

The open TELEMAC-MASCARET template for Joomla!2.5, the HTML 4 version.