Welcome, Guest
Username: Password: Remember me

TOPIC: parallelism problem BC free surface

parallelism problem BC free surface 2 years 5 months ago #40541

  • JuliAlzate
  • JuliAlzate's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 110
  • Thank you received: 1
Hello everyone,

I am using Telemac 3D to simulate thermal stratification in a large tropical reservoir and I want to add a boundary condition in the free surface that is varying with time.

First I used limid3d.f to impose a constant temperature at the free surface and it works :

! FREE SURFACE
! =============
!
! DEFAULT: NEUMANN BC'S
!
! PART MODIFIED BY JULY
DO IPOIN2 = 1,NPOIN2
LITABS%ADR(ITRAC)%P%I(IPOIN2) = KENT
TABORS%ADR(ITRAC)%P%R(IPOIN2) = 30.86D0

Then I tried to impose a function varying in time using the USER_TR3.f, using the following equation:

DO IPOIN2 = 1,NPOIN2
LITABS%ADR(ITRAC)%P%I(IPOIN2) = KENT ! LGC : prescribed value
TABORS%ADR(ITRAC)%P%R(IPOIN2) = 30.86D0+2.0D0*
& SIN(2.0D0*pi*AT/600.0D0)
ENDDO

The simulation is running with 2 processors but it's crashing with more than 2 processors with the following error message:
================================================================================
ITERATION 0 TIME 0 D 0 H 0 MN 0.0000 S ( 0.0000 S)
================================================================================
MASS BALANCE
INITIAL MASS OF WATER IN THE DOMAIN : 20.000000000000000
USING STREAMLINE VERSION 7.3 FOR CHARACTERISTICS
@STREAMLINE::SCARACT: THE NUMBER OF TRACEBACK INTERFACE CROSSINGS IGEN > 99

PLANTE: PROGRAM STOPPED AFTER AN ERROR
RETURNING EXIT CODE: 2
application called MPI_Abort(MPI_COMM_WORLD, 2) - process 0
application called MPI_Abort(MPI_COMM_WORLD, 2) - process 1
application called MPI_Abort(MPI_COMM_WORLD, 2) - process 2
Traceback (most recent call last):
File "/PRODCOM/Ubuntu20.04/Gcc-9.3.0-Mpich-3.3.2/TELEMAC/v8p2r1-gcc-9.3.0-mpich-3.3.2-python-3.8.3/scripts/python3/runcode.py", line 287, in <module>
main(None)
File "/PRODCOM/Ubuntu20.04/Gcc-9.3.0-Mpich-3.3.2/TELEMAC/v8p2r1-gcc-9.3.0-mpich-3.3.2-python-3.8.3/scripts/python3/runcode.py", line 271, in main
run_study(cas_file, code_name, options)
File "/PRODCOM/Ubuntu20.04/Gcc-9.3.0-Mpich-3.3.2/TELEMAC/v8p2r1-gcc-9.3.0-mpich-3.3.2-python-3.8.3/scripts/python3/execution/run_cas.py", line 157, in run_study
run_local_cas(my_study, options)
File "/PRODCOM/Ubuntu20.04/Gcc-9.3.0-Mpich-3.3.2/TELEMAC/v8p2r1-gcc-9.3.0-mpich-3.3.2-python-3.8.3/scripts/python3/execution/run_cas.py", line 65, in run_local_cas
my_study.run(options)
File "/PRODCOM/Ubuntu20.04/Gcc-9.3.0-Mpich-3.3.2/TELEMAC/v8p2r1-gcc-9.3.0-mpich-3.3.2-python-3.8.3/scripts/python3/execution/study.py", line 612, in run
self.run_local()
File "/PRODCOM/Ubuntu20.04/Gcc-9.3.0-Mpich-3.3.2/TELEMAC/v8p2r1-gcc-9.3.0-mpich-3.3.2-python-3.8.3/scripts/python3/execution/study.py", line 440, in run_local
run_code(self.run_cmd, self.sortie_file)
File "/PRODCOM/Ubuntu20.04/Gcc-9.3.0-Mpich-3.3.2/TELEMAC/v8p2r1-gcc-9.3.0-mpich-3.3.2-python-3.8.3/scripts/python3/execution/run.py", line 182, in run_code
raise TelemacException('Fail to run\n'+exe)
utils.exceptions.TelemacException: Fail to run
mpiexec -wdir /work/jalzate/Test_case_sin11/t3d_stratif2.cas_2022-06-09-20h57min40s -n 3 /work/jalzate/Test_case_sin11/t3d_stratif2.cas_2022-06-09-20h57min40s/out_user_fortran

After verification, what makes the simulation crash is the type of boundary condition:
LITABS%ADR(ITRAC)%P%I(IPOIN2) = KENT

The simulation does not crash regardless of the number of processors with another type of boundary condition, e.g.:
LITABS%ADR(ITRAC)%P%I(IPOIN2) = KLOG

I was wondering if there is a parallelization problem with the KENT type condition?

I am currently using version v8p2r1 and I attach the test case with which I identified the problem.

File Attachment:

File Name: Test_userTR3.tar
File Size: 260 KB



Thank you very much
The administrator has disabled public write access.
Moderators: pham

The open TELEMAC-MASCARET template for Joomla!2.5, the HTML 4 version.