Welcome, Guest
Username: Password: Remember me

TOPIC: Fail to run a telemac 2d simulation

Fail to run a telemac 2d simulation 7 years 7 months ago #26120

Hi, all
I have successfully compiled the telemac 2d codes under ubuntu system. I successfully to run several examples of the code. However, when i run my own case, I got an error that something about MPI_INITIALIZED or mpiexec:
Is anybody can help? I am a new user of telemac. Thanks.

MPI_INITIALIZED F
[lairuixun-ThinkPad-X220:2341] *** An error occurred in MPI_Comm_rank
[lairuixun-ThinkPad-X220:2341] *** reported by process [1197998081,0]
[lairuixun-ThinkPad-X220:2341] *** on communicator MPI_COMM_WORLD
[lairuixun-ThinkPad-X220:2341] *** MPI_ERR_COMM: invalid communicator
[lairuixun-ThinkPad-X220:2341] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[lairuixun-ThinkPad-X220:2341] *** and potentially your MPI job)
Primary job terminated normally, but 1 process returned
a non-zero exit code.. Per user-direction, the job has been aborted.

mpiexec detected that one or more processes exited with non-zero status, thus causing
the job to be terminated. The first process to do so was:

Process name: [[18280,1],0]
Exit code: 5
_____________
runcode::main:
:
|runCode: Fail to run
|/usr/bin/mpiexec -wdir /home/lairuixun/opentelemac/v7p2/scripts/python27/2d_xiaobeiganliu/xbgl_steering.cas_2017-04-19-06h52min32s -n 1 /home/lairuixun/opentelemac/v7p2/scripts/python27/2d_xiaobeiganliu/xbgl_steering.cas_2017-04-19-06h52min32s/out_telemac2d
Attachments:
The administrator has disabled public write access.

Fail to run a telemac 2d simulation 7 years 7 months ago #26141

  • Phelype
  • Phelype's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 140
  • Thank you received: 64
Hello,

I tried to run your simulation and it stops because no Law of Bottom Friction is defined. (The error message is: NO FRICTION LAW IS PRESCRIBED!).

To make it work, add the following two lines to your .cas file:
LAW OF BOTTOM FRICTION = 2
FRICTION COEFFICIENT = 50

These settings will set the Law of Bottom Friction as the Chezy's Fomula with Bottom Friction Coefficient of 50.

Change this values depending on what your simulation needs.

Best Regards,

Phelype
The administrator has disabled public write access.
The following user(s) said Thank You: lairuixun

Fail to run a telemac 2d simulation 7 years 7 months ago #26144

Than you! It's work now when I add key words LAW OF BOTTOM FRICTION.
The administrator has disabled public write access.

Fail to run a telemac 2d simulation 7 years 7 months ago #26142

I have sovle my problem.
The administrator has disabled public write access.

Fail to run a telemac 2d simulation 7 years 5 months ago #26804

  • victor
  • victor's Avatar
Did you get the solution for this issue. I have the same problem. No idea how to solve it.

Cheers

Victor
The administrator has disabled public write access.

Fail to run a telemac 2d simulation 7 years 5 months ago #26809

  • jose2kk
  • jose2kk's Avatar
  • OFFLINE
  • Expert Boarder
  • Posts: 161
  • Thank you received: 59
Hello Victor,

Could you please attach your files?
It would be easier to solve the problem looking at your files.

Cheers,
José Andrés.
The administrator has disabled public write access.

Fail to run a telemac 2d simulation 7 years 5 months ago #26815

  • Phelype
  • Phelype's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 140
  • Thank you received: 64
Hello Victor,

If you have the exact same problem, then the solution is as I stated above.

Although, the error message lairuixun posted here is a generic MPI message that is thrown whenever MPI is not correctly finalized. The actual error message appears before what he posted, and I only figured out what it was because I ran the simulation in my computer. So the error you are experiencing might not be the same, only the MPI message is.

If you attach the listing or the simulation files we can help.

Best regards,

Phelype
The administrator has disabled public write access.

Fail to run a telemac 2d simulation 7 years 5 months ago #26816

  • victor
  • victor's Avatar
Thanks for the reply

I have installed telemac many times in ubunntu computers. I follow the step by step guide to install v7p1 in ubuntu 16.

When finished I ran the telemac2d malpaset small cas, I know it very well.

So the error message is this:



Loading Options and Configurations
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

_____ __ __
|___ | /_ | /_ |
__ __ _/ / _ __ | | _ __ | |
\ \ / / |_ _| | '_ \ | || '__| | |
\ V / / / | |_) | | || | | |
\_/ /_/ | .__/ |_||_| |_|
| |
|_|
_ _ ___ ___ __ __
_| || |_ / _ \ / _ \ / / /_ |
_ __ ___ __ __ |_ __ _|| (_) || (_) | / /_ | |
| '__| / _ \\ \ / / _| || |_ \__, | > _ < | '_ \ | |
| | | __/ \ V / _ |_ __ _| / / | (_) || (_) | | |
|_| \___| \_/ (_) |_||_| /_/ \___/ \___/ |_|


... parsing configuration file: /home/victor/telemac/v7p1/configs/systel.ubuntu.cfg


Running your CAS file for:
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+> configuration: ubugfmpich
+> parallel mode, using mpiexec directly (of the MPICH2 package).
| The only difference with the scalar versions (optimised) is the presence
| of the key mpi_cmdexec and the -DHAVE_MPI compilation directive.
| Of course, you also need the key par_cmdexec.
| Finally, note that this configuration also works whether
| processor is 0 or 1.
+> root: /home/victor/telemac/v7p1


~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~


... reading the main module dictionary

... processing the main CAS file(s)
+> simulation en Francais

... handling temporary directories

... checking coupling between codes

... checking parallelisation

... first pass at copying all input files
copying: geo_malpasset-small.slf /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s/T2DGEO
copying: t2d_malpasset-small.f /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s/t2dfort.f
copying: geo_malpasset-small.cli /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s/T2DCLI
re-copying: /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s/T2DCAS
copying: telemac2d.dico /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s/T2DDICO

... checking the executable
re-copying: t2d_malpasset-small /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s/out_t2d_malpasset-small

... modifying run command to MPI instruction

... modifying run command to PARTEL instruction

... partitioning base files (geo, conlim, sections and zones)
+> /home/victor/telemac/v7p1/builds/ubugfmpich/bin/partel < PARTEL.PAR >> partel_T2DGEO.log

... splitting / copying other input files

... handling sortie file(s)


Running your simulation(s) :
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~



/usr/bin/mpiexec -wdir /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s -n 4 /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s/out_t2d_malpasset-small


_ _ ___ _ _____ __
| | | | |__ \ | | |___ | /_ |
| |_ ___ | | ___ _ __ ___ __ _ ___ ) | __| | ______ __ __ _/ / _ __ | |
| __| / _ \| | / _ \| '_ ` _ \ / _` | / __| / / / _` | |______| \ \ / / |_ _| | '_ \ | |
| |_ | __/| || __/| | | | | || (_| || (__ / /_ | (_| | \ V / / / | |_) | | |
\__| \___||_| \___||_| |_| |_| \__,_| \___||____| \__,_| \_/ /_/ | .__/ |_|
| |
|_|
MPI_INITIALIZED F
MPI_INITIALIZED F
MPI_INITIALIZED F
MPI_INITIALIZED F
[victor-Latitude-E6410:3047] *** An error occurred in MPI_Comm_rank
[victor-Latitude-E6410:3047] *** reported by process [2073952257,0]
[victor-Latitude-E6410:3047] *** on communicator MPI_COMM_WORLD
[victor-Latitude-E6410:3047] *** MPI_ERR_COMM: invalid communicator
[victor-Latitude-E6410:3047] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[victor-Latitude-E6410:3047] *** and potentially your MPI job)
_____________
runcode::main:
:
|runCode: Fail to run
|/usr/bin/mpiexec -wdir /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s -n 4 /home/victor/telemac/v7p1/examples/telemac2d/malpasset/t2d_malpasset-small.cas_2017-06-15-22h30min51s/out_t2d_malpasset-small
|~~~~~~~~~~~~~~~~~~
|[victor-Latitude-E6410:03045] 3 more processes have sent help message help-mpi-errors.txt / mpi_errors_are_fatal
|[victor-Latitude-E6410:03045] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
|~~~~~~~~~~~~~~~~~~

I check the libraries, all aparently fine.

Maybe I have both openmpi and mpich installed, thats what I'm trying to check.

Any suggestion wellcome

Tlazocamati (thanks in Nahuatl)

Victor
The administrator has disabled public write access.

Fail to run a telemac 2d simulation 7 years 5 months ago #26818

  • Phelype
  • Phelype's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 140
  • Thank you received: 64
Hello Victor,

Indeed the error you are receiving is not the same as above.

The error is occurring during the initialization of MPI. The only thing I can think of is, since you said you have both OpenMPI and MPICH, that your congif file is using the compiler from one and some library from the other. Of course this is just a guess.

Hope this helps.

Best Regards,

Phelype
The administrator has disabled public write access.

Fail to run a telemac 2d simulation 7 years 5 months ago #26833

  • victor
  • victor's Avatar
I finally solved the problem:

The correct configuration systel.ubuntu.cfg with openmpi

# _____ _______________________________
# ____/ TELEMAC Project Definitions /______________________________/
#
[Configurations]
configs: ubugfortrans ubugfopenmpi
# ubugfortransdbg ubugfmpich2 ubugfmpich2dbg
#
#
# _____ ___________________________________________________
# ____/ GENERAL /__________________________________________________/
[general]
modules: system -dredgesim
#
cmd_lib: ar cru <libname> <objs>
cmd_exe: /usr/bin/mpif90 -fconvert=big-endian -frecord-marker=4 -v -lm -o <exename> <objs> <libs>
#
mods_all: -I <config>
#
sfx_zip: .gztar
sfx_lib: .a
sfx_obj: .o
sfx_mod: .mod
sfx_exe:
#
val_root: <root>/examples
val_rank: all
# also possible val_rank: <3 >7 6
# _____ ____________________________________
# ____/ Ubuntu gfortran scalar /___________________________________/
[ubugfortrans]
#
brief: scalar mode, Fortran optimisation 3.
TELEMAC will work whether processor is 0 or 1
#
cmd_obj: gfortran -c -O3 -fconvert=big-endian -frecord-marker=4 <mods> <incs> <f95name>
cmd_exe: gfortran -fconvert=big-endian -frecord-marker=4 -v -o <exename> <objs> <libs>
# _____ ____________________________________
# ____/ Ubuntu gfortran scalar debug/___________________________________/
[ubugfortransdbg]
#
brief: scalar mode, Fortran debug mode.
TELEMAC will work whether processor is 0 or 1
#
cmd_obj: gfortran -c -g -fbounds-check -Wall -fbacktrace -finit-real=nan -fconvert=big-endian -frecord-marker=4 <mods> <incs> <f95name>
cmd_exe: gfortran -fconvert=big-endian -frecord-marker=4 -v -o <exename> <objs> <libs>
#
# _____ ____________________________________
# ____/ Ubuntu gfortran openmpi /___________________________________/
[ubugfopenmpi]
#
brief: parallel mode, using mpiexec directly (of the MPICH2 package).
The only difference with the scalar versions (optimised) is the presence
of the key mpi_cmdexec and the -DHAVE_MPI compilation directive.
Of course, you also need the key par_cmdexec.
Finally, note that this configuration also works whether
processor is 0 or 1.
#
mpi_cmdexec: /usr/bin/mpirun -wdir <wdir> -n <ncsize> <exename>
#
cmd_obj: mpif90 -c -O3 -DHAVE_MPI -fconvert=big-endian -frecord-marker=4 <mods> <incs> <f95name>
incs_special: -I /usr/lib/openmpi/include/
incs_parallel: -I /usr/lib/openmpi/include/
incs_bief: -I /usr/lib/openmpi/include/
libs_partel: ~/telemac/v7p1/optionals/metis-5.1.0/build/lib/libmetis.a
libs_all: /usr/lib/libmpi.so
#
# _____ ____________________________________
# ____/ Ubuntu gfortran mpich2 debug /___________________________________/
[ubugfmpich2dbg]
#
brief: parallel mode, using mpiexec directly (of the MPICH2 package).
The only difference with the scalar versions (debugged) is the presence
of the key mpi_cmdexec and the -DHAVE_MPI compilation directive.
Of course, you also need the key par_cmdexec.
Finally, note that this configuration also works whether
processor is 0 or 1.
#
mpi_cmdexec: /usr/bin/mpiexec -wdir <wdir> -n <ncsize> <exename>
#
cmd_obj: gfortran -c -g -fbounds-check -Wall -fbacktrace -finit-real=nan -DHAVE_MPI -DHAVE_MUMPS -fconvert=big-endian -frecord-marker=4 <mods> <incs> <f95name>
#
incs_special: -I /usr/lib/mpich2/include/ -I /home/telemac/mumps/MUMPS_5.0.0/include/
incs_parallel: -I /usr/lib/mpich2/include/ -I /home/telemac/mumps/MUMPS_5.0.0/include/
incs_bief: -I /usr/lib/mpich2/include/ -I /home/telemac/mumps/MUMPS_5.0.0/include/
libs_partel: /home/telemac/metis-5.0.2/libmetis.a
libs_all: /usr/lib/mpich2/lib/libmpich.so -L /home/telemac/mumps/MUMPS_5.0.0/lib -ldmumps -lmumps_common -lpord /home/telemac/mumps/SCALAPACK/libscalapack.a -L /home/telemac/mumps/BLAS-3.5.0 /home/telemac/mumps/BLAS-3.5.0/blas_LINUX.a /home/telemac/mumps/BLACS/LIB/blacs_MPI-LINUX-0.a /home/telemac/mumps/BLACS/LIB/blacsF77init_MPI-LINUX-0.a /home/telemac/mumps/BLACS/LIB/blacs_MPI-LINUX-0.a

I have to mention that maybe the problem is that I installed openFoam4 previously so it confugure opienmpi by default.

Cheers

Victor
The administrator has disabled public write access.
Moderators: pham

The open TELEMAC-MASCARET template for Joomla!2.5, the HTML 4 version.