Hello,
I have trying to package TELEMAC directly from the sources for Linux and macOs, in order to match @nicogodet's implementation for windows.
for reference here is the forum item for telemac conda package for windows
www.opentelemac.org/index.php/kunena/2-w...elemac-conda-package
Now, I have finally succeeded to find the right mumps conda library to implement: which are:
and
reference :
github.com/conda-forge/mumps-feedstock
(
original mumps conda library provides "libdmumps.a" which is static so compilation does not work for the API in dynamic)
However I still can't run the example test creaocean. Steps to replicate:
install telemac-mascaret package for linux: mamba create -n telemac -c tomsail telemac-mascaret
activate (forgot to include it in the package)
make sure you have the mumps libraries in the $HOMETEL/lib folder : ll $HOMETEL/lib | grep mumps
-rwxrwxr-x 2 saillth users 2309584 Mar 11 2022 libcmumps-5.2.1.so*
-rwxrwxr-x 2 saillth users 2270664 Mar 11 2022 libcmumps_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 22 Oct 14 08:06 libcmumps_seq.so -> libcmumps_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 18 Oct 14 08:06 libcmumps.so -> libcmumps-5.2.1.so*
-rwxrwxr-x 2 saillth users 2268328 Mar 11 2022 libdmumps-5.2.1.so*
-rwxrwxr-x 2 saillth users 2229408 Mar 11 2022 libdmumps_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 22 Oct 14 08:06 libdmumps_seq.so -> libdmumps_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 18 Oct 14 08:06 libdmumps.so -> libdmumps-5.2.1.so*
-rwxrwxr-x 2 saillth users 25504 Mar 9 2022 libesmumps-6.so*
-rwxrwxr-x 2 saillth users 25504 Mar 9 2022 libesmumps.so*
-rwxrwxr-x 2 saillth users 382640 Mar 11 2022 libmumps_common-5.2.1.so*
-rwxrwxr-x 2 saillth users 423504 Mar 11 2022 libmumps_common_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 28 Oct 14 08:06 libmumps_common_seq.so -> libmumps_common_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 24 Oct 14 08:06 libmumps_common.so -> libmumps_common-5.2.1.so*
-rwxrwxr-x 2 saillth users 25504 Mar 9 2022 libptesmumps-6.so*
-rwxrwxr-x 2 saillth users 25504 Mar 9 2022 libptesmumps.so*
-rwxrwxr-x 2 saillth users 2264280 Mar 11 2022 libsmumps-5.2.1.so*
-rwxrwxr-x 2 saillth users 2229456 Mar 11 2022 libsmumps_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 22 Oct 14 08:06 libsmumps_seq.so -> libsmumps_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 18 Oct 14 08:06 libsmumps.so -> libsmumps-5.2.1.so*
-rwxrwxr-x 2 saillth users 2280864 Mar 11 2022 libzmumps-5.2.1.so*
-rwxrwxr-x 2 saillth users 2250136 Mar 11 2022 libzmumps_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 22 Oct 14 08:06 libzmumps_seq.so -> libzmumps_seq-5.2.1.so*
lrwxrwxrwx 1 saillth users 18 Oct 14 08:06 libzmumps.so -> libzmumps-5.2.1.so*
make sure the libraries paths are correctly entered in the systel.cfg: cat $HOMETEL/configs/systel.cfg
...
incs_mumps: -I$MUMPSHOME/include
flags_mumps: -DHAVE_MUMPS
libs_mumps: $MUMPSHOME/lib/libdmumps-5.2.1.so
$MUMPSHOME/lib/libmumps_common-5.2.1.so
$MUMPSHOME/lib/libpord-5.2.1.so
$SCALAPACKHOME/lib/libscalapack.so
-L$BLACSHOME -lblas
libs_so_mumps: -L$MUMPSHOME/lib -ldmumps -lmumps_common -lpord
-L$SCALAPACKHOME/lib -lscalapack
-lblas
libs_so_mumps_mkl: -L$MUMPSHOME/lib -ldmumps -lmumps_common -lpord
-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64
-lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -lmkl_def
-lpthread -lm -ldl
...
#
# GFortran
#
[S10.gfortran.dyn]
brief: Scibian 10 dynamic build using GFortran 8.3.0 and Open MPI.
options: api
#
f2py_name: f2py3
pyd_fcompiler: gnu95
sfx_lib: .so
#
obj_flags: -O2 -fPIC [fflags_gfo] [flags_mpi] [flags_mumps]
lib_flags: -fPIC -shared [fflags_gfo]
#
exe_flags: -fPIC [fflags_gfo]
#
cmd_lib: [fc] [lib_flags] -o <libname> <objs>
incs_all: [incs_mumps]
# libs_all: [libs_so]
# libs_all: [libs_so_mumps] [libs_med] [libs_metis] [libs_gotm]
libs_all: [libs_mumps] [libs_metis]
libs_so: [libs_so_mumps] [libs_metis]
cflags: -fPIC
#
compile telemac:
compilation runs fine
run test case: git clone https://gitlab.pam-retd.fr/otm/telemac-mascaret.git
cd telemac-mascaret/examples/artemis/creocean
artemis.py art_creocean_2.cas
error:
WITH NO PARALLELISM,
NO DIRECT SYSTEM SOLVER 9 USE 8
PLANTE: PROGRAM STOPPED AFTER AN ERROR
RETURNING EXIT CODE: 2
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
with errorcode 2.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.