Hello,
I have TELEMAC v6p3r2 working in following setup:
GCC/4.8.1, OpenMPI/1.6.5, METIS/5.1.0, Python/2.7.5
However, switching to following configuration on the same system:
GCC/4.8.2, OpenMPI/1.6.5, METIS/5.1.0, Python/2.7.6
results in
Program received signal SIGSEGV: Segmentation fault - invalid memory reference.
Backtrace for this error:
but no backtrace is shown. I'm using the openmpi-dbg configuration from systel.cis-fedora.cfg where the backtrace option is enabled. The example job uses 40 cores in 2 nodes. In the former case, mpi communication is being setup between the nodes. In the latter case, the job crashes.
Do you have any idea how to show the backtrace?
Has anyone encountered similar problems with the above setup?
Regards,
Stefan