Hi:
Now I can compile the MPI-example "fpi.f" (example of MPICH2) without problem and run it with the DOS-command "C:\MPICH2\Bin\mpiexec.exe -n 2 -localonly C:\MPICH2\examples\x64\Debug\fpi.exe". The parallel calculation of the example works well.
Then I tried to do the same with Telemac-2D (V6P2). The Compilation and Link of V6P2 Source Code (Parallel) are OK. I used partel_win64.exe to divide the files "T2DGEO, "T2DCli", "T2DREF" and "T2DRES" into two parts (2 processors). The following files are gererated by Partel_win64.exe:
t2dgeo00001-00000
t2dgeo00001-00001
t2dcli00001-00000
t2dcli00001-00001
t2dref00001-00000
t2dref00001-00001
t2dres00001-00000
t2dres00001-00001
Then I ran the DOS-command with MPI "C:\MPICH2\Bin\mpiexec.exe -n 2 -localonly D:\sourcen\Telemac2D_V6.2\Telemac2D_V6.2_Parallel\x64\Debug\Telemac2D_V6.2_Parallel.exe". The projet data could be red with no error. But the program stopped at a positiion. I took a Debug of the program and find out that the founction:
NPOIN_GLOB=P_IMAX(NPOIN_GLOB)
in subroutine almesh.f could not be excuted because the program stopped at
CALL MPI_ALLREDUCE(MYPART,P_IMAX,1,MPI_INTEGER,MPI_MAX,
& MPI_COMM_WORLD,IER)
in Subroutine P_IMAX.
I copied this line into my example fpi.f, there it works well. So I have no idea why? Can you tell me if the running of parallel with mpiexec.exe in this way is correct? Why it works with my example, but not with Telemac2D.
PS: I dot not use Python configulation (also for ParaVoid-Telemac2D). I just copied the project files T2dgeo, T2dcas, T2dcli ... into the program folder and ran exe file there. It works also well for ParaVoid-Telemac2D.
thanks
kutti