Welcome, Guest
Username: Password: Remember me

TOPIC: My installation on macOS using Homebrew

My installation on macOS using Homebrew 1 year 11 months ago #41792

  • Renault
  • Renault's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 120
  • Thank you received: 33
Hi all,
I thought I would share my installation process for the OTM suite on macOS. Since it is Unix-like, and I have seen others have success with it on the forum, I thought it would work with a bit of adaptation from the Linux procedures.

The main change is that since macOS does not come with a package manager, I had to select and install one. I went with Homebrew, but of course you are free to choose your own and adapt the instructions below in consequence.

As well, I am installing at the system level, in /opt/telemac; you may want to install it locally, such as in ~/opt/telemac, instead. In that case, use $HOME/opt (or wherever) instead of /opt.

Finally, I use gcc for compilation and mpich for parallelism, but you are free to substitute any compiler and parallelism module, such as ifort or openmpi.

To begin, if you do not already have a package manager, install Homebrew as per instructions on their website. Then, run the following commands in the terminal to install the dependencies:
brew install python gcc git git-lfs mpich metis
pip install numpy scipy matplotlib
If you have other packages you want, install them using Homebrew or from source; you're on your own for this step as I have never needed, let alone successfully installed/compiled, any others.

With these installed, run the following commands to download OTM, replacing <latest-version> with v8p4r0 or whatever is newest (note, there might be another way to do this and access updates more readily, but I'm not aware of it):
mkdir -p /opt/telemac
chown <your-username> /opt/telemac
cd /opt/telemac
git clone https://gitlab.pam-retd.fr/otm/telemac-mascaret.git <latest-version>
cd <latest-version>
git checkout tags/<latest-version>
cd configs

I am providing my config files below. Note that paths to libraries like MPICH and METIS may differ slightly. Name this one pysource.macos.sh:
###
### TELEMAC settings -----------------------------------------------------------
###
# TELEMAC version number
export TELVER=v8p4r0
# Path to telemac root dir
export HOMETEL=/opt/telemac/$TELVER
# Adding python scripts to PATH
export PATH=$HOMETEL/scripts/python3:.:$PATH
# Configuration file
export SYSTELCFG=$HOMETEL/configs/systel.macos.cfg
# Name of the configuration to use
export USETELCFG=gfort-mpich
# Path to this file
export SOURCEFILE=$HOMETEL/configs/pysource.macos.sh
### Python
# To force python to flush its output
export PYTHONUNBUFFERED='true'
### API
export PYTHONPATH=$HOMETEL/scripts/python3:$PYTHONPATH
export LD_LIBRARY_PATH=$HOMETEL/builds/$USETELCFG/lib:$LD_LIBRARY_PATH
export PYTHONPATH=$HOMETEL/builds/$USETELCFG/lib:$PYTHONPATH
###
### MPI -----------------------------------------------------------
export MPIHOME=/usr/local/opt/mpich
export PATH=$MPIHOME/bin:$PATH
export LD_LIBRARY_PATH=$MPIHOME/lib:$LD_LIBRARY_PATH
###
### EXTERNAL LIBRARIES -----------------------------------------------------------
###
### METIS -------------------------------------------------------------
export METISHOME=/usr/local/opt/metis
export LD_LIBRARY_PATH=$METISHOME/lib:$LD_LIBRARY_PATH

This is the config file, which again uses certain paths which are assumed from using Homebrew. If you are feeling brave, you can replace -O3 with -Ofast (honestly, I don't know how much of a difference it makes!). With credit to this post:
# _____                              _______________________________
# ____/ TELEMAC Project Definitions /______________________________/
#
# based on systel.cis-ubuntu.cfg from v7p2r0
[Configurations]
configs:   gfort gfort-mpich
# _____          ____________________________________
# ____/ General /___________________________________/
# Global declaration that are true for each configuration
[general]
language:   2
modules:     system
#
cmd_lib:    ar cru <libname> <objs>
#
mods_all:   -I <config>
#
sfx_zip:    .zip
sfx_lib:    .so
sfx_obj:    .o
sfx_mod:    .mod
sfx_exe:
#
val_root:   <root>/examples
val_rank:   all
cmd_obj_c: gcc -c <srcName> -o <objName>
#
# _____                           ____________________________________
# ____/ Mac OS X gfortran scalar /___________________________________/
[gfort]
#
cmd_obj:    gfortran -cpp -c -O3 -fconvert=big-endian -frecord-marker=4 <mods> <incs> <f95name>
cmd_exe:    gfortran -fconvert=big-endian -frecord-marker=4 -v -o <exename> <objs> <libs>

#
# _____                          ____________________________________
# ____/ Mac OS X gfortran mpich /___________________________________/
[gfort-mpich]
#
brief: parallel mode, using mpiexec directly (of the MPICH2 package).
       The only difference with the scalar versions (debugged) is the presence
       of the key mpi_cmdexec and the -DHAVE_MPI compilation directive.
       Of course, you also need the key par_cmdexec.
       Finally, note that this configuration also works whether
       processor is 0 or 1.
#
par_cmdexec:   <config>/partel < <partel.par> >> <partel.log>
mpi_cmdexec:  mpiexec -wdir <wdir> -n <ncsize> <exename>
#
cmd_obj:    gfortran -cpp -c -O3 -fbounds-check -fbacktrace -finit-real=nan -DHAVE_MPI -fconvert=big-endian -frecord-marker=4 <mods> <incs> <f95name>
cmd_exe:    mpifort -fconvert=big-endian -frecord-marker=4 -lpthread -v -lm -o <exename> <objs> <libs>
#
#these lines may have to begin with incs_all and libs_all instead of incs and libs, but I'm not 100% sure
incs: -I $MPIHOME/include -I $METISHOME/include
libs: $METISHOME/lib/libmetis.dylib -L $MPIHOME/lib 
#

With these in place, you can run the following commands, and hopefully all goes well:
source pysource.macos.sh
config.py
compile_telemac.py
If not, check the errors, usually they will be somewhat helpful even if a bit cryptic.

If compilation is successful, you can try running one of the example cases with however many cores/threads you have/want (4 in this case). However, you will have to "actually" download the large selafins first before running them:
git lfs install
git lfs pull
cd /opt/telemac/<latest-version>/examples/telemac2d/gouttedo
#or cd ../examples if you are still in configs
telemac2d.py --ncsize=4 t2d_gouttedo.cas

Finally, to be able to load the OTM environment, you can create a symlink/alias to the pysource file in your home folder, and then load it on demand using source:
ln -s /opt/telemac/<latest-version>/configs/pysource.macos.sh $HOME/telemac.sh
source ~/telemac.sh

If you want, you can add the source command to your .zshrc or .bashrc so the commands are loaded at terminal login, or add an alias to load them at any time:
# contents of your .zshrc ...
# To load commands at login
source /opt/telemac/<latest-version>/configs/pysource.macos.sh
# To add an alias (change otm to another name if you want)
alias otm='source /opt/telemac/<latest-version>/configs/pysource.macos.sh'
# further contents of your .zshrc ...

I hope this helps! If you run into any issues and manage to fix them, please let me know and I will incorporate these changes. I can't guarantee I will be able to help you with issues, though, as I'm just sharing what worked for me and am not affiliated with the OTM consortium.
The administrator has disabled public write access.
The following user(s) said Thank You: borisb, yanncv

My installation on macOS using Homebrew 1 year 11 months ago #41794

  • sebourban
  • sebourban's Avatar
  • OFFLINE
  • Administrator
  • Principal Scientist
  • Posts: 814
  • Thank you received: 219
Thank you very much Renault - this is really appreciated.

Can I ask whether you have an intel or arm processor ?

Best regards,
Sébastien.
The administrator has disabled public write access.

My installation on macOS using Homebrew 1 year 11 months ago #41797

  • Renault
  • Renault's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 120
  • Thank you received: 33
Hi Sébastien,

I use an Intel Mac, but I have had OTM work on an ARM Mac (not mine). It wasn't very fast, though, and there are probably several reasons for that. The procedure should be the same on ARM, it just may not be as optimized due to compilers and libraries for the newer architecture still being under development.
I doubt much of the OTM userbase uses Mac (let alone ARM) for OTM, but I'd be curious to see if anyone gets it working well.
(As an aside, on a Linux cluster, I have found Intel Fortran/MPI to work faster than GNU Fortran and OpenMPI, but haven't been able to get an Intel compiler working on my computer.)
The administrator has disabled public write access.

My installation on macOS using Homebrew 1 year 11 months ago #41803

  • borisb
  • borisb's Avatar
  • OFFLINE
  • Admin
  • Posts: 128
  • Thank you received: 64
Hello,

Thanks for your macOS configuration. While they are few, we do have some macOS users so I think I should put a dedicated systel.cfg file for this particular system.

Regarding performance issues on ARM, I think some work needs to be done to get the Telemac algorithms to take full advantage of this architecture. We hope that part of this work can be carried out next year during a hackathon organised by TERATEC and ARM.
The administrator has disabled public write access.

My installation on macOS using Homebrew 8 months 1 week ago #44408

  • yanncv
  • yanncv's Avatar
  • OFFLINE
  • Fresh Boarder
  • Posts: 2
Hello,

Thank you very much Renault for your work.

I am currently trying to install Telemac on my 2020 MacBookAir that uses an M1 processor (ARM architecture).

I've gone through the different steps but I am struggling to compile the compile_telemac.py file as I have error messages such as : "cputype (16777223) does not match previous archive members cputype (16777228)" and "ld: warning: ignoring file adstack.o, building for macOS-arm64 but attempting to link with file built for unknown-x86_64"

This seems to be an issue related to the fact that the compile files are built for an Intel architecture, not ARM. Or maybe it has to do with gcc.
Would you have any tips on how to get around this problem ?

Thank you !
The administrator has disabled public write access.

My installation on macOS using Homebrew 8 months 4 days ago #44448

  • yanncv
  • yanncv's Avatar
  • OFFLINE
  • Fresh Boarder
  • Posts: 2
EDIT : I actually made it work by installing Rosetta 2, and opening a terminal app with the emulator activated. This translated everything to x86 architecture, which allowed my computer to read the files that it couldn't read previously.
Now hoping that OTM runs smoothly on this configuration :)
The administrator has disabled public write access.

My installation on macOS using Homebrew 8 months 4 days ago #44449

  • Renault
  • Renault's Avatar
  • OFFLINE
  • Senior Boarder
  • Posts: 120
  • Thank you received: 33
Hi yanncv,

Very cool that you got it to work! I've had compilation issues on x86 recently, the exact nature of which I can't currently recall, but it was due to some change in Xcode.

Do keep us updated!

André Renault
The administrator has disabled public write access.
Moderators: borisb

The open TELEMAC-MASCARET template for Joomla!2.5, the HTML 4 version.