Skip to content
Snippets Groups Projects
Unverified Commit dd432f7c authored by Brent Huisman's avatar Brent Huisman Committed by GitHub
Browse files

tiny themefix, tutorialfix (#1390)

parent 0d336d3d
No related branches found
No related tags found
No related merge requests found
......@@ -127,7 +127,7 @@
<div class="header-title-wrap">
<a class="header-title" href="{{ pathto(master_doc) }}">{{ project }}</a>
</div>
<a class="logo-link" target="_blank" href="{{ pathto(master_doc) }}"><img src="{{ pathto('_static/' + logo, 1) }}" class="logo" alt="Divio"/></a>
<a class="logo-link" target="_blank" href="{{ pathto(master_doc) }}"><img src="{{ pathto('_static/' + logo, 1) }}" class="logo" alt="{{ project }}"/></a>
</div>
{% include "breadcrumbs.html" %}
......
......@@ -58,14 +58,13 @@ it as an argument to the ``python`` command, you need to use ``srun`` or ``mpiru
distribution) to execute a number of jobs in parallel. You can still execute the script using ``python``, but then
MPI will not execute on more than one node.
From the commandline, we can run the script using ``mpirun`` or ``srun`` and specify the number of ranks (``NRANKS``)
From the commandline, we can run the script using ``mpirun`` (``srun`` on clusters operated with SLURM) and specify the number of ranks (``NRANKS``)
or nodes. Arbor will spread the cells evenly over the ranks, so with ``NRANKS`` set to 5, we'd be spreading the 500
cells over 5 nodes, simulating 100 cells each.
.. code-block::
srun -n NRANKS python network_ring_mpi.py
mpirun -n NRANKS python network_ring_mpi.py
mpirun -n NRANKS python mpi.py
The results
***********
......
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment