Update PyNN to 0.12.2, add libNeuroML
Merge request reports
Activity
Hi @adavison,
some small changes were needed to the package names :) I fixed them but it seems I can't trigger the pipeline on your fork (probably because I only have Reporter role permissions), could you start the pipeline (can be done manually here by clicking Run pipeline) or change my role to Developer?
@adavison
import pyNN.arbor
complains that it can't find cmake:Error message
==> [2023-11-14-08:30:15.072258] '/mnt/spack_v0.20.0/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/python-3.8.11-sb7klygirl4v2whqdxjtgg5f2tlglxit/bin/python3.8' '-c' 'import pyNN.arbor' Building catalogue 'PyNN' from mechanisms in /builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/nmodl * NMODL * na * leak * pas * pas2 * kdr * expsyn Build log: Traceback (most recent call last): File "/mnt/spack_v0.20.0/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/arbor-0.9.0-vs5koousolfyk23apdpqy4pqwskq42lo/bin/arbor-build-catalogue", line 317, in sp.run(cmake_cmd, shell=True, check=True, capture_output = True, text = True) File "/mnt/spack_v0.20.0/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/python-3.8.11-sb7klygirl4v2whqdxjtgg5f2tlglxit/lib/python3.8/subprocess.py", line 516, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command 'cmake ..' returned non-zero exit status 127. Error: /bin/sh: 1: cmake: not found Traceback (most recent call last): File "", line 1, in File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/init.py", line 19, in from .standardmodels import * # noqa: F403, F401 File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/standardmodels.py", line 17, in from .simulator import state File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/simulator.py", line 233, in build_mechanisms() File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/simulator.py", line 31, in build_mechanisms err_msg = "\n ".join(proc.stdout) TypeError: can only join an iterable FAILED: PyPynn::test_python3.8_c_importpyNN.arbor: Command exited with status 1: '/mnt/spack_v0.20.0/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/python-3.8.11-sb7klygirl4v2whqdxjtgg5f2tlglxit/bin/python3.8' '-c' 'import pyNN.arbor' Building catalogue 'PyNN' from mechanisms in /builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/nmodl NMODL na leak pas pas2 kdr expsyn Build log: Traceback (most recent call last): File "/mnt/spack_v0.20.0/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/arbor-0.9.0-vs5koousolfyk23apdpqy4pqwskq42lo/bin/arbor-build-catalogue", line 317, in sp.run(cmake_cmd, shell=True, check=True, capture_output = True, text = True) File "/mnt/spack_v0.20.0/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/python-3.8.11-sb7klygirl4v2whqdxjtgg5f2tlglxit/lib/python3.8/subprocess.py", line 516, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command 'cmake ..' returned non-zero exit status 127. Error: /bin/sh: 1: cmake: not found Traceback (most recent call last): File "", line 1, in File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/init.py", line 19, in from .standardmodels import * # noqa: F403, F401 File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/standardmodels.py", line 17, in from .simulator import state File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/simulator.py", line 233, in build_mechanisms() File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-m2yqbsawdzwysw4ynlwhsjogw5xt5xtv/lib/python3.8/site-packages/pyNN/arbor/simulator.py", line 31, in build_mechanisms err_msg = "\n ".join(proc.stdout) TypeError: can only join an iterable File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/bin/spack", line 54, in sys.exit(main()) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack_installable/main.py", line 37, in main sys.exit(spack.main.main(argv)) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/main.py", line 1018, in main return _main(argv) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/main.py", line 973, in _main return finish_parse_and_run(parser, cmd_name, env_format_error) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/main.py", line 1001, in finish_parse_and_run return _invoke_command(command, parser, args, unknown) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/main.py", line 650, in _invoke_command return_val = command(parser, args) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/cmd/install.py", line 366, in install install_with_active_env(env, args, install_kwargs, reporter_factory) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/cmd/install.py", line 435, in install_with_active_env env.install_specs(specs_to_install, **install_kwargs) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/environment/environment.py", line 1915, in install_specs builder.install() File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/installer.py", line 1752, in install self._install_task(task) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/installer.py", line 1285, in _install_task spack.package_base.PackageBase._verbose = spack.build_environment.start_build_process( File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/build_environment.py", line 1185, in start_build_process p.start() File "/usr/lib/python3.8/multiprocessing/process.py", line 121, in start self._popen = self._Popen(self) File "/usr/lib/python3.8/multiprocessing/context.py", line 224, in _Popen return _default_context.get_context().Process._Popen(process_obj) File "/usr/lib/python3.8/multiprocessing/context.py", line 277, in _Popen return Popen(process_obj) File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 19, in init self._launch(process_obj) File "/usr/lib/python3.8/multiprocessing/popen_fork.py", line 75, in _launch code = process_obj._bootstrap(parent_sentinel=child_r) File "/usr/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap self.run() File "/usr/lib/python3.8/multiprocessing/process.py", line 108, in run self._target(*self._args, **self._kwargs) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/build_environment.py", line 1051, in _setup_pkg_and_run return_value = function(pkg, kwargs) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/installer.py", line 2089, in build_process return installer.run() File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/installer.py", line 1954, in run self._real_install() File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/installer.py", line 2049, in _real_install phase_fn.execute() File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/builder.py", line 436, in execute callback(self.builder) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/build_systems/_checks.py", line 127, in execute_install_time_tests builder.pkg.tester.phase_tests(builder, "install", builder.install_time_test_callbacks) File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/install_test.py", line 381, in phase_tests fn() File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/lib/spack/spack/build_systems/python.py", line 176, in test self.run_test(
It looks like an issue with the arbor Spack package (for now cmake is only a build dependency), but do you happen to know if it should be a run dependency?
Edited by Eleni Mathioulakimentioned in merge request !470 (merged)
Yes, I think so. Arbor allows users to compile extensions during runtime (see https://docs.arbor-sim.org/en/latest/concepts/mechanisms.html#mechanisms)
added 9 commits
-
10cafbf4...bb722ca6 - 8 commits from branch
technical-coordination/project-internal/devops/platform:master
- d1a093b5 - Merge branch 'master' into cnrs-neuroinformatics/update-PyNN
-
10cafbf4...bb722ca6 - 8 commits from branch
added 11 commits
-
d1a093b5...ff3f7542 - 10 commits from branch
technical-coordination/project-internal/devops/platform:master
- e89adb34 - Merge branch 'master' into cnrs-neuroinformatics/update-PyNN
-
d1a093b5...ff3f7542 - 10 commits from branch
@adavison ok thanks, I added
cmake
(andmpi
) to the run dependencies of Arbor and the previous error is gone, but theimport pyNN.arbor
test still fails:==> [2023-11-15-12:14:28.524940] test: test_python3.8_c_importpyNN.arbor: checking import of pyNN.arbor ==> [2023-11-15-12:14:28.525525] Expecting return code in [0] ==> [2023-11-15-12:14:28.525983] '/mnt/spack_v0.20.0/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/python-3.8.11-sb7klygirl4v2whqdxjtgg5f2tlglxit/bin/python3.8' '-c' 'import pyNN.arbor' Building catalogue 'PyNN' from mechanisms in /builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-akffqcpub4aymqr243ufowlqkgqrlgr3/lib/python3.8/site-packages/pyNN/arbor/nmodl * NMODL * kdr * pas * pas2 * expsyn * na * leak Catalogue has been built and copied to /builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-akffqcpub4aymqr243ufowlqkgqrlgr3/lib/python3.8/site-packages/pyNN/arbor/nmodl/PyNN-catalogue.so Traceback (most recent call last): File "<string>", line 1, in <module> File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-akffqcpub4aymqr243ufowlqkgqrlgr3/lib/python3.8/site-packages/pyNN/arbor/__init__.py", line 19, in <module> from .standardmodels import * # noqa: F403, F401 File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-akffqcpub4aymqr243ufowlqkgqrlgr3/lib/python3.8/site-packages/pyNN/arbor/standardmodels.py", line 17, in <module> from .simulator import state File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-akffqcpub4aymqr243ufowlqkgqrlgr3/lib/python3.8/site-packages/pyNN/arbor/simulator.py", line 234, in <module> state = State() File "/builds/cnrs-neuroinformatics/ebrains-spack-builds/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/py-pynn-0.12.1-akffqcpub4aymqr243ufowlqkgqrlgr3/lib/python3.8/site-packages/pyNN/arbor/simulator.py", line 189, in __init__ self.arbor_context = arbor.context(alloc, comm) TypeError: __init__(): incompatible constructor arguments. The following argument types are supported: 1. arbor._arbor.context() 2. arbor._arbor.context(*, threads: int = 1, gpu_id: object = None, mpi: object = None, inter: object = None, bind_procs: bool = False, bind_threads: bool = False) 3. arbor._arbor.context(alloc: arbor._arbor.proc_allocation, *, mpi: object = None, inter: object = None) Invoked with: <arbor.proc_allocation: threads 1, gpu_id None, bind_threads 0, bind_procs 0>, <arbor.mpi_comm: MPI_COMM_WORLD> FAILED: PyPynn::test_python3.8_c_importpyNN.arbor: Command exited with status 1: '/mnt/spack_v0.20.0/spack/opt/spack/linux-ubuntu20.04-x86_64/gcc-10.3.0/python-3.8.11-sb7klygirl4v2whqdxjtgg5f2tlglxit/bin/python3.8' '-c' 'import pyNN.arbor'
I don't think this has to do with the Spack package, because I tried locally and I got the same error. Should we just skip the arbor tests for now?
mentioned in commit b3d0f60a