Skip to content
Snippets Groups Projects

Compare revisions

Changes are shown as if the source revision was being merged into the target revision. Learn more about comparing revisions.

Source

Select target project
No results found

Target

Select target project
  • rshimoura/ebrains-spack-builds
  • hl11/ebrains-spack-builds
  • woodman/ebrains-spack-builds
  • filippomarchetti/ebrains-spack-builds
  • dsegebarth/ebrains-spack-builds-na-3
  • ziaee/ebrains-spack-builds
  • jkaiser/ebrains-spack-builds
  • mloshakov/ebrains-spack-builds
  • dsegebarth/ebrains-spack-builds
  • kozlov/ebrains-spack-builds
  • ansimsek/ebrains-spack-builds
  • lupoc/ebrains-spack-builds
  • rominabaila/ebrains-spack-builds
  • hartmut/ebrains-spack-builds
  • ri/tech-hub/platform/esd/ebrains-spack-builds
  • lcalori0/ebrains-spack-builds
  • deepu/ebrains-spack-builds
  • noelp/ebrains-spack-builds
18 results
Show changes
Showing
with 569 additions and 0 deletions
## Experimental release: Process to create new Spack installation root in case we want to update the Spack release
**the process contains many manual steps**
1. In the Gitlab repository go to CI/CD > Variables > Expand and set the appropriate {site}_INSTALLATION_ROOT_{env} variable to the **new name**
2. Start a new OKD job with [$OP == "manualFixing"] from CLI with `oc create -f simplejob.yml`
3. Connect to the running pod to get access to the NFS partition
4. Execute `mkdir -p {site}_INSTALLATION_ROOT_{env}`
5. Execute `cd {site}_INSTALLATION_ROOT_{env}`
6. Execute all commands of [$OP == "create"] from "ebrains-spack-build-env>bin/deploy-build-env.sh".
- Right after the git clone command, Set the git commit of the Spack instance to the desired one.
- Right after the previous action go to "spack/etc/spack" and create "packages.yaml" with content:
```
packages:
all:
target: [x86_64]
```
- Right after the `spack compiler find` command install also the appropriate python with `spack install python@3.8.11 %gcc@10.3.0`
7. Create the appropriate LAB_KERNEL_PATH if it does not exist
## Regular release of EBRAINS tools: Process to create new Spack installation root in case we want to update the Spack release without breaking the existing packages
1. In the Gitlab repository go to CI/CD > Variables > Expand and set the appropriate {site}_INSTALLATION_ROOT_{env} variable to the **new name**
2. In the Gitlab repository go to CI/CD > Variables > Expand and set the appropriate {site}_OPERATION_{env} variable to **"create"**
3. In the Gitlab repository go to CI/CD > Variables > Expand and set the appropriate {site}_LAB_KERNEL_PATH_{env} variable to the **desired path**
4. In the Gitlab repository go to CI/CD > Variables > Expand and set the appropriate {site}_SPACKIFIED_ENV_{env} variable to the **appropriate name**
5. Run the build pipeline
6. In the Gitlab repository go to CI/CD > Variables > Expand and set the appropriate {site}_OPERATION_{env} variable to **"manualFixing"** and run the build pipeline. This will create an OpenShift job that will not do any actual operation just loop forever, to give us the time to connect to the running pod and manually set the Spack instance to the commit we want for Spack to be.
- We set the appropriate commit to the Spack instance
- Right after the previous action go to "spack/etc/spack" and create "packages.yaml" with content:
```
packages:
all:
target: [x86_64]
```
- Right after the previous action install also the appropriate python with `spack install python@3.8.11 %gcc@10.3.0`
- Then disconnect and kill the OpenShift job.
7. In the Gitlab repository go to CI/CD > Variables > Expand and set the appropriate {site}_OPERATION_{env} variable to **"update"**
8. Run the build pipeline
1. Change in CI/CD variables
INSTALLATION_ROOT="main-spack-instance-2205"
2. Start an OpenShift build job from a terminal
oc create -f simplejob.yml
3. Execute in the running pod:
mkdir -p /opt/app-root/src
cd /srv
mkdir -p $INSTALLATION_ROOT
cd $INSTALLATION_ROOT
git clone https://github.com/spack/spack.git
git checkout -b release_v0_2_spack_commit a8d440d3ababcdec20d665ad938ab880cd9b9d17
cat <<EOF > /srv/$INSTALLATION_ROOT/spack/etc/spack/packages.yaml
packages:
all:
target: [x86_64]
EOF
source /srv/$INSTALLATION_ROOT/spack/share/spack/setup-env.sh
cat <<EOF > $SPACK_ROOT/etc/spack/defaults/mirrors.yaml
mirrors:
public_mirror: https://spack-llnl-mirror.s3-us-west-2.amazonaws.com/
spack-public: https://mirror.spack.io
EOF
spack compiler find
spack install gcc@10.3.0
spack load gcc@10.3.0
spack compiler find
spack install python@3.8.11 %gcc@10.3.0
cp -r ~/.spack $SPACK_ROOT
mkdir -p /srv/jupyterlab_kernels/prod/experimental
\ No newline at end of file
Steps to update the current experimental release with no major upgrades (i.e. install it in the same Spack instance as before - respective CI/CD env variable is ${site}_INSTALLATION_ROOT_{env})
1) Back up the Lab's experimental kernel configuration in both CSCS and JSC prod
cp -r /srv/jupyterlab_kernels/prod/experimental /srv/backup/experimental_{date}
2) Merge the master branch of the ebrains-spack-builds repo to the experimental_rel branch.
Make sure that the master branch has uncommented in the spack.yaml file only those packages that have been built successfully.
3) Start the scheduled pipeline with description "deploy experimental release in the dev CSCS cluster"
4) Check if the pipeline executed successfully
5) On Friday morning (not every Friday, could be when we have important or numerous changes) run the scheduled pipeline with description "deploy experimental release in prod environments".
6) Only one job will continue (due to credentials mismatch in the gitlab runner)
7) Check the result of the job. If it is successful retry the other job that failed previously. If it is not, restore the experimental kernel's configuration from step #1 in its original location.
File added
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Acpype(PythonPackage):
"""A tool based in Python to use Antechamber to generate topologies for chemical
compounds and to interface with others python applications like CCPN and ARIA"""
# Homepage and download url
homepage = "https://github.com/alanwilter/acpype"
git = 'https://github.com/alanwilter/acpype.git'
url = 'https://github.com/alanwilter/acpype/archive/refs/tags/2022.7.21.tar.gz'
# Set the gitlab accounts of this package maintainers
maintainers = ['dbeltran']
# Versions
version('master', branch='master')
version('2022.7.21', sha256='5f7e6162d9a0aed2f770b9ccf5617ac1398a423cca815ae37cbf66d4cd62ea2f')
# Dependencies
depends_on('python@3.8:', type=('build', 'run'))
depends_on('ambertools')
depends_on('openbabel')
depends_on('py-poetry-core')
# Test
@run_after('install')
@on_package_attributes(run_tests=True)
def check_install (self):
python("-c", 'import acpype')
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Ambertools(CMakePackage):
"""AmberTools is a free, useful standalone package and a prerequisite
for installing Amber itself. The AmberTools suite is free of charge,
and its components are mostly released under the GNU General Public
License (GPL). A few components are included that are in the public
domain or which have other, open-source, licenses. The libsander and
libpbsa libraries use the LGPL license."""
homepage = "https://ambermd.org/AmberTools.php"
url = "https://ambermd.org/downloads/AmberTools22jlmrcc.tar.bz2"
maintainers("d-beltran")
# begin EBRAINS (added): add version
version("23_rc6", sha256="debb52e6ef2e1b4eaa917a8b4d4934bd2388659c660501a81ea044903bf9ee9d")
# end EBRAINS
version("22jlmrcc", sha256="1571d4e0f7d45b2a71dce5999fa875aea8c90ee219eb218d7916bf30ea229121")
depends_on("flex", type="build")
depends_on("bison", type="build")
depends_on("tcsh", type="build")
depends_on("zlib", type=("build", "run"))
depends_on("bzip2", type=("build", "run"))
depends_on("blas", type=("build", "run"))
depends_on("lapack", type=("build", "run"))
depends_on("arpack-ng", type=("build", "run"))
depends_on("netcdf-c", type=("build", "run"))
depends_on("netcdf-fortran", type=("build", "run"))
depends_on("fftw", type=("build", "run"))
depends_on("readline", type=("build", "run"))
depends_on("netlib-xblas~plain_blas", type=("build", "run"))
# Specific variants needed for boost according to build logs
depends_on(
"boost+thread+system+program_options+iostreams+regex+timer+chrono+filesystem+graph",
type=("build", "run"),
)
# Python dependencies
# begin EBRAINS (modified): add version
depends_on("python@3.8:3.10 +tkinter", type=("build", "run"), when="@22jlmrcc")
depends_on("python@3.8: +tkinter", type=("build", "run"), when="@23_rc6")
# end EBRAINS
depends_on("py-setuptools", type="build")
depends_on("py-numpy", type=("build", "run"))
depends_on("py-matplotlib", type=("build", "run"))
depends_on("py-scipy", type=("build", "run"))
def cmake_args(self):
# Translated from ambertools build/run_cmake script
# We also add the TRUST_SYSTEM_LIBS argument mentioned in the ambertools guide
# https://ambermd.org/pmwiki/pmwiki.php/Main/CMake-Guide-to-Options
args = [
self.define("COMPILER", "GNU"),
self.define("MPI", False),
self.define("CUDA", False),
self.define("INSTALL_TESTS", True),
self.define("DOWNLOAD_MINICONDA", False),
self.define("TRUST_SYSTEM_LIBS", True),
# This is to avoid the x11 (X11_Xext_LIB) error
# It is equivalent to the "-noX11" flag accoridng to the docs:
# https://ambermd.org/pmwiki/pmwiki.php/Main/CMake-Common-Options
self.define("BUILD_GUI", False),
]
return args
def setup_run_environment(self, env):
env.set("AMBER_PREFIX", self.prefix)
env.set("AMBERHOME", self.prefix)
def setup_build_environment(self, env):
env.set("AMBER_PREFIX", self.prefix)
env.set("AMBERHOME", self.prefix)
@run_after("install")
@on_package_attributes(run_tests=True)
def check_install(self):
make("test.serial")
# Temporarily copy netcdf.h header file to netcdf-fortran/include to pass the Ambertools
# cmake check (quickest fix, will probably cause problems, needs to change)
@run_before("cmake")
def fix_check(self):
cp = Executable("cp")
cp(
self.spec["netcdf-c"].headers.directories[0] + "/netcdf.h",
self.spec["netcdf-fortran"].headers.directories[0],
)
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Apbs(CMakePackage):
"""
APBS (Adaptive Poisson-Boltzmann Solver) solves the equations of continuum electrostatics
for large biomolecular assemblages. This software was designed "from the ground up"
using modern design principles to ensure its ability to interface with other computational
packages and evolve as methods and applications change over time. The APBS code is
accompanied by extensive documentation for both users and programmers and is supported
by a variety of utilities for preparing calculations and analyzing results.
Finally, the free, open-source APBS license ensures its accessibility to the entire
biomedical community.
"""
# Homepage and Github URL.
homepage = "https://www.poissonboltzmann.org/"
url = "https://github.com/Electrostatics/apbs/archive/refs/tags/v3.4.0.tar.gz"
# List of GitHub accounts to notify when the package is updated.
maintainers = ['thielblz', 'richtesn']
# SHA256 checksum.
version('3.4.0', sha256='572ff606974119430020ec948c78e171d8525fb0e67a56dad937a897cac67461')
# Dependencies.
depends_on('cmake@3.19:', type=('build'))
depends_on('bison', type=('build'))
depends_on('flex', type=('build'))
depends_on('swig', type=('build'))
depends_on('readline', type=('build', 'run'))
depends_on('eigen', type=('build', 'run'))
depends_on('boost', type=('build', 'run'))
depends_on('blas', type=('build', 'run'))
depends_on('arpack-ng', type=('build', 'run'))
depends_on('suite-sparse', type=('build', 'run'))
depends_on('maloc', type=('build', 'run'))
depends_on('python@3.8:3.11', type=('build', 'run'))
def cmake_args(self):
# Min and max Python versions need to be set as variables to pass tests.
# See tests/CMakeLists.txt lines 6-14.
python_version = str(self.spec['python'].version)
args = [
self.define('PYTHON_MIN_VERSION', python_version),
self.define('PYTHON_MAX_VERSION', python_version),
self.define('BLAS_FOUND', True),
self.define('BLAS_INCLUDE_DIRS', self.spec['blas'].prefix.include),
self.define('BLAS_LIBRARIES', self.spec['blas'].libs.joined(';'))
]
return args
def setup_build_environment(self, env):
# add suite-sparse libs to path because tests can't find them
env.prepend_path('LD_LIBRARY_PATH', self.spec['suite-sparse'].prefix.lib)
env.prepend_path('LD_LIBRARY_PATH', self.spec['blas'].prefix.lib)
def setup_dependent_build_environment(self, env, dependent_spec):
self.setup_build_environment(env)
@run_after('install')
@on_package_attributes(run_tests=True)
def install_test(self):
with working_dir(self.build_directory):
# for testing, apbs needs to be in the path
import os
os.environ['PATH'] = self.prefix.bin + ':' + os.environ['PATH']
ctest = which("ctest")
ctest("-C", "Release", "--output-on-failure")
def check(self):
# this would run "make test" before installation, so we override this and define install_test() instead
pass
# Copyright 2013-2024 Lawrence Livermore National Security, LLC and other
# Spack Project Developers. See the top-level COPYRIGHT file for details.
#
# SPDX-License-Identifier: (Apache-2.0 OR MIT)
from spack.package import *
class Arbor(CMakePackage, CudaPackage):
"""Arbor is a high-performance library for computational neuroscience
simulations."""
homepage = "https://arbor-sim.org"
git = "https://github.com/arbor-sim/arbor.git"
url = "https://github.com/arbor-sim/arbor/releases/download/v0.9.0/arbor-v0.9.0-full.tar.gz"
maintainers = ("thorstenhater", "ErbB4", "haampie")
version("master", branch="master", submodules=True)
version("develop", branch="master", submodules=True)
version(
"0.10.0",
sha256="72966b7a2f45ce259b8ba167ca3e4f5ab9f212136a300267aaac0c04ed3fe3fc",
url="https://github.com/arbor-sim/arbor/releases/download/v0.10.1/arbor-v0.10.0-full.tar.gz",
)
version(
"0.9.0",
sha256="5f9740955c821aca81e23298c17ad64f33f635756ad9b4a0c1444710f564306a",
url="https://github.com/arbor-sim/arbor/releases/download/v0.9.0/arbor-v0.9.0-full.tar.gz",
)
version(
"0.8.1",
sha256="caebf96676ace6a9c50436541c420ca4bb53f0639dcab825de6fa370aacf6baa",
url="https://github.com/arbor-sim/arbor/releases/download/v0.8.1/arbor-v0.8.1-full.tar.gz",
)
version(
"0.8.0",
sha256="18df5600308841616996a9de93b55a105be0f59692daa5febd3a65aae5bc2c5d",
url="https://github.com/arbor-sim/arbor/releases/download/v0.8/arbor-v0.8-full.tar.gz",
)
version(
"0.7.0",
sha256="c3a6b7193946aee882bb85f9c38beac74209842ee94e80840968997ba3b84543",
url="https://github.com/arbor-sim/arbor/releases/download/v0.7/arbor-v0.7-full.tar.gz",
)
version(
"0.6.0",
sha256="4cd333b18effc8833428ddc0b99e7dc976804771bc85da90034c272c7019e1e8",
url="https://github.com/arbor-sim/arbor/releases/download/v0.6/arbor-v0.6-full.tar.gz",
)
version(
"0.5.2",
sha256="290e2ad8ca8050db1791cabb6b431e7c0409c305af31b559e397e26b300a115d",
url="https://github.com/arbor-sim/arbor/releases/download/v0.5.2/arbor-v0.5.2-full.tar.gz",
)
variant(
"assertions",
default=False,
description="Enable arb_assert() assertions in code.",
)
variant("doc", default=False, description="Build documentation.")
variant("mpi", default=False, description="Enable MPI support")
variant("python", default=True, description="Enable Python frontend support")
variant(
"vectorize",
default=False,
description="Enable vectorization of computational kernels",
)
variant(
"gpu_rng",
default=False,
description="Use GPU generated random numbers -- not bitwise equal to CPU version",
when="+cuda",
)
# https://docs.arbor-sim.org/en/latest/install/build_install.html#compilers
conflicts("%gcc@:8")
conflicts("%clang@:9")
# Cray compiler v9.2 and later is Clang-based.
conflicts("%cce@:9.1")
conflicts("%intel")
# begin EBRAINS (modified: added run dep)
depends_on("cmake@3.19:", type=("build", "run"))
# end EBRAINS
# misc dependencies
depends_on("fmt@7.1:", when="@0.5.3:") # required by the modcc compiler
depends_on("fmt@9.1:", when="@0.7.1:")
# begin EBRAINS (modified: relaxed (upstream gave no info about update))
# upstream adds: depends_on("fmt@10.1:", when="@0.9.1:")
depends_on("googletest@1.12.1:", when="@0.7.1:")
depends_on("pugixml@1.11:", when="@0.7.1:")
# upstream adds: depends_on("pugixml@1.13:", when="@0.9.1:")
depends_on("nlohmann-json@3.11.2:")
depends_on("random123")
#upstream adds: depends_on("random123@1.14.0:", when="@0.10:")
# end EBRAINS (modified)
with when("+cuda"):
depends_on("cuda@10:")
depends_on("cuda@11:", when="@0.7.1:")
depends_on("cuda@12:", when="@0.9.1:")
# mpi
# begin EBRAINS (modified: added run dep)
depends_on("mpi", when="+mpi", type=("build", "run"))
# end EBRAINS (modified)
depends_on("py-mpi4py", when="+mpi+python", type=("build", "run"))
# python (bindings)
with when("+python"):
extends("python")
depends_on("python@3.7:", type=("build", "run"))
depends_on("python@3.9:", when="@0.9.1:", type=("build", "run"))
depends_on("py-numpy", type=("build", "run"))
depends_on("py-pybind11@2.6:", type="build")
depends_on("py-pybind11@2.8.1:", when="@0.5.3:", type="build")
depends_on("py-pybind11@2.10.1:", when="@0.7.1:", type="build")
depends_on("py-pandas", type="test")
depends_on("py-seaborn", type="test")
# sphinx based documentation
with when("+doc"):
depends_on("python@3.10:", type="build")
depends_on("py-sphinx", type="build")
depends_on("py-svgwrite", type="build")
@property
def build_targets(self):
return ["all", "html"] if "+doc" in self.spec else ["all"]
def cmake_args(self):
args = [
self.define_from_variant("ARB_WITH_ASSERTIONS", "assertions"),
self.define_from_variant("ARB_WITH_MPI", "mpi"),
self.define_from_variant("ARB_WITH_PYTHON", "python"),
self.define_from_variant("ARB_VECTORIZE", "vectorize"),
]
if "+cuda" in self.spec:
args.append("-DARB_GPU=cuda")
args.append(self.define_from_variant("ARB_USE_GPU_RNG", "gpu_rng"))
# query spack for the architecture-specific compiler flags set by its wrapper
args.append("-DARB_ARCH=none")
opt_flags = spack.build_environment.optimization_flags(
self.compiler, self.spec.target
)
# Might return nothing
if opt_flags:
args.append("-DARB_CXX_FLAGS_TARGET=" + opt_flags)
# Needed, spack has no units package
args.append("-DARB_USE_BUNDLED_UNITS=ON")
return args
@run_after("install", when="+python")
@on_package_attributes(run_tests=True)
def install_test(self):
python("-c", "import arbor")
python("python/example/single_cell_model.py")
--- a/third_party/zlib/gzguts.h 1980-01-01 00:00:00
+++ b/third_party/zlib/gzguts.h 2023-04-03 12:23:10
@@ -3,6 +3,10 @@
* For conditions of distribution and use, see copyright notice in zlib.h
*/
+#ifndef _WIN32
+ #include <unistd.h>
+#endif
+
#ifdef _LARGEFILE64_SOURCE
# ifndef _LARGEFILE_SOURCE
# define _LARGEFILE_SOURCE 1
--- a/src/main/java/com/google/devtools/build/lib/bazel/rules/BazelRuleClassProvider.java
+++ b/src/main/java/com/google/devtools/build/lib/bazel/rules/BazelRuleClassProvider.java
@@ -181,6 +181,13 @@ public class BazelRuleClassProvider {
env.put("PATH", null);
}
+ Map<String, String> spackEnv = System.getenv();
+ for (String envName : spackEnv.keySet()) {
+ if (envName.startsWith("SPACK_")) {
+ env.put(envName, spackEnv.get(envName));
+ }
+ }
+
// Shell environment variables specified via options take precedence over the
// ones inherited from the fragments. In the long run, these fragments will
// be replaced by appropriate default rc files anyway.
--- a/src/main/java/com/google/devtools/build/lib/bazel/rules/BazelRuleClassProvider.java
+++ b/src/main/java/com/google/devtools/build/lib/bazel/rules/BazelRuleClassProvider.java
@@ -185,7 +185,7 @@ public class BazelRuleClassProvider {
Map<String, String> spackEnv = System.getenv();
for (String envName : spackEnv.keySet()) {
- if (envName.startsWith("SPACK_")) {
+ if ((envName.startsWith("SPACK_")) || (envName.equals("fcc_ENV")) || (envName.equals("FCC_ENV"))) {
env.put(envName, spackEnv.get(envName));
}
}
diff -Naur a/src/main/cpp/blaze_util_posix.cc b/src/main/cpp/blaze_util_posix.cc
--- a/src/main/cpp/blaze_util_posix.cc 1980-01-01 00:00:00.000000000 -0800
+++ b/src/main/cpp/blaze_util_posix.cc 2022-06-30 23:34:08.000000000 -0700
@@ -600,7 +600,7 @@
// Prefer OFD locks if available. POSIX locks can be lost "accidentally"
// due to any close() on the lock file, and are not reliably preserved
// across execve() on Linux, which we need for --batch mode.
- if (fcntl(fd, F_OFD_SETLK, lock) == 0) return 0;
+ if (fcntl(fd, F_SETLK, lock) == 0) return 0;
if (errno != EINVAL) {
if (errno != EACCES && errno != EAGAIN) {
BAZEL_DIE(blaze_exit_code::LOCAL_ENVIRONMENTAL_ERROR)
--- a/compile.sh
+++ b/compile.sh
@@ -63,7 +63,7 @@
log "Building output/bazel"
# We set host and target platform directly because we are building for the local
# host.
-bazel_build "src:bazel_nojdk${EXE_EXT}" \
+CC=$SPACK_CC CXX=$SPACK_CXX bazel_build "src:bazel_nojdk${EXE_EXT}" \
--action_env=PATH \
--host_platform=@local_config_platform//:host \
--platforms=@local_config_platform//:host \
--- a/compile.sh
+++ b/compile.sh
@@ -124,7 +124,7 @@
new_step 'Building Bazel with Bazel'
display "."
log "Building output/bazel"
- bazel_build "src:bazel${EXE_EXT}" \
+ CC=$SPACK_CC CXX=$SPACK_CXX bazel_build "src:bazel${EXE_EXT}" \
|| fail "Could not build Bazel"
bazel_bin_path="$(get_bazel_bin_path)/src/bazel${EXE_EXT}"
[ -e "$bazel_bin_path" ] \
--- a/compile.sh
+++ b/compile.sh
@@ -85,7 +85,7 @@
log "Building output/bazel"
# We set host and target platform directly since the defaults in @bazel_tools
# have not yet been generated.
-bazel_build "src:bazel${EXE_EXT}" \
+CC=$SPACK_CC CXX=$SPACK_CXX bazel_build "src:bazel${EXE_EXT}" \
--experimental_host_platform=//tools/platforms:host_platform \
--experimental_platforms=//tools/platforms:target_platform \
|| fail "Could not build Bazel"
--- a/src/main/java/com/google/devtools/build/lib/rules/cpp/HeaderDiscovery.java.orig 2020-03-25 08:54:37.914186251 -0400
+++ b/src/main/java/com/google/devtools/build/lib/rules/cpp/HeaderDiscovery.java 2020-03-25 08:55:01.356250657 -0400
@@ -148,7 +148,7 @@
if (execPath.startsWith(execRoot)) {
execPathFragment = execPath.relativeTo(execRoot); // funky but tolerable path
} else {
- problems.add(execPathFragment.getPathString());
+ // problems.add(execPathFragment.getPathString());
continue;
}
}
diff --color=auto --color=auto -Naur a/src/main/java/com/google/devtools/build/lib/rules/cpp/HeaderDiscovery.java b/src/main/java/com/google/devtools/build/lib/rules/cpp/HeaderDiscovery.java
--- a/src/main/java/com/google/devtools/build/lib/rules/cpp/HeaderDiscovery.java 1980-01-01 00:00:00
+++ b/src/main/java/com/google/devtools/build/lib/rules/cpp/HeaderDiscovery.java 2024-02-15 13:36:37
@@ -143,7 +143,7 @@
LabelConstants.EXPERIMENTAL_EXTERNAL_PATH_PREFIX.getRelative(
execPath.relativeTo(execRoot.getParentDirectory()));
} else {
- absolutePathProblems.add(execPathFragment.getPathString());
+ // absolutePathProblems.add(execPathFragment.getPathString());
continue;
}
}
diff --git a/third_party/ijar/zlib_client.h b/third_party/ijar/zlib_client.h
index ed6616362fcc..c4b051e0100c 100644
--- a/third_party/ijar/zlib_client.h
+++ b/third_party/ijar/zlib_client.h
@@ -16,6 +16,7 @@
#define THIRD_PARTY_IJAR_ZLIB_CLIENT_H_
#include <limits.h>
+#include <limits>
#include "third_party/ijar/common.h"