The Swift Package Index logo.Swift Package Index

Track the adoption of Swift 6 strict concurrency checks for data race safety. How many packages are Ready for Swift 6?

Build Information

Failed to build LlamaStackClient, reference 0.0.41 (cd1069), with Swift 5.8 for Linux on 10 Oct 2024 19:35:08 UTC.

Build Command

bash -c docker run --pull=always --rm -v "checkouts-4606859-0":/host -w "$PWD" registry.gitlab.com/finestructure/spi-images:basic-5.8-latest swift build --triple x86_64-unknown-linux-gnu 2>&1

Build Log

========================================
RunAll
========================================
Builder version: 4.55.0
Interrupt handler set up.
========================================
Checkout
========================================
Clone URL: https://github.com/meta-llama/llama-stack-client-swift.git
Reference: 0.0.41
Initialized empty Git repository in /host/spi-builder-workspace/.git/
hint: Using 'master' as the name for the initial branch. This default branch name
hint: is subject to change. To configure the initial branch name to use in all
hint: of your new repositories, which will suppress this warning, call:
hint:
hint: 	git config --global init.defaultBranch <name>
hint:
hint: Names commonly chosen instead of 'master' are 'main', 'trunk' and
hint: 'development'. The just-created branch can be renamed via this command:
hint:
hint: 	git branch -m <name>
From https://github.com/meta-llama/llama-stack-client-swift
 * tag               0.0.41     -> FETCH_HEAD
HEAD is now at cd10692 Merge pull request #4 from meta-llama/registry
Submodule path 'llama-stack': checked out 'dd9d34cf7d7c632b553d722438928a2ebef3d077'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch': checked out '9b6d4b4a7b9b8f811bb6b269b0c2ce254e3a0c1b'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/ethos-u-core-driver': checked out '90f9df900acdc0718ecd2dfdc53780664758dec5'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/serialization_lib': checked out '187af0d41fe75d08d2a7ec84c1b4d24b9b641ed2'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/serialization_lib/third_party/flatbuffers': checked out '0100f6a5779831fa7a651e4b67ef389a8752bd9b'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/cadence/hifi/third-party/nnlib/nnlib-hifi4': checked out '6a9ea45e23ef591fe207442df33a5ebe88bbe8de'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/Vulkan-Headers': checked out '0c5928795a66e93f65e5e68a36d8daa79a209dc2'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/VulkanMemoryAllocator': checked out 'a6bfc237255a6bac1513f7c1ebde6d8aed6b5191'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/volk': checked out 'b3bc21e584f97400b6884cb2a541a56c6a5ddba3'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/FP16': checked out '4dfe081cf6bcd15db339cf2680b9281b8451eeb3'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/FXdiv': checked out 'b408327ac2a15ec3e43352421954f5b1967701d1'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/XNNPACK': checked out '87ee0b46b834f67bad9025d4a82ed5654f3403d3'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/cpuinfo': checked out '16bfc1622c6902d6f91d316ec54894910c620325'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/pthreadpool': checked out '4fe0e1e183925bf8cfa6aae24237e724a96479b8'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/examples/third-party/fbjni': checked out '52a14f0daa889a20d8984798b8d96eb03cebd334'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/abseil-cpp': checked out 'eb852207758a773965301d0ae717e4235fc5301a'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/re2': checked out '6dcd83d60f7944926bfd308cc13979fc53dd69ca'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/sentencepiece': checked out '6225e08edb2577757163b3f5dbba4c0b670ef445'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/kernels/optimized/third-party/eigen': checked out 'a39ade4ccf99df845ec85c580fbbb324f71952fa'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/flatbuffers': checked out '595bf0007ab1929570c7671f091313c8fc20644e'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/flatcc': checked out '896db54787e8b730a6be482c69324751f3f5f117'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/gflags': checked out 'a738fdf9338412f83ab3f26f31ac11ed3f3ec4bd'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/googletest': checked out 'e2239ee6043f73722e7aa812a459f54a28552929'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/ios-cmake': checked out '06465b27698424cf4a04a5ca4904d50a3c966c45'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/prelude': checked out '4e9e6d50b8b461564a7e351ff60b87fe59d7e53b'
Submodule path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/pybind11': checked out '8c7b8dd0ae74b36b7d42f77b0dd4096ebb7f4ab1'
Submodule 'llama-stack' (https://github.com/meta-llama/llama-stack) registered for path 'llama-stack'
Cloning into '/host/spi-builder-workspace/llama-stack'...
From https://github.com/meta-llama/llama-stack
 * branch            dd9d34cf7d7c632b553d722438928a2ebef3d077 -> FETCH_HEAD
Submodule 'llama_stack/providers/impls/ios/inference/executorch' (https://github.com/pytorch/executorch) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch'
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch'...
Submodule 'backends/arm/third-party/ethos-u-core-driver' (https://review.mlplatform.org/ml/ethos-u/ethos-u-core-driver) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/ethos-u-core-driver'
Submodule 'backends/arm/third-party/serialization_lib' (https://review.mlplatform.org/tosa/serialization_lib) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/serialization_lib'
Submodule 'backends/cadence/hifi/third-party/nnlib/nnlib-hifi4' (https://github.com/foss-xtensa/nnlib-hifi4.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/cadence/hifi/third-party/nnlib/nnlib-hifi4'
Submodule 'backends/vulkan/third-party/Vulkan-Headers' (https://github.com/KhronosGroup/Vulkan-Headers) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/Vulkan-Headers'
Submodule 'backends/vulkan/third-party/VulkanMemoryAllocator' (https://github.com/GPUOpen-LibrariesAndSDKs/VulkanMemoryAllocator.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/VulkanMemoryAllocator'
Submodule 'backends/vulkan/third-party/volk' (https://github.com/zeux/volk) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/volk'
Submodule 'backends/xnnpack/third-party/FP16' (https://github.com/Maratyszcza/FP16.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/FP16'
Submodule 'backends/xnnpack/third-party/FXdiv' (https://github.com/Maratyszcza/FXdiv.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/FXdiv'
Submodule 'backends/xnnpack/third-party/XNNPACK' (https://github.com/google/XNNPACK.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/XNNPACK'
Submodule 'backends/xnnpack/third-party/cpuinfo' (https://github.com/pytorch/cpuinfo.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/cpuinfo'
Submodule 'backends/xnnpack/third-party/pthreadpool' (https://github.com/Maratyszcza/pthreadpool.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/pthreadpool'
Submodule 'examples/third-party/fbjni' (https://github.com/facebookincubator/fbjni.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/examples/third-party/fbjni'
Submodule 'extension/llm/third-party/abseil-cpp' (https://github.com/abseil/abseil-cpp.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/abseil-cpp'
Submodule 'extension/llm/third-party/re2' (https://github.com/google/re2.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/re2'
Submodule 'extension/llm/third-party/sentencepiece' (https://github.com/google/sentencepiece.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/sentencepiece'
Submodule 'kernels/optimized/third-party/eigen' (https://gitlab.com/libeigen/eigen.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/kernels/optimized/third-party/eigen'
Submodule 'third-party/flatbuffers' (https://github.com/google/flatbuffers.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/flatbuffers'
Submodule 'third-party/flatcc' (https://github.com/dvidelabs/flatcc.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/flatcc'
Submodule 'third-party/gflags' (https://github.com/gflags/gflags.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/gflags'
Submodule 'third-party/googletest' (https://github.com/google/googletest.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/googletest'
Submodule 'third-party/ios-cmake' (https://github.com/leetal/ios-cmake) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/ios-cmake'
Submodule 'third-party/prelude' (https://github.com/facebook/buck2-prelude.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/prelude'
Submodule 'third-party/pybind11' (https://github.com/pybind/pybind11.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/pybind11'
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/ethos-u-core-driver'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/serialization_lib'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/cadence/hifi/third-party/nnlib/nnlib-hifi4'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/Vulkan-Headers'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/VulkanMemoryAllocator'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/vulkan/third-party/volk'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/FP16'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/FXdiv'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/XNNPACK'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/cpuinfo'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/xnnpack/third-party/pthreadpool'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/examples/third-party/fbjni'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/abseil-cpp'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/re2'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/extension/llm/third-party/sentencepiece'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/kernels/optimized/third-party/eigen'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/flatbuffers'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/flatcc'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/gflags'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/googletest'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/ios-cmake'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/prelude'...
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/third-party/pybind11'...
Submodule 'third_party/flatbuffers' (https://github.com/google/flatbuffers.git) registered for path 'llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/serialization_lib/third_party/flatbuffers'
Cloning into '/host/spi-builder-workspace/llama-stack/llama_stack/providers/impls/ios/inference/executorch/backends/arm/third-party/serialization_lib/third_party/flatbuffers'...
Cloned https://github.com/meta-llama/llama-stack-client-swift.git
Revision (git rev-parse @):
cd10692e1d4ea411d464fecf32b30bf0806f88ce
SUCCESS checkout https://github.com/meta-llama/llama-stack-client-swift.git at 0.0.41
========================================
Build
========================================
Selected platform:         linux
Swift version:             5.8
Building package at path:  $PWD
https://github.com/meta-llama/llama-stack-client-swift.git
Running build ...
bash -c docker run --pull=always --rm -v "checkouts-4606859-0":/host -w "$PWD" registry.gitlab.com/finestructure/spi-images:basic-5.8-latest swift build --triple x86_64-unknown-linux-gnu 2>&1
basic-5.8-latest: Pulling from finestructure/spi-images
Digest: sha256:5112a149cbb5cb7c4578603c13e2541eac53725a20fec231c9f83c98beab2b47
Status: Image is up to date for registry.gitlab.com/finestructure/spi-images:basic-5.8-latest
error: 'spi-builder-workspace': package 'spi-builder-workspace' is using Swift tools version 5.10.0 but the installed version is 5.8.1
BUILD FAILURE 5.8 linux