CUDA.jl 5.5: Maintenance release
Tim Besard
CUDA.jl 5.5 is a minor release that comes with a couple of small improvements and new features.
The only important change is that the minimal required Julia version has been bumped to 1.10, in anticipation of it becoming the next LTS release.
New features
Support for the upcoming Julia 1.11 release has been added, as well as for CUDA 12.6 (Update 1).
Launch overhead has been reduced by avoiding double argument conversions. Note that this does not apply to kernels that are obtained using
@cuda launch=false
.CUSOLVER's dense wrappers have been improved by Ben Arthur, now caching workspace buffers. This should greatly reduce the number of allocations needed for repeated calls.
Alexis Montoison has improved the CUSPARSE wrappers, adding conversions between sparse vectors and sparse matrices that enable a version of
gemv
which preserves sparsity of the inputs.CUDA.jl's CUFFT wrappers now support
Float16
, thanks to Erik Schnetter.