* using log directory 'd:/Rcompile/CRANpkg/local/4.4/vetiver.Rcheck' * using R version 4.4.3 (2025-02-28 ucrt) * using platform: x86_64-w64-mingw32 * R was compiled by gcc.exe (GCC) 13.3.0 GNU Fortran (GCC) 13.3.0 * running under: Windows Server 2022 x64 (build 20348) * using session charset: UTF-8 * checking for file 'vetiver/DESCRIPTION' ... OK * this is package 'vetiver' version '0.2.6' * package encoding: UTF-8 * checking package namespace information ... OK * checking package dependencies ... OK * checking if this is a source package ... OK * checking if there is a namespace ... OK * checking for hidden files and directories ... OK * checking for portable file names ... OK * checking whether package 'vetiver' can be installed ... OK * checking installed package size ... OK * checking package directory ... OK * checking 'build' directory ... OK * checking DESCRIPTION meta-information ... OK * checking top-level files ... OK * checking for left-over files ... OK * checking index information ... OK * checking package subdirectories ... OK * checking code files for non-ASCII characters ... OK * checking R files for syntax errors ... OK * checking whether the package can be loaded ... [1s] OK * checking whether the package can be loaded with stated dependencies ... [1s] OK * checking whether the package can be unloaded cleanly ... [1s] OK * checking whether the namespace can be loaded with stated dependencies ... [1s] OK * checking whether the namespace can be unloaded cleanly ... [2s] OK * checking loading without being on the library search path ... [1s] OK * checking whether startup messages can be suppressed ... [1s] OK * checking use of S3 registration ... OK * checking dependencies in R code ... OK * checking S3 generic/method consistency ... OK * checking replacement functions ... OK * checking foreign function calls ... OK * checking R code for possible problems ... [9s] OK * checking Rd files ... [1s] OK * checking Rd metadata ... OK * checking Rd cross-references ... OK * checking for missing documentation entries ... OK * checking for code/documentation mismatches ... OK * checking Rd \usage sections ... OK * checking Rd contents ... OK * checking for unstated dependencies in examples ... OK * checking installed files from 'inst/doc' ... OK * checking files in 'vignettes' ... OK * checking examples ... [22s] OK * checking for unstated dependencies in 'tests' ... OK * checking tests ... [42s] ERROR Running 'testthat.R' [41s] Running the tests in 'tests/testthat.R' failed. Complete output: > library(testthat) > library(vetiver) > > test_check("vetiver") Loading required package: ggplot2 Loading required package: lattice Create a Model Card for your published model * Model Cards provide a framework for transparent, responsible reporting * Use the vetiver `.Rmd` template as a place to start This message is displayed once per session. This is mgcv 1.9-4. For overview type '?mgcv'. Attaching package: 'parsnip' The following object is masked from 'package:e1071': tune Attaching package: 'probably' The following objects are masked from 'package:base': as.factor, as.ordered Attaching package: 'tune' The following object is masked from 'package:e1071': tune The following object is masked from 'package:vetiver': load_pkgs Attaching package: 'rsample' The following object is masked from 'package:e1071': permutations The following object is masked from 'package:caret': calibration Attaching package: 'recipes' The following object is masked from 'package:stats': step Your rsconnect bundle has been created at: * D:/temp/2025_12_08_01_50_00_1885/RtmpEFnX0b/filedf2c9f23a17/bundledf2c7384ffd.tar.gz Saving _problems/test-xgboost-9.R [ FAIL 1 | WARN 2 | SKIP 70 | PASS 221 ] ══ Skipped tests (70) ══════════════════════════════════════════════════════════ • On CRAN (70): 'test-api.R:16:1', 'test-api.R:77:1', 'test-attach-pkgs.R:2:5', 'test-attach-pkgs.R:7:5', 'test-attach-pkgs.R:12:5', 'test-caret.R:22:1', 'test-caret.R:65:5', 'test-choose-version.R:4:5', 'test-choose-version.R:33:1', 'test-create-ptype.R:41:1', 'test-dashboard.R:13:5', 'test-gam.R:8:1', 'test-gam.R:60:5', 'test-glm.R:7:1', 'test-glm.R:59:5', 'test-keras.R:1:1', 'test-kproto.R:14:1', 'test-kproto.R:65:5', 'test-luz.R:1:1', 'test-mlr3.R:3:1', 'test-mlr3.R:52:5', 'test-monitor.R:66:5', 'test-monitor.R:72:5', 'test-monitor.R:79:5', 'test-monitor.R:124:5', 'test-pin-read-write.R:3:1', 'test-pin-read-write.R:17:1', 'test-pin-read-write.R:132:5', 'test-predict.R:1:1', 'test-probably.R:48:1', 'test-probably.R:98:5', 'test-probably.R:109:1', 'test-probably.R:159:5', 'test-probably.R:170:1', 'test-probably.R:220:5', 'test-probably.R:232:1', 'test-probably.R:282:5', 'test-ranger.R:9:1', 'test-ranger.R:13:1', 'test-ranger.R:62:5', 'test-recipe.R:14:1', 'test-recipe.R:58:5', 'test-rsconnect.R:18:5', 'test-sagemaker.R:4:5', 'test-sagemaker.R:25:5', 'test-sagemaker.R:49:1', 'test-sagemaker.R:77:1', 'test-sagemaker.R:98:1', 'test-sagemaker.R:112:1', 'test-sagemaker.R:161:1', 'test-stacks.R:1:1', 'test-tidymodels.R:21:1', 'test-tidymodels.R:71:5', 'test-type-convert.R:15:1', 'test-type-convert.R:31:1', 'test-type-convert.R:47:1', 'test-write-docker.R:5:5', 'test-write-docker.R:17:5', 'test-write-docker.R:35:5', 'test-write-docker.R:52:5', 'test-write-docker.R:65:5', 'test-write-docker.R:81:5', 'test-write-docker.R:88:5', 'test-write-plumber.R:4:5', 'test-write-plumber.R:17:5', 'test-write-plumber.R:38:5', 'test-write-plumber.R:57:5', 'test-write-plumber.R:71:5', 'test-write-plumber.R:84:5', 'test-write-plumber.R:98:5' ══ Failed tests ════════════════════════════════════════════════════════════════ ── Error ('test-xgboost.R:9:1'): (code run outside of `test_that()`) ─────────── Error in `matrix(NA_real_, ncol = model$nfeatures, dimnames = list("", model$feature_names))`: non-numeric matrix extent Backtrace: ▆ 1. └─vetiver::vetiver_model(cars_xgb, "cars2") at test-xgboost.R:9:1 2. └─vetiver::vetiver_create_ptype(model, save_prototype, ...) 3. ├─vetiver::vetiver_ptype(model, ...) 4. └─vetiver:::vetiver_ptype.xgb.Booster(model, ...) 5. └─base::matrix(...) [ FAIL 1 | WARN 2 | SKIP 70 | PASS 221 ] Error: ! Test failures. Execution halted * checking for unstated dependencies in vignettes ... OK * checking package vignettes ... OK * checking re-building of vignette outputs ... [10s] OK * checking PDF version of manual ... [20s] OK * checking HTML version of manual ... [7s] OK * DONE Status: 1 ERROR