×

Nextflow Modules

Clear

Showing module(s) with keyword "inference"

Module Keywords Description
nf-core/huggingface/download download gguf huggingface hub inference llm model Download a file from a Hugging Face Hub repository using the `hf` CLI
nf-core/llamacpppython/run inference llama llm local-inference offline-llm Python wrapper for running locally-hosted LLM with llama.cpp