×

nf-core/llamacpppython/run @ 0.0.0-4c44634

Python wrapper for running locally-hosted LLM with llama.cpp

Latest version: 0.0.0-4c44634
Total downloads: 4
Source: nf-core/modules
Maintainers: @toniher @lucacozzuto

Summary

Python wrapper for running locally-hosted LLM with llama.cpp

Get started

Add the following snippet to your workflow script to include this module.

include { LLAMACPPPYTHON_RUN } from 'nf-core/llamacpppython/run'

License

MIT License

Process
Name LLAMACPPPYTHON_RUN
Input 1 channel
#1 tuple
meta map

Groovy Map containing sample information e.g. [ id:'sample1' ]

prompt_file file

Prompt file Structure: [ val(meta), path(prompt_file) ]

gguf_model file

GGUF model Structure: [ val(meta), path(gguf_model) ]

Output 2 channels
#1 output tuple
meta map

Groovy Map containing sample information e.g. [ id:'sample1' ]

${prefix}.txt file

File with the output of LLM inference request

#2 versions_llama_cpp_python
versions.yml file

File containing software versions

versions.yml
Tool Description Homepage
llama-cpp-python Python wrapper for llama.cpp LLM inference tool https://llama-cpp-python.readthedocs.io/en/latest/
Version 0.0.0-4c44634
Commit ID 4c44634073c8b785e34735a3bd1b3c89123bc34b
Release Date 08 May 2026 15:00:41 (UTC)
Download URL https://registry.nextflow.io/api/v1/modules/nf-core%2Fllamacpppython%2Frun/0.0.0-4c44634/download
OCI Store URL https://public.cr.seqera.io/v2/nextflow/plugin/modules/nf-core/llamacpppython/run/blobs/sha256:d5fd803930bd603a8f280b6b9567ab1ca230de8accca7b8548784fc1b5258bad
Size 5.0 KB
Checksum sha256:d5fd803930bd603a8f280b6b9567ab1ca230de8accca7b8548784fc1b5258bad
Downloads 3
Version Date Status Downloads Size Diff
0.0.0-4c44634 08 May 2026 15:00:41 (UTC) 3 5.0 KB
0.0.0-3a0ac08 07 May 2026 15:01:00 (UTC) 1 5.0 KB -