A collection of recipes/notebooks showcasing use-cases of open-source models with Qubrid AI.
-
Updated
Apr 1, 2026 - Python
A collection of recipes/notebooks showcasing use-cases of open-source models with Qubrid AI.
Testing Function Calling while running llama3.1 locally using ollama
Fuzzy Logic Toolbox with GUI and Files Analysis.
A high-performance request router for vLLM — smart load balancing, response caching, prefill/decode disaggregation, semantic routing, Anthropic/OpenAI API translation, and operational tooling out of the box.
Python codes generation from latex expressions. Using synthetic dataset and CodeT5-base model.
Classification of sperm heads based on its morphological quality via Mask-RCNN
An LLM inferencing benchmark tool focusing on device-specific latency and memory usage
Add a description, image, and links to the inferencing topic page so that developers can more easily learn about it.
To associate your repository with the inferencing topic, visit your repo's landing page and select "manage topics."