python313Packages.exllamav2
Inference library for running LLMs locally on modern consumer-class GPUs
- Name
- exllamav2
- Homepage
- Version
- 0.3.2
- License
- Maintainers
- Platforms
- x86_64-windows
- x86_64-linux
- Defined
- Source
Inference library for running LLMs locally on modern consumer-class GPUs