liltom-eth/llama2-webui - Gource visualisation

Channel:
Subscribers:
3,030
Published on ● Video Link: https://www.youtube.com/watch?v=ape1n8MJq-M



Duration: 0:17
15 views
0


Url: https://github.com/liltom-eth/llama2-webui
Author: liltom-eth
Repo: llama2-webui

Description: Run Llama 2 locally with gradio UI on GPU or CPU from anywhere (Linux/Windows/Mac). Supporting Llama-2-7B/13B/70B with 8-bit, 4-bit. Supporting GPU inference (6 GB VRAM) and CPU inference.
Starred: 560
Forked: 51
Watching: 6
Total commits: 22
Initial commit: Wed Jul 19 19:03:39 2023 -0700
Total number of files: 13
Total number of lines: 1575