• Features
  • Pricing
  • Contact
    Login with:
  • Features
  • Pricing
  • Contact
Login with:
main
ollama-run-llama3.1-8b
Repository badge with CPlus rating
View on GitHub

  • 955 issues
  • 0 pull requests
  • 533 files
  • 1 active branch

CodeFactor Rating C+

  • {{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
llm\server.go
233 10 9
B
ml\backend\ggml\ggml\src\ggml-cpu\amx\mmq.cpp
226 1 15
A
ml\backend\ggml\ggml\src\ggml-alloc.c
206 1 3
A
cmd\cmd.go
204 6 6
A
llama\llama.cpp\src\llama-model-loader.cpp
196 1 4
A
llama\llama.cpp\src\unicode.cpp
193 1 6
A
llama\llama.cpp\src\llama-chat.cpp
189 1 2
A
llm\gguf.go
162 1 6
A
parser\parser.go
160 6 6
A
server\images.go
158 3 4
A
server\create.go
154 3 4
A
openai\openai.go
137 3 4
A
server\sched.go
132 1 10
A
llama\llama.cpp\src\llama-kv-cache.cpp
124 1 2
A
llama\runner\runner.go
124 4 9
A
readline\buffer.go
123 2 2
B
discover\gpu.go
120 5 13
A
llm\ggml.go
116 1 2
A
cmd\interactive.go
107 3 2
A
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu.cpp
102 1 1
  • Previous Page
  • Next Page
  • Security
  • Terms
  • Privacy
  • Contact
© 2025 CodeFactor

We use cookies in order to offer you the most relevant information. See our Privacy Policy.