• Features
  • Pricing
  • Contact
    Login with:
  • Features
  • Pricing
  • Contact
Login with:
main
ollama-run-llama3.1-8b
Repository badge with CPlus rating
View on GitHub

  • 955 issues
  • 0 pull requests
  • 533 files
  • 1 active branch

CodeFactor Rating C+

  • {{format.Name}}
Grade
Name
Complexity
Churn
Issues
F
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu-quants.c
245 1 58
F
llama\llama.cpp\src\llama.cpp
1147 1 143
F
Dockerfile
5 25
F
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu-aarch64.cpp
249 1 35
F
llama\llama.cpp\common\json.hpp
3317 1 79
F
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu.c
1661 1 78
F
ml\backend\ggml\ggml\src\ggml-quants.c
1064 1 48
F
llama\llama.cpp\common\stb_image.h
1621 1 25
C
ml\backend\ggml\ggml\src\ggml.c
956 1 13
C
llama\llama.cpp\src\llama-arch.cpp
11 1 19
B
openai\openai_test.go
32 2 8
B
ml\backend\ggml\ggml\src\ggml-cuda\convert.cu
1 14
B
server\routes.go
299 7 11
B
ml\backend\ggml\ggml\src\ggml-cpu\llamafile\sgemm.cpp
261 1 11
B
ml\backend\ggml\ggml\src\ggml-cpu\amx\mmq.cpp
226 1 15
B
llama\llama.cpp\examples\llava\clip.cpp
255 2 16
B
discover\gpu_linux_test.go
9 1 14
B
discover\gpu.go
120 5 13
A
api\types_test.go
13 1 -
A
api\client.go
56 1 -
  • Next Page
  • Security
  • Terms
  • Privacy
  • Contact
© 2025 CodeFactor

We use cookies in order to offer you the most relevant information. See our Privacy Policy.