• Features
  • Pricing
  • Contact
    Login with:
  • Features
  • Pricing
  • Contact
Login with:
main
ollama-run-llama3.1-8b
Repository badge with CPlus rating
View on GitHub

  • 955 issues
  • 0 pull requests
  • 533 files
  • 1 active branch

CodeFactor Rating C+

  • {{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
server\routes_test.go
101 3 3
A
server\routes_generate_test.go
100 3 6
A
llama\llama.go
99 9 1
A
ml\backend\ggml\ggml\src\ggml-opt.cpp
95 1 2
A
llama\llama.cpp\src\llama-batch.cpp
94 1 2
A
ml\backend\ggml\ggml\src\ggml-cpu\cpu-feats-x86.cpp
92 1 1
A
ml\backend\ggml\ggml\src\ggml-backend-reg.cpp
89 1 1
A
discover\amd_linux.go
89 2 8
A
llama\llama.cpp\src\llama-mmap.cpp
87 1 -
A
template\template.go
85 1 4
A
llama\llama.cpp\common\sampling.cpp
82 1 2
A
server\download.go
81 4 1
A
server\sched_test.go
78 1 4
A
api\types.go
75 5 4
A
cmd\cmd_test.go
75 3 5
A
readline\readline.go
74 2 2
A
server\routes_create_test.go
73 3 4
A
types\model\name.go
69 2 -
A
llm\filetype.go
67 1 3
A
convert\convert_test.go
63 3 1
  • Previous Page
  • Next Page
  • Security
  • Terms
  • Privacy
  • Contact
© 2025 CodeFactor

We use cookies in order to offer you the most relevant information. See our Privacy Policy.