• Features
  • Pricing
  • Contact
    Login with:
  • Features
  • Pricing
  • Contact
Login with:
main
ollama-run-llama3.1-8b
Repository badge with CPlus rating
View on GitHub

  • 955 issues
  • 0 pull requests
  • 533 files
  • 1 active branch

CodeFactor Rating C+

  • {{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
discover\amd_windows.go
26 3 8
A
discover\amd_hip_windows.go
25 1 1
A
llm\ggla.go
25 1 1
A
integration\concurrency_test.go
25 1 3
A
server\prompt.go
24 2 3
A
convert\convert_bert.go
24 1 1
A
server\layer.go
23 1 -
A
model\pixtral\imageproc_test.go
22 1 1
A
discover\types.go
22 2 3
A
envconfig\config_test.go
21 1 1
A
server\modelpath.go
21 2 -
A
discover\gpu_info_cudart.c
20 1 1
A
llama\runner\stop.go
20 1 -
A
llama\llama.cpp\src\llama-kv-cache.h
19 1 -
A
progress\progress.go
19 1 -
A
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu-impl.h
19 1 1
A
llama\llama.cpp\src\llama-hparams.cpp
18 1 -
A
convert\tokenizer_spm.go
18 1 -
A
discover\cuda_common.go
18 1 1
A
model\imageproc\images_test.go
18 1 -
  • Previous Page
  • Next Page
  • Security
  • Terms
  • Privacy
  • Contact
© 2025 CodeFactor

We use cookies in order to offer you the most relevant information. See our Privacy Policy.