• Features
  • Pricing
  • Contact
    Login with:
  • Features
  • Pricing
  • Contact
Login with:
main
ollama-run-llama3.1-8b
Repository badge with CPlus rating
View on GitHub

  • 955 issues
  • 0 pull requests
  • 533 files
  • 1 active branch

CodeFactor Rating C+

  • {{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
macapp\src\index.ts
56 1 -
A
envconfig\config_test.go
21 1 1
A
macapp\src\renderer.tsx
1 -
A
ml\backend\ggml\ggml\include\ggml-opt.h
0 1 1
A
macapp\webpack.main.config.ts
1 -
A
format\bytes_test.go
6 1 -
A
macapp\webpack.renderer.config.ts
1 -
A
ml\backend\ggml\ggml\src\ggml-backend-reg.cpp
89 1 1
A
main.go
1 1 -
A
format\format_test.go
3 1 -
A
ml\backend\ggml\ggml\include\ggml-backend.h
0 1 -
A
cmd\runner\main.go
2 1 -
A
ml\backend\ggml\ggml\src\ggml-cpu\llamafile\sgemm.h
0 1 -
B
ml\backend\ggml\ggml\src\ggml-cpu\llamafile\sgemm.cpp
261 1 11
F
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu-quants.c
245 1 58
A
ml\backend\ggml\ggml\include\ggml-rpc.h
0 1 -
A
ml\backend\ggml\ggml\include\ggml-cpu.h
0 1 -
A
app\lifecycle\updater_windows.go
8 1 3
A
convert\convert_gemma2.go
2 1 1
A
llama\llama.cpp\common\log.cpp
56 1 1
  • Previous Page
  • Next Page
  • Security
  • Terms
  • Privacy
  • Contact
© 2025 CodeFactor

We use cookies in order to offer you the most relevant information. See our Privacy Policy.