• Features
  • Pricing
  • Contact
    Login with:
  • Features
  • Pricing
  • Contact
Login with:
main
ollama-run-llama3.1-8b
Repository badge with CPlus rating
View on GitHub

  • 955 issues
  • 0 pull requests
  • 533 files
  • 1 active branch

CodeFactor Rating C+

  • {{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
discover\gpu_info_darwin.h
0 1 -
A
.github\workflows\test.yaml
4 -
A
discover\gpu_info_cudart.h
0 1 1
A
.github\ISSUE_TEMPLATE\config.yml
1 -
A
.golangci.yaml
2 -
A
discover\gpu_info_nvml.h
0 1 -
A
discover\gpu_info.h
0 1 -
A
go.mod
5 1
A
.github\ISSUE_TEMPLATE\10_bug_report.yml
2 -
A
app\ollama_welcome.ps1
1 -
A
.github\workflows\release.yaml
15 -
A
.github\workflows\latest.yaml
1 -
A
discover\gpu_info_oneapi.h
0 1 1
A
discover\gpu_info_nvcuda.h
0 1 1
A
discover\path.go
2 -
A
llama\llama.cpp\common\common.go
1 -
A
llama\build-info.cpp
0 4 -
A
app\tray\wintray\messages.go
1 -
A
app\tray\commontray\types.go
1 -
F
Dockerfile
5 25
  • Next Page
  • Security
  • Terms
  • Privacy
  • Contact
© 2025 CodeFactor

We use cookies in order to offer you the most relevant information. See our Privacy Policy.