• Features
  • Pricing
  • Contact
    Login with:
  • Features
  • Pricing
  • Contact
Login with:
main
ollama-run-llama3.1-8b
Repository badge with CPlus rating
View on GitHub

  • 955 issues
  • 0 pull requests
  • 533 files
  • 1 active branch

CodeFactor Rating C+

  • {{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
integration\max_queue_test.go
8 1 -
A
discover\gpu_info_cudart.h
0 1 1
A
app\tray\wintray\messages.go
1 -
A
discover\gpu_info_nvcuda.c
27 1 1
A
app\lifecycle\logging_nonwindows.go
1 1 -
A
discover\gpu_info_nvml.c
10 1 -
A
app\tray\wintray\tray.go
52 1 1
A
discover\gpu_info_oneapi.c
30 1 -
A
llama\llama.cpp\common\common.go
1 -
A
discover\gpu_linux.go
32 1 1
A
app\tray\wintray\winclass.go
4 1 -
A
discover\gpu_oneapi.go
3 1 1
A
app\lifecycle\logging_windows.go
2 1 -
A
llama\llama.cpp\common\json-schema-to-grammar.cpp
271 1 7
A
llama\llama.cpp\common\common.h
3 1 -
A
integration\llm_image_test.go
2 1 1
A
format\format_test.go
3 1 -
A
app\lifecycle\paths.go
11 1 3
A
app\lifecycle\updater_windows.go
8 1 3
A
convert\convert_gemma2.go
2 1 1
  • Previous Page
  • Next Page
  • Security
  • Terms
  • Privacy
  • Contact
© 2025 CodeFactor

We use cookies in order to offer you the most relevant information. See our Privacy Policy.