Features
Pricing
Contact
Login with:
×
Features
Pricing
Contact
Login with:
GitHub
Bitbucket
×
main
ollama-run-llama3.1-8b
955
issues
0
pull requests
533
files
1
active branch
×
CodeFactor Rating C+
{{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
ml\backend\ggml\ggml\include\ggml-kompute.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-cuda\vendors\musa.h
0
1
-
A
integration\basic_test.go
6
1
1
A
ml\backend\ggml\ggml\src\ggml-cuda\wkv6.cuh
1
-
A
ml\backend\ggml\ggml\include\ggml-opencl.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-metal\ggml-metal-impl.h
0
1
3
A
app\lifecycle\server_windows.go
15
1
-
A
ml\backend\ggml\ggml\src\ggml-opt.cpp
95
1
2
A
ml\backend\ggml\ggml\include\ggml-rpc.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-quants.h
0
1
-
A
integration\context_test.go
4
1
1
A
types\errtypes\errtypes.go
1
1
1
A
template\template_test.go
43
1
2
A
server\prompt_test.go
17
1
1
A
scripts\push_docker.sh
1
-
A
integration\llm_image_test.go
2
1
1
A
app\lifecycle\updater_windows.go
8
1
3
A
convert\convert_gemma2.go
2
1
1
A
llama\llama.cpp\common\log.cpp
56
1
1
F
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu.c
1661
1
78
We use cookies in order to offer you the most relevant information. See our
Privacy Policy
.
Ok, got it!