Features
Pricing
Contact
Login with:
×
Features
Pricing
Contact
Login with:
GitHub
Bitbucket
×
main
ollama-run-llama3.1-8b
955
issues
0
pull requests
533
files
1
active branch
×
CodeFactor Rating C+
{{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
ml\backend\ggml\ggml\include\ggml-rpc.h
0
1
-
A
ml\backend\ggml\ggml\include\ggml-sycl.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-quants.h
0
1
-
A
ml\backend\ggml\ggml\include\ggml-vulkan.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-threading.h
0
1
-
A
ml\backend\ggml\ggml\include\ggml.h
0
1
2
A
ml\backend\ggml\ggml\src\ggml_darwin_arm64.go
3
-
A
ml\backend\ggml\ggml_debug.go
1
-
A
ml\backend\ggml\ggml\src\ggml-backend-impl.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-blas\blas.go
3
-
A
llama\build-info.cpp
0
4
-
A
ml\backend\ggml\ggml\src\ggml-common.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-cpu\amx\amx.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-cpu\amx\mmq.h
0
1
-
A
llama\llama.cpp\common\common.go
1
-
A
ml\backend\ggml\ggml\src\ggml-cpu\cpu.go
4
-
A
scripts\build_darwin.sh
8
2
A
scripts\build_docker.sh
1
-
A
scripts\build_linux.sh
2
-
A
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu-aarch64.h
0
1
-
We use cookies in order to offer you the most relevant information. See our
Privacy Policy
.
Ok, got it!