Features
Pricing
Contact
Login with:
×
Features
Pricing
Contact
Login with:
GitHub
Bitbucket
×
main
ollama-run-llama3.1-8b
955
issues
0
pull requests
533
files
1
active branch
×
CodeFactor Rating C+
{{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
scripts\build_windows.ps1
4
-
A
ml\backend\ggml\ggml\src\ggml-cpu\cpu.go
4
-
A
llama\sampling_ext.h
0
3
-
A
llama\mllama.cpp
61
3
3
A
llama\llama.cpp\examples\llava\llava.go
3
-
A
ml\backend\ggml\ggml\src\ggml-blas\blas.go
3
-
A
llama\llama_test.go
10
3
-
A
server\images.go
158
3
4
A
server\create.go
154
3
4
A
ml\backend\ggml\ggml\src\ggml_darwin_arm64.go
3
-
A
ml\backend\ggml\ggml\src\ggml-metal\metal.go
3
-
A
llama\llama.cpp\src\llama.go
3
-
A
convert\convert.go
40
3
1
A
openai\openai.go
137
3
4
A
discover\amd_windows.go
26
3
8
A
convert\convert_test.go
63
3
1
A
parser\expandpath_test.go
16
3
2
A
ml\backend\ggml\ggml\src\ggml-cpu\llamafile\llamafile.go
3
-
A
cmd\interactive.go
107
3
2
A
cmd\cmd_test.go
75
3
5
We use cookies in order to offer you the most relevant information. See our
Privacy Policy
.
Ok, got it!