Features
Pricing
Contact
Login with:
×
Features
Pricing
Contact
Login with:
GitHub
Bitbucket
×
main
ollama-run-llama3.1-8b
955
issues
0
pull requests
533
files
1
active branch
×
CodeFactor Rating C+
{{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
discover\cpu_common.go
4
1
-
A
llama\llama.cpp\src\llama-quant.h
0
1
-
A
app\lifecycle\lifecycle.go
15
1
3
A
llama\llama.cpp\src\llama-sampling.h
0
1
-
A
app\tray\wintray\eventloop.go
43
1
2
A
llama\llama.cpp\src\llama.go
3
-
A
discover\gpu_info.h
0
1
-
A
llama\llama.cpp\src\unicode-data.h
0
1
-
A
api\examples\chat\main.go
3
1
1
A
llama\llama.cpp\src\unicode.h
3
1
-
A
discover\gpu_info_cudart.h
0
1
1
A
llama\llama_test.go
10
3
-
A
app\tray\wintray\messages.go
1
-
A
llama\llama.cpp\common\base64.hpp
43
1
1
A
convert\convert_bert.go
24
1
1
A
llama\llama.cpp\common\common.cpp
265
1
4
A
llama\llama.cpp\common\common.go
1
-
A
app\lifecycle\updater_nonwindows.go
1
1
-
A
convert\convert_commandr.go
4
1
-
A
llama\llama.cpp\common\common.h
3
1
-
We use cookies in order to offer you the most relevant information. See our
Privacy Policy
.
Ok, got it!