Features
Pricing
Contact
Login with:
×
Features
Pricing
Contact
Login with:
GitHub
Bitbucket
×
main
ollama-run-llama3.1-8b
955
issues
0
pull requests
533
files
1
active branch
×
CodeFactor Rating C+
{{format.Name}}
Grade
Name
Complexity
Churn
Issues
A
ml\backend\ggml\ggml\src\ggml-cuda\binbcast.cu
1
3
A
discover\gpu_info_cudart.h
0
1
1
A
app\tray\wintray\messages.go
1
-
A
llama\llama.cpp\src\llama-mmap.h
0
1
-
A
ml\backend\ggml\ggml\src\ggml-cpu\ggml-cpu-hbm.h
0
1
-
A
.golangci.yaml
2
-
A
ml\backend\ggml\ggml\include\ggml-cann.h
0
1
-
A
ml\backend\ggml\ggml\include\ggml-opt.h
0
1
1
B
ml\backend\ggml\ggml\src\ggml-cuda\convert.cu
1
14
A
llama\llama.cpp\examples\llava\llava.go
3
-
A
macapp\webpack.rules.ts
1
-
A
llm\llm_linux.go
1
-
A
ml\backend\ggml\ggml\src\ggml-cuda\arange.cuh
1
-
A
ml\backend\ggml\ggml\src\ggml-cuda\count-equal.cu
1
-
A
ml\backend\ggml\ggml\src\ggml-cuda\convert.cuh
1
-
A
ml\backend\ggml\ggml\src\ggml-cuda\argsort.cu
1
-
A
ml\backend\ggml\ggml\src\ggml-cuda\arange.cu
1
-
A
ml\backend\ggml\ggml\include\ggml-vulkan.h
0
1
-
A
macapp\src\index.html
1
-
A
llama\llama.cpp\examples\llava\llava.h
0
1
-
We use cookies in order to offer you the most relevant information. See our
Privacy Policy
.
Ok, got it!