This website requires JavaScript.
Explore
Help
Sign In
ExternalVendorCode
/
whisper.cpp
Watch
1
Star
0
Fork
0
You've already forked whisper.cpp
mirror of
https://github.com/ggerganov/whisper.cpp.git
synced
2025-02-15 23:02:08 +00:00
Code
Issues
Actions
Packages
Projects
Releases
Wiki
Activity
whisper.cpp
/
ggml
History
uvos
c49ee07ff4
HIP: add GGML_CUDA_CC_IS_* for amd familys as increasing cc archtectures for amd gpus are not supersets of eatch other (llama/11601)
...
This fixes a bug where RDNA1 gpus other than gfx1010 where not handled correctly
2025-02-03 22:00:57 +02:00
..
include
CUDA: use mma PTX instructions for FlashAttention (llama/11583)
2025-02-03 22:00:57 +02:00
src
HIP: add GGML_CUDA_CC_IS_* for amd familys as increasing cc archtectures for amd gpus are not supersets of eatch other (llama/11601)
2025-02-03 22:00:57 +02:00
.gitignore
whisper : reorganize source code + improve CMake (
#2256
)
2024-06-26 19:34:09 +03:00
CMakeLists.txt
cmake: add ggml find package (llama/11369)
2025-02-03 22:00:57 +02:00