Yeah, unfortunately it seems this can’t be converted to a llama.cpp compatible format yet, and that’s pretty big a tradeoff right now. Not surprising with how new it is, but we’ll have to wait to combine it with other improvements. Pretty exciting for the future though.
Yeah, unfortunately it seems this can’t be converted to a llama.cpp compatible format yet, and that’s pretty big a tradeoff right now. Not surprising with how new it is, but we’ll have to wait to combine it with other improvements. Pretty exciting for the future though.