For 4 bit quantization, cogvlm-chat-v1.1, how much RAM and num of CPU do I need? #389
-
Tried with: |
Beta Was this translation helpful? Give feedback.
Answered by
zRzRzRzRzRzRzR
Apr 24, 2024
Replies: 2 comments 1 reply
-
[2024-02-29 01:25:24,679] [INFO] [RANK 0] global rank 0 is loading checkpoint /hy-tmp/cogvlm-chat-v1.1/1/mp_rank_00_model_states.pt got killed too |
Beta Was this translation helpful? Give feedback.
0 replies
-
Any recommended CPU and RAM? |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
这个应该是能跑得动吧(32G 需要 int4,建议使用A100 40G以上的)你CPU换好一点试试