We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
After updating mlx-vlm to version 0.1.14 the following error appears
mlx-vlm
0.1.14
ValueError: Unrecognized image processor in /Users/ljoana/.cache/huggingface/hub/models--mlx-community--Qwen2.5-VL-3B-Instruct-bf16/snapshots/94c621f1696af5012c72e3c42c895aedf136e74c. Should have a `image_processor_type` key in its preprocessor_config.json of config.json, or one of the following `model_type` keys in its config.json: align, aria, beit, bit, blip, blip-2, bridgetower, chameleon, chinese_clip, clip, clipseg, conditional_detr, convnext, convnextv2, cvt, data2vec-vision, deformable_detr, deit, depth_anything, depth_pro, deta, detr, dinat, dinov2, donut-swin, dpt, efficientformer, efficientnet, flava, focalnet, fuyu, git, glpn, got_ocr2, grounding-dino, groupvit, hiera, idefics, idefics2, idefics3, ijepa, imagegpt, instructblip, instructblipvideo, kosmos-2, layoutlmv2, layoutlmv3, levit, llava, llava_next, llava_next_video, llava_onevision, mask2former, maskformer, mgp-str, mllama, mobilenet_v1, mobilenet_v2, mobilevit, mobilevitv2, nat, nougat, oneformer, owlv2, owlvit, paligemma, perceiver, pix2struct, pixtral, poolformer, pvt, pvt_v2, qwen2_5_vl, qwen2_vl, regnet, resnet, rt_detr, sam, segformer, seggpt, siglip, superglue, swiftformer, swin, swin2sr, swinv2, table-transformer, timesformer, timm_wrapper, tvlt, tvp, udop, upernet, van, videomae, vilt, vipllava, vit, vit_hybrid, vit_mae, vit_msn, vitmatte, xclip, yolos, zoedepth
This is already being discussed In the transformers repository in the following open issue.
transformers
The text was updated successfully, but these errors were encountered:
Hey @JoeJoe1313
Thanks for reporting!
As discussed it's an issue with transformers and will be fixed on their minor release.
Closing this issue for now as we already have #209
Sorry, something went wrong.
No branches or pull requests
After updating
mlx-vlm
to version0.1.14
the following error appearsThis is already being discussed In the
transformers
repository in the following open issue.The text was updated successfully, but these errors were encountered: