I’ve been having loads of fun playing around with all of the new LLMs that have come out on AWS recently. Last week we saw the AWS announce support for the new gpt-oss models from Open AI.
While those models are getting a lot of the press, some are less fortunate. This week I put together a quick intro on how to deploy Qwen2.5 VL model to Amazon Bedrock.
Even if this model is not the one you end up using in your stack, the process would be similar if you wanted to import other models from HuggingFace.
If you’re looking for support with your next AI project, drop me a message or reach out to the MakeOps Team.
Resources
The code snippet mentioned is in this GitHub Gist
HuggingFace model card for Qwen2.5VL Instruct
Original blog post: Deploy Qwen models with Amazon Bedrock Custom Model Import