Poe on X: "Now on Poe: Mixtral 8x22B! This bot, hosted and fine-tuned by @FireworksAI_HQ, is among the first instruct-tuned variants of the new Mixtral 8x22B model. (1/2) https://t.co/oNwaey3Q9J" / X
Poe on X: "Groq-powered inference for Mixtral is now available on Poe! You can use Mixtral-8x7b-Groq and experience 400 token/second responses. (1/2) https://t.co/2oyM54WWjK" / X
Poe on X: "We’ve added Mixtral 8x7B as a base model for bot creation! All Poe users can now create prompt bots on Mixtral-8x7B-Chat, a chat fine-tuned version of one of the strongest open source models. We can’t wait to see what you build! https://t.co/IYzQqCRRaq" / X