Perplexity LLM Not working (API Issue?)

Using Perplexity as the LLM in a block is not working.

I think there is an issue with the API connection?

The error reads: Model Error: llama-3.1-
Sonar-small-128k-online Invalid model ‘llama-3.1-
Sonar-small-128k-online’. Permitted models can be found in the documentation at Home - Perplexity

OR: It will just read “LLM Error”

Using this agent: https://app.mindstudio.ai/agents/55752855-20d2-4cac-86bd-17b3a0206efd/edit

But I think this is a universal error right now.

Perplexity models were deprecated by provider - we’ve deprecated and flipped over on MindStudio as well. Thanks!