My chat application isn't working

Hi,
my chat application isn’t working. When I check it in my browser, I get these errors.

Which SDK are you using?

“name”: “@convai/web-sdk”,

“version”: “0.2.4”,

1 Like

I also updated the SDK. I upgraded to version 1.2.0, but the result is still the same; I type the “connecting” message but get no response.

Is there any progress? Has the problem been identified? Our manager is waiting for information from us.


The system works with models other than anthropic models. However, since we get the best results with anthropic models and have always worked with these models, this problem should be fixed. I think you have an access problem related to anthropic models.

Can you share some bit of info on what you are using react/vanilla also what fields are you passing through your connect config

Framework: React (Vite)
SDK Version: @convai/web-sdk v0.2.4
Connect Config:
apiKey: “”
characterId: “”
enableAudio: false

Widget Props:
defaultVoiceMode: false
showScreenShare: false

Connect Config:
{
apiKey: “your-api-key”,
characterId: “your-character-id”,
enableVideo: false,
}
Widget Props:
<ConvaiWidget

convaiClient={convaiClient}

defaultVoiceMode={false}
showScreenShare={false}

/>

Character responds correctly when using other LLM providers, but does not respond when using Anthropic or Meta LLM models.

SDK Version: @convai/web-sdk v0.2.4

Request Payload shows:

  • llm_provider: “dynamic”

The issue seems to be specific to Anthropic and Meta LLM integrations on Convai’s backend.

Could you please check if there’s an ongoing issue with Anthropic/Meta API connections?

Is it working on convai.com?

Yes, it is working on convai.com. We also have a WebGL (Unity) application and it works there too.

The issue is only with the Web SDK (@convai/web-sdk v0.2.4). Same character, same Anthropic/Meta LLM - works in Unity SDK and convai.com, but not in Web SDK.

I found more details on the matter. Do you have any updates Solution ?

Core API (/character/getResponse) works perfectly with Anthropic/Meta LLMs. I tested it and got responses successfully.

However, Realtime API (WebSocket/LiveKit) does not work with Anthropic/Meta LLMs.

My analysis:

The pipeline seems to break at the LiveKit → WebSocket → Client stage:

  1. Message is sent :white_check_mark: (confirmed in Network tab)

  2. LLM generates response :white_check_mark: (Core API works, so LLM is responding)

  3. Response returns through LiveKit → WebSocket :cross_mark: (this is where it fails)

This suggests the WebSocket channel is closing before receiving the LLM response. This only happens with Anthropic/Meta models - OpenAI and Gemini work fine.

Possible cause: Anthropic/Meta responses may have a different format or longer response time that the Web SDK’s LiveKit handler is not handling correctly.

Environment:

  • SDK: @convai/web-sdk v0.2.4 and v1.2.0 (tested both)

  • Works: Unity SDK, convai.com, Core API

  • Fails: Web SDK Realtime API only

Hey Can you try out the 1.3.0 version once. This issue most probably has to do with lipsycn config from the previous versions.

I tried v1.3.0 but the issue persists. Important note: Everything was working fine until Friday.

We have not identified an issue with the WebSDK itself. The same WebSDK is also used in the Playground, and on our side it is working as expected.

At this point, there does not appear to be a need for any LLM related change.

To help us investigate further, please share your Character ID, your implementation code, and all relevant logs. If this information is private, you can send it to us via DM instead.

Character ID: 390e41f6-2e96-11f1-bddc-42010a7be02c

SDK Version: @convai/web-sdk ^1.3.0 (reports as 1.2.2-beta.4 in invocation_metadata)

Implementation Code:

import React from ‘react’
import { useConvaiClient, ConvaiWidget } from ‘@convai/web-sdk/react’

export default function App() {
const convaiClient = useConvaiClient({
apiKey: “[API_KEY]”,
characterId: “390e41f6-2e96-11f1-bddc-42010a7be02c”,

enableAudio: false,
enableLipsync: true,
})

return (

<div style={{ width: 360, height: 520 }}> <ConvaiWidget convaiClient={convaiClient} defaultVoiceMode={false} showScreenShare={false} /> </div>
)
}


I’ve tried it this way too, but the problem persists. I’ve tried creating a new project and testing it.

import { ConvaiWidget, useConvaiClient } from “@convai/web-sdk”;

export function App() {

const convaiClient = useConvaiClient({

apiKey: import.meta.env.VITE_CONVAI_API_KEY,

characterId: import.meta.env.VITE_CONVAI_CHARACTER_ID,

enableVideo: false,

startWithAudioOn: false,

});

return <ConvaiWidget convaiClient={convaiClient} />; ;

}

export default App;

Pushing a fix on the backend side in about an hour. will ping here once deployed

I’m trying to chat on your new platform and I’m having the same problem there. Check your platform as well. It works when I change the LLM model. Here, anthropic/meta doesn’t work either.

Thanks, I’ll be waiting.