When a User Input block is configured with a compiled Custom Interface (Interface Designer Beta) and connected to a Chat block, the submission does not hand off.

  1. What problem does this feature request solve?
    The Custom Interface (Interface Designer Beta) cannot hand off to a Chat block. When a compiled custom interface submits user input, the Chat block ignores it and resets to its own fresh interface instead of processing the submission and returning a response. The only workaround is replacing the Chat block with a Generate Text block, which eliminates conversation memory entirely.

  2. What is the use case for this feature?
    Anyone building a branded companion, coach, or assistant app needs both a custom visual entry experience AND a continuous conversation. These are not optional extras — they are the core of what makes a companion app feel like a companion. Right now you have to choose one or the other.

  3. Please describe the functionality of this feature request.
    When a User Input block with a compiled Custom Interface is connected to a Chat block, the submitted variable should be passed into the Chat block as the first user message — triggering the RAG pipeline and returning a response exactly as if the user had typed it directly into the Chat block’s own input. The Chat block should treat the handoff as the conversation opener, not a reset trigger.

  4. Is there anything else we should know?
    The Interface Designer is genuinely impressive. The Chat block with RAG is genuinely powerful. Right now they can’t talk to each other and that gap makes both significantly less useful. I don’t think I’m the only one hitting this wall.

Hi @rwfelty,

Welcome to the community!

The Custom Interface doesn’t currently hand off directly to a Chat block, but there are two workarounds that can help:

  1. Add a Generate Text block between your User Input block and your Chat block. Set its Output Behavior to Display to User and its Chat History Behavior to Include. The Generate Text block handles the first response and adds it to the conversation history, so by the time the Chat block runs, it already has context to continue from

  2. Save the Generate Text output to a variable, display it via a Generate Asset block with a button to continue to chat, and reference your variables in the System Prompt. The Chat block that follows will have access to everything in the System Prompt, so it can answer users’ questions based on that context.

Here’s an example of the second approach:
https://app.mindstudio.ai/agents/sample-data-source-query-template-2366e4fc/remix

I’ve also filed a feature request for the use case you’ve described. Appreciate you taking the time to outline the details!