### What would you like to see? - In some cases users may want to directly output the response from an agent flow directly to the chat window - This is useful for smaller models that are not great at processing lots of data being fed to them - Prevents the LLM from processing the data before sending it back to the user