Where's the "Prompting Expert Mode"? #4437
-
I was just started using Dify and I saw that there's this "Prompting Expert Mode" here, but I can't find it anywhere in the latest version of Dify. And after a quick search I found out that the expert mode is removed.
Any help is appreciated. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
You can use LLM block in chatflows and workflows. In LMM block you can implement SYSTEM / USER and ASSISTANT prompts |
Beta Was this translation helpful? Give feedback.
Thank you for the reply. Yes, the LLM block allows input for those prompts but they're high-level implementations. I want to have direct control over the raw prompt that will be send to the LLM.
I researched for a bit and it looks like Dify just makes API calls to inference backends and probably don't have control over those. I'm using Xinference, I guess I need to find out how to set the prompt template on there.