-
-
Notifications
You must be signed in to change notification settings - Fork 18.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allows CustomFunction to be set as an Ending Node. #1591
Conversation
interesting, what is your use case to have custom function as ending node? |
@HenryHengZJ This is the case. My chatflow uses llm to generate SQL, input it to customFunction, then send it to the database, and finally construct the final JSON result from this customFunction. |
@HenryHengZJ and @mokeyish , could this PR also solve this additional use case?: |
This is interesting. It allows Flowise to return a custom API output! I can see a number of useful benefits from consumption by other APIs or custom chat handling. Something I can think of for a current project I'm working on would be to return an additional attribute(s) in the payload so that I can then perform additional logic on the client based upon output beyond the "text" value returned to the chat bot.
I feel like it might make sense to include an additional API entry point distinct from prediction. Similar to internal-prediction, an additional variable could be added to buildChatflow: isCustomOutput. By doing this, we should be able to simplify the logic in buildChatflow as well; reducing the changes to packages/server/src/index.ts and allowing for additional validation logic to be added in the future (e.g. define a flow as custom output and only allow it to be called from the new entry point and not from chat by mistake <<< not something needed to launch a feature like this, but someone certainly could make the mistake of trying to chat with a custom output). Perhaps something along the lines of /api/v1/custom-prediction/:id would work? @mokeyish Minor ts adjustment on the change to packages/components/nodes/utilities/CustomFunction/CustomFunction.ts. The value of isEndingNode on line 131 ends up being iffy (boolean | 0 | undefined) instead of just boolean. While 0 and undefined are both falsey, typecasting to boolean and using ternary logic makes the boolean clear. We can also leave off the check of Object.keys().length since we're already checking that nodeData.outputs exists and it can only be undefined or an Object. An object with no attributes will still return an array to Object.values().
Also, I'm happy to help out on this PR if needed. |
feefcba
to
9ef85e4
Compare
@Jaredude Thanks, I've updated it. But this is more concise and can also infer the boolean type. |
It seems like it should be possible. Can you pull the code from this branch and test it? |
@dkindlund I think for your use case, we cant do that yet, because you cant take the output from agent to custom function, only llmchain at the moment @mokeyish exact same thought. thats why we are working on a workflow feature which allow more sequential flow. that way you have a seperate endpoint having said that, as a near term solution, we can use custom function as ending node |
@Jaredude good spot, cus we are hitting both |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
in the init function, add a check to see if the output is Ending Node
, if yes, dont execute sandbox stuff, this way we can prevent running both init
and run
twice
@mokeyish better yet, I think we can dual purpose the logic at the top...
Then the logic on line 1659 is already taken care of. I'll add comments on the PR code. |
packages/components/nodes/utilities/CustomFunction/CustomFunction.ts
Outdated
Show resolved
Hide resolved
@HenryHengZJ @Jaredude please review again. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I ran this through two separate LLM Chain flows and received the expected outcome. Here's a screenshot.
I also tested validations such as no end node, multiple JS functions, JS code that errors, and also existing node validation (e.g. LLM Chain with an Output Prediction and no end node). Everything is working as expected!
awesome, thanks guys! |
closes #1590
With this PR, we can set CustomFunction as an Ending Node.