You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Dec 1, 2021. It is now read-only.
Just a related question:
Is there a possibility of using multiple FPGAs for inference?
(Basically if the model is large, two FPGAs can be used for inference for two different models and the results are synced for combined output)
Just a related question:
Is there a possibility of using multiple FPGAs for inference?
(Basically if the model is large, two FPGAs can be used for inference for two different models and the results are synced for combined output)
Maybe we don't have any plan to use multiple FPGAs for inference.
Bigger FPGA is easier way to handle large model on FPGA instead of multiple FPGAs.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Now runtime only support one inference on FPGA.
But sometimes running two models is useful.
We might need the feature for that.
The text was updated successfully, but these errors were encountered: