You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#Add support for Cerebras as an inference provider
About Cerebras
At Cerebras, we've developed the world's largest and fastest AI processor, the Wafer-Scale Engine-3 (WSE-3). The Cerebras CS-3 system, powered by the WSE-3, represents a new class of AI supercomputer that sets the standard for generative AI training and inference with unparalleled performance and scalability.
With Cerebras as your inference provider, you can:
Achieve unprecedented speed for AI inference workloads
Build commercially with high throughput
Effortlessly scale your AI workloads with our seamless clustering technology
The text was updated successfully, but these errors were encountered:
#Add support for Cerebras as an inference provider
About Cerebras
At Cerebras, we've developed the world's largest and fastest AI processor, the Wafer-Scale Engine-3 (WSE-3). The Cerebras CS-3 system, powered by the WSE-3, represents a new class of AI supercomputer that sets the standard for generative AI training and inference with unparalleled performance and scalability.
With Cerebras as your inference provider, you can:
The text was updated successfully, but these errors were encountered: