You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/src/moe.md
+1-1
Original file line number
Diff line number
Diff line change
@@ -5,7 +5,7 @@
5
5
6
6
The Mixture of Experts (MOE) Surrogate model represents the interpolating function as a combination of other surrogate models. SurrogatesMOE is a Julia implementation of the [Python version from SMT](https://smt.readthedocs.io/en/latest/_src_docs/applications/moe.html).
7
7
8
-
MOE is most useful when we have a discontinuous function. For example, let's say we want to build a surrogate for:
8
+
MOE is most useful when we have a discontinuous function. For example, let's say we want to build a surrogate for the following function:
0 commit comments