Skip to content

Commit 2f89acb

Browse files
minor change to text
1 parent 2349678 commit 2f89acb

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/src/moe.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@
55

66
The Mixture of Experts (MOE) Surrogate model represents the interpolating function as a combination of other surrogate models. SurrogatesMOE is a Julia implementation of the [Python version from SMT](https://smt.readthedocs.io/en/latest/_src_docs/applications/moe.html).
77

8-
MOE is most useful when we have a discontinuous function. For example, let's say we want to build a surrogate for:
8+
MOE is most useful when we have a discontinuous function. For example, let's say we want to build a surrogate for the following function:
99

1010
### 1D Example
1111

0 commit comments

Comments
 (0)