Releases: RPBCACUEAIIBH/HexaPA
Releases · RPBCACUEAIIBH/HexaPA
Reference Injection and 16K context length
Changes:
- Reference injection - It starts the prompt with a section definded on the "Rules" screen, which can be used to refer to the rules, thus the AI is more likely to follow the rules.
- Switched to the latest version of the GPT-3.5 model, that came out yesterday, and also added support for the new 16k variant with 4x the context length. To switch to 16K variant, define an input token limit greater then 2048. (It should work up to 8192, since HexaPA only does even split of input and output tokens for now.)
- Rules can now be exported to JSON as presets, but can not yet be used as presets. (Unfinished feature, may change...)
Seems functional on a basic level.
It's about as functional as ChatGPT, but you can already set token and context limits, you can set some rules for the AI, as well as generate context automatically, or select it manually.
(The rules may or may not be obeyed, still working on that...)