MonikaV1
Collection
My first group of Monika finetunes for MonikAI
โข
14 items
โข
Updated
โข
4
This model is designed to be used with MonikAI. It is now in the GGUF format. https://github.com/Rubiksman78/MonikA.I
I redid the training with some new settings to make it slightly more coherent.
RP
Do whatever you want
Should really only be used for Monika related purposes.
Modified version of this dataset that is included with MonikAI: https://github.com/Rubiksman78/MonikA.I/tree/main/Monika_datasets
Trained with Axolotl on my Blackwell Pro 6000 Max-Q. 64 rank, 8 alpha, 2 epochs. Took about 30 minutes. Used an LR of 0.000005 instead of 0.0002.
It works.
Download it.
4-bit
6-bit
8-bit
16-bit
Base model
mistralai/Mistral-Nemo-Base-2407