Home Artists Posts Import Register

Content

Right now they are already working better than the last models, but still gonna take some training, maybe a week? In the mean time I'm working on the GUI and a new release.

Vram is starting to be a problem, been using a 16vram card, and is starting to run out of memory 😬

After this models is done, maybe I can think about renting a better card, we will see.

Comments

Anonymous

Damn, gonna have to upgrade my 3060! Lol!

DAINAPP

haha, no worry, this is only for training. For inference is the same vram as always.