cool models backup
Collection
backing up cool models for the community
•
2 items
•
Updated
Version 1.2 of the Venus 120b lineup. Credit to nsfwthrowitaway69 the original author!
exl2-*
branches for exllama2 quantizations. The 4.85 bpw quant should fit in 80GB VRAM, and the 3.0 bpw quant should (just barely) fit in 48GB VRAM with 4k context.Warning: This model will produce NSFW content!
Better at following instructions than both v1.1 and v1.0, and doesn't seem to suffer from censorship issues. Overall the original author likes this version the most out of all the models they've created.