$\begingroup$

I need a general purpose large language model which I would like to implement on my own. The issue is that I have one 32 GB GPU. Clearly I cannot use very large (10B+ parameters) models. Can someone suggest any such open source models? I know of a few like LaMma2 and Fairseq but I want to do a through capability analysis on differnet such LLMs and find out which will work best with my GPU.

$\endgroup$