al4s@feddit.detoSelfhosted@lemmy.world•Is it possible to run a LLM on a mini-pc like the GMKtec K8 and K9?English
0·
5 months agoLLMs work by always predicting the next most likely token and LLM detection works by checking how often the next most likely token was chosen. You can tell the LLM to choose less likely tokens more often (turn up the heat parameter) but you will only get gibberish out if you do. So no, there is not.
Skill issue