What happens after you close the browser? Is the model still stored locally?
Source code: https://github.com/nadchif/in-browser-llm-inference
fantastic tool!, is the limit on the length of the prompt text model dependent? or can be tweaked on the github repo?
it turn out to be a limit on the html, you can change the maxlength="512" with the chrome console to fit your text
nice tool, can see this actually being useful especially in those times where sites go down for no reason
Very nice
Thanks :)
What happens after you close the browser? Is the model still stored locally?
Source code: https://github.com/nadchif/in-browser-llm-inference
fantastic tool!, is the limit on the length of the prompt text model dependent? or can be tweaked on the github repo?
it turn out to be a limit on the html, you can change the maxlength="512" with the chrome console to fit your text
nice tool, can see this actually being useful especially in those times where sites go down for no reason
Very nice
Thanks :)