I have a docker installation of Llama - rubbish.
I tried gemini - if any thing worse, with added censorship, I asked it to translate “ass” (as in donkey) it won’t!
So far I can do better with a search engine
I have a docker installation of Llama - rubbish.
I tried gemini - if any thing worse, with added censorship, I asked it to translate “ass” (as in donkey) it won’t!
So far I can do better with a search engine
LLMs just aren’t that smart, so I doubt that. They can work well in certain scenarios, but expecting the world from them isn’t ideal. Think of them as auto complete on steroids or a lazy person’s summary of a given topic, not the revolutionary next step in computing as a whole.
Agreed, was just playing, Claude is the same a Gemini, prudish and only up to April 2024, so don’t ask anything current
Try Uncensored Models
For example
Occasionally I refer to them as a naive brute force approach, especially in terms of the training requirements.
And as they’re optimized to generate fast responses, the first and quick answer isn’t necessarily the best solution. Just imagine a team of leading scientists which have to answer detailed questions… precisely.
From my point of view, playing around with them without a serious intend or such. Is essentially a waste of computation time. Which may not affect your power bill directly. But it’s wasteful.
Spicy auto complete
I believe there are uses which a LLM can come in handy. The biggest of which is coding actually. I have tried JetBrains’ implementation of that and it has been pretty good overall. It still makes mistakes, but it removes all of the boring boilerplate some languages have.
I do overall agree that LLMs, at the moment, don’t have much of a reason to exist.
No, there aren’t if for no other reason than the datasets they’re “taught” are trash.
neeva was the only one I found pretty decent. But it gave “unpopular” (correct) answers often, so it was shut down. Because reasons.
This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.