looks like it’s good, but I already have something that works, so not changing
Not exactly an application, (well, there is manga_ocr I suppose) but I have discovered a truly awesome use for LLMs.
Translation.
With the right LLM we can get quality comparable to deepl (better than google translate) for back and forth translation between a lot of languages.
And the seemingly best model for this currently is this:
Which is an uncensored version of this:
Because we don’t want our translator to pretend impolite text is polite lol (wtf is wrong with the morons censoring translation tools?)
It requires about 8GB VRAM to run purely on GPU (if you have less you can still run it but it will be slower)
Using a mix of ollama and bash I’ve worked out a nice way to integrate this into my terminal (I opted to put this stuff in my bash aliases instead of making it a script), I added some comments so that you can easily understand it.
# Start ollama server in the background
_run_ollama(){
if [ "$(curl -s http://localhost:11434 )" != "Ollama is running" ]; then
export OLLAMA_KEEP_ALIVE=30 # Timer in seconds of inactivity before unloading LLM to free up VRAM
export OLLAMA_FLASH_ATTENTION=1
export OLLAMA_CONTEXT_LENGTH=32768 # 4 * this = max characters in prompt.
ollama serve > /dev/null 2>&1 &
sleep 1
echo "$!"
fi
}
# Translate text to English
translate-text() {
local MODEL='hy-mt-c-7b-abl:Q8' # https://huggingface.co/mradermacher/Huihui-Hunyuan-MT-Chimera-7B-abliterated-GGUF
local PROMPT="Translate the following text to English: $@"
local OLLAMA=$(_run_ollama)
ollama ps | awk -v exclude="$MODEL" 'NR>1 && $1 != exclude {print $1}' | xargs -I {} ollama stop {}
ollama run "$MODEL" "$PROMPT"
}
# Translate text files, and save to a subdirectory under their original names
translate-file() {
local OUTDIR='translated'
local OLLAMA=$(_run_ollama)
mkdir -p "$OUTDIR"
for file in $@; do
if [[ -f "$file" ]]; then
echo "$file: $(cat "$file")"
translate-text "$(cat "$file")" | tee "$OUTDIR/$file"
fi
done
}
Also I created 2 more handy functions, one to shoot a quick question at an ai from the terminal without all the usual fuss, and a more elaborate translation method
# Ask a local LLM something from the terminal
# The model I am using here requires 16+GB VRAM for palatable speed. But it is probably the absolute best general purpose model you can run on a 16GB GPU at time of writing.
query(){
local MODEL='g3-27b-abl:Q4' # https://ollama.com/pidrilkin/gemma3_27b_abliterated
local PROMPT="$@"
local OLLAMA=$(_run_ollama)
# Stop any other model to free up vram
ollama ps | awk -v exclude="$MODEL" 'NR>1 && $1 != exclude {print $1}' | xargs -I {} ollama stop {}
ollama run "$MODEL" "$PROMPT"
echo $OLLAMA
}
# Compare translations from multiple models, maybe i'll add google translate and deepl as well later
translate-text-analyze(){
local OLLAMA=$(_run_ollama)
local MODELS=('hy-mt-c-7b-abl:Q8' 'floppa-g3-12b-unc:Q8' 'translategemma-12b-it:Q8' 'g3-27b-abl:Q4')
local PROMPT="Translate the following text to English: $@"
local PROMPT_ALT="Translate the following text to English, return only the translation and nothing else: $@"
local PROMPT_ROMAJI="Convert this to romaji: $@"
for model in "${MODELS[@]}"; do
ollama ps | awk 'NR>1 {print $1}' | xargs -I {} ollama stop {}
echo "$model:"
if [[ "$model" == *"translategemma"* ]] || [[ "$model" == *"g3-27b-abl"* ]]; then
ollama run "$model" "$PROMPT_ALT"
else
ollama run "$model" "$PROMPT"
fi
done
if echo "$@" | perl -CSD -ne 'exit !/\p{Hiragana}|\p{Katakana}|\p{Han}/' 2>/dev/null; then
echo "Romaji:"
ollama run "${MODELS[-1]}" "$PROMPT_ROMAJI"
fi
}
Lastly, I have also discovered manga_ocr which is an OCR specifically for grabbing japanese text, for this task it is way better than tesseract although it’s not perfect.
WIth this combination I have the beginnings of a process for translating stuff, in particular from japanese to english

I am probably gonna refine it more though, manga_ocr is good at reading the screen, but it’s shit in every other way (i mean are you kidding me? you just run it in the background and it monitors your clipboard, I’m gonna have to tweak it so that I can just screenshot the manga and get the text straight to clipboard or something, but yeah)
Finally something the LLMs are indisputably good at. And doing this is generally gonna turn out better than using an official translation, because the LLM is just translating the text and not injecting it’s political opinions into it.
