Selecting local AI models for me and my fellows in network analysis field
Source: Dev.to
Developed in C++, often on‑site with customers. Offline work means no access to ChatGPT, Claude, Gemini, etc. To fill the gap I experimented with a handful of local LLMs. Below is a (still personal) summary of what I tried, what worked, and what didn’t.
How to Test?
- Create a Prompt – I first had to decide what I wanted the model to do.
- Extract Rankings – I pulled “rankings” from our analyzer, wrote a short description of each column, and added instructions on how to interpret the tables.
- Run the Model – Feed the prompt + table data to the LLM and evaluate the output.
From Google to “Open Sesame”
| Model | First Impressions | Result Quality | Notes |
|---|---|---|---|
| Gemma‑3 (Google) | Versatile, fast, lightweight (for me) | Good, but sometimes missed details | Eventually abandoned because of occasional incorrect or overly‑changed answers. |
| Qwen‑3 (thinking model) | Slower, but much better quality than Gemma‑3 | Consistently accurate | Later switched to the instruct variant when it became available. |
| DeepSeek‑R1 | Failure – reports didn’t make sense. | — | — |
| LG EXAOne | Decent (on par with Qwen‑3‑Instruct) | Not usable commercially (license restriction) | — |
“Man, it’s way toooooooo slow” – Enter Mistral
The Qwen‑3 thinking model gave great answers but took ages to generate them. I tried to disable its “thinking” mode, but the switch didn’t work. When the Qwen‑3‑Instruct model finally arrived, I turned my attention to Mistral AI.
| Model | Experience |
|---|---|
| Mistral Small (years ago) | Disappointing. |
| Mistral‑3 | Generates solid reports; rarely misses any instruction detail (unlike Gemma‑3). |
| DevStral‑Small 3.2 | Good for short code snippets; lacks FIM (Fill‑in‑the‑Middle) but still useful. |
| Qwen‑3‑Coder | Hallucinated code 100 % of the time (non‑existent STL‑ish calls). |
| Qwen‑3‑Coder‑Next | Correct, but overly verbose – “literary” C++ that adds no value for a small. |
Note: It’s the Lunar New Year holidays (설날) in Korea, so I’ll be taking a rest for now. I hope everyone likes the new setup!