Selecting local AI models for me and my fellows in network analysis field

Published: (February 16, 2026 at 11:20 PM EST)
2 min read
Source: Dev.to

Source: Dev.to

Developed in C++, often on‑site with customers. Offline work means no access to ChatGPT, Claude, Gemini, etc. To fill the gap I experimented with a handful of local LLMs. Below is a (still personal) summary of what I tried, what worked, and what didn’t.

How to Test?

  1. Create a Prompt – I first had to decide what I wanted the model to do.
  2. Extract Rankings – I pulled “rankings” from our analyzer, wrote a short description of each column, and added instructions on how to interpret the tables.
  3. Run the Model – Feed the prompt + table data to the LLM and evaluate the output.

From Google to “Open Sesame”

ModelFirst ImpressionsResult QualityNotes
Gemma‑3 (Google)Versatile, fast, lightweight (for me)Good, but sometimes missed detailsEventually abandoned because of occasional incorrect or overly‑changed answers.
Qwen‑3 (thinking model)Slower, but much better quality than Gemma‑3Consistently accurateLater switched to the instruct variant when it became available.
DeepSeek‑R1Failure – reports didn’t make sense.
LG EXAOneDecent (on par with Qwen‑3‑Instruct)Not usable commercially (license restriction)

“Man, it’s way toooooooo slow” – Enter Mistral

The Qwen‑3 thinking model gave great answers but took ages to generate them. I tried to disable its “thinking” mode, but the switch didn’t work. When the Qwen‑3‑Instruct model finally arrived, I turned my attention to Mistral AI.

ModelExperience
Mistral Small (years ago)Disappointing.
Mistral‑3Generates solid reports; rarely misses any instruction detail (unlike Gemma‑3).
DevStral‑Small 3.2Good for short code snippets; lacks FIM (Fill‑in‑the‑Middle) but still useful.
Qwen‑3‑CoderHallucinated code 100 % of the time (non‑existent STL‑ish calls).
Qwen‑3‑Coder‑NextCorrect, but overly verbose – “literary” C++ that adds no value for a small.

Note: It’s the Lunar New Year holidays (설날) in Korea, so I’ll be taking a rest for now. I hope everyone likes the new setup!

0 views
Back to Blog

Related posts

Read more »