An award-winning devportal is more than words
Source: Dev.to

“A reflection on 3 years as a Head of Developer Content.”
The Viam docs cover a complex and large developer platform, SDKs in multiple languages, and APIs that move at the speed of, well, a startup. For that, the docs have received several awards.
But where did it start? This Web Archive capture shows what they looked like three years ago, before I started.
My team and I got to work, and a year after that humble starting point the docs had transformed into something worth reading. We won multiple awards at the Devportal Awards, including the Best Overall SME Devportal award — see the LinkedIn post. We were on the right track.
Improving the docs website
A good docs portal isn’t just about technical writing; it’s also a website that users interact with daily. With that in mind, we added several usability features.
-
Dynamic elements for filtering changelog entries, tutorials, and modules (the latter long before the product).
A combination of Typesense (an open‑source search engine) and Algolia’s InstantSearch.js (an open‑source UI library) let us add these components to an otherwise static site. -
In‑text glossary items that show more info on hover.
Hugo is a wonderful framework for creating docs sites when you want customisation without a full front‑end framework. I leaned on the work the Kubernetes writers built (also using Hugo). -
GIFs to show features & hardware in action.
I decided early on to rely on imagery to bring the product to life, because hardware actuation is best conveyed visually. With Hugo I added shortcodes that serve GIFs as bandwidth‑friendly videos, and I used Hugo extensively to enforce SEO rules.
Automation with GitHub Actions
Keeping the documentation surface area up‑to‑date for a complex platform while also maintaining a user‑facing web application is no small feat for a small team. We relied heavily on automation, primarily via GitHub Actions:
-
Style‑guide checking with Vale – a linter focused on prose. After creating regexes that cover our style‑guide rules, Vale automatically flags violations. I first discovered Vale at MongoDB and used it to write the rules that are still in use today (my Vale repo; later released by the community as the mongodb‑vale‑action).
-
Code‑sample testing – I built a testing suite that places code samples in files with project setup/teardown and markup to delineate each sample. GitHub Actions and Bash scripts then run the tests and generate the final samples from the markup. The upfront effort paid off: end‑to‑end tests catch API changes and outages, giving us confidence in the published code.
-
Dead‑link detection & forwarding –
htmltestis great for broken‑link detection, but handling links that have moved required extra work. I solved this with a Netlify integration (the “No more 404s” plugin). Netlify also helped us move from HTML‑level redirects to proper HTTP‑level forwarding.
Generative AI
We also experimented heavily with generative AI. I built a proof‑of‑concept tool that updates docs based on information from a PR, and later discovered a vendor that does this even better. While AI‑generated content still requires rigorous review, it’s a promising direction for the future. AI chat models have become far more advanced, and I was surprised by how useful they can be—provided they’re used with diligence.
Emergence in Fact‑Checking
I find myself wishing that people viewed generative‑AI output as a different kind of search, because that perspective keeps the need for critical review front‑and‑center. As an industry we still have work to do improving user literacy around how generative AI works—users too often expect capabilities it doesn’t have and even entrust AI chats with passwords and secrets.
The learnings from our AI usage could easily fill multiple blog posts, so I’ll keep it brief: if you’re looking for pre‑built, embeddable AI‑chat tooling, I recommend Inkeep.
A Single Source of Truth: SDK Docs and “Main” Docs
We leveraged automatic documentation generation to reduce our workload. For example, we automatically generated the main platform documentation from the SDK docs. When I realized people were copying code samples back and forth between the SDK docs and the main docs, I pushed for a single source of truth.
- Where code samples belong: Samples that demonstrate how to use individual API functions should live with the SDK.
- How we achieved it: The SDK docs are generated from the docstrings in the SDK code, so the documentation is colocated with the code and more likely to stay up‑to‑date.
- The workflow: We built a system to ingest the SDK docs and generate parts of the main docs automatically. As the system ran, we could see and review new changes and edit the SDK docs when needed.
The result was better consistency and less manual work.
Conclusion
While the docs weren’t the only thing I worked on, they are the work I’m most proud of. Three years later, as I hand over the documentation, there is still work to be done—but I’m happy with what we accomplished.
Take a look for yourself:




