autograd-cpp

Published: (December 16, 2025 at 09:31 PM EST)
2 min read
Source: Dev.to

Source: Dev.to

Overview

A lightweight, high‑performance C++ automatic differentiation library using computational graphs.

  • Computational Graph‑based AD: Forward and backward propagation through dynamic graphs
  • Jacobian & Hessian: First‑ and second‑order derivative computations
  • Optimizers: SGD with learning‑rate scheduling (linear, exponential, cosine, polynomial)
  • Header‑mostly: Minimal dependencies, easy integration
  • CMake Package: FetchContent support for seamless integration

Installation

Using CMake FetchContent

include(FetchContent)

FetchContent_Declare(
    autograd_cpp
    GIT_REPOSITORY https://github.com/queelius/autograd-cpp.git
    GIT_TAG main
)

set(BUILD_EXAMPLES OFF CACHE BOOL "" FORCE)
set(BUILD_TESTS OFF CACHE BOOL "" FORCE)

FetchContent_MakeAvailable(autograd_cpp)

target_link_libraries(your_app PRIVATE autograd::autograd)

Building from source

git clone https://github.com/queelius/autograd-cpp.git
cd autograd-cpp
mkdir build && cd build
cmake ..
make -j$(nproc)

Run examples

./examples/simple_gradients
./examples/hessian_demo

Requirements

  • C++17 or later
  • CMake 3.14+
  • Optional: OpenMP for parallelization

Basic Usage

#include 

using namespace autograd;

int main() {
    // Create computation graph
    auto x = constant(3.0);
    auto y = constant(4.0);
    auto z = mul(x, y);                     // z = x * y
    auto result = add(z, constant(2.0));    // result = z + 2

    // Compute gradients
    result->backward();

    std::cout data[0] grad[0] grad[0] << std::endl;       // 3

    return 0;
}

Library Structure

  • tensor.hpp – Tensor class with gradient tracking
  • ops.hpp – Operations (add, mul, exp, log, matmul, etc.)
  • jacobian.hpp – Jacobian matrix computation
  • hessian.hpp – Hessian matrix computation
  • optim.hpp – SGD optimizer with learning‑rate schedules

Applications

The core automatic‑differentiation engine can be used as a foundation for:

  • Neural networks and deep learning
  • Statistical modeling and inference
  • Physics simulations requiring gradients
  • Optimization algorithms
  • General scientific computing

Design Goals

  • Minimal – Core AD functionality only; domain‑specific features can be built on top.
  • Efficient – Optimized for performance with optional OpenMP parallelization.
  • Flexible – Dynamic computational graphs support arbitrary computations.
  • Portable – Standard C++17, works on any platform.

License

[Specify your license]

Contributing

Contributions are welcome! This repository provides the core AD engine; domain‑specific extensions (e.g., neural networks, statistical models) should be developed as separate packages that depend on autograd-cpp.

Back to Blog

Related posts

Read more »

C# Minimal API: Response Caching

Response Caching Response caching reduces the number of requests a client or proxy makes to a web server. It also reduces the amount of work the web server per...