Why Manual YUV Video Encoding Fails Across Android Devices (and How to Fix It)

Published: (January 13, 2026 at 12:00 PM EST)
2 min read
Source: Dev.to

Source: Dev.to

The Problem

Many developers take a naive approach: convert a Bitmap to YUV manually and push it to the encoder. It seems simple — but in production, it’s a minefield:

  • Stride assumptions differ per device (MediaTek often pads by 16–64 bytes)
  • Chroma plane ordering varies (planar vs. semi‑planar)
  • Surface locking can race, causing IllegalArgumentException
  • Encoder output threads can block indefinitely, hanging the app

The result? Crashes, corrupted files, and an unreliable video pipeline across devices.

Why Naive Approaches Fail Manual YUV Conversion

Manual YUV Conversion

fun bitmapToYuv420(bitmap: Bitmap, width: Int, height: Int): ByteArray {
    val yuv = ByteArray(width * height * 3 / 2)
    // Naive RGB → YUV conversion without stride handling
    return yuv
}

What goes wrong

  • Assumes stride == width → fails on MediaTek devices
  • Ignores planar/interleaved layout → color corruption
  • Misses hardware alignment → encoder rejects frames

Even if it works on your test phone, it’s likely to fail on other devices.

Naive Surface Handling

fun recordFrame(bitmap: Bitmap) {
    val canvas = surface.lockCanvas(null)
    canvas.drawBitmap(bitmap, ...)
    surface.unlockCanvasAndPost(canvas)
}

Issues

  • No concurrency control → crashes if a new frame arrives while the previous one is still being drawn
  • Blocking the main thread → dropped frames or camera freezes
  • No error recovery → the entire recording fails

The Production Solution: Surface‑Based Encoding

The only vendor‑agnostic, production‑ready approach is Surface‑based encoding (COLOR_FormatSurface). The hardware handles stride, color conversion, and alignment internally.

Key patterns

  • Frame dropping – skip a frame if the encoder is busy to prevent Surface lock conflicts
  • Short timeout – use ~100 ms for dequeueOutputBuffer() for responsive shutdown
  • Resource cleanup orderMediaMuxer → MediaCodec → Surface
  • Thread safety – use a dedicated encoder thread for all operations
val isEncodingFrame = AtomicBoolean(false)

fun recordFrame(bitmap: Bitmap) {
    if (!isEncodingFrame.compareAndSet(false, true)) return

    try {
        val canvas = encoderSurface.lockCanvas(null)
        canvas.drawBitmap(bitmap, null, dstRect, paint)
        encoderSurface.unlockCanvasAndPost(canvas)
    } finally {
        isEncodingFrame.set(false)
    }
}

This approach works across Qualcomm, MediaTek, and Exynos devices, for Android 10–15, with long recordings and high FPS.

Trade‑offs and Lessons

  • Frame dropping vs. quality – a slightly choppy video is better than a crash
  • Memory & CPU – Surface encoding offloads conversion to the GPU, reducing memory footprint
  • Background processing – Android 12+ may kill encoder threads if the app is backgrounded; use a foreground service
  • Testing focus – prioritize MediaTek devices (Vivo, Oppo, Xiaomi) and high‑FPS scenarios

Key Takeaways

  • Avoid manual YUV conversion — it’s fragile and device‑specific
  • Use Surface‑based encoding — hardware handles quirks automatically
  • Test on multiple chipsets and Android versions — real‑world signal beats theory

With this approach, your video pipeline will be production‑ready, reliable, and maintainable — no device‑specific hacks required.

Back to Blog

Related posts

Read more »