Custom Machine Learning for Film Restoration

Reference-based restoration workflow for NukeX using CopyCat and Inference. Trains small CNNs against real source/reference pairs to recover lost chroma or spatial detail in degraded film elements.

Not a plugin. A repeatable, documented workflow for archives, preservation teams, and restoration practitioners.

Watch the YouTube walkthrough Video walkthrough — a visual companion to this repository.

Workflow overview Recovery workflow overview.

Recovery Modes

Mode Use when Ground truth target
Chroma recovery Luma/detail intact, chroma faded, shifted, or collapsed Source Y + Reference Cb/Cr
Spatial recovery Color acceptable, detail/sharpness/grain degraded vs. reference Reference Y + Source Cb/Cr

Start with chroma recovery unless your problem is clearly spatial. Do not combine both in the same target build — treat them as separate passes.

Getting Started

Follow these in order:

  1. Shared Workflow — Stages 0-2: Resolve export, Nuke setup, dataset curation, alignment, shared crop, and the branch decision.
  2. Chroma Recovery — Stage 3 onward: chroma target build, training, inference, validation.
  3. Spatial Recovery — Stage 3 onward: spatial target build, training, inference, validation.

Supporting Material

Requirements

  • Foundry NukeX with CopyCat and Inference (GPU: Apple Silicon or NVIDIA)
  • A source scan with surviving image information
  • A reference with stronger color or spatial detail
  • Resolve (or equivalent) for pre-alignment and container prep
  • ACES/OCIO color management

Back to top

Workflow template for film preservation and restoration research.