Bringing Old Films Back to Life

1City University of Hong Kong, 2Microsoft Research, 3Microsoft Cloud+AI
CVPR 2022

Our method firstly restores the real-world old films by fully leveraging the spatial-temporal information, and then propagete the colors from keyframes into the whole video.

Abstract

We present a learning-based framework, recurrent transformer network (RTN), to restore heavily degraded old films. Instead of performing frame-wise restoration, our method is based on the hidden knowledge learned from adjacent frames that contain abundant information about the occlusion, which is beneficial to restore challenging artifacts of each frame while ensuring temporal coherency. Moreover, contrasting the representation of the current frame and the hidden knowledge makes it possible to infer the scratch position in an unsupervised manner, and such defect localization generalizes well to real-world degradations. To better resolve mixed degradation and compensate for the flow estimation error during frame alignment, we propose to leverage more expressive transformer blocks for spatial restoration. Experiments on both synthetic dataset and real-world old films demonstrate the significant superiority of the proposed RTN over existing solutions. In addition, the same framework can effectively propagate the color from keyframes to the whole video, ultimately yielding compelling restored films.

Video

More Results

BibTeX

@article{wan2022oldfilm,
  title={Bringing Old Films Back to Life},
  author={Wan, Ziyu and Zhang, Bo and Chen, Dongdong and Liao, Jing},
  journal={CVPR},
  year={2022}
}