28/03/2025
Hot off the press - our research paper, Go-with-the-Flow, will be presented at !
Based on our research we believe this could allow artists in the future to leverage these new techniques to direct the motion in generated videos, empowering creative control in a wide range of video applications: cut-and-drag animation, transferring movement between videos, first frame editing, camera control via depth warping, and text-to-video 3D scene creation.
Kudos to the amazing team: Burgert, Yuancheng Xu, Wenqi Xian, Oliver Pilarski, Pascal Clausen, Mingming He, Li Ma, Yitong Deng, Lingxiao Li, Mohsen Mousavi, Michael Ryoo, Paul Debevec, Ning Yu , from Eyeline Studios, Scanline VFX - Powered by Netflix, Netflix, Stony Brook University, University of Maryland, and Stanford University.
***This is part of the ongoing research and development at Eyeline and we hope to see adoption in these techniques and workflows soon.
Paper: https://arxiv.org/pdf/2501.08331
Web: https://eyeline-research.github.io/Go-with-the-Flow/
Code: https://github.com/Eyeline-Research/Go-with-the-Flow
Models: https://huggingface.co/Eyeline-Research/Go-with-the-Flow/tree/main