Here are some more of my experiments with slit-scan image processing…
And a little bit of info on the process-
My earlier slit-scan methodology seen in the Driving Things here used a relatively simple GPU-driven setup in Touch Designer:
These new videos use a more complex CPU-driven setup in Houdini:
While slower to process, the new method has the advantage of being able to process much larger frame counts without worrying about maxing out the graphics card.
All of these Broadway videos started with some slow motion footage like this:
And then the process swaps either the width or height pixel axis of the image with the time axis. In these cases it’s always the width getting swapped with time, so the more frames that were originally captured, the wider the final video is: