Model Chaining with Runway Gen-2
In this section, we will chain the output of a Midjourney image to the input of the Gen-2 image to video service.
First, let’s download an example Gen-2 video produced by a prompt asking the model to create a video of an astronaut driving in a rover on Mars:
!curl -L https://raw.githubusercontent.com/Denis2054/Transformers_3rd_Edition/master/Chapter19/Gen-2_Mars.mp4 --output "Gen-2_Mars.mp4"
We can now view it in the chapter notebook:
from moviepy.editor import *
# Load myHolidays.mp4 and select the subclip 00:00:00 - 00:00:60
clip = VideoFileClip("Gen-2_Mars.mp4").subclip(00,7)
clip = clip.loop(5)
clip.ipython_display(width=900)
The video you can view in the notebook is quite promising.
Figure 19.14: A screenshot of an AI-generated video about an astronaut on Mars
Now, let’s chain Midjourney to Gen-2.
Midjourney: Imagine a ship in the galaxy
Let’s download an image...