Runway Gen-2 adds multiple motion controls to AI videos with Multi Motion Brush

Key Points:

  • Runway’s Gen-2 model has been updated with Multi Motion Brush, enabling creators to add multiple directions and types of motion to their AI video creations.
  • Runway’s Gen-2 model offers a variety of features to control video outputs, including text, video, and image-based generation, the ability to extend clips up to 18 seconds, and options to choose the style of the video to be produced.
  • In the creative AI market, Runway competes with companies like Pika Labs and Leonardo AI, but challenges remain in generating clear and consistent images and videos.

Summary:

AI video creation is rapidly advancing, and New York City-based startup Runway has updated its Gen-2 foundation model with a new tool called Multi Motion Brush, which allows creators to add multiple directions and types of motion to their AI video creations.

 

The introduction of Multi Motion Brush strengthens the set of tools Runway has on offer to control the video outputs from the Gen-2 model. In addition to Multi Motion Brush, the model has other features like text, video, and image-based generation, as well as the ability to extend clips up to 18 seconds and choose styles for generated videos.

 

Runway faces competition in the creative AI market from players such as Pika Labs and Leonardo AI. These competitors are also developing AI video platforms, but issues still exist with generating images and videos that are blurred, incomplete, or inconsistent.

DAILY LINKS TO YOUR INBOX

PROMPT ENGINEERING

Prompt Engineering Guides

ShareGPT

 

©2024 The Horizon