🎬 Video generation: new Gen-3 Alpha and Dream Machine updates
Runway has introduced Gen-3 Alpha, a new hyper-realistic model for video generation. The model can generate detailed and realistic videos up to 10 seconds long with high fidelity, a variety of emotional expressions, and camera movements.
🔑 Key features
⚫️Photorealistic Humans. Gen-3 Alpha excels at generating expressive human characters with a wide range of actions, gestures, and emotions, unlocking new storytelling opportunities over its Gen-2 predecessor.
⚫️Multimodal Learning. Expand Runway's suite of tools to include text2video, image2video, and text2image.
⚫️Increased generation speed. A 10-second video is generated in about 90 seconds, which is significantly better than previous models.
⚫️Extended features. In addition to the classic Motion Brush, Advanced Camera Controls, and Director Mode, new tools will provide more precise control over structure, style, and movement. Can interpret a wide range of film styles.
The model is currently only accessible to selected companies and has not been released publicly.
🪄 Just a week ago, Luma Labs released the Dream Machine video generator, and today published an update.
Dream Machine can now generate continuous videos up to 60 seconds long. Soon, a generation library will be available for inspiration and the ability to edit the generated videos, such as changing backgrounds, characters, and animations. Watch the latest clip 🔼
Due to high demand, Luma has limited the number of free generations to 5 per day. Paid subscription users have priority and unlimited generation.
More on the topic:
CapCut is an AI-Powered Video Editing Tool
Sora Updates: AI Sound Effects by ElevenLabs, Comparison with Runway Gen-2, and New Videos
Multi-Motion Brush from Gen-2 and Lumiere from Google Research
#news @hiaimediaen
Runway has introduced Gen-3 Alpha, a new hyper-realistic model for video generation. The model can generate detailed and realistic videos up to 10 seconds long with high fidelity, a variety of emotional expressions, and camera movements.
🔑 Key features
⚫️Photorealistic Humans. Gen-3 Alpha excels at generating expressive human characters with a wide range of actions, gestures, and emotions, unlocking new storytelling opportunities over its Gen-2 predecessor.
⚫️Multimodal Learning. Expand Runway's suite of tools to include text2video, image2video, and text2image.
⚫️Increased generation speed. A 10-second video is generated in about 90 seconds, which is significantly better than previous models.
⚫️Extended features. In addition to the classic Motion Brush, Advanced Camera Controls, and Director Mode, new tools will provide more precise control over structure, style, and movement. Can interpret a wide range of film styles.
The model is currently only accessible to selected companies and has not been released publicly.
🪄 Just a week ago, Luma Labs released the Dream Machine video generator, and today published an update.
Dream Machine can now generate continuous videos up to 60 seconds long. Soon, a generation library will be available for inspiration and the ability to edit the generated videos, such as changing backgrounds, characters, and animations. Watch the latest clip 🔼
Due to high demand, Luma has limited the number of free generations to 5 per day. Paid subscription users have priority and unlimited generation.
More on the topic:
CapCut is an AI-Powered Video Editing Tool
Sora Updates: AI Sound Effects by ElevenLabs, Comparison with Runway Gen-2, and New Videos
Multi-Motion Brush from Gen-2 and Lumiere from Google Research
#news @hiaimediaen