These Runway AI Video Updates are Insane!
TLDRThis week's AI film news covers the unveiling of a new tool called Korea, which enhances sketches with AI for higher quality images and storyboards. Runway's algorithm update leads to improved video quality, with a comparison to Pabs showing Runway's edge in text-to-video conversion. Nicholas Nert's drone footage created in Runway demonstrates the potential of AI in filmmaking. The discussion also touches upon the future of feature films and TV using AI for establishing shots and other details. Runway's motion brush feature allows for selective editing of image frames, and Pabs announces its upcoming 1.0 version with a website interface. Meta's research into a new video editing model and their Emu Edit tool are highlighted, along with Moon Valley's new image-to-video model and Playday's AI video tool. Open AI's Dev day brings updates including an extended knowledge base and the ability to input longer prompts, impacting screenwriting and analysis. The introduction of AI chat bots allows for customized bots for specific tasks. The video concludes with a study from Oxford showing AI's potential in predicting heart attacks up to a decade in advance.
Takeaways
- π¨ A new tool called 'Korea' uses AI to enhance the quality of hand-drawn sketches and storyboards, making them more realistic.
- π Runway has updated its algorithm for better quality, offering more control over camera movements and direction in generated videos.
- π Runway's text-to-video quality is superior to P Labs, with more cinematic results, despite an exposure issue in the initial video frames.
- π AI filmmaker Nicholas Nert created drone footage entirely within Runway, demonstrating the potential of AI in film production.
- π A presentation suggests that within 12 months, the film and TV industry will use AI for establishing shots and details, indicating a shift in the industry.
- β° Runway introduced a 'motion brush' feature allowing users to selectively edit parts of an image frame for animation.
- π P Labs announced its upcoming version 1.0, offering a website-based platform and showcasing new animation potential.
- π Open AI extended its knowledge base and introduced the ability to input long prompts, beneficial for script analysis and feedback.
- π€ Open AI's Dev day highlighted the introduction of AI chat bots, allowing customization and sharing of bots for specific tasks.
- π Mid Journey's fine-tuning model allows for consistent style creation across images, which could be useful for filmmakers.
- π½οΈ The 1979 Jesus movie is using AI for translation and lip-syncing into 200 additional languages, expanding its audience reach.
Q & A
What new tool was mentioned in the video that uses AI to enhance the quality of sketches?
-The new tool mentioned is called Korea, which allows users to sketch and then automatically enhances the quality of those sketches using AI, making rough sketches and storyboards more realistic.
How does the updated Runway algorithm compare to Pabs in terms of image to video conversion?
-Both Runway and Pabs produce incredible results, but Pabs has a slight edge in terms of realism for cinematic movements. However, Runway offers more control over camera movement and direction.
What issue does Runway's new algorithm still struggle with?
-Runway's new algorithm still struggles with an exposure problem where the first few frames of a video may become brighter or darker depending on the uploaded asset, which users need to work around.
How does Runway's motion brush feature work?
-Runway's motion brush feature allows users to select parts of an image frame they want to edit by brushing in those areas. Everything else in the frame will remain static, providing a way to animate specific elements with control over movement and direction.
What significant update did Open AI make to their knowledge cutoff and API?
-Open AI extended their knowledge cutoff to April 2023, making their research more up-to-date. They also allowed API prompts to be up to 300 book pages long, enabling users to upload entire scripts for feedback and analysis.
What is the significance of the AI chat bots introduced by Open AI?
-AI chat bots allow users to customize a specific bot for a task that is likely to be repeated frequently. These bots can be shared with others, and they can be programmed with custom instructions and capabilities, which can be turned on or off as needed.
What does the study by Oxford suggest about AI's ability to predict heart attacks?
-The study suggests that AI can predict heart attacks up to 10 years in the future. It is currently being used to analyze heart scans for over 350,000 people each year and is more effective than humans at detecting irregularities that might otherwise go unnoticed.
What is the impact of the SAG strike resolution on AI filmmaking?
-The SAG strike resolution has several implications for AI filmmaking. It mandates that actors must give consent for the creation of digital replicas of their likeness, and if their likeness is used, they must be paid the day rate as if they were on set. Additionally, actors receive residual payments when their likeness is used on screen.
What are the conditions under which studios can use a digital double or synthetic actor without consent?
-Studios can use a digital double or synthetic actor without consent as long as the story remains substantially as scripted, performed, or re-recorded. Also, digital doubles or synthetic actors can be used without consent for projects meant for comment, criticism, scholarship, satire, parody, docudrama, or historical or biographical work.
What is the role of the new tool called Blockade in virtual productions?
-Blockade is a tool that creates a 360-degree world based on a user's prompt. It can generate realistic environments that can be used for VR development, reflection maps, or virtual productions, which is significant for the creation of immersive and cost-effective virtual sets.
What is the significance of the update from Midjourney regarding their fine-tuning model?
-The fine-tuning model by Midjourney allows users to create consistent styles for their images. By selecting preferred styles from a grid, users can generate a code that can be applied to future prompts to maintain a specific look, which is particularly useful for filmmakers seeking a uniform aesthetic across their work.
Outlines
π¨ AI Tools Revolutionizing Sketch and Storyboard Quality
The video script introduces a new tool called Korea, which uses AI to enhance the quality of hand-drawn sketches and storyboards, making them more realistic. It also discusses the advancements in AI technology, particularly the updates to Runway's algorithm, which now offers better quality in image to video conversion and text to video, although it still struggles with exposure issues. The script highlights the potential of AI in the film industry, showcasing examples from Nicholas Nert and discussing the future of filmmaking with AI.
πΉ Runway's New Features and Industry Competitions
The script covers Runway's new motion brush feature, which allows users to selectively edit parts of an image frame. It also mentions a drone footage example created entirely within Runway. The video discusses the potential of AI in feature films and TV, with a presentation suggesting that AI will be used for establishing shots within a year. It also talks about Runway's time remapping feature and the AI holiday film competition by Curious Refuge and Epidemic Sound, offering a prize of up to $5,000.
π Open AI's Developments and CEO Changes
The script discusses Open AI's updates, including an extended knowledge base, the ability to input longer prompts, and improved performance for enterprise customers. It also covers the introduction of AI chat bots, which can be customized for specific tasks and shared with others. The video mentions the recent departure of Sam Altman as CEO of Open AI and the potential for his return due to investor pressure.
π Real-time Language Models and Virtual Production Tools
The video script introduces Elon Musk's new language model with less censorship and real-time access to Twitter data. It also discusses Mid-Journey's fine-tuning model for creating consistent styles across images, which the speaker finds less useful for filmmakers at the moment. The script highlights a new tool called Blockade for virtual productions, which generates 360-degree worlds based on prompts, and mentions an AI film-making course with opportunities in the industry.
π AI in Filmmaking and Hollywood Strikes
The script discusses how AI is being used to translate and dub films into additional languages with native lip-syncing, as demonstrated by the 1979 Jesus movie. It also covers the end of the biggest strike in Hollywood history, with new agreements ensuring wage increases, streaming bonuses, and residuals for actors whose likenesses are used in films. The use of digital replicas and synthetic actors in various contexts is also explored, including the need for consent and payment for recognized actors.
π¬ AI Film Showcase and Predictive Healthcare
The video script showcases a fashion project by a Curious Refuge community member, a film called 'The Merge' created for the P Labs Halloween contest, and a rap video made with AI-generated content. It concludes with a study from Oxford that AI can predict heart attacks up to 10 years in advance by analyzing heart scans, outperforming humans in detecting irregularities.
Mindmap
Keywords
AI Film Making
Runway
Pabs
Motion Brush
Text to Video
AI Chat Bots
Digital Replica
Residual Payments
Virtual Productions
AI Video Editing
SAG Strike
Highlights
A new tool called Korea is introduced, which uses AI to enhance the quality of hand-drawn sketches and storyboards, making them more realistic.
Runway updates its algorithm for better quality in AI film production, offering more control over camera movements.
Runway's text-to-video quality is now superior to P Labs, with more cinematic results and improved realism.
Nicholas Nert, a leading AI filmmaker, demonstrates the creation of drone footage entirely within Runway, showcasing the potential of AI in film production.
A presentation by Shelby and the speaker predicts that AI will be used for establishing shots in feature films and TV within 12 months.
Runway introduces a motion brush feature, allowing users to selectively edit parts of an image frame for animation.
P Labs announces its upcoming version 1.0, offering a website-based platform and showcasing new animation potential.
Curious Refuge and Epidemic Sound host an AI holiday film competition with prizes up to $5,000.
Meta is researching a new video editing model using AI, allowing for a multimodal editing experience.
Moon Valley releases a new image-to-video model, although it leans towards a cartoonish style.
Open AI extends its knowledge base to April 2023 and allows for longer input prompts, impacting screenwriting and analysis.
AI chat bots are introduced by Open AI, allowing for task-specific customization and sharing.
Sam Altman's departure and potential return as CEO of Open AI amidst investor pressure.
Elon Musk announces a new language model with less censorship and real-time access to Twitter data.
Mid-Journey's fine-tuning model is showcased, allowing for consistent styles in film shots.
The BFX Festival in Bournemouth University features films from the Curious Refuge community.
Blockade, a new tool for virtual productions, generates 360-degree worlds from prompts for VR development and other uses.
AI is used to translate and add native lip-syncing to films, expanding audience reach.
The SAG strike results in increased wages for performers, streaming bonuses, and consent requirements for digital replicas.
A study by Oxford shows that AI can predict heart attacks up to 10 years in advance by analyzing heart scans.