These Runway AI Video Updates are Insane!

Curious Refuge
21 Nov 202327:37

TLDRThis week's AI film news covers the unveiling of a new tool called Korea, which enhances sketches with AI for higher quality images and storyboards. Runway's algorithm update leads to improved video quality, with a comparison to Pabs showing Runway's edge in text-to-video conversion. Nicholas Nert's drone footage created in Runway demonstrates the potential of AI in filmmaking. The discussion also touches upon the future of feature films and TV using AI for establishing shots and other details. Runway's motion brush feature allows for selective editing of image frames, and Pabs announces its upcoming 1.0 version with a website interface. Meta's research into a new video editing model and their Emu Edit tool are highlighted, along with Moon Valley's new image-to-video model and Playday's AI video tool. Open AI's Dev day brings updates including an extended knowledge base and the ability to input longer prompts, impacting screenwriting and analysis. The introduction of AI chat bots allows for customized bots for specific tasks. The video concludes with a study from Oxford showing AI's potential in predicting heart attacks up to a decade in advance.

Takeaways

  • ๐ŸŽจ A new tool called 'Korea' uses AI to enhance the quality of hand-drawn sketches and storyboards, making them more realistic.
  • ๐Ÿ“ˆ Runway has updated its algorithm for better quality, offering more control over camera movements and direction in generated videos.
  • ๐Ÿ“Š Runway's text-to-video quality is superior to P Labs, with more cinematic results, despite an exposure issue in the initial video frames.
  • ๐Ÿš€ AI filmmaker Nicholas Nert created drone footage entirely within Runway, demonstrating the potential of AI in film production.
  • ๐Ÿ“š A presentation suggests that within 12 months, the film and TV industry will use AI for establishing shots and details, indicating a shift in the industry.
  • โฐ Runway introduced a 'motion brush' feature allowing users to selectively edit parts of an image frame for animation.
  • ๐ŸŒŸ P Labs announced its upcoming version 1.0, offering a website-based platform and showcasing new animation potential.
  • ๐Ÿ“ Open AI extended its knowledge base and introduced the ability to input long prompts, beneficial for script analysis and feedback.
  • ๐Ÿค– Open AI's Dev day highlighted the introduction of AI chat bots, allowing customization and sharing of bots for specific tasks.
  • ๐ŸŒ Mid Journey's fine-tuning model allows for consistent style creation across images, which could be useful for filmmakers.
  • ๐Ÿ“ฝ๏ธ The 1979 Jesus movie is using AI for translation and lip-syncing into 200 additional languages, expanding its audience reach.

Q & A

  • What new tool was mentioned in the video that uses AI to enhance the quality of sketches?

    -The new tool mentioned is called Korea, which allows users to sketch and then automatically enhances the quality of those sketches using AI, making rough sketches and storyboards more realistic.

  • How does the updated Runway algorithm compare to Pabs in terms of image to video conversion?

    -Both Runway and Pabs produce incredible results, but Pabs has a slight edge in terms of realism for cinematic movements. However, Runway offers more control over camera movement and direction.

  • What issue does Runway's new algorithm still struggle with?

    -Runway's new algorithm still struggles with an exposure problem where the first few frames of a video may become brighter or darker depending on the uploaded asset, which users need to work around.

  • How does Runway's motion brush feature work?

    -Runway's motion brush feature allows users to select parts of an image frame they want to edit by brushing in those areas. Everything else in the frame will remain static, providing a way to animate specific elements with control over movement and direction.

  • What significant update did Open AI make to their knowledge cutoff and API?

    -Open AI extended their knowledge cutoff to April 2023, making their research more up-to-date. They also allowed API prompts to be up to 300 book pages long, enabling users to upload entire scripts for feedback and analysis.

  • What is the significance of the AI chat bots introduced by Open AI?

    -AI chat bots allow users to customize a specific bot for a task that is likely to be repeated frequently. These bots can be shared with others, and they can be programmed with custom instructions and capabilities, which can be turned on or off as needed.

  • What does the study by Oxford suggest about AI's ability to predict heart attacks?

    -The study suggests that AI can predict heart attacks up to 10 years in the future. It is currently being used to analyze heart scans for over 350,000 people each year and is more effective than humans at detecting irregularities that might otherwise go unnoticed.

  • What is the impact of the SAG strike resolution on AI filmmaking?

    -The SAG strike resolution has several implications for AI filmmaking. It mandates that actors must give consent for the creation of digital replicas of their likeness, and if their likeness is used, they must be paid the day rate as if they were on set. Additionally, actors receive residual payments when their likeness is used on screen.

  • What are the conditions under which studios can use a digital double or synthetic actor without consent?

    -Studios can use a digital double or synthetic actor without consent as long as the story remains substantially as scripted, performed, or re-recorded. Also, digital doubles or synthetic actors can be used without consent for projects meant for comment, criticism, scholarship, satire, parody, docudrama, or historical or biographical work.

  • What is the role of the new tool called Blockade in virtual productions?

    -Blockade is a tool that creates a 360-degree world based on a user's prompt. It can generate realistic environments that can be used for VR development, reflection maps, or virtual productions, which is significant for the creation of immersive and cost-effective virtual sets.

  • What is the significance of the update from Midjourney regarding their fine-tuning model?

    -The fine-tuning model by Midjourney allows users to create consistent styles for their images. By selecting preferred styles from a grid, users can generate a code that can be applied to future prompts to maintain a specific look, which is particularly useful for filmmakers seeking a uniform aesthetic across their work.

Outlines

00:00

๐ŸŽจ AI Tools Revolutionizing Sketch and Storyboard Quality

The video script introduces a new tool called Korea, which uses AI to enhance the quality of hand-drawn sketches and storyboards, making them more realistic. It also discusses the advancements in AI technology, particularly the updates to Runway's algorithm, which now offers better quality in image to video conversion and text to video, although it still struggles with exposure issues. The script highlights the potential of AI in the film industry, showcasing examples from Nicholas Nert and discussing the future of filmmaking with AI.

05:01

๐Ÿ“น Runway's New Features and Industry Competitions

The script covers Runway's new motion brush feature, which allows users to selectively edit parts of an image frame. It also mentions a drone footage example created entirely within Runway. The video discusses the potential of AI in feature films and TV, with a presentation suggesting that AI will be used for establishing shots within a year. It also talks about Runway's time remapping feature and the AI holiday film competition by Curious Refuge and Epidemic Sound, offering a prize of up to $5,000.

10:04

๐Ÿš€ Open AI's Developments and CEO Changes

The script discusses Open AI's updates, including an extended knowledge base, the ability to input longer prompts, and improved performance for enterprise customers. It also covers the introduction of AI chat bots, which can be customized for specific tasks and shared with others. The video mentions the recent departure of Sam Altman as CEO of Open AI and the potential for his return due to investor pressure.

15:06

๐ŸŒ Real-time Language Models and Virtual Production Tools

The video script introduces Elon Musk's new language model with less censorship and real-time access to Twitter data. It also discusses Mid-Journey's fine-tuning model for creating consistent styles across images, which the speaker finds less useful for filmmakers at the moment. The script highlights a new tool called Blockade for virtual productions, which generates 360-degree worlds based on prompts, and mentions an AI film-making course with opportunities in the industry.

20:08

๐Ÿ“š AI in Filmmaking and Hollywood Strikes

The script discusses how AI is being used to translate and dub films into additional languages with native lip-syncing, as demonstrated by the 1979 Jesus movie. It also covers the end of the biggest strike in Hollywood history, with new agreements ensuring wage increases, streaming bonuses, and residuals for actors whose likenesses are used in films. The use of digital replicas and synthetic actors in various contexts is also explored, including the need for consent and payment for recognized actors.

25:09

๐ŸŽฌ AI Film Showcase and Predictive Healthcare

The video script showcases a fashion project by a Curious Refuge community member, a film called 'The Merge' created for the P Labs Halloween contest, and a rap video made with AI-generated content. It concludes with a study from Oxford that AI can predict heart attacks up to 10 years in advance by analyzing heart scans, outperforming humans in detecting irregularities.

Mindmap

Keywords

๐Ÿ’กAI Film Making

AI Film Making refers to the use of artificial intelligence (AI) in the creation and production of films. This includes using AI algorithms to generate images, edit videos, and even create scripts. In the video, AI film making is the central theme, as it discusses various AI tools that are transforming the industry, such as Runway and Pabs, which are used to create more realistic and cinematic results from sketches and text.

๐Ÿ’กRunway

Runway is an AI tool mentioned in the video that has been updated to improve the quality of AI-generated content. It is used for tasks such as image to video conversion, where it can take a still image and create a video with cinematic movements. The video highlights a side-by-side comparison of Runway with another tool called Pabs, noting that while Runway offers more control over camera movements, Pabs may provide more realism in certain aspects.

๐Ÿ’กPabs

Pabs is another AI film making tool that is compared with Runway in the video. It is noted for its ability to create more realistic cinematic movements, although it may not offer as much control over the camera as Runway does. Pabs is also mentioned to be releasing a new version, indicating ongoing development and improvement in AI film making technology.

๐Ÿ’กMotion Brush

Motion Brush is a feature of the Runway tool that allows users to selectively edit parts of their image frame. It is showcased in the video as a way to animate specific elements of a scene, such as smoke and fire in a home on fire scenario, while keeping other elements static. This feature is highlighted as an innovative aspect of AI technology that contributes to the future of film making and animation.

๐Ÿ’กText to Video

Text to video is a process where AI takes textual input and generates video content based on that input. The video discusses the quality of text to video output from Runway, noting that it appears more realistic and cinematic compared to other tools. This process is significant as it represents the potential for AI to interpret and visualize textual concepts, which is a key aspect of storytelling in film making.

๐Ÿ’กAI Chat Bots

AI Chat Bots are customizable AI agents that can be programmed to perform specific tasks. In the context of the video, they are mentioned as a new feature from Open AI that can be used for various purposes, including screenwriting advice. The video demonstrates creating a chat bot that provides screenwriting tips for short films, showcasing the potential for AI to assist in creative processes.

๐Ÿ’กDigital Replica

A digital replica refers to a computer-generated version of a real person, often used in film making for various purposes such asๆ›ฟ่บซ (stunt doubles) or creating a younger version of an actor. The video discusses the legal and ethical considerations surrounding the use of digital replicas, particularly the requirement for consent from the individuals being replicated.

๐Ÿ’กResidual Payments

Residual payments are additional payments made to actors, directors, and other creative professionals when their work is reused or rebroadcast. In the video, it is mentioned that actors will receive residual payments when their digital likeness is used on screen, which is a significant development for the industry as it pertains to the use of AI-generated content.

๐Ÿ’กVirtual Productions

Virtual Productions involve the use of virtual reality, game engine technology, and other digital tools to create film sets and environments without the need for physical locations. The video mentions a tool called Blockade that generates 360-degree worlds based on prompts, which can be used for virtual productions, indicating a shift towards more digital and AI-assisted production methods.

๐Ÿ’กAI Video Editing

AI Video Editing refers to the use of AI algorithms to assist in the editing process of video content. The video discusses a tool created by First AI Machine that uses Runway to generate video edits through a physical device, suggesting a future where AI plays a larger role in post-production processes.

๐Ÿ’กSAG Strike

The SAG Strike refers to a labor dispute involving the Screen Actors Guild (SAG), which affects the film and television industry. The video mentions the resolution of a major strike in Hollywood, highlighting changes such as wage increases and streaming bonuses for performers, which are significant for the industry and the individuals working within it.

Highlights

A new tool called Korea is introduced, which uses AI to enhance the quality of hand-drawn sketches and storyboards, making them more realistic.

Runway updates its algorithm for better quality in AI film production, offering more control over camera movements.

Runway's text-to-video quality is now superior to P Labs, with more cinematic results and improved realism.

Nicholas Nert, a leading AI filmmaker, demonstrates the creation of drone footage entirely within Runway, showcasing the potential of AI in film production.

A presentation by Shelby and the speaker predicts that AI will be used for establishing shots in feature films and TV within 12 months.

Runway introduces a motion brush feature, allowing users to selectively edit parts of an image frame for animation.

P Labs announces its upcoming version 1.0, offering a website-based platform and showcasing new animation potential.

Curious Refuge and Epidemic Sound host an AI holiday film competition with prizes up to $5,000.

Meta is researching a new video editing model using AI, allowing for a multimodal editing experience.

Moon Valley releases a new image-to-video model, although it leans towards a cartoonish style.

Open AI extends its knowledge base to April 2023 and allows for longer input prompts, impacting screenwriting and analysis.

AI chat bots are introduced by Open AI, allowing for task-specific customization and sharing.

Sam Altman's departure and potential return as CEO of Open AI amidst investor pressure.

Elon Musk announces a new language model with less censorship and real-time access to Twitter data.

Mid-Journey's fine-tuning model is showcased, allowing for consistent styles in film shots.

The BFX Festival in Bournemouth University features films from the Curious Refuge community.

Blockade, a new tool for virtual productions, generates 360-degree worlds from prompts for VR development and other uses.

AI is used to translate and add native lip-syncing to films, expanding audience reach.

The SAG strike results in increased wages for performers, streaming bonuses, and consent requirements for digital replicas.

A study by Oxford shows that AI can predict heart attacks up to 10 years in advance by analyzing heart scans.