The source code for this blog is available on GitLab.

Wiregate Blog

Genmo Mochi 1 Preview

Cover Image for Genmo Mochi 1 Preview
Mikhail Shevtsov
Mikhail Shevtsov

๐Ÿ“ Note: Thumbnail was generated using Mochi-1-Preview model with help of ComfyUI;
This article was created with help of llama 3.1:8b using Ollama;
๐ŸŽฅ Don't have time to read? - Watch the video we created: What is Mochi 1

Hey there, fellow tech enthusiasts! ๐Ÿ‘‹ I'm Mikhail, and welcome to my article on the latest developments in AI-generated videos. Today, we're going to dive into the world of Mochi 1, an open-source Text2Video model that's taking the industry by storm ๐Ÿ”ฅ. Pasta

What You Need to Know About Mochi 1 ๐Ÿค”

Mochi 1 is a Text2Video model released under the Apache 2 license. This means that anyone can use the output and derivatives of this AI without paying any monthly subscription or royalty fees ๐Ÿ’ธ. Just download and use it! However, there's a catch (there always is ๐Ÿ˜œ). To run this model, you need at least four H100 NVIDIA GPU's, which costs around $25,000 each ๐Ÿคฏ. Money

ComfyUI Mochi-1 Module ๐ŸŽ‰

But here's the thing - someone has taken it upon themselves to create a quantized version of the model and force it to run inside of the ComfyUI on an ordinary RTX 4090. This means that you can now generate decent-quality videos in about half an hour with 50 steps or very good quality with 200 steps in about 2 hours ๐Ÿ•’.

Rock

You can find more information about this amazing achievement on ComfyUI Mochi-1 module. This is a huge step forward for the industry, and we're excited to see where it takes us ๐Ÿ”œ.

The Quality: A Game-Changer? ๐Ÿค”

But what about quality? We took exactly the same prompt from the long-promised SORA AI and here's the result ๐Ÿคฏ. As you can see, Mochi 1 is capable of producing high-quality videos that are almost indistinguishable from real-life footage ๐Ÿ”ฅ. Tokio

What's Next? ๐Ÿค”

Looking further ahead, Genmo promises to release a full version of Mochi 1 by the end of this year ๐Ÿ“†. This will include a 720p HD version with enhanced fidelity and even smoother motion ๐ŸŽฅ. They're also working on image-to-video capabilities, which is going to be huge for the industry ๐Ÿ”œ. Mario

The Future: Integration and Innovation ๐Ÿ’ก

We're already seeing more companies integrating Mochi 1 into their products ๐Ÿ“ˆ. Heck, we're planning to add it to our new project called KraftStream ๐Ÿคฏ. This is just the beginning of a new era in AI-generated videos, and we can't wait to see what's next ๐Ÿ”œ. Kitchen

Conclusion: The Future is Now! ๐ŸŽ‰

In conclusion, Mochi 1 is an open-source Text2Video model that's changing the game for the industry ๐Ÿคฏ. With its high-quality output and affordable price point, it's a must-have for anyone looking to get into AI-generated videos ๐Ÿ”ฅ.

So, what are you waiting for? Try it out yourself and see the magic of Mochi 1 in action ๐Ÿ”ฎ. And if you're interested in learning more about this technology, be sure to check out our blog at https://blog.wiregate.io ๐Ÿ“š.

Don't forget to like and subscribe for more updates on the latest developments in AI-generated videos! ๐Ÿ‘ Like

Prompt ๐Ÿ“

```
What you are looking at right now is not what you think!
This moment never happened in real life.
This scene is completely artificial.
Want to learn more?
Stay with me and Iโ€™ll show โ€œHow deep the rabbit hole goes?โ€.
Hi there! ๐Ÿ‘‹ My name is Niki.
Today is October 30, 2024, and you are watching Breaking Brains.
About a week ago a startup company Genmo released an open source Text2Video model called mochi-1-preview.
And as you can see the first impression results are not goodโ€ฆ ๐Ÿ˜• for 3D Animators, Movie Actors, and Movie Editors ๐Ÿ˜
I would strongly recommend them to start looking for a new job as this little AI rocks! ๐Ÿค˜
Let's dig deeper to understand why this model is so important and what makes it so special.
Most probably youโ€™ve already seen some AI-generated videos and now thinking: โ€œWhatโ€™s the fuss about? nothing new!โ€
First and foremost Mochi 1 is released under Apache 2 license.
This means that One can use the output and derivatives of this little AI without paying any monthly subscription or royalty fees.
Just download and use it!
Hi Stability AI!
However, there is a catch.
There is always a catch.
To run this model you need at least four H100 NVIDIA GPUโ€™s ~ about 25 grand each.
Genmoโ€™s Mochi 1 model allows One to generate videos up to 5 seconds with 480p resolution.
Thanks to this guy โ˜๏ธyou donโ€™t have to sell your kidney to try Mochi 1.
So what he did is literally brain-breaking.
On the same day that mochi-1-preview was released, he created a quantized version of the model and forced it to run inside of the ComfyUI on an ordinary RTX 4090.
Which means it now takes about half an hour to generate a decent quality of Alien versus Mario dream with only 50 steps.
Or very good quality with 200 steps in about 2 hours.
Anyway... Performance is great!
What about quality?
We took exactly the same prompt from the long-promised SORA AI and here is the result.
This meme will explain everything.
By the way, almost everything that you saw was also generated by this little AI.
Whatโ€™s next?
Looking further these guys are promising to release a full version of Mochi 1 by the end of this year.
It will include a 720p HD version with enhanced fidelity and even smoother motion.
By the way, they are working on image-to-video capabilities.
That's one small step for AI, one giant leap for mankind.
We now will see that more and more companies are integrating this little AI into their products.
Heckโ€ฆ We are already planning to add it to our new project called KraftStream.
You donโ€™t have to believe us.
Just try it yourself.
Like the content we create? - check out our blog.
Link the description.
Donโ€™t forget to Like and Subscribe!
See you in the next one!
```
Given above script for the video write a LinkedIn article at least 500 words.
Include a lot of emoji's in the text to make it more rich.
You must write in Markdown syntax.
You must use `##` for marking the paragraphs.
You must use brackets and parenthesis for the links.
Author of the text is Mikhail Shevtsov.

Link to the blog: https://blog.wiregate.io
You must include following links in the article:
- https://github.com/kijai/ComfyUI-MochiWrapper - ComfyUI Mochi-1 module