With Runway you can use powerful machine learning models in a simple way, allowing you to experiment with new creative workflows. Now, you can also connect models together, so you can explore an even wider set of possibilities.
The latest release of Runway (Beta version 0.7.0) introduces a new experimental 🧪 feature: you can now use the output of one machine learning model as the input for another one 🤯.
This has been something we have been wanting to build for a long time and has been one of the requested features coming from our users. So much, that some of them have gone ahead and built their own model chaining prototype interfaces:
A Cambrian explosion of DIY machine learning is nearing: training and chaining models will be way more accessible w/ tools like @runwayml + @lobe_ai. Hopefully chaining will be as easy as this @framer demo I made -- it talks to pre-trained models in Runway's beta. pic.twitter.com/8yNOjV3PmU— Drew Stock (@drewbuttons) February 18, 2019
With all the feedback we have gathered and some of the great suggestions from all of you, we are really excited to have this new experimental feature available for everyone to try. — You can chain up to 5 models together (CPU or GPU based) and you can run and stop them all at once.
Check out the following resources to learn more about how to start using chaining in Runway.
More is coming!
This is our first public iteration of model chaining in Runway. We know more powerful and complex workflows can be achieved with other types of interfaces and we would love to hear from you. We can’t wait to see what you create!
Try the new model chaining feature and let us know what you think! Download or update to the latest release here: https://runwayml.com/downloa