New Balance's Onur Yüce Gün on how Runway's AI capabilities are revolutionizing design
New Balance's Onur Yüce Gün on how Runway's AI capabilities are revolutionizing design
Onur Yüce Gün is a computational design specialist and instructor, who is currently leading the New Balance Athletics, Inc.’s computational design team. Onur has been developing computational design workflows and futuristic concepts, with a keen interest in dfAM (design for additive manufacturing) since his early career days in architecture at Kohn Pedersen Fox NY.

A frequent international lecturer, Onur acted as a critic at MIT (where he earned his masters and PhD), Harvard, Columbia University, the University of Pennsylvania and various other institutions around the world. Onur has helped develop innovative design curricula for some of the industry’s leading institutions, and supported software companies to implement computational modeling workflows.

A longtime user of Runway, we spoke to Onur about how he’s incorporating the platform into his daily work, teaching methods, and what makes the software so beneficial both for practitioners and instructors.

Hi Onur! It's been two years since we last spoke about how you brought Runway into your design processes at New Balance. How would you say things have changed since then?

In two years, a lot has changed. In terms of emerging AI tools, there’s been a crazy increase in interest within the last six or more months. The dealbreaker, is of course, diffusion models coming into play and designers getting very excited, and at times, getting addicted to them. With that, the challenge we face is this: if you can do so much, how do you make something relevant?

So for that, I think two things are needed. First, lots of insight and knowledge about what you're trying to do, and what you want to do. The second thing is having the right tool to work with. These two things will be pretty crucial moving forward. 

The last thing, specifically in terms of what's happening at New Balance, computational design has become a way more clear entity. And for the implementation of these AI tools, we have a lot more visibility, and accessibility compared to two years ago.

Are you still working with Runway?

Yes. When I first started using Runway 2+ years ago, it was mostly about SGANs, model training. I did around 15-20 projects both in-house and in my personal work, but my usage of Runway was probably every other month. Then, when Runway started releasing the Magic Tools, it became more of a tool I went back to weekly. And then, as the diffusion models came out, and you started adding more magic tools, it became a tool that got used almost every day. It’s also not just me using Runway anymore, but a team. 

So the biggest change is the frequency of use. That has changed a lot. 

How do you find Runway works in a team setting?

I have to say, the way it works within a team setting is pretty amazing. That’s been mind-blowing for me. You can collaborate in real time, and it is platform agnostic, meaning PC or Mac doesn’t matter. The excitement for Runway is ramping up a lot. I’m seeing more and more people getting curious and jumping on the platform to connect and use it.

Outside of your work at New Balance, you also teach. How are you using Runway in your lessons?

I taught a workshop at Penn State University Architecture Department two weeks ago, and for that, I got access for over 20 students to explore Runway. These were freshman year students, who were learning about creating spatial relationships by using rule-based systems. 

So far, they’ve been doing everything by hand and didn’t know how to do drafting or modeling on computers. I walked in one day and gave this five-hour workshop using Runway. We used Text to Image, Image to Image, Infinite Image, and then image interpolation Magic Tools. Suddenly the students’ vision and understanding of spatial concepts and relationships started widening. The final review became an interesting surprise for the instructors, because nobody expected students to come up with such conceptual visuals to describe their design intentions. The students learned by generating images, then making models by being inspired from those, and then by referring back to AI-generated images to describe their work.

In this instance, I think Runway worked as an accelerator for the understanding of spatial compositions, and 3D clusters and so forth. I’ve been teaching for years, and I can see Runway as becoming an integral part of the toolset to teach with in design studios.

That’s great to hear. What specifically about Runway makes it a great platform for those teachable moments?

The learning curve is a simple line, not a parabolic curve. One of the challenges we had in the past, was trying to first teach technical skills to the students, so they could use advanced digital tools to develop and represent their ideas. Now they have a tool to explicate and represent the ideas in a much easier way, using natural language. I think making something so accessible, and so quick to learn is good, but it will also bring challenges. But for the purpose of this kind of accelerated learning (as in running a condensed workshop), I think it's really great. 

What makes Runway unique compared to other softwares?

I believe advanced modeling software and specialized tools are here to stay, but Runway is opening up a new way of making by combining emerging visual design tools in one platform. What makes Runway really unique, is that it offers all of these Magic Tools together. So, once you create something, unfolding starts, it’s not the end, every step you take leads your creation elsewhere. The other thing that makes Runway different, is the ability to collaborate in real time. For instance, in the workshop I ran, the students were able to see what everybody else was doing right on the screen, so they could jump on there and look into other’s shared content, and learn from them. I find that sort of collaboration very powerful.

Can you tell us more about why you like the collaborative aspect?

Simply the fact that assets are available to everyone without ubiquitously and without any storage requirements. And then, the interface being platform agnostic. Our designers use both PCs and Macs, which has caused problems in the past. 

The collaborative video interface is very flexible. We tend to simultaneously use different portions of the timeline. Now we can divide up and see what’s happening elsewhere without having to send new links. Same goes for preparing presentations. In InDesign, I need to package everything and send it, and then there’ll be a compatibility issue. 

The third thing is the hardware requirements for computationally heavy tasks. When I think about the company-scale hardware, some people’s laptops were renewed six months ago, some were not. That creates a huge discrepancy for anything we are processing. Just having that computational power and the functionality all in the same place, is a game-changer.

In terms of larger conversations about tools like Runway and utilizing AI as part of our processes, what are you interested in exploring more? What are you trying to stay away from? 

The thing that I'm looking into directly, and encouraging other people to look into, is how to use these tools to create meaning and value, as opposed to creating noise. 

I'm trying to stay away from any tool that is super generative and super easy to use, which can make people have infinite play space. It’s too fast. And then you create a lot of things and let's say they are all beautiful. Well, if you have like 1 million good images, what do you do with them?

I think our understanding of design concepts need to be altered. We need to filter out the noise. There needs to be some sort of intelligence and sensitivity in the way in which we use these tools. The fact that you can make images and videos and edit things rapidly, doesn’t mean everything is going to change overnight. 

In architecture and consumer products, there’s a huge value chain that you need to be aware of. So, whenever you are dealing with these kinds of tools, you have to be aware that this is going to be plugged in somewhere and it should do something meaningful for you.

So that's what we are working on now. And I like the fact that you can handle this in different ways. You can use it for creative purposes, but also you can use it for data processing purposes. So, it can be very technical, or it can be very artistic. I think that's going to be our goal, to use these tools to their full potential to create meaningful, performance oriented, beautiful looking products.

What are some of the tools on Runway that you’re using the most or finding the most interesting?

When you go into Magic Tools, the first five tools are the ones we use pretty extensively. Text to Image, Image to Image, Infinite Image, and Frame Interpolation: I use these a lot. I’ve been doing interpolation for such a long time in other software, and it was such a cumbersome process before. Having that instantly done is mind boggling for me.  

Are there any tools you would like to see more development in?

I’m interested in further development of the Custom Generator, and then painting 3D texture, Image Variation, and just more functionality across all our tools. I’m expecting something to emerge in the 3D space in addition to the magic tools in the future.

In terms of 3D tools, can you give me an example of  one you’d love to see?

I am really interested in seeing further development of the text-to-3D tools. Some emergent algorithms are able to generate 3D methods with full texture maps. Similarly, the models that can build 3D models from one single image. These tools are showing us where the tech has reached and how much further they can go on in the near future!

If Runway takes that leap and implements something in 3D, it’s going to be a total game changer because it will become more of a design tool. For buildings and for products, that will be a dealbreaker.

Find more of Onur and his work on Instagram.