Skip to main content

Revolutionising Fashion Prototyping with AI and High-Performance Computing

Background

Suhasish Basak, a second-year PhD student in Digital Fashion at Ulster University’s Belfast School of Art, is exploring how artificial intelligence (AI) can transform how fashion designers create and test clothing ideas, especially before anything is physically made.

Traditionally, garment prototyping involves time-consuming hands-on work with fabrics and sketches. Suhasish’s work aims to make this process faster and more flexible by using generative AI tools to create realistic virtual designs.

He has developed two creative methods, or pipelines, which help bring clothing ideas to life:

  • A Text-to-Design Model turns written descriptions into design images.
  • A Sketch-to-Design Model that transforms hand-drawn sketches into detailed, high-quality visuals.

Both use advanced AI technology (Stable Diffusion XL with ControlNet) to generate high-quality, lifelike images of garments at high resolution (1024x1024 pixels) in less than five minutes per design.

This work could make fashion design more accessible, reduce waste, and speed up early-stage prototyping, all without needing to cut or sew a single piece of fabric.

The Challenge: Making Fashion Prototyping Faster and More Sustainable

Creating physical samples of new clothing designs is often slow, expensive, and wasteful. Suhasish initially developed a prototype system using a macOS laptop, but generating just one virtual design took 40 to 80 minutes, far too long for real-time experimentation or collaboration.

To better understand designers’ needs, Suhasish surveyed 50 professional designers from the UK, India, and the US. They identified two major obstacles:

  1. A lack of easy-to-follow tutorials for advanced 3D design software.
  2. Frequent software crashes when creating complex virtual designs.

Suhasish also surveyed over 200 consumers about digital fashion products like AR garments and NFTs. The results showed that after price, design quality was the most important factor in buying decisions. Clearly, there was a need for faster, more reliable, and more accessible virtual design tools for both designers and end users.

 

Access to NI-HPC’s GPU cluster has been transformative for Suhasish Basak’s research in digital fashion. He used clever memory-saving techniques like VAE slicing, UniPC scheduling, and CPU offloading to get the most out of the system.

 His AI pipelines, originally taking up to 80 minutes per garment design, were reduced to just 2–5 minutes using Kelvin2, making it fast enough for real-time creative feedback and interactive workshops with designers.

This dramatic speed-up has enabled:

  • Rapid, real-time design iterations during workshops and live sessions.
  • Scalable, sustainable virtual prototyping, replacing the need for physical samples.
  • Integration of advanced AI techniques (like VAE slicing and CPU offloading) for efficient GPU use.
  • A solid foundation for future developments, including eco-aware design metrics and brand-specific AI tuning.

Using Kelvin2 has turned an experimental concept into a fast, flexible, and practical design tool, empowering both creative exploration and commercial potential in digital fashion.

Turning Words and Sketches into Fashion Designs
Text to Design
Suhasish created a smart tool that helps fashion designers bring their ideas to life just by typing them. Designers can simply describe what they want, like the type of fabric, colours, patterns, and shape, and the tool turns those words into realistic digital clothing designs.

To make the designs look more realistic, the system includes a special feature called “enhance_fabric_realism.” This function automatically adds extra detail to the description. For example, if the fabric is satin, it might add the phrase “ultra-smooth sheen” to better reflect how satin looks in real life. This extra layer of detail helps the AI generate highly realistic and visually accurate garment images. Designers have full control over things like colour, fabric, and shape, making it easy to explore creative ideas without losing their personal touch.

Sketch-to-Design Model

If they draw a quick sketch, the tool can also turn that into a polished design. Designers can upload their hand-drawn sketches, and the system cleans them up and turns them into realistic digital fashion designs. It keeps the original style and details, so the final result stays true to the designer’s vision. Even when creating different versions, the tool makes sure key elements stay the same.

Key Outcomes

  • Speeds up design: What used to take hours now takes just a few minutes. Designers can try new ideas quickly and even do live sessions.
  • Better for the planet: more sustainable design, by replacing physical samples with virtual ones, the process can help reduce fabric waste and cut carbon emissions, supporting more eco-friendly design practices.
  • Designed with Users in Mind: Suhasish worked closely with fashion designers and tech experts through focus groups, using their feedback to shape the tool’s features. This helped tackle real-world issues like the lack of easy tutorials, which have slowed down the wider use of digital design tools.
  • Room to Grow: With access to high-performance computing, the system can be expanded further. Future upgrades may include specialised AI tuning using fashion-specific data, making the tool even smarter and more accurate.

 

Future Direction

Suhasish plans to embed biodegradability metrics, collaborating with textile chemists to incorporate end-of-product lifecycle data, and integrate live fashion brand data to generate brand-specific designs.

He also intends to develop micro-learning modules to onboard nontechnical designers and extend partnerships with industry for commercial trials of this SaaS-style prototyping tool.

 

students designing fashion