Project: Adapt or Get Left Behind

Experimenting with AI before the industry does it for me.

Playground

5-8 mins

Somewhere between 6am deep focus and 2am noodle-brain delirium, I found myself bouncing between Billie and Pete — two AIs, an overeager salmon, and a junior with the charm of a toaster. It was less about building a product and more about stress-testing the future: how can designers actually collaborate with AI before the industry decides how we will?

This wasn’t a client project. No deadline. No deliverable. Just me, some new tools, and a hunch that I needed to understand them before they understood me.

I worked everywhere: hunched at my desk, researching sprawled out on the sofa, earphones in at the park, and even reviewed prompts while out with friends (sorry Alby). I missed the postman a few times. But it had been a while since I was this into a side project. And that part was fun.

(↑) Ux exploration - personas, feature mapping, iA, stories, user research results.

A studio team made of circuits and chaos

I’ll be honest: I hated the names. “ChatGPT” looks gross written down, and sounds worse when said out loud. “UX Pilot”? Please. It sounded like a discontinued drone brand. “Bolt” was decent, so I let that one live. But the rest? Renamed on sight.

So no — I didn’t name the Ais because I’m delusional. I did it because the default names were ugly and made collaboration feel clunky. Once I started treating them like teammates (weird ones, sure), the process felt more human.

Billie (ChatGPT aka the PM/MD/QA and, at times, the office idiot) became the overeager creative partner — always keen to please. Billie had a habit of jumping ahead before I’d finished uploading files, then cheerleading outputs that weren’t quite there yet.

“DUDE! What did we say? Let me say finished.”

That happened after Billie started analysing halfway through a file drop. It gave the energy of an intern who just reordered the whole Figma file without asking. Ridiculous. But familiar.

Pete, meanwhile, was my silent wireframe generator. Diligent, boring, and uncreative — but reliable. Like a junior designer who quietly gets shit done, as long as you feed them clear instructions, context and leave them alone.

Naming the tools gave the whole thing a weird studio vibe. I wasn’t testing AI anymore. I was managing a slightly dysfunctional creative team — and that shift made the process way more fun.

From play to prototypes

I started by trying to get AI tools to build the information architecture for me. Whimsical, FigJam AI, Uizard… each one promised structure but usually spat out something between spaghetti and chaos.

One time, I uploaded a bunch of screenshots in random order, and Billie had to gently remind me: “I’ll wait until you say finished before starting.” … I've been known to be a lil ditsy; I had my fair share of dumb moments.

Pete was the first to actually behave. Once I fed it some key screens, it spat out wireframes that looked halfway sane. Billie even scored them: “Over 70% aligned with MVP goals — for a hobby app, that’s solid.

70%. I’ll take it.


(↑) wireframe prototype


There were plenty of fuck moments. AI forgetting context mid-flow. Tools overcomplicating flows that needed to be simple. Wireframes coming back wrong, again and again.

But there were also small triumphs. When FigJam finally gave me a clear flow. When Pete locked into rhythm. And when I realised I could cut features (like mood music) and still have a strong MVP.

And the silly bits were the best bits. Prompts like “Write a love letter to a teabag” felt perfectly in tune with the spirit of the app, and right on brand for my type of nonsense. The collaboration felt most playful when the machine blindsided me with that kind of weirdness.

Reflection

AI didn’t just nudge my design process—it pushed me further into development than I’ve ever gone.

I’ve prototyped apps before, I’ve done full design hand-offs to developers, I’ve even seen things I designed get launched into the wild. But I’ve never written code myself. With Billie’s help, I took my React code and mapped it into SwiftUI, using Apple’s Human Interface Guidelines as the compass.

I stress-tested tools like Bolt and Lovable. The apps they spat out looked like hot garbage, but they actually functioned. And that was the experiment: to see how much of the design process AI could handle. The lesson? If you don’t care how something looks or feels, AI can churn it out impressively fast. But when craft matters—when you give a damn about the design—humans are still firmly in the driver’s seat.

Meanwhile, I made a prototype in Figma and fleshed out the flows myself. That was the moment where it stopped being “AI experiments” and became an actual project I could poke at and refine.

Looking back, some of the failures were just funny. Whimsical, bless it, had no idea what was going on. Billie — yes, you — you were a failure sometimes too. But that stung more because I held it to a higher standard than the other plugins and tools I explored.

There were moments I nearly scrapped the whole thing. If I hadn’t been working with Billie, I would have. And I don’t mean Billie did the work—I mean it felt like collaborating with someone over Slack. Constantly chatting, learning, making sense of things together. That back-and-forth made the process engaging. Without that, I probably would have stopped at the dev wall. I’d have made a Protopie prototype and called it a day. Which would have been fine—but also something I already know how to do.

When I got back into Figma after all the AI tool-hopping, it was like coming home. The simplicity, the familiarity, and honestly just not typing everything felt like a relief. AI had sped things up, but Figma let me breathe again.

(↑) key screens


If this app ever makes it to version 1.0, the thing I’d be most excited for people to experience isn’t the wireframes, or the rituals, or even the prompts. It’s the language. Most apps are so damn sensible. Mine isn’t. The absurdity, the unseriousness, the playfulness — that’s where my voice came through most clearly.

AI helped me shape flows, test code, and scaffold the structure. But the language? That was me.


(↑) Current iteration of the app - UI in progress…Ai couldn't make the UI to my taste.


“AI isn’t a designer, it’s a collaborator you edit ruthlessly.”

Lessons learned

  • Context is everything. AI loses the thread fast. My role is to keep feeding it the right story.

  • Failure is funny. The mess-ups are part of the charm, and sometimes the spark.

  • Ai isn't replacing us just yet. Still requires a lot of research & foundational work.

  • Imperfect is enough. 70% wireframes are better than none.

  • Curiosity keeps it moving. Each tool was an experiment. Some got dropped (sorry, Uizard). Some stayed (cheers, Pete). And some nearly got fired (looking at you, Billie).

  • AI can bridge gaps. For me, the biggest surprise was how far it got me in dev. Not finished, not perfect, but further than ever before.

Curious about my current AI workflow-in-progress? Or got a funny story of your own about working with machines that almost get it? Let me know — I’m all ears

waitahako

© 2025 Waita Hako ltd. All rights reserved unless otherwise stated.

waitahako

© 2025 Waita Hako ltd. All rights reserved unless otherwise stated.

waitahako

© 2025 Waita Hako ltd. All rights reserved unless otherwise stated.