We Fed Our Mascot to AI - Here's What Happened

16th April 2025

Caitlin

AI didn't create this. We did. But it helped bring it to life.

We built Pecobot to give our systems a face. He announces when something deploys, when something breaks, and when things move. He's not a mascot for fun. He's a character born from function. A visual voice for automation.

He started as a sketch, drawn by Hajni, our head designer. Over time, he found his way into Slack messages, plush prototypes, and internal jokes. He became a part of how we worked and communicated, a quiet representative of the machine behind the scenes.

Recently, we ran a test: what happens when you run a fully human-made character like Pecobot through modern AI tools? What gets lost? What carries through? And can we use those tools without losing authorship?

How We Did It

We didn't use AI to replace our ideas. We used it to support them.

Here's the process we followed:

  • We designed Pecobot from scratch as a 2D illustration
  • We built a 3D render using Blender
  • We made a plush prototype with a manufacturing partner
  • We wrote a highly detailed visual prompt using ChatGPT
  • We generated a stating illurstration based on that prompt
  • We animated it in Runway, keeping the motion minimal

The most important part was the prompt itself. We didn't leave the visual direction open. We treated ChatGPT like an art director's assistant, carefully designing the scene before we moved to any visual tool.

"Create a vibrant, playful scene featuring the uploaded character smiling and waving. Place the character in front of an imaginative, softly illustrated automation pipeline. Use pastel colors, soft gradients, light sketchy outlines, and an artistic, dreamy texture, similar to a hand-drawn, slightly surreal style. The pipeline should have flowing arrows, data packets, and soft mechanical details, keeping everything fun, abstract, and eye-catching. The character and background must blend together naturally in the same dreamy, colorful, illustrated style."

We weren't trying to see what AI could come up with. We knew what we wanted. The goal was to see whether the system could understand and respect what we'd already built.

Once we had the static image, we moved to Runway. But again, with guardrails. We only asked it to add motion. No new content. No story generation. Just light animation, a blink, a wave, a soft bounce. The less we asked of the model, the more of ourselves stayed in the output.

Why We Used ChatGpt and Runway

We used ChatGPT to help write the visual prompt. That gave us more control over what the image generator returned, because AI performs best when you know exactly what you want. From there, we passed the image into Runway.

The motion layer was minimal, but it gave Pecobot a sense of life. The key was giving the AI less to do. The more you ask it to invent, the more it drifts. The more you constrain it, the more it reflects your original vision.

That's how we see AI in our process: a tool for production, not invention.

The Results (And A Few Surprises)

What came back felt familiar, but slightly uncanny. Pecobot didn't feel like he had changed. But how people saw him did.

Some assumed the whole thing was AI-generated. Even though the concept, design, copy, and tone were all ours. The system just animated what we had already made. Still, the visual aesthetic carried the baggage of generative work. It blurred authorship.

That ambiguity is what stood out most. Not what AI made, but what people thought it made.

Our Take

We're not using AI to replace our creative process. We use it where it makes sense, to speed things up, prototype faster, or add polish. But the vision, the ideas, and the strategy always come from people.

That's not just a value statement. It's how we get better. Tools don't improve your taste. Making things does.

Pecobot is still ours. Every part of him was made by hand, from his personality to the last pixel. But now, he moves. And that movement came from a tool we controlled, not one that controlled us.

We're not using AI to come up with ideas. We'll keep using AI where it helps, but based strictly on our designs.

Related posts