We built Pecobot to give our systems a face. He announces when something deploys, when something breaks, and when things move. He's not a mascot for fun. He's a character born from function. A visual voice for automation.
He started as a sketch, drawn by Hajni, our head designer. Over time, he found his way into Slack messages, plush prototypes, and internal jokes. He became a part of how we worked and communicated, a quiet representative of the machine behind the scenes.
Recently, we ran a test: what happens when you run a fully human-made character like Pecobot through modern AI tools? What gets lost? What carries through? And can we use those tools without losing authorship?
We didn't use AI to replace our ideas. We used it to support them.
Here's the process we followed:
The most important part was the prompt itself. We didn't leave the visual direction open. We treated ChatGPT like an art director's assistant, carefully designing the scene before we moved to any visual tool.
"Create a vibrant, playful scene featuring the uploaded character smiling and waving. Place the character in front of an imaginative, softly illustrated automation pipeline. Use pastel colors, soft gradients, light sketchy outlines, and an artistic, dreamy texture, similar to a hand-drawn, slightly surreal style. The pipeline should have flowing arrows, data packets, and soft mechanical details, keeping everything fun, abstract, and eye-catching. The character and background must blend together naturally in the same dreamy, colorful, illustrated style."
We weren't trying to see what AI could come up with. We knew what we wanted. The goal was to see whether the system could understand and respect what we'd already built.
Once we had the static image, we moved to Runway. But again, with guardrails. We only asked it to add motion. No new content. No story generation. Just light animation, a blink, a wave, a soft bounce. The less we asked of the model, the more of ourselves stayed in the output.
We used ChatGPT to help write the visual prompt. That gave us more control over what the image generator returned, because AI performs best when you know exactly what you want. From there, we passed the image into Runway.
The motion layer was minimal, but it gave Pecobot a sense of life. The key was giving the AI less to do. The more you ask it to invent, the more it drifts. The more you constrain it, the more it reflects your original vision.
That's how we see AI in our process: a tool for production, not invention.
What came back felt familiar, but slightly uncanny. Pecobot didn't feel like he had changed. But how people saw him did.
Some assumed the whole thing was AI-generated. Even though the concept, design, copy, and tone were all ours. The system just animated what we had already made. Still, the visual aesthetic carried the baggage of generative work. It blurred authorship.
That ambiguity is what stood out most. Not what AI made, but what people thought it made.
We're not using AI to replace our creative process. We use it where it makes sense, to speed things up, prototype faster, or add polish. But the vision, the ideas, and the strategy always come from people.
That's not just a value statement. It's how we get better. Tools don't improve your taste. Making things does.
Pecobot is still ours. Every part of him was made by hand, from his personality to the last pixel. But now, he moves. And that movement came from a tool we controlled, not one that controlled us.
We're not using AI to come up with ideas. We'll keep using AI where it helps, but based strictly on our designs.
7th December 2022
Faryal
Everyone is talking about coding as it is a well-recognized skill and a lot more in demand. One can easily be tempted to start coding, but it is essential to understand what it means exactly and how you can do it efficiently.
Read moreRead more6th June 2023
Phoebe
Web and app development are distinct disciplines with unique characteristics and target platforms. While web development focuses on creating websites accessible via web browsers, app development caters to platform-specific experiences through native or hybrid applications.
Read moreRead more17th August 2023
Hollie
Out of professional curiosity - I reached out to someone whose career took them on a different path to mine, to see how our roles and experiences of the Software Development Life Cycle differ. This is explored through a conversation with an external Tester, with over ten years of experience in the industry.
Read moreRead more