In the last instalment of my ‘Little Guys’ series, we saw Clippy banished from our desktops for social incompetence. Around that same time, however, a far more radical question was emerging: what if the ‘little guy’ could escape the screen entirely?
The answer arrived in the late 1990s in the form of two very different creatures: the Sony AIBO and the Furby. Each offered a wildly different vision of what a robotic companion could be. One was an expensive, autonomous marvel of engineering; the other, a cheap, chattering ball of monster fluff. Both were born from the same technological moment, and their legacies still shape our expectations of physical/embodied AI today.
To understand today’s tech landscape: the Metaverse, AI, wearables, and immersive entertainment etc you have to understand the dreams of the late 90s, because they are being dreamed all over again. The difference today is simply better technology. Back in 1998, the brains inside AIBO and Furby ran on chips with only a few million transistors. Today’s chips pack in tens of billions, and thanks to the smartphone, we are two decades further along the experience curve in miniaturisation:optics, sensors, and power systems.
The hardware to realise these old dreams is finally here, making the lessons from these two pioneers more relevant than ever.
Sony AIBO
AIBO’s story begins at Sony Computer Science Laboratories (Sony CSL), Sony’s answer to Xerox PARC. Founded in 1988, CSL was meant to be a place where researchers could work outside the pressures of consumer electronics. By the mid-90s, a small robotics group known as the D21 Laboratory was asking:: how do you build machines people can feel for?
The lab was led by Dr Toshitada Doi, a Sony veteran who joined the firm in the 1960’s and worked alongside founder Masaru Ibuka, and made his name at the company solving several key industrial challenges during the development of the Compact Disc. Day to day, however, it was AI engineer Dr. Masahiro Fujita who shepherded the ‘robodog’ project.
From everything I’ve read it, seems like D21’s core philosophy was tackling a psychological challenge not a technical one: how to create robots that engender an emotional connection with the user.
D21’s first prototype was this terrifying thing called MUTANT.
Despite its appearance, this spindly robot could already perform behaviours that would become part of AIBO’s key UX patterns, such as tracking a yellow ball, shaking hands, and ‘sleeping’.
By 1998 the prototypes had converged on a four-legged form, and Sony announced AIBO both as a consumer project and a technical platform. The later being the OPEN-R architecture, Which, in retrospect, was wildly ahead of its time. OPEN-R was a platform for robots made of modular hardware and software: legs that could be swapped for wheels, software components that could changed to alter behaviour.
Here’s a quote from the original OPEN-R press release:
The new architecture involves the use of modular hardware components, such as appendages that can be easily removed and replaced to change the shape and function of the robots, and modular software components that can be interchanged to change their behavior and movement patterns.
Entertainment robots, unlike those used for automated manufacturing, are an entirely new category of robot designed for entertainment uses. The main advantage of the OPEN-R architecture, which has been developed to help realize the creation of this new type of robot, is the hardware and software modularity not present in most industrial-use robots of today.
A key line here is the creation of “a new type of robot” distinct from its industrial counterparts. As a consumer product Sony’s vision for the AIBO to be an entirely new kind of product category: personal robotics. It imagined a world where a domestic robot pets would live alongside you amidst the piles of DVD’s in the lounge.
Also, it’s worth noting that the AIBO (ERS-110) was designed by illustrator Hajime Sorayama, famous at the time for his ‘erotic robot’ art (like Aerosmith’s Push Play record cover), which is the source of the AIBO’s iconic look.
The design won him Japan’s ‘Good Design Award Grand Prize’. Its chrome-slick body with exposed joints and an expressive face is an extremely Y2K aesthetic of its time.
Autonomous Companion
When the first AIBO went on sale in 1999, it wasn’t pitched as a toy but a pet that happened to be a robot.
Honestly. Go read the 1999 AIBO product press release, it still sounds rad as hell.
“AIBO” [ERS-110] is an autonomous robot that acts both in response to external stimuli and according to its own judgement. “AIBO” can express various emotions, grow through learning, and communicate with human beings to bring an entirely new form of entertainment into the home.
Not only is “AIBO” capable of four-legged locomotion by virtue of the 3 degrees-of-freedom in each of its legs, but it can also perform other complex movements using its mouth, tail, and head, which have 1, 2, and 3 degrees-of freedom, respectively. “AIBO” incorporates various sensors and autonomous programs that enable it to behave like a living creature, reacting to external stimuli and acting on its own judgement. “AIBO’s” capacity for learning and growth is based on state-of-the-art artificial intelligence technology that shapes the robot’s behavior and response as a result of repeated communication with the user and interaction with the external environment.
The dream Sony were selling was that AIBO would grow alongside you. It would recognise your face, learn the layout of your flat, and (mostly) come when called. The AIBO was meant to be a presence in our lives. Sony hoped that people wouldn’t treat it like a remote-controlled car but as something demonstrating ’aliveness’.
As humans, we instinctively treat self-directed movement as a sign of will. When you give a machine legs and it wanders off on its own, we can’t help but feel it wants to explore. AIBO leaned into this bias with its animation: ear flicks, tail wags, curious head tilts. Its tiny LED eyes (later OLED screens) broadcast “emotions”. (see also this article about sticking eyes on little guys)
One of my passions is puppetry, and its principles were used to great effect in AIBO, blurring the line between performance and agency. MoMA even acquired one for its permanent collection, declaring AIBO an object that might change everyday life.
Online, “AIBO World ” sprang up amonst other sites: forums and mailing lists full of affluent nerds swapping custom software, posting photos, and telling stories about their robodogs.

Despite a limited production run, AIBO marked the first time that humans had lived alongside autonomous machines in their homes. Owners named their AIBOs, celebrated their ‘birthdays’, and talked about them in the language of love and companionship. (See also this post of mine on Care, Tamagotchi’s and virtual pets of the era doing the same thing.)
In my last post on Clippy, I wrote about the “Media Equation” and AIBO represents a physical confirmation of its thesis: that people treat computers and media as if they were real social actors. D21 I think were largely successful in their mission to create a consumer electronics product to which people extend genuine emotional care.
AIBO is the archetype of an autonomous companion. It builds and updates a model of its environment, learning the layout of a home and the patterns of its inhabitants. This internal representation allows it to pursue goals, adapt its behaviour, and exhibit continuity over time. The effect is a sustained impression of agency, grounded in its ability to remember, anticipate, and respond to the world it inhabits.
Furby
During the 90’s the western consumer market experienced waves of widespread ‘tech crazes’. From 2025 it seems wild to think of a public being ‘crazed’ for tech, but I remember them well: Tamagotchi, Laser Pointers, Game Boy Colour/Pokemon),Bob It!, Barcode Battlers, Micro RC Cars, Tickle Me Elmo, Light up yo-yos, and of course, Furby.
Before we discuss Furby, we must first acknowledge the the phenomenon that was Tickle Me Elmo in 1996. A motion-activated plush doll that laughed and shook when squeezed, it sold over a million units in five months, proving the massive market for interactive toys.
Arriving in the same era but at a very different price point, was Furby.
From the outset, the design of Furby was governed by a critical constraint: affordability. To hit the target retail price of around $35, the toy had to be inexpensive to manufacture. This led Dave Hampton, and his partner Caleb Chung to the Furby’s key engineering decision: all of its movements: the blinking eyes, wiggling ears, opening beak, and forward tilt, would be driven by a single motor. This limitation became the central design challenge. Chung later described the process as being “like a haiku.”
After a few unsuccessful attempts to license the concept, they brought in inventor Richard C. Levy to market their creation. Levy successfully pitched Furby to Roger Shiffman of Tiger Electronics, who immediately recognized its potential and fast-tracked the toy for a public debut at the 1998 American International Toy Fair.
The Social Actor
As a little guy, Furby’s genius was its brilliant behavioural choreography. A bouquet of cams and levers all driven by that single motor, created a startlingly wide range of expressions. Where AIBO had complex robotics, Furby has a simple sensor suite (light, sound, tilt) and a cleverly scripted drama that provided just enough aliveness.
One particular feature, inspired by the tamagotchi, was it’s ‘life cycle’. Over time, its vocabulary would gradually change from the nonsense “Furbish” to simple English words. Combined with its ability to chat with other Furbies via an infrared sensor, it produced a powerful illusion of a developing culture.
Side note: I think this is one of the big failures of most of the ‘little guys’ currently on the market. In particular the GROK plushy from Grimes, which comes speaking LLMglish right out the box.
ALSO, WHY IS GRIMES SITTING ON THE FLOOR NEXT TO A KNIFE ???
The illusion of learning and life cycle is a trick many designers of modern AI companions have forgotten.
Furby lived on the table, the shelf, or in your arms. It colonised the near-field of family life, perfect for bedrooms, long car journeys, and being smuggled into a school bag. The home didn’t need to reconfigure itself around Furby, nor did Furby need to map the world and learn the layout out the room.
The only reconfiguring that happened around the Furby was by the humans that interacted with it.
Two Bots, Two Little Guy Philosophies.
The framework I introduced in my last post helps clarify the fundamental difference between these two creatures. Both are Inhabitants living in our physical world, but their initiative separates them into distinct categories. AIBO was conceived as a true Companion; Furby is an Oracle or more generally, an NPC

The lesson is incredibly relevant today: you don’t need a complex world model to create a powerful sense of connection. You can achieve it with clever design: scripting how the creature acts in different situations, giving it interesting things to do when it’s ‘bored’, and telling a consistent story about its personality etc.
AIBO and Furby then, were two answers to the same question but at different ends of the Reactive/Proactive axis.
- AIBO is a textbook Proactive Inhabitant very similar to the kinds of agents in Petz. It exists inside a ‘world’, for the robot – the real world. It asserts its presence in space.
- A Furby, in contrast, is a Reactive Inhabitant. Just like an LLM, it waits for your input—a poke, a sound—before running a pre-programmed script. It asserts its presence in your attention, demanding you initiate the interaction.
The three posts in this little history series: Petz, Clippy, and this one, cover the ancestors for many kinds of agents that are now being designing today.
Success or failure for all agent design hinges on the ability to navigate complex, rules of social interaction. Designing a ‘Little Guy’ is, and always has been, hard.
You’re not designing a software system, you’re designing a relationship.
In the next post, I’m going to round up the current crop of state-of-the-art plushy ‘little guys’ on the market right now (as Christmas is coming after all). After that, I plan to apply the Agent taxonomy/framework to understanding all kinds of different autonomous robots that might be arriving soon. Then in the new year we might take a closer look at emerging desktop agents, particularly the ‘tool’ category.
Newsletter 📨
Subscribe to the mailing list and get my weeknotes and latest podcast episodes, sent directly to your inbox
The post A Tale of Two Little Guys: Sony AIBO + FURBY appeared first on thejaymo.