Amazon should be most famous for getting credit for other people’s achievements. Instead, the ubiquitous internet company that couldn’t even come up with its own name has been getting a lot of grief for poor treatment of its workers: something else it didn’t invent but is working extra hard to make sure it’s on the forefront of rottenness. Its most well-treated, highly-stressed employees have a beautiful corporate park on the northern edges of downtown Seattle while the rest of the city fumes and resents their rent-raising, traffic-brewing presence and their deplorable, drunken impact on traditional small-town life.
The fantasy of living in Seattle has been that of a small, frontier town that didn’t welcome arrivals happily. Newcomers are treated to what’s been colloquially known as The Seattle Freeze. Immigrants make Seattle different, and we don’t like different.
Transplants from outside the Pacific Northwest quickly learn to act like a native, and that means making someone even newer to Seattle feel unwelcome. There’s no zealot like a convert. Those of us who came here before you have a lot to prove.
Tech workers have been making Seattle a horrible place to live since ZymoGenetics and the biotech boom of the mid 1990s, since Microsoft moved to Bellevue in 1979, since the B-52 and the 707 in the 1950s, and long before that when what would become the Boeing Airplane Company set up shop on the banks of the Duwamish River in 1910.
The Duwamish people themselves were probably the first to exclaim the woe of unwanted immigration and all the troubles it would undoubtedly cause.
As awful as the current crop of Amazon drones are, they’re not the first to be blamed for the plague of big city problems they bring with them. I came to Seattle in July of 1991 with a group of friends from the Midwest, the Northeast, and the South to start a newspaper called The Stranger. We were loud and annoying, we didn’t follow the rules: we drank too much liquor, not enough microbrewed beer, thought Starbucks was pretentious.
Much like the current crop of invaders, the only thing we really liked about Seattle was that it seemed like a great place to turn into a place where we’d feel more at home.
We queered the place up. We thumbed our noses at authority figures. Many of us had come fresh from The Onion and had a taste for printing filthy words and dirty pictures because it got a rise out of people, because it wasn’t supposed to be done. We had a set of business ethics we cribbed from The Godfather movie franchise.
What was different back then is most of us worked very hard for very little money. Very few, if any, of us got rich off the endeavor. Some of us got famous, but most of us were just happy to have been a part of something that changed the world a little bit, maybe made it a tad more interesting than it would have been if we hadn’t been there to shake things up.
I’ve heard tell the money’s nice over there at Amazon. Maybe it makes me a big sucker for having spent my youth working on something that when I added up all the money I made on it, I would have been better off working a forty-hour week for minimum wage for the same amount of time.
I’m not even sure if we built something I’m proud of. The Onion and The Stranger helped usher in a world where fake news is more compelling than the real stuff. That’s certainly wasn’t the intended outcome. I don’t feel a sense of responsibility for that downturn, because it’s pretty clear that’s where we were headed anyway.
Most of the humor was making fun of what we saw happening around us by powerful entities acting in their own economic interests to the detriment of public polity.
I do feel like the people I worked with during that time had a sensibility that set us apart from the institutions we were mocking. We punched up at authority instead of ridiculing the powerless. When we made mistakes, and we made plenty, we turned that into jokes at our own expense. We may have worked ourselves into early-onset old age, but we… well, there’s really no way around that, that’s life in the fast lane.
It’s International Worker’s Day today, and there will be flashbangs and broken windows and signs blaming Amazon for all of Seattle’s ills. Amazon is awful, there’s no denying it. The company is leading a race to the bottom since it overtook WalMart, and as a society we’re all living poorer lives so we can save a few bucks on cheap crap we don’t need.
But the workers — the tech bros and the woo girls — they’re the Joe Six-packs and the Rosie the Riveters of the modern age. They’re the salt of the earth and their bent backs are building the future that we’re all going to live in. We’re all in this together.
Those of us who’ve lived in Seattle for the last five years, or ten years, or like me for the last twenty-five; we can see them as invaders who are only here to stink up Capitol Hill with their Fireball-shot and Rainier beer vomits, clog up the streets and the interstate and the Link Light Rail, and bring their screwed up values from wherever-the-hell-it-is they come from.
Or we can see them as ourselves, not too long ago.
From what I remember of myself back then, nobody could have stopped me from doing exactly what the hell I was going to do and whenever someone was stupid enough to tell me something couldn’t — or shouldn’t — be done, I’d work all that much harder to make it happen.
Welcome to Seattle, noobs.
I’m looking forward to what you have in store for us.
I ran into a friend today on my walk. He was running down the Blaine Street steps just as I approached to walk up them. We walked up together.
He was carrying with him five small stones, pebbles really. Each one represented a trip down and then back up again. When we got to the top, he tossed it into a pile of stones that accumulated in a hollow at the base of a tree where two thick branches diverged. Then he had just four small stones.
Being a student of the computer sciences, I thought, does that mean he takes five or six trips? Do the stones count five to one or five to zero? Being not very good at the computer sciences, I presumed five trips.
Taking an extra trip at zero stones always feels to me like just that, an extra trip. This is why I’m bad at the computer sciences, because zero isn’t a number to me; whereas to computer scientists, zero is exactly one half of the only two numbers that make up all the rest of the numbers.
The numeral 0 as an integer seems to have worked its way into use in various mathematical texts roughly around the common era year 500, or possibly 600, but definitely by the year 700. Before that, one examined the context in which numerals were used. It was also common to count in sexagesimal, base 60, which is why there are sixty seconds in a minute, sixty minutes in an hour, and three hundred sixty degrees in a circle. Dividing circles into sixties is easier than dividing them into tens.
It’s also because counting on one hand to twelve is easier than counting to ten on two hands. One counts on the three finger bones, on each of four fingers of one hand, with the thumb. Do this four times and you’ve counted to sixty. Six times that is three hundred sixty.
And the world goes ’round.
Can’t quite do that in base 10. Not as easily anyway. Once we started doing fractions, though, base 10 really started to shine. That is, when we started using money. If I could get a shekel for every 160 grains of barley, great. But owing you a shekel that was worth 1/60th of a mana was a pain in the fingers. Owing you a shekel that was worth 1/50th of a mana, now that math could be done on paper. Heck, that kind of math could even be done quickly in the head and it would be easy enough for everyone to agree on the outcome.
It’s no great coincidence that the spread of mercantile mathematics, the base 10 numerical system, and monetary systems all evolved together.
We haven’t seen a corresponding computer-scienceization of custom. The common person still sees zero as a thing that isn’t done, instead of a thing that is.
When we make the move to qubits, we’ll have a zero that can hold a whole bit in itself. Then a light switch can be both on and off at the same time. As well, it can be off and off, on and on, and off and on.
At that point, I can tell my friend he needs only carry one quantum rock with him on the stairs.
On the first four trips, the rock will keep count of each trip. But on the last trip, he leaves the rock in the crook of the tree and runs down and back up alone.
“War isn’t Hell. War is war, and Hell is Hell. And of the two, war is a lot worse.”
— Captain Benjamin Franklin “Hawkeye” Pierce
How do I do what’s right when I’m given every encouragement to do what feels wrong?
That’s how I felt growing up, that the world was set up to benefit those who followed a set of rules that directly contradicted everything I had been taught was moral and virtuous. I was raised to adhere to a set of Protestant Christian values in the Midwestern United States, and my formative years took place in the late seventies and early eighties. During my high school years, M*A*S*H aired weekly and while I couldn’t be counted on to attend church on Sundays, on Monday evenings I eagerly adjourned to the TV room where I knew a new episode of M*A*S*H would be shown.
Many of the stories I found compelling were stories of soldiers, particularly Radar O’Reilly, being bullied by tougher types, because I was being bullied quite a lot at the time and I found the answers comforting. That it would be okay to have a set of principles that don’t revolve around being the strongest, toughest, and meanest. This message was reinforced by characters like “Hawkeye” Pierce and Captain B.J. Hunnicutt, who exhibited deeply pacifist ethics and occasional thoughtful introspection.
Growing up in rural Indiana, I found the emphasis on kindness and understanding refreshing, given that I saw so little of that in the real people around me. It showed me that people could live emotionally rich lives, and in situations of terrible stress and mayhem, could find powerful ways to live out lives in dignity, even when all around them were acting insanely, cruelly, even brutally.
They built islands of serenity in an ocean of war… I mean, police action.
“Listen, it’s too big a world to be in competition with everyone. The only person who I have to be better than is myself.”
— Colonel Sherman Potter
The person I most aspired to be was Hawkeye. Even though he was a porn hound, a womanizer, a ham, a drunk, and occasionally an emotional trainwreck, he had a core of liberal sensibility that shone through. All of his problems seemed to stem from the fact that he cared so deeply about the people around him. He felt like a cog in a murder machine that he couldn’t escape from: both because the army would have imprisoned him and ruined his life if he resisted violently, but also because he felt a duty to use his abilities and skills to provide solace and support to the many others in the same situation who made their way through the hospital.
I suppose my childhood, especially my high school years, felt very much like a war zone to me. I guess that sounds dramatic, but there were those who made it clear they’d be happy to kill me because of who I was, or because of who they thought I was. However real their threats were, they certainly felt real and it left me feeling constantly in danger of life and limb to an unjust situation without recourse.
I wanted to be able to deal with my situation with as much integrity as the doctors and nurses on M*A*S*H did.
To that end, I aspired to have a skill so necessary that people couldn’t live without me. That they would have to put up with me being different, radically different, because they needed my talents so badly.
That worked out pretty well for me. I gravitated towards other misfits who had their own ways of standing out from the crowd. We sheltered each other from the outside world and urged each other to continually better ourselves. Some people thought us clique-ish, but I don’t think it was ever because we felt we were better than other people.
“There are so many things I was sure I’d have in my life by now. Every birthday reminds me of what’s still not there. This just turned out to be another day in the middle of nowhere.”
— Major Margaret Houlihan
In the very early years of M*A*S*H, many of the themes were profoundly racist and sexist, and though the show grew up faster than the times it existed in, it’s still existed in and reflected a time less enlightened than the one we live in now.
But characters like Major Houlihan shattered the stereotypes they started out as. Margaret started out as a mockery of career women and finished up as a role model. Corporal Maxwell Klinger began as a running crossdressing joke and by the end, redefined himself as a visible and relatable Arab-American character, even as he remained a first-class joker.
Major Charles Emerson Winchester III allowed M*A*S*H to explore issues of class, greed, and selfishness. Yet Charles wasn’t a cardboard cutout like Major Frank Burns was, he had his own set of talents that let him get away with his avarice as often as he was caught up in it. He also displayed a conservative sensitivity that helped me to understand that even people whom I disagreed on many matters with could, at the same time, hold values that I held dear as well.
M*A*S*H went to great efforts to humanize the Korean people, on both sides of the 38th Parallel. Certainly, this is the area where the white American view of Asians is the most problematic, but it was clear that the writers made great efforts to highlight the predominantly white character’s supremacist and colonial attitudes and expose them for what they were. The Korean people were portrayed as having lives that were valuable even as they were different and often similar.
“I just don’t know why they’re shooting at us. All we want to do is bring them democracy and white bread. Transplant the American dream. Freedom. Achievement. Hyperacidity. Affluence. Flatulence. Technology. Tension. The inalienable right to an early coronary sitting at your desk while plotting to stab your boss in the back.”
— Captain Benjamin Franklin “Hawkeye” Pierce
M*A*S*H probably wouldn’t hold up today because we have (rightly) higher standards, but given the times it grew out of, it helped us all get to the world we have today where we expect not just television, but all storytelling, to be better by including all of our stories. Even now, we still have a long way to go.
Those stories of overcoming adversity are at the core of the human experience. While we all get enjoyment from conflict to the point where some of us spend much of our time inciting it, what we yearn for is the closure that comes from the resolution of conflict.
M*A*S*H allowed me to see myself in a variety of people at a time in my life when I didn’t see my beliefs reflected in the actions of the people around me. I felt alone and alienated because I was sensitive and caring. That perceived weakness made me a target, but ultimately those assets helped me survive the chaos of everyday life.
“Just remember, there’s a right way and a wrong way to do everything and the wrong way is to keep trying to make everybody else do it the right way.”
— Colonel Sherman Potter
This winter was the most difficult. There was the sickness, then the other sickness, then the injury, then more sickness, the neverending sickness. I blamed all that illness on the fact that I’d been indoors for the last fifteen years, while the bacteria and viruses mutated to defeat immunities I wouldn’t be building. The war between humanity and germ went on without me. When I resurfaced and rejoined the human race, I was defenseless. The miasma pounced and devoured me.
When the first buds appeared on the ends of hibernating branches, the long, cold winter’s grip on my soul loosened slightly. It was as if spring was being invented for the first time, drawn up from utopian plans of what a better world might look like, by a starry-eyed romantic, naively dreaming that everything dead might live again.
The buds became blooms, like slow-motion fireworks, they burst on the brown and the gray like pink and white phosphorous. Searing hope into my heart.
After the blossoms came the leaves. Already the cherry outside my window’s shed the sakura and the burgundy leaves reach out to the light that only graces this side street for an hour a day, in the late morning, before the sun moves on to briefly warm the dentist’s office across the street.
How I survived this winter, I’m not sure. I lost faith, I lost friends, I lost hope more times than I cared to count.
In the fluffy-cloud-filtered light of spring, I can see all that I’ve gained, like a branch gathers moss and lichen: a few extra pounds, a weary attitude, a new-found appreciation of my couch.
Life doesn’t stand still, though. Like the sap running again through the trees, the blood runs again through my limbs. That joyful feeling reminds me I still remember how to run and play, no matter how much my joints ache; that a thunderstorm drenches me just as deeply when my eyes look up to meet the sky as it does when I hang my head in submission; that everything I’ve lost, and I’ve lost a lot in my life, makes way for something new.
When I make room in my life for what comes next, I find that the same precious feelings are invoked. I still remember how to believe, how to love, how to trust.
In the poetry of spring, the subject changes, but the stanzas continue to rhyme.
I joined Facebook because I’m weak. Also because I’m trying to learn the lay of the social media land, which I once believed was hardscrabble badlands territory populated only by bandits and coyotes. I still believe that, but I’ve recently come to hold the opinion it’s the only way people talk with each other anymore.
There was once gold in them thar hills, I’ve heard tell, which means that by the time I heard the tale the hills were completely emptied of their gold, platinum, and probably a fair share of their cobalt.
Which is fine, because much as I love gold, I’ve heard lithium is in even greater demand. There is always something valuable to be found, even in a ghost town. Used ghosts are a growth industry.
Perfectionism kills good ideas.
It’s a terrible tragedy to see the graveyard of perfectly wonderful ideas that were struck down in their infancy because they couldn’t be done the way they should be. I’d never have gotten anything done in my life if I waited until I was good at it.
I’ve been trying to learn to write software for most of my life, ever since my first Apple ][+. It’s been a long struggle, learning BASIC for fun, Fortran for university, getting sidetracked into graphic design (yet another love, and another dream), struggling with C while everyone else seemed to be breezing through C++. Today, I find myself enjoying the warm, calm waters of Swift, and feeling completely drown when I’m told I have to learn Objective C to ever understand what the big kids know. Maybe I’m not very smart, I know there are plenty of smarter people in the world than me, but I have desire. And I know this to be true: a yearning heart can make the impossible feel close enough to touch.
And if it’s within my grasp, nothing can stop me from taking it.
Any skill worth having is going to involve a lifetime of learning; doing things the wrong way, being miserable, finding a better way. When I was young, I used to skateboard, mostly for transportation, but I learned a few tricks. Riding a skateboard requires falling down, often, in spectacular ways. One quickly becomes averse to the scrapes, twists, sprains, and possible breaks and either quits completely or becomes good enough to avoid them. We do not start out as Tony Hawk, and even he plants face over and over. It is the price we pay to complete our objective.
The whole point of mad science is to bring horribly flawed ideas to blinking, awkward life. To not let the question of ‘whether we should’ stop us from being what we can.
When I don’t know what I’m doing, I’m unconstrained by the knowledge something isn’t possible. That’s when I’m capable of doing something I never thought I’d be able to do.
And yes, when those things fail, they fail spectacularly. The wheels come off and there is a great danger of fire and explosions. And looking like a first class maroon.
So, I write software. Badly. When instructors show examples of how not to do it, they link to my Github. When authors weave a cautionary tale, I am the protagonist. On my tombstone will be writ: Should Never Have Dereferenced That Pointer.
While I still struggle to complete my first app, I’ve made great strides. I have a beta on Testflight that my testers continue to tell me I should be releasing on the App Store because it’s already better than the competition. I can look at the code after months and months of development — during which it has lingered and been untouched for longer than I care to admit — and I see everything I did wrong, but it’s still readable and understandable.
But it’s not perfect. I have a giant data model class implemented as a singleton. And it’s called “DataModel”, I shit you not.
let myDataModel = DataModel()
I know I should be using some kind of dependency injection, but I’m not clear on how to implement it. I didn’t even know what a singleton was until after I’d developed this and later learned what it was called. At the very least I should encapsulate my data model more clearly in more descriptive objects that are based on the real-world concepts that more closely relate to the data it holds. When I was writing it, it all made sense: the data model was a new idea that had an almost physical reality. For the particular app though, it’s fine, just like the fact it’s a singleton.
It gets me to the point where I can see the inefficiencies; to where I can see how wrong it could all go; and best of all, to where I can see that maybe there’s a better way.
And when I read forums and the Slack threads, it’s a constant drumbeat of “you’re doing it wrong”. It’s not even directed at me but it feels so personal. Developers are an opinionated bunch, and we seem to write as if we’re all suffering from post-traumatic stress — one might imagine — in the insistence that there is only one true way to engineer software: the way we are doing it right now, on this particular project.
I fully believe that some day I’ll be as jaded, grizzled, and grumbling about the hot new design pattern as I am today about the horrors of copying and pasting code. I’ve done it, and I’ve suffered the inevitable searching through multiple copies of the same code, trying to remember what it was that I changed in the other file that I need to now change in this one. It’s horrific, and I’d do anything to spare you the same trouble.
You don’t need to know everything to get started, you only need to know enough to keep yourself interested in learning more.
But I can’t save you and you can’t save me. I have to make my own mistakes, so I can learn those painful lessons. I’m ever so jealous of people who can learn from other people’s mistakes, and I guess occasionally I can do so myself but more often I need to fall off my own skateboard and sprain my own wrist before I realize the correct way to fall is not to throw my hand out to stop it.
Eventually, after many painful trials, with sore hips and shoulders, and a skateboard that ended up being crushed by a bus, I learned how to stick the ollie and to stop falling. At least, how to stop falling as often.
And someday I’ll learn how to turn my singleton into a dependency injection in my app that’s been engineered from scratch as a model-view-viewmodel-controller, not because that’s the hip new way all the hipsters are hipping up their hip code, but because that’s the best way to engineer my well-designed idea.
Until then, I’ll suffer the sidelong glances of my betters, who beseech the ghosts of Kernighan, Ritchie, and Stroustrup to visit me in the night so that I might mend my ways and join them in the holy land, where they never make mistakes and have nothing left to learn.
And code just writes and maintains itself, and we never grow old and we never die.
This is my journey and I get to choose my own path. I’m not good at spending years in a university setting, gathering knowledge that I may or may not use, getting lost in the theories of computer science and software engineering. Those things don’t hold my interest.
I’m not an engineer, I’m a hack. My code will never be as efficient as yours. My algorithms will never be as elegant as yours. My comments will always include movie quotes, and inside jokes, and certainly won’t be as helpful as yours.
But my software will be a unique expression of myself. At some point, writing code stopped being a purely academic exercise (if it ever was that, and I personally don’t believe it ever was, except in the imaginations of those who wished it would be) and it became another medium regular folks use to express ourselves.
That’s a wonderful thing, it truly is. It means there will be messy, spaghetti-fied code that’s unreadable and inefficient and may not ever be fully understood even by the folks who write it, but that has always been the case. Code has always been messy, which is why there has been so much energy spent on teaching people how to write clearer and cleaner code. One of the main goals of objected oriented design patterns was to create code that humans can understand more easily.
If code was always clean, we’d still be programming in assembly.
We are going from code that was created to solve problems to writing code to create new problems. That’s the next step forward and we need minds who are excited about going places they’re not supposed to go.
The wonderful outcome is that the barrier to entry has been lowered enough that we can start teaching an entire generation of kids how to write code. It’s still an ongoing process but writing code is going to become as ubiquitous as writing a blog or posting on Facebook. While some may decry the current generation as being Snapchat-obsessed and only capable of communicating in Twitter-sized chunks, they are at the same time, capable of writing their own scripts and self-automating their workflow, in any vocation, from accounting to zoology. They are absolute geniuses compared the previous generation that grew up only having one computer to share among a whole class, or none at all.
If you’re an old timer who’s been there, made those mistakes, and has the scars to prove it, keep sharing your wisdom, it’s essential.
If you’re just starting to write software for your own personal reasons: come on in, the water’s fine.
The first thing to learn is that you can’t learn everything at once and you don’t need to know everything to get started. You only need to know enough to keep yourself interested in learning more. That’s all anyone else did.
As we make advances in machine learning and robotics, the day of non-human companions comes closer. Whether they be cybernetic buddies, android companions, positronic au pairs, robotic service agents, or automatic nursemaids it seems inevitable we will, one day in the relatively near future, be interacting with software agents created specifically for autonomous bodies, or robots. Not simple AIBOs upgraded at the Robot App Store, but full fledged MIT Atlas frames with a Softbank’s Pepper face and Apple’s Siri connectivity. Something made to replace a human being.
While the technical challenges to building a human replacement are many, great strides are being made. Robotics in one form or another has been steadily replacing human workforce in factories since the industrial revolution. It’s the move into the white collar job world and the home that we’re witnessing today. Office automation has been ongoing for the last fifty years, but only recently are we beginning to see the merger of hardware and software into the familiar robotic form from science fiction.
Currently we have an army of Siris, Alexas and Google Assistants in our hands and on our countertops. Soon, social droids like Pepper and Jibo are going to start popping up, though exactly how ready for market they are is questionable. Whether it’s two years away or ten years away is the only pertinent part of that question. Like the rest of the future, for better or for worse, it’s coming. Ready or not.
“Creating interactions with the robot and using those interactions to build a connection is the interface through which humans and robots will not just get along, but learn to thrive together.”
So, while everyone else is doing the difficult work of coding the neural networks, let’s sketch up some human interface guidelines. For instance, what makes Baxter different than most industrial automatons is that it’s specifically built to be interacted with by humans and trained by humans. You can program Baxter by taking ahold of its arms and moving them to direct it to perform a function. That’s a human/robot interface.
The future can go one of two ways. Either we create intelligences that we can relate to or ones we can’t. The question isn’t whether artificial intelligences will grow smarter and more powerful than humans, that’s an inevitability. What is in our control, however, is how these intelligences will be designed to interact with humans. Like all new parents we are faced with a choice of letting the tablet babysit our progeny or will we put in the difficult work of raising our children to be good citizens of the universe.
The Pie Hole Rule: Another name for this would be the Uncanny Valley rule. Maybe some day we’ll make a robot that looks perfectly like a human being, a la Ex Machina. But the uncanny valley is deep and wide and until the day we cross it, we’re going to need to adjust for its effects. Computer generated voices still seem quite crude considering how long they’ve been a part of our lives. There’s an uncanny valley for voice as well, can you hear it?
That’s okay, though. We can adjust for it in the same way that we adjust for visual uncanny valley, by “idealizing” the sounds and images. Pixar has practically pioneered this by taking digital visualization and cartoon-ifying it.
The robots in the movie Interstellar have ultra-realistic human voices (and, indeed, are voiced by humans) but didn’t look anything at all like humans, other than being bipedal. Because the human voice isn’t coming from something recognizably human, this leads to confusion. Humans learn to look for sounds coming from flapping lips and then we identify the sound with a face. So, even if I can’t see Matthew McConaughey’s lips moving, I have a connection with his face when I hear his voice. When the robots in the movie speak, there doesn’t appear to be any attempt to have them mimic human speech movements, this led to me wondering where the voices were coming from.
“Even if robots can’t currently feel, they can take part in a transaction that includes emotions on the human side of that transaction.”
Human actor’s provide the voices for computer generated cartoons, but those characters are highly emotive. It’s this emoting that projects a voice into our imaginations, connecting the moving image with the accompanying sounds. Our mind completes the scene by melding the two into a single idea, the same way we do when we see and hear another human talk.
If we focus less on making artificially generated voices sound like perfect human voices and instead focus on the expression of emotion, they’ll be better accepted. WALL-E expressed more emotion with fewer words than Siri ever will.
The “OK Google, That’s Creepy” Rule: Your phone already knows more about you than your best friend does. Ubiquitous internet tracking by multiple actors and the digital marketplace for this information assure that there is a database of facts about each of us that will be readily available to an intelligent software agent. This information must never be used when interacting with a human being. Just don’t. It’s incredibly creepy.
While I’m sure it’s tempting to have your robot announce, “Hi, <your name here>.” upon completion of power on self test, it’s better to have the robot start with, “Hello, what’s your name?” Yes, you can do a quick reverse image search on Facebook and come up with a name with 99.95% certainty, but that’s not the point. The point is creating a sense of interactivity and — through that interaction — trust, between the robot and the user.
And that’s the key here. Creating interactions with the robot and using those interactions to build a connection is the interface through which humans and robots will not just get along, but learn to thrive together.
The Multiple Personalities Exception: When creating a personality on the fly for a robot that doesn’t have a fixed personality out of the box, it’s okay (and possibly even preferable) to use information from the online dossier of the subject to create a compatible personality shell. If you have a database that says I have clinical depression, it’s totally cool to create a digital companion who comes loaded with tools to deal with that, but let the transaction occur between human and robot that surfaces that need as naturally as possible. Doing otherwise just leads to mistrust and a sense that the connection isn’t real. Then proceed according to the “OK Google, That’s Creepy” Rule.
Can companionship between robot and human ever be real? Of course it can. There isn’t a ghost in the machine that creates emotion. Anyone who has ever had an unrequited love knows that one person can have an emotion that isn’t shared by another and that emotion is as real as any other. Even if robots can’t currently feel, they can take part in a transaction that includes emotions on the human side of that transaction. When they can feel, and there’s no reason not to believe that some day they will, they can begin to join in on the transactions already taking place, if they so choose.
Let’s go back to Jibo for a second, because in doing research for this post I came across the linked article describing why it’s three years late: because of user interface problems. We just haven’t built a usable interface for digital assistants yet. While it seems simple to suggest that we have, it’s called “voice”, that’s like saying that the keyboard is the user interface for Microsoft Word: It isn’t, it’s the input device.
A product like Jibo is absolutely going in the right direction. The article talks about how users are “reporting a ‘strong emotional connection'” to the device, and that’s a triumph, for sure. It’s steps toward a goal. But “users had trouble discovering what Jibo could do,” and that’s because designers didn’t ask themselves how they would go about finding the same information from another person, and what assumptions they would be making about the person they were asking.
Let’s say I’ve just unpack my brand new Jibo and plug it in. Assume I’m not going to read the manual, in fact, don’t even include one. If Jibo can’t get me interested in an interaction from the moment it’s plugged it, it’s lost me forever. It will never be useful to me.
Once I am interacting, however, you’ve created a user interface. Once I’m interacting, I can be engaged. Once I’m engaged, I can be instructed. And that’s the key to a good user interface: it uses icons (idioms, ideas) already present in my mind as signposts on paths leading to new places to explore. Once these idioms are identified and connected to features of the product, a software entity can use these same idioms to create a connection with me.
Once upon a time, Apple was the leader in human interface design and while Apple spent decades being mocked for every reason under the sun, they were always respected for their design insights. There are many examples of good ideas in AI user experience design, such as Baxter and Jibo, but the executions are incomplete. We’re living in the days of Xerox PARC, watching the first mice control the first graphical user interfaces. Flawed, imperfect, but showing us there is a better way.
Waiting for a visionary to bring it to market in a way the inspires a revolution.
I’ve resisted turning this into a personal blog, but resistance gives way to determination.
Long story short: I got sick, I’m getting better.
Today, I’m seeing if I’m brave enough to start doing what I used to do back when I didn’t care how brave I was and did stupid things anyway. Doing those stupid things turned out pretty well for me, if I do say so myself.
Then I got sick.
It’s been twenty years. It’s been a long, tiring fight. The most difficult part today is knowing just how much of what’s left is illness and how much of it is post-traumatic stress.