Cover Photo: © 2017 Ava Jarvis
© 2017 Ava Jarvis

Fungibility Is A Broken Concept

Fungibility is a much-lauded quality for an ideal employee, but is that due to flexibility or to replaceability?

DESPITE  having friends who are full-stack devs, and myself more than capable of same, I hate the way "full-stack" as a qualifier is treated by hiring companies and managers. 

I didn't know why until I remembered that when I worked as a software engineer in high tech, I had managers take me to task for "over-specializing." I was supposed to be flexible enough to take over any position whatsoever in any development team. To concentrate too much in one speciality or another would make me  a worse software developer. The advice was that I strive to be fungible.

At the time I thought they simply wanted me to be flexible. It's a reasonable conclusion for someone who wants to believe that systems and people are better than they actually are, that there's a mutual beneficence in a system giving a person critical feedback for improvement, and that person striving to do their best to incorporate that feedback into their career. A growing opportunity, as it were, for both entities.

And so I strived to be a perfect cog in the machine of our CEO's ambitions for years. 

But in striving to be anything my company wanted me to be, I ended up being nothing that I wanted myself to be—and I didn't even know it.

Fungible didn't mean flexibility, or at least, if it meant flexibility, it was of less importance than replaceability.  I was meant to be flexible because it was cheaper to the company to be able to plug me into anywhere than to hire and vet a person who could do the job better.

LET'S go back to the concept of the "full-stack" developer. 

A full-stack developer is a software developer who can, as a single person, code, design, and deploy every single layer of an application. From the user interface, to the networking code, to the transactional database layers, they're a superb generalist; a jack-of-all-trades and supposedly master of all of them. Such a developer can be a real savings to a company, since hiring managers don't need to be spend time or money looking for people best in specific disciplines. 

And for non-technical company heads, a full-stack developer sounds like a good deal, because it's all tech and it all just sounds the same, doesn't it? It's all just code, isn't it? 

What non-technical folks don't understand, and what I believe some technical people have a vested interest in keeping them ignorant of, is that every layer of an application requires vastly different skillsets, as well as ways of thinking often at odds with each other. You wouldn't ask your accountant to also design your brochures—and yet that's exactly what "full-stack" strives to accomplish. Writing a backend layer is so different from writing a frontend layer that it would have been, all in all, much easier for everyone  if we treated them as entirely separate disciplines. 

"Full stack" embraces the idea of fungibility at a smaller level of company/business. It's so damaging to a software engineer's self-development, because spreading yourself across so many different, conflicting disciplines means that you simply don't have time to develop skills, or even to really look at whether you actually like, or actually hate, some technical area. Being full-stack means never having to investigate how best to do any one particular thing, it means relying on frameworks you don't understand to be absolutely reliable,  and, most of all,  it means embracing speed over quality. 

And while many companies profess that quality is overhyped and speed is the only thing that matters, I think this idea breaks people and breaks careers—and you don't even necessarily have to be as brutal as an Amazon or a Microsoft to do both. 

The best intentions don't lead to heaven. 

THE worst truth I ever had to confront about my tech career was that I never got to find out what I really loved.  I wasn't given the room to figure that out. I loved software, but that love wasn't allowed to grow. There was simply no time to do so. 

While some companies do encourage engineers to explore different disciplines,  I have rarely seen such encouragement come without a caveat that doing so will reduce your dedication and efficiency to your current job role. A great example of this bait-and-switch is Google's "20% time"—if you do take it, you're expected to also increase your company project time by 20%. In other words, Google's 20% is actually 120%. 

And since many tech companies are salaried positions, instead of  hourly ones, there's no such thing as overtime. Companies may not be allowed to directly order salaried employees to spend their weekends on work, but they sure can bring other pressures—all related to career advancement—to bear to achieve a similar effect. 

Plus many tech companies take a proactive stance in preventing their employees from writing code for open source projects, which are a common way for developers to learn what they do and don't like. I remember the chill that went down my spine one morning when an official company email arrived in my inbox to tell me that developers were no longer allowed to participate in code project marathons. 

Apparently the only marathoning allowed, even on our own time, was for the company. 

"Fungibility" was not the only reason I left the tech field—there's a lot of social toxicity there in the first place—but it was a huge contributing factor. The very brokenness with which this otherwise inoffensive descriptor was applied made tech depressing and soul-sucking. I'm sure the social toxicity did a lot to rot the concept of fungibility further. 

There's nothing like peer pressure driven by management-encouraged competitive behavior to drive morale into the ground.

SOME may say that I  became an artist because I wanted to be a special snowflake all along.

Yet I tried for over a decade to be a perfect cog. I don't think I would have done that if all I wanted was to be special. I didn't want accolades—I just wanted to love what I did for a living, and to become better at what I loved. 

I live to strive. But that wasn't what the company wanted.

Anyways, as an artist on my own (for commercial artists working for large companies are just as beholden to be as fungible as their technical counterparts), I've been able to explore. 

Unbound by the requirement to be fungible, I've found much more freedom in developing specialities than in covering generalities. I can work on what I love—but I can explore other artistic disciplines in enough depth to learn if I truly love or truly hate them. 

I can strive to be a better artist in a way that I was never allowed to strive to be a better developer after I entered the tech industry. 

Perhaps it's a little sad. To be honest, I cried for two years after finding out how much striving to be fungible had broken me. 

But now things are better.

So it goes.

Ink and watercolor artist living in the Pacific Northwest. Vietnamese-American US citizen, child of refugees, nb, they/aj, disabled, jinja shintoist, bi ace, trauma survivor. Plays boardgames. Also writes.

I used to write software but no longer do so.