Twitter as a Cure for Perfectionism

Perfectionism can cripple productivity in that it will stop you from even getting started. Why do something now if it will not be perfect? Why should I not shave this yak? This problem is particularly insidious in programming and design,  in that you can argue that there may very well be a “perfect” way of doing it. The global optimum of some function, or the minimum amount of ink to convey an idea. You can essentially work forever and never achieve perfection. There is a cure, in an unlikely place: Twitter.

I was (and to some degree still am) guilty of perfectionism, but I found the most therapeutic device in combating this was having a space to spout off half-cocked ideas into the ether and watch them linger. I noticed that no matter how much or how little time I spent crafting these short messages, it was always kind of short of perfect and it did not matter.

If you find yourself bored one day, log into a streaming Twitter client (e.g. TweetDeck) and add a trending hash tag column. What you’ll see immediately is that almost no tweet is significant. You could craft the most beautiful, intelligent short poem and post it to an unknown quantity of people. Could no one see it? Perhaps, though unlikely. Could someone see it and move on to the thousands of other messages they’re trying to consume? Absolutely.

This exercise reinforces that more important than holding something to perfection, you should let it go. Someone may call you mediocre, but that is just as insignificant. If they someone sees it and doesn’t think it’s great, it’s unlikely they’ll do anything but move on (unbeknownst to you). A kind of social nihilism.

This echos with almost all social networks where there really is no way to be negative using the software (e.g. favorite, like, heart), Hater App excluded.

While this technique may not work in quite the same way if you have over 100k followers, it addresses the real problem at hand: worrying that you’re not as great as you want to be and people will find out. Tweet more, ship more, write more, design more, who cares?

College graduates are depressed, and they should (and shouldn’t) be

Recent college graduates, sold a guaranteed future in exchange for unabsolvable debt, are increasingly finding that the emperor has no clothes–working sucks, and not in a whiny small way. Graduating in some of the worst economic conditions in almost a century, young folks are at the whim not of the educated but the established money/power architecture. This is not to say that we should all be black bloc anarchists burning effigies of “fat cats” or the like. Blame cannot be (entirely) placed on a given person or class of people, but on what is essentially the fastest clip of advancement in human history.

The New York Times recently published an article on the lives of 20 somethings in this country (for future reference, the United States). The article resonates with almost everyone I know personally, in that we work long hours for little pay and little hope for advancement. The only industry in which this might not be entirely true is software/tech. However, even within the startup culture you have long hours paid below market value pay for the hopes of some big payoff down the road (sound familiar?).

You can focus on marketing, “creative work”, etc. when observing these trends, but it can be seen in what were traditionally solid careers in science, technology, engineering and math (STEM) related fields. Analysis of the current social and economic climates often ignores the role of advancing technology, at least in a high level way.

Beyond just the oft heard complaints of being “always on” with smart phones, etc., the real role that technology is playing is far deeper–it is shifting, as it always has, towards more efficient work. The output of workers since 1947 has increased dramatically, while compensation has stagnated for over 10 years, seen in the chart below:

That major bump in productivity in the mid to late ’90s is no coincidence: computer and Internet technology has vastly increased our productivity, while employers are opting not to increase compensation. And why should they?

Employers have the upper hand in that jobs are scarce while talent is not. Increased productivity also means that fewer workers are needed to complete the same amount of work. Experience is waved over workers new to the workforce as a justification for a lack of compensation and advancement. More insidious than this, of course, the unpaid, borderline illegal internships that litter the job landscape and further depress wages. Often times those wishing to strive for a high level profession, such as medicine, have to suffer through incredibly harsh hours with no compensation in order to qualify for college applications. Creatives in the film and book industries see this as well.

The underlying trend in all of this follows that the young are asked to sacrifice tremendously for the hopes of a payoff later in life. This has been the song of the ages, you reap what you sow. The question lies in what really is the final payoff. It is far from guaranteed to happen, at least in the capacity commensurate with the level of effort.

It is not all gloom and doom, though. Looking forward to the future, technology will continue to improve our lives and make the world better for all. Perhaps this is period of adjustment. Re-calibrating to the new level of what is possible.