trenchant.org

by adam mathes

The Apple Watch Will Make Me Cool Or Die Trying

It’s easy to be cynical about new technology and products, particularly from Apple. It’s also easy to be caught up in the hype before a product launch, which maybe I am.

But I’m more excited about the launch of Apple Watch than I’ve been for anything since the iPhone (which we could only imagine) because it feels very, very different than any high profile tech launch.

Fashionable or Bust

This is a product intended to be worn and seen on your person. It’s going to touch your skin all day. It’s going to tap your wrist to get your attention. It’s designed to be noticed by others.

Mobile devices are personal, intimate, and used in public — the whole package matters. And therefore how these devices make us feel, and what these devices say about you to others matter. […] Wearables (watches, glasses, whatever else comes next) will be even more about fashion as these things will be even more a part of how we present ourselves to others.

me, October 2013

Google’s offerings (Glass, Wear) tried but weren’t cool.

Apple Watch is going to make you feel fashionable and cool or die trying.

Apple Watch looks serious, designed, and beautiful in a way that the competitors in this space as of yet do not. I was skeptical of the Pebble, but now I wear one every day and I love it, but it is borderline embarrassing as an adult to be wearing it. (The new Pebble Time looks much better, and I’ve preordered one.)

That tech pundits are skeptical of Apple Watch is irrelevant. The early adopter smartwatch phase of Pebble and Android Wear wasn’t that fun. It was kind of boring, the devices were clunky, the software was slow and hard to use, and the devices were just pretty fugly.

The Apple Watch is something else.

Evaluating it purely as a functional object misses the point. It’s a watch and a piece of jewelry as much as anything else - and it’s nicer than lots of the existing traditional watch competitors in this price range. But, also, it’s a hyper-awesome-space-age-internetworked computer on your wrist while it does that.

Wrists Are On Human Beings

Having 38mm and 42mm sizes is one of the few acknowledgments in a major technology product launch that there are customers in the world who matter who are not, statistically in terms of body size averages, served by a product designed for and by tall white men. In fact there are people in this world who need products in shapes and sizes beyond what is designed around a statistically average white american male! Like, me. And many women. And all kinds of people all over the world.

In an ideal world we’d expect products to regularly be designed to account for a range of human bodies and this wouldn’t be so out of the ordinary. With wearables a one size fits all approach is unlikely to be as successful. Comfort matters. And you have to invest early on to get it right.

Making designs work at multiple screen sizes in a v1 is something Apple didn’t do with the iPod, iPhone or iPad at launch. It would have been a lot less work for Apple to just pick one size, but I think they rightly concluded that it would have made for a significantly worse experience for everyone.

Mediating, Replacing, and Creating New User Experiences

I think the Apple Watch is going to succeed in large part because it acknowledges fashion, comfort, consumer desire, and luxury in a way other wearables haven’t even begun to, but I’m a hobbyist in that realm.

I have a lot more experience in the world of technology mediated user experiences, and while I think evaluating any wearable purely in functional experiences misses a big part of the story, here’s my take.

The time and effort to glance at a watch is fractional compared to pulling out a device from a pocket, powering it on, unlocking it, and then interacting with an application. It’s seconds vs a fraction of a second.

That’s awesome and exciting.

The premise - and promise - of something like the apple watch is three-fold:

  1. Mediate and filter phone/tablet/computer experiences to save you time
  2. Replace existing experiences with faster, easier, more pleasant ones
  3. Create new experiences that are only possible with a wearable

An initial critical analysis of wearables from technology pundits tend to overly focus on how they don’t really want or need (1), (2) is not yet feasible because of supposed constraints, and they don’t properly imagine (3). I think these are short-sighted critiques.

Mediating Experiences

Because you almost always have your phone with you, and it’s nearly always connected to a network, smartphones quickly became better mediators/filters/messengers than their larger predecessors.

Smartphones mediate experiences - based on emails, notifications, or other things you might interact quickly with an application on your phone, or it may trigger a longer more complex interaction on a computer later.

Push notifications on smartphones are victims of their own success - by providing information outside of the application context, notifications inform and help users to decide whether to re-engage with an application. They help us to filter and mediate (or addict us, depending on your perspective) but are abused by many applications, so often fail to “scale” as you use your smartphone more.

Because we now live in a world of constant pinging, ringing, dinging, txting, liking, retweeting, snapchattering, instabrogramming, hyperspacefumbling, and spammy-re-engaging the idea of having these things show up on our wrists (especially to tech-heavy journalists addicted to social media) is some combination of awesome and/or terrifying for many.

Pebble (in its first version) really focused on this. Having used a Pebble for a year, if you take active control over your devices, notifications, and permissions this can actually be a great experience.

It’s actually quite simple on iOS: go into settings, then notification center. Turn off everything (notifications, badges, sounds, etc) for everything but phone calls. Wait. Breath deeply. Take a walk.

Meditate.

Cleanse your soul.

Only when you are spiritually and emotionally stable - selectively turn on the 2-3 things you might actually need to be interrupted for (phone calls, text messages, VIP email.) If those then actually show up on your wrist and you can glance at them without taking out your phone, it might actually make things easier for you.

On Android it’s also simple, especially if you’re on Lollipop with the “priority” notification features: just unlock your Android device, spend minutes in vain trying to find out how any of it works and why dozens of apps notify you without permission, scream in a combination of futile anger and anguish, then smash your device against the nearest concrete block, drop out of consumer society, go to a 14 day silent meditation retreat, move to New Mexico, switch to a pre-paid phone plan and put your SIM in a used Motorola Razr.

Telling you who’s calling you, seeing a text message, and other little bits without pulling out your phone will be an easy win for the Apple Watch, and it’s the easiest part to understand, though some are skeptical it will make things easier. (I was when Pebble was announced and mocked it as a device for those too lazy to pull out their phones.)

Apple watch will likely do a better job of it than the Pebble, which was already useful pretty much based solely on that (the apps on Pebble are too clunky and slow and unusable.) That you will then be able to have super tiny instant reactions / interactions with it will make it not just faster than pulling out your phone, but better in some cases if they get it right.

Think of how many “coordination” text messages happen (are you there, are you on our way, etc) that would be nice to respond to with a single tap on your wrist (‘omw’, ‘5 minutes’, or ‘running late’) rather than futzing with your phone while you’re trying to get somewhere.

Android Wear tried to do some of this, but my experiences with it were the interfaces, navigation, experiences, and hardware were too clunky to get it right quite yet. My expectation is Apple will do better on their first try, and Android Wear will improve, and overall this sort of thing is inevitable for some segment of users.

Replacing Experiences

In the pre-smartphone era, you might have gotten on your laptop in the morning to check traffic for the morning commute. Today that would seem cumbersome (and your laptop probably isn’t on the bedside table like it was in 2005.)

In 2010 if you were on the cutting edge you might have checked your smartphone, using an application, perhaps seeing if other routes were better.

In 2015, you probably use your smartphone, but most modern iPhones or Android devices are smart enough to pull that information and experience outside the context of an application and provide it to you faster via Google Now or Apple’s Today experiences.

It’s not hard to imagine how you can take those sorts of experiences and replace them with something “glanceable” on a wearable that gets most of the value, but faster, with near-zero interaction required.

Interactions and interfaces are often the enemy of great experience - if you can do what you want without impediment, or with as little as action possible, you’ve saved more time in your day for “real” life.

In the early days, many people didn’t imagine that smartphones would swallow up as many “big device” activities as it has - maps, web browsing, messaging, social networking, photography. But as the hardware and software accelerated its pace, it wasn’t just possible to replicate activities you’d do on a laptop with a smartphone with subpar experiences, but it was also more pleasant in many cases. And this happened, very, very fast — a period of a few years. Compared to the adoption of personal computers, the web, and other technologies smartphones adoption and power is just mind-boggling.

If you squint, you can imagine something similar happening with wearable technology, with Apple Watch leading the effort. Directions, short messages, weather, traffic, and a whole host of things might be better experienced and general usage eclipsed with smaller, smarter, zero-interaction or “minimal” single tap experiences if we (as product and technology people) get our shit together.

In the same way that the constraints of the smartphone led to a whole world of simpler experiences vs. the bloated, WIMP and web modes, wearables offer that same blank slate. This doesn’t mean the other platform experiences disappear, but if you can get 80% of the value for 80% of the use cases without the bigger form factor, interface, and annoyance, that’s a huge change in behavior and time spent.

Creating New Experiences

This is the part that is hard to really grok. It’s also the most exciting.

It’s easy to be cynical about how things will not work; it’s easy to speculate about how battery life isn’t great yet. It’s hard to imagine how things might actually work in the future and what will be most impressive.

When I see things like the “send a heartbeat” feature - this is a feature that has to be evaluated on an emotional level, not just technology. It’s a magical, awesome thing that you just can’t do and get the same visceral, emotional, magical thing to happen on any other platform.

Things that weren’t possible before that don’t necessarily even make sense at first glance - that’s what’s most exciting. You don’t get Instagram without the iPhone. You don’t get Snapchat without smartphones either, and you don’t get it by starting with something everyone understands.

New sensors, new inputs, and new contexts offer up completely new worlds.

The kinds of things that seem exciting to me:

  • self-evaluation - imagine the watch that listens to the tone of the voices around me and my heart rate to understand mood and quietly pulses my wrist to try and get me to breathe deeply and lower my pulse.
  • communications - imagine a watch face that shows my loved ones physical status (sitting/walking/running), and one tap to urge them on if they’re exercising
  • swapnote reborn
  • proximity alerts on my wrist if friends are nearby
  • a million other things I can’t even imagine

The point is: I’m probably buying one and if it’s awful I’ll be sad.

Fourteen

I remember nights when I would sit at my computer, staring at the screen, and telling myself I had something to write. Something to say. And even if it didn’t seem important I had to put something down. Even if I thought it was garbage. Because that’s the only way I’d ever get anything out there and get better at it and get over the dread that every thought I ever have is garbage and boring. (Only some of them are.)

I used to sometimes worry that nobody would ever read this.

14 years later I sometimes think nobody will ever read this! How liberating.

Social media solved the audience problems for personal web communication. People can find an audience on centralized, social media sites. It makes writing into the cryptic blackhole void of the independent web nearly as strange now as when it first begun. But the instant audience and feedback and hyper-virality is its own nightmare.

I used to think I wasn’t internet famous enough and what was I doing wrong but now I just crave less attention and I wonder if Snapchat is the only authentic communications modality in 2015.

Happy 14th birthday trenchant daily. I didn’t understand being 14, or other teenagers even when I was one, so I don’t expect to understand you either.

Reinventions

The thought that keeps coming back to me these days is - Forum 2000 already did this and it was funnier.

Platform Changes

I recently assembled a new PC to go along with a 34” ultrawide (21:9 aspect ratio) monitor.

Spending money to buy a bunch of PC components, then assemble them into a working computer by myself is something I have mostly avoided in my life, opting for the simplicity and peace of mind that comes from being primarily a Mac user since OS X 10.0 debuted, and letting companies build PC’s for me the few times I’ve bought them.

But the cost/performance ratio between self-build and pre-built seems higher than I remember, and also why not?

The sole purpose of this machine was gaming, and the simpler answer is to just buy a next generation console. But the latest generation of consoles seem to have almost all of the downsides of consoles and other “modern” hardware platforms - can’t run arbitrary software - but they also seem to exhibit the downsides of modern Microsoft powered PC’s: nothing ever seems to just work and you’re constantly barraged by bad user experiences and multi-gigabyte downloads before you can even do anything.

The days when you could buy a cartridge of READ-ONLY memory and expect it to just work more or less flawlessly on hardware are so long gone so as to seem quaint. If Microsoft and Sony want to just release PC’s for the living room that I can’t run software on (cough Steam cough) or the most interesting hardware (Oculus Rift) and force me to buy software from them and deal with their authentication/social experiences then basically the UX/annoyance hits seem so high I might as well just deal with the headaches of having a real PC that can play games better on my own terms. And probably I should also build the damned thing from parts to give myself a crash course on what I’ve missed in PC stuff from the past 15 years.

Also, did I mention the 34” ultra-wide monitor? That’s never going to work with an XBox One. (It barely works with PC games half the time.)

And I am now able to use my Oculus Rift DK2 and have it properly sync at 75hz. (Trying to use it with a 2012 Macbook Pro with Retina Display just made me sad.)

Overall the entire process of building a PC from parts is about a million times easier than I remember from previous decades (Cases! cord management! All so easy to use now!)

I had one “oh god I’ve made a huge mistake” moment because I couldn’t get the BIOS screen to show up on my new monitor. Turns out the motherboard’s onboard HDMI output wouldn’t sync to my crazy monitor. (I realized this when it would sync to a television at 720p just fine.) Turns out putting one of my two video cards in and connecting via DisplayPort made everything copacetic.

It’s really satisfying to see it all come together except after all that careful thought, purchasing, anxiety about possible DOA parts, and assembly, you turn on the machine and you install and then boot into Microsoft Windows 8.1 which is just so disappointing. Expected, but disappointing.

After dealing with the “PC tax” of drivers, software upgrades, Steam, and the rest of it, it’s kind of nice.

In the past weeks I have played and completed:

  • Far Cry 4
  • Call of Duty: Ghosts
  • Wolfenstein: The New Order
  • Call of Duty: Advanced Warfare

running at 60FPS at 3440×1440 resolution and that has been pretty cool.

For more on running games like that see WSGF.

Dot Files All The Way Down

Is it too late to start keeping a .plan file as my canonical web expression?

Scholar

Only at Google, of course, would the world’s most popular scholarly search service be seen as a relative backwater.

Steven Levy, Making the world’s problem solvers 10% more efficient

I had the privilege of working with Anurag Acharya a little during my time at Google. He is a treasure, and his contributions to the world of scholarly publishing are enormous.

Happy 10th birthday, Google Scholar.

Changing Keycaps

I’ve now become the sort of curmudgeon who not only insists on a mechanical keyboard, but gets custom key caps to replace the Windows keys.

Because they are distasteful.

~club, Unix and The Commoditization of Community

With Tilde Club Paul Ford reminds everyone that Unix is powerful even if we forgot it.

Writing about it on medium instead of ftrain makes me worry that the dream of the early web came true and we can never go back.

The web is a Heraclitus river but Michael Sippey is blogging again and Matt Haughey is blogging again amongst other highlights. I can’t even begin to keep up.

BeOS batmobiles are long gone, but the M1-A1 Abrams Linux tanks keep on rolling.

When I was a child the cost of a Unix workstation was so far out of my reach and my world - a Unix workstation was the cost of a car. When I was a teenager in the 90’s Linux made faux-Unix available for a fraction of the cost on commodity hardware, but you wouldn’t have confused the performance of a 486 with a Sun workstation. You were just pretending on the PC. (The “real work” of scientific computing and programming in academia was still done on the big, out of reach Sun and SGI workstations.)

Now Unix is everywhere. Unix is so pervasive and boring that now it’s fascinating, but we don’t use it like we did in the previous decades.

On the desktop OS X became the most mainstream Unix derived from the “legitimate” BSD legacy, only to be supplanted by Android’s use of the semi-legit Linux, which means everywhere all around me are little Unix boxes in people’s pockets, talking to a million anonymous Unix boxes in data centers.

For fun I have a $35 Linux computer. The faux-Unix on it is more powerful than those workstations and mainframes I could only imagine having as a child.

Tilde club reminds us that we have the tools to spin up digital virtual communities on software and hardware that is so cheap and abundant that it’s hard to fathom. The hardest part is actually doing something interesting with it.

And We're Back

I had a great wedding and honeymoon. Thanks for asking!

Maybe it’s time to update my web site again.

Here are books I read over the past few months that I liked:

Imagine Being Surrounded Only By Things That Bring You Joy

The Life-Changing Magic of Tidying Up: The Japanese Art of Decluttering and Organizing by Marie Kondo - my wife is the acquiring editor for the US edition of this book about how to rationalize our relationship with the objects around us.

The secret to tidying and organizing is you start by throwing things out! My favorite. For the past few years I’ve been trying this sort of thing and it does actually matter.

It’s only when you discard the awful that you have space for the wonderful. (Physically, emotionally, etc.)

Also it has a story about cell phone disposal that made me tear up. (Don’t tell anyone.)

The Phantom Phone Virations In Your Pocket

The Distraction Addiction by Alex Soojung-Kim Pang - Pang is a futurist and this is his attempt to define theoretically and practically contemplative computing as a way to thoughtfully approach information technology use to enhance our lives (rather than letting it cause us destructive behaviors.)

Similar lessons: rationalizing our relationship with technology to hack our way to a better world. Get rid of the bad. Focus on the good. It’s not so much that less is more with our devices and connections but that everything all over all the time hurts. Some of the little software tools I dabble with are about this.

Do-Over

Seconds: A Graphic Novel by Bryan Lee O’Malley the new graphic novel from the creator of Scott Pilgrim is incredibly beautiful, fun, and enjoyable, even if predictable.

The Dawn of Modern Web Computing

Close to the Machine: Technophilia and Its Discontents by Ellen Ullman

In 1997, the computer was still a relatively new tool—-a sleek and unforgiving machine that was beyond the grasp of most users. With intimate and unflinching detail, software engineer Ellen Ullman examines the strange ecstasy of being at the forefront of the predominantly male technological revolution, and the difficulty of translating the inherent messiness of human life into artful and efficient code. Close to the Machine is an elegant and revelatory mediation on the dawn of the digital era.

The novel feels authentic and personal and resonating in a way that conveys that period and programming culture better than anything else I’ve read.

Conclusion

Turns out there’s more time for books when you give up Twitter.

The Mainstream Internet

Twitter feels like it’s my last connection to the hyper-connected social “mainstream” internet, and with my bots I feel like I’m about ready to automate myself out of it.

What’s next?

I feel too old for Snapchat, too tired to reboot 4uhm.

Like A Magic Spell

Cake flavors, place card designs, hors d’oeuvre options, playlists, and a million other small choices but it’s the decisions between words that feels qualitatively different.

Marvel Cinematic Multiverse

https://twitter.com/mcu_movies

When I saw that Guardians of the Galaxy 2 was being made, I realized that all Marvel properties could now be made into movies, and the Marvel Cinematic Universe could live forever.

So I made a bot for it, that tweets things like -

Marvel Cinematic Universe Phase 39 - Avengers 39: Pet Avengers, Squirrel Girl 14, Imperial Guard, Starfox, Champions 4

Every time I make a Twitter bot I feel bad because why am I making art (?!) on Twitter and also aren’t there real people pouring their souls out onto the internet still and I’m just excreting out the same one-line joke over and over again forever digitally.