By the time I got to the theaters last month to see “Booksmart,” actor Olivia Wilde’s directorial debut, it had already engendered a ton of hand-wringing over its poor box office performance relative to its widespread critical plaudits. Financially, it was declared dead on arrival. I’m a fan of independent movies, and I like stories about women made by women, so I figured I’d buy a ticket and do my part in supporting the kind of cinema that I want to see more of. Unfortunately I found “Booksmart” incredibly unpleasant to watch. Not only are its plot and characters remarkably—even aggressively—flimsy and inconsistent, but it traffics in a vision of teenagers as slightly downsized twenty-somethings—with all of the ravenous consumer and sexual appetites of urban professionals, hampered only by modestly limited spending power—that I personally find to be lazy and offensive.
My wife and I actually got into a bit of an argument over how “Booksmart” compares with “Always Be My Maybe,” which was released on Netflix at almost exactly the same time. “Always” is another diversity breakthrough in that it was written by and starred two Asian American leads, and was set in a milieu that’s almost entirely Asian American. I wouldn’t make the case that “Always Be My Maybe” is a masterpiece, but I did think the essential motivations of its plot and characters followed sound logic, which in my view is a claim that you can’t make for “Booksmart.” That’s really all I want from most movies: a reasonably accurate simulacrum of the way real people act and behave, regardless of how outlandish or unrealistic their circumstances might be. Plus, if the movie bills itself as a comedy, it would be nice if it was also genuinely funny. I bursted out laughing several times during “Always Be My Maybe.” I think I might have audibly chuckled once during “Booksmart.”
One more comment about “Booksmart”: despite my distaste for it, I left it more convinced than ever that Beanie Feldstein, one of its two leads, is among the most watchable actors working today. She’s a dynamo.
Speaking of independent cinema, I also went to see “The Last Black Man in San Francisco,” a startlingly rich portrait of gentrification told from the perspective of utterly unique African American characters. I couldn’t believe my eyes for the first ten or fifteen minutes; first-time director Joe Talbot practically plows over you with the assuredness of his vision. The rest of the movie can’t quite live up to its opening, but it’s never less than mesmerizing.
Unfortunately, in the third act Talbot also succumbs to a common trap in auteurist filmmaking: trying to make his point through the conceit of a play staged by his characters. I’ve learned through many bad experiences that a play-within-a-film is almost invariably a signal both that the director and screenwriter are very serious and that they are stumped as to how to present their very important ideas. John Turturro’s horrific “Illuminata” and Alejandro González Iñárritu’s pompous “Birdman” are two debacles I barely endured that come to mind. Luckily “The Last Black Man in San Francisco” never stoops as low as those and is well worth a watch. I can’t wait to see what Talbot does next.
Aside from that it was a relatively light month of movie watching. Here is my full list of all twelve that I saw.
“Always Be My Maybe” (2019) Keanu’s cameo got the most attention, and it’s not even the best part!
“The Sting” (1973) Rewatched. An almost perfect little fairy tale of the confidence game.
A box of a hundred jumbo paperclips costs a little less than US$2 at Staples but the folks at gift and home accessories brand Areaware sell a box that will set you back more than ten times that for just thirty of them. Still, if you’re going to spend 67¢ per paperclip, you might as well get these, created by Dutch studio Daphna Laurens. They’re shaped into almost extravagantly unconventional forms and yet they also remain instantly recognizable for their purpose, which is the kind of savvy aesthetic accomplishment that I personally find to be really f’ing cool. I’ve got a box and I only use them on my best stacks of paper.
If you like them, you can spend your hard earned money on them over at areaware.com.
There are so many Dropbox integrations available that the service seems essential, or at least difficult to imagine doing without. Over the years I’ve hooked numerous apps and services into my Dropbox account, which is why I started paying for the professional plan seven years ago. And yet each year, at renewal time, I think a bit more deeply about the question of whether Dropbox is in fact so indispensable. This is the very boring story of how I came to realize that it’s not.
It’s no secret that online storage has become more and more common, but I was still taken aback a bit when I recently took an accounting of all the places where I have access to at least a terabyte or more of it: iCloud Drive, where I also have all of our family photos, and Google Drive, where I have my email and office productivity, are the most prominent examples. But I also have backup storage on my Synology NAS, and “cold storage” that I’ve set up on Amazon S3 Glacier. And those are just the options that I pay for myself; at work I can store files on Microsoft OneDrive and even on an enterprise version of Dropbox.
Clearly, storage is a commodity now. And while Dropbox has worked hard to differentiate itself with new features, at its core, it’s still hard to argue that the service is truly much more than storage. Even the company’s elegantly designed and reasonably popular Paper app hardly feels additive; it’s hard to make a case for innovation when the key value-add is something as basic as word processing.
Of course, part of the reason I stuck with it for so long was because the prospect of untangling Dropbox from my life had daunted me for years. That’s the genius of services like these; at first you think you’re just buying storage, but little by little it finds its way into everything you use.
The only way to kick a habit though is to start kicking it. I logged into the Dropbox site and then opened the “Connected Apps” section of my settings screen. That displayed a list of over three dozen apps and services that I had hooked into the service over the past decade or so of usage. That’s when I realized that surprisingly few of them actually seemed all that critical anymore, if they ever even were. Many of them I hadn’t used in ages, and most others could easily be replaced by iCloud Drive. A very few would be difficult to use without Dropbox, but I realized that the actual storage they required was effectively de minimis and that they could easily reside on Dropbox’s free tier, if necessary.
There was a standout exception on this list though: 1Password, the absolutely essential password manager, whose password vaults I sync between devices via Dropbox. It’s possible to sync 1Password with iCloud, but neither iCloud nor Dropbox are as robust and secure for this purpose as Agile Bits’ own 1Password membership option. That comes at a cost of course, but a 1Password family plan, which covers up to five users and costs half the price of a Dropbox subscription for just me, struck me as far more economically wise anyway.
Having confirmed that my apps would be largely unaffected, I turned my focus to shifting my stuff off of Dropbox—or deleting it—at scale. I started with my projects folder, the largest single directory I have on the service. It proved surprisingly difficult to shift to iCloud Drive; for several days, Dropbox seemed stalled or out of sync between my devices as it worked overtime to delete hundreds of thousands of individual files. Eventually it sorted itself out, and I then started rooting through all the other directories where I’d stashed photos, screen grabs, stock art, fonts, music, PDFs, video clips and countless other random items over the years, deleting most of them and refiling others on iCloud.
One of the hardest things to figure out was how to handle a shared directory that my wife and I use to store bills and paperwork. iCloud Drive’s folder sharing is still only in beta and won’t officially be ready until this fall, and even then my wife historically has been less aggressive about upgrading than me. As an intermediate plan, I decided to make copies of the files I knew I would need continued access to, then disconnect my account from the directory altogether for the time being (my wife retains access), until we can recreate the share on iCloud Drive. It’ll be inconvenient for us to coordinate these items in the meantime, but not insurmountable.
As with any file system in continual use for nearly a decade, my Dropbox account had accumulated a ton of cruft. All in all, it took me three weeks of persistent pruning, bit by bit, to whittle it down from nearly a terabyte to just seventy-eight megabytes. It was annoying, and not at all the way I wanted to spend even a little bit of my life, but then again I had a very nice feeling of satisfaction when I was able to cancel the pending auto-renewal of my Dropbox account, which was set to happen this coming week.
More to the point, disconnecting from Dropbox was time consuming, but it wasn’t difficult. Which is to say that although the process was high friction it was relatively straightforward, and not at all technically challenging. At the outset I had expected that switching away from Dropbox would break many parts of my workflow; in practice, very little of it has been disturbed at all, even though the iCloud Drive features that are ostensibly allowing me to switch are still in beta. I’m certainly not extolling the virtues of leaving Dropbox if you find it indispensable in your own work—it’s still the best option if you need to share files across non-Apple platforms. But the relative ease with which I was able to leave it illustrates well Steve Jobs’s famous criticism that Dropbox is a feature, not a product. We often mistakenly believe that software features are irreplaceable but they rarely are, especially in categories as well commoditized as storage.
This is what usually happens: a film creates a compelling fantasy world and fans clamor for more. So sequels build that world out, they show more of its mechanics, its people, its history. But ‘John Wick: Chapter 3—Parabellum’ demonstrates one little acknowledged principle of escalated world building: the inevitable outcome is bureaucracy.
If nothing else, this latest chapter in the surprisingly successful Keanu Reeves franchise makes it clear how fan service can undermine everything that made that franchise interesting in the first place. “Parabellum” spends a distressing amount of time literally explicating the bureaucracy of the Wick-verse—pointlessly. The outcome is tedium so pervasive, even the fighting—the fighting is why you watch a John Wick movie in the first place!—left me bored. Read my whole review here.
Longtime readers may remember that a little more than two years ago my friend Scott Ostler and I released a Mac app called Bumpr. Well today we have a major update in the App Store. Before I go into what’s new, here’s a refresher: Bumpr is a simple utility that lets you click on any web link and choose which browser to open it with, on the fly. It looks like this:
Tons of Bumpr users have told us that, like me, they’ve come to rely on it for the power to choose their favorite browsers and email apps every day, dozens of times a day. I still use it religiously, and it’s among the first two or three things I install on any new Mac.
Today’s release brings a few new features that lots of users have been asking for. The first is the ability to define custom rules for your favorite domains so that they always open up in specific browsers, so you can always open your favorite sites where you want them.
The second major feature is extensions for Safari and Chrome, which now gives you Bumpr functionality as you surf the web. Once installed in either browser, you can either click on the new Bumpr toolbar icon to open the current page in any other browser, or you can right-click on any link to do the same. This makes Bumpr even handier for controlling the flow of what you see and where.
This upgrade is free for all current owners of Bumpr. If you haven’t tried it yet, we’re having a special sale right now: Bumpr is thirty percent off at just US$3.99. This new version is in the App Store right now. Once you’ve installed it, you won’t know how you got along without it before.
The folks at Creative Mornings recently interviewed me about blogging, which is a topic you don’t really hear a lot about these days. They did this in collaboration with WordPress, who are of course proponents of people publishing their own websites—and owning those sites—using their seminal blogging software.
In fact, the interview is hosted at a WordPress-owned site called Own Your Content, a beachhead for an eponymous campaign that encourages creative professionals to “own their content, platform, and the future of their work.” Unsurprisingly, one of the questions addresses the issue of centralized writing platforms—which, frankly, means Medium—and whether or not I believe that people should be using them or should be using independent publishing tools like WordPress. In my answer, I try to draw a distinction between the idea of publishing “content” and “writing”:
…far be it from me to pretend that I know what most people should be doing. Many terrific careers have been borne from creating works on centralized platforms, where the creator has only the most tenuous ownership over what he or she is creating or its brand.
That said, I personally can’t imagine handing over all of my labor to a centralized platform where it’s chopped up and shuffled together with content from countless other sources, only to be exploited at the current whims of the platform owners’ volatile business models. I know a lot of creators are successful in that context, but I also see a lot of stuff that gets rendered essentially indistinguishable from everything else, lost in the blizzard of ‘content.’
Not that the work I do is all that important or memorable, but I prefer to think of it as ‘writing’ rather than as ‘content.’ And for me, that’s an important distinction. Content and writing are not the same thing, at least the way that we’ve come to define them in contemporary society. Content is inherently transactional; its goal is to drive towards some kind of conversion, some kind of exchange of value. This is why platforms just think of it all as ‘content’; for the most part, they’re indifferent to whether it’s good or bad writing, or even if it’s writing at all. It doesn’t matter whether it has any kind of inherent worth, whether it’s video or animated GIFs or whatever— so long as it’s driving clicks, time spent, purchases, etc.
Again, I’m not suggesting that what I do has any superior worth at all, but what I will say is that the difference between content that lives on a centralized blogging platform and what I do on a site that I own and operate myself—where I don’t answer to anyone else but me—is that my writing on Subtraction.com has a high tolerance for ambiguity. It’s generally about design and technology, but sometimes it’s about some random subject matter, some non sequitur, some personal passion. It’s a place for writing and thinking, and ambiguity is okay there, even an essential part of it. That’s actually increasingly rare in our digital world now, and I personally value that a lot.
In retrospect, my view on content is a bit too harsh, I think. Content is an unavoidable reality of the contemporary Internet because it’s virtually impossible to do anything online today without being involved in a transaction of some kind. And there’s a lot of good content out there too, much of it on Medium, in fact. What I regret though is that it’s almost all become content, and that there is relatively little writing on the Internet these days that isn’t transactional, that actually has a tolerance for ambiguity. Read the full interview at ownyourcontent.wordpress.com.
I’m incredibly humbled by the whole thing but I have to say it’s directly a function of the rich culture for innovation at Adobe. I’ve said before that the reason I work there is that, as the only multibillion dollar professional creativity company in the world, Adobe allows people like me to work on problems that no other company would even be interested in, much less realize the potential of. It’s an honor to get this recognition, but I’m only one of many, many people at Adobe who are frankly having the time of our lives reimagining what professional creativity can be.
You can read the write-up about me in Fast Company’s ranking at fastcompany.com. Also, note that I’m one slot ahead of actor and icon Michelle Pfeiffer. I can’t tell you how good it feels to finally have an edge in my years-long rivalry with Michelle Pfeiffer.
I got out to theaters twice last month to see “Shazam!” and “Avengers: Endgame,” both jam-packed with super-hero action and special effects (and, incidentally, virtually indistinguishable from one another). But the most thrilling new movie I saw in April was Steven Soderbergh’s very odd “High Flying Bird”—on my iPad.
Despite possessing a vibrant sense of verve and daring, this original Netflix release from director Steven Soderbergh is almost perfectly designed to be swallowed up whole by today’s media landscape, before anyone notices. It’s ostensibly a drama about the world of professional basketball but it includes virtually no basketball; its cast is noticeably lacking in star power, even if the performances by up-and-comers André Holland and Zazie Beats are transfixing; and its plot is so obtuse as to practically defy any buzz spread by word-of-mouth. It’s almost unsurprising that when it debuted back in February, it was met with a mixed reception before promptly sinking into the deep, mealy swamp that is Netflix’s bottomless catalog.
Still, I found it riveting. “Birds” is the latest vehicle for Soderbergh’s fascination with iPhone cinematography, and the result is, if not uniformly pleasing, never less than alive, imbued with a powerful, hyper-aware detail and immediacy. In many ways the aesthetic is perfectly matched by the unapologetically ambitious script from screenwriter Tarell Alvin McCraney (who also wrote the Oscar-winning “Moonlight”). Both are intensely precise—deep focus imagery and dense, nuanced dialogue—yet paradoxically vague and open to interpretation, and both are beautiful in inelegant, even brutalist ways. You never quite know what you should be looking at or even listening to, but the wallop they pack together is undeniable.
Here is a full list of all seventeen movies I watched in April.
Here is a presentation that I made last week about how to understand the design process, explained through the lens of Thanos, that lovable scamp from “Avengers: Endgame.” (Mild spoilers included.)
If we want more awareness of and appreciation for our work, explaining design to people who aren’t already well-versed in the field is one of the most worthwhile things that we can do as professionals. It’s also one of the hardest. Which is probably why I procrastinated so much in preparing for this talk last week, when Gimlet Creative invited me to come and help them “get a little smarter about design.” Gimlet is of course the production house behind “Wireframe,” the podcast about design that I host, but the invitation was to address the entire team of audio producers working on many different shows on many different topics.
I lecture fairly frequently and so have a good number of talks in hand but I didn’t have one that lays out the basics of the design process for an uninitiated audience. Some people can just extemporize grandly on anything even vaguely relevant to their areas of expertise, but I always need to have plenty of time to write and rehearse. In my head, I had expected to be able to devote one day early last week to writing it from scratch and another to rehearsing, but that didn’t quite work out, suffice it to say. I ended up cramming it all into Wednesday night, the day before my appearance at Gimlet.
Actually, to be totally frank, what happened was that I came by two tickets for “Avengers: Endgame” on Tuesday, and so of course, nothing got done. By the time I sat down to start writing on Wednesday, I had that feeling of being in a real jam, as I was due to give my schpiel on Thursday at 10:30a. Not only was I running out of time, but I was completely stumped as to how to tell a story that would resonate. How does one explain a subject that’s as expansive and nuanced as design, without boring the heck out of an audience as smart and discerning as this? And how to figure that out the night before?
After an hour or so of panic, I had a realization that there was an “in” here that would, at the very least, make the topic more accessible for me: I could explain design through the lens of “Avengers: Endgame.” This would require accepting a pretty silly conceit: the idea that the master plan that Thanos, the central villain, enacts in “Endgame” and its predecessor, “Infinity War,” was in fact a kind of design. Or, at the very least, it’s an example of design gone wrong, and that in itself could be a useful way of explaining how design works.
Settling on that concept allowed me to power through the rest quickly—I just used the notion of Thanos being a fairly incompetent designer as a framework on top of which I could hang a bunch of stuff about design that I already knew. The whole talk is hardly genius, but I would contend that it’s mildly fun, at least, which is a useful step towards making design a little bit more relatable. And based on the massive box office receipts for “Endgame,” even if this makes design more relatable for a tiny fraction of moviegoers, that would be a victory.
The full presentation is embedded above. Of course, unaccompanied by my talk track, my intention isn’t always apparent so I’ve added excerpts from my talk track to selected slides. For maximum legibility though, the deck is available over at speakerdeck.com. Enjoy.
An article published yesterday in The Washington Post demonstrates the danger of design’s failure to broaden popular understanding of our craft. It tells the story of hackers compromising Nest Cams in private homes by taking advantage of lax security on the cameras. And it pins the blame for this on technology companies’ focus on reducing “what Silicon Valley calls ‘friction’—anything that can slow down or stand in the way of someone using a product.” The assertion is that Nest and other companies could better secure devices like the Nest Cam by requiring measures such as two-factor authorization of user accounts, but are reluctant to do so because that would make the products more difficult to use.
It’s certainly true that more could be done to encourage better security practices for Nest Cams (and in fact for most every other smart home device; the category is in desperate need of a privacy and security overhaul). But the concept of user experience writ large is not to blame here; what’s actually at fault is bad user experience practice.
There are at least a few other designs that could have been more conducive to users’ interests here: Nest could force users to consciously opt out of two-factor authorization; it could more clearly warn users of the danger of not opting into two-factor authorization; it could offer an option where account access is restricted entirely to local IP addresses; and many more possibilities. Privacy and security are not at odds with user experience; in fact privacy and security are raw materials that designers must use to create good products.
Nest just happened to make an injudicious design decision. But the framing of the problem in this article equates a focus on low-friction user experience design to be suspicious at best, and inherently compromising at worst. Any professional product designer knows that’s hogwash, of course, but the gospel of our profession—the idea that designers are motivated to make people’s lives better—is lost on the audience of a mainstream news organization like The Post’s.
We could chalk this up to lazy journalism but in fact the fault lies with us, with designers who have utterly failed at explaining what it is that we do to the world at large. There is little comprehension of what design does or how to define user experience, and what possibilities exist within these broad, amorphous concepts for everyday people. Design, as I’ve argued many times, is still a mystery to the uninitiated—including otherwise savvy reporters. In absence of understanding, suspicion and fear rush to fill the void, which is what is on display here.