Deliberate Rest

A blog about getting more done by working less

Month: November 2012 (page 1 of 2)

Designing interfaces is designing users

About once a month I’ll come across an article of book that makes me think, I really wish I’d met that person when I was in Cambridge. This month it’s computer science professor Alan Blackwell, whose article “The Reification of Metaphor as a Design Tool” makes this point:

When we propose that a computer be presented as a metaphorical office or typewriter, one of the things we are really describing is the intended user of this computer, describing him or her as an office worker or typist. When we designer-researchers in HCI imagine a UI to be a work of literature, we are describing ourselves as creative authors rather than mundane technologists. The relationship between users and designers structures the commercial and social context of HCI, and is the basis of our academic and professional discipline.

This idea that technologies implicitly or explicitly design users, and that this is a process that we need to pay attention to if we want to use our devices more mindfully, is central to one of the chapters of my book (which is , did I mention?). Our interactions with computers, I argue, change the way we think about ourselves, and the way we value human abilities like memory and cognition– and usually, after a few turns of Moore’s Law, we notice that computers constantly get faster and more powerful, while we seem to stay the same.

The solution to the problem of working in bed is NOT a wifi-enabled bed

The Wall Street Journal reports that researchers “who study work habits say a new generation reared on mobile devices is increasingly accustomed to using them while propped against pillows, lying down or in a fetal curl:”

Half of 1,000 workers polled this year by Good Technology, a Sunnyvale, Calif., mobile-security software company, said they read or respond to work emails from bed. A study of 329 British workers found nearly 1 in 5 employees spends two to 10 hours a week working from bed, according to the 2009 poll by Credant Technologies, a London-based data-security company.

The Good Technology survey found:

the average American puts in more than a month and a half of overtime a year – just by answering calls and emails at home…. [M]ore than 80 percent of people continue working when they have left the office – for an average of seven extra hours each week – almost another full day of work. That’s a total of close to 30 hours a month or 365 extra hours every year. They’re also using their cell phones to mix work and their personal life in ways never seen before.

While 60 percent do it simply to stay organized, almost half feel they have no choice because their customers demand quick replies. Thirty-one percent of respondents admit to continuing to work at home as they find it hard to ‘switch off.’ Half of Americans can’t even put their phone down while in bed, as they read or respond to work emails after climbing under the covers.

(Good, btw, has found similar things in other countries: 88 percent of people in a recent Australian survey “said that they worked on their mobile devices outside regular working hours,” while “35 percent checking their phones and emails in bed” and “24 percent of respondents fought with their partners over working on their mobile devices outside of work.”)

The habit of working is bed is problematic because it can contribute to insomnia, ergonomic problems, and strain in your relationship. Of course, this isn’t just a problem; for some, it’s a catastra-tunity! The Wall Street Journal explains:

Reverie, a Walpole, Mass., maker of adjustable beds… is pushing to change the hospital image of adjustable beds to appeal to younger consumers, showing them how elevating the head or foot can ease strain while watching TV or working.

Reverie also offers a built-in power outlet in the base of its beds to plug in lamps, televisions or laptops. Both the outlet and the bed’s movement can be operated with a hand-held remote, or with the user’s smartphone or tablet via built-in Wi-Fi and Bluetooth.

Granted, some people do manage to get some good work done in bed, so this isn’t necessarily bad for everyone; but it strikes me that it’s not one a trend we should encourage, and that as users we should avoid it if we can.

The Internet is the Problem

Saw this in a parking lot in Mountain View this afternoon.


via

I don’t believe it for a minute, but I was still amused to see it.

Distraction Addiction Flickr set and slideshow

I thought it would be interesting to create a that mixes images I use in my contemplative computing talks, with photos from Cambridge and my year writing the book.

It’s currently rather on the long side, but perhaps I’ll get it cut down to a clean 100 pictures, or 50, or something that doesn’t exhaust everyone. Right now there’s a kind of Brian Eno ambient imagery gone mad quality to it.

Ancient multitasking is older than we thought: Evidence from stone spear use and arrow manufacture

When I was at Cambridge (almost two years ago!), I stumbled on the work of cognitive archaeologists Lambros Malafouris and Colin Renfrew. Renfew is a professor at Cambridge with whom I had a really interesting lunch, while Lambros is a fellot at Oxford (and one of those brilliant young academics who in a more generous and expansive era would have gotten tenure years ago). They in turn led me to studies of ancient multitasking, particularly Monica Smith's reconstruction of multitasking in everyday ancient life and Lyn Wadley's work on halfting.

So naturally new research indicating that halfting is a much older practice than we realized caught my eye. Science has a new article on the subject, which is summarized on the AAAS Web site:

Early humans were lashing stone tips to wooden handles to make spears
and knives about 200,000 years earlier than previously thought,
according to a study in the 16 November issue of Science.

Attaching stone points to handles, or “hafting,” was an important
technological advance that made it possible to handle or throw sharp
points with much more power and control. Both Neandertals and early Homo sapiens made hafted spear tips, and evidence of this technology is relatively common after about 200,000 to 300,000 years ago.

Jayne Wilkins of the University of Toronto and colleagues present
multiple lines of evidence implying that stone points from the site of
Kathu Pan 1 in South Africa were hafted to form spears around 500,000
years ago. The points’ damaged edges and marks at their base are
consistent with the idea that these points were hafted spear tips.

So why does this matter? The Guardian explains,

The invention of stone-tipped spears was a significant point in human evolution, allowing our ancestors to kill animals more efficiently and have more regular access to meat, which they would have needed to feed ever-growing brains. "It's a more effective strategy which would have allowed early humans to have more regular access to meat and high-quality foods, which is related to increases in brain size, which we do see in the archaeological record of this time," said Jayne Wilkins, an archaeologist at the University of Toronto who took part in the latest research.

The technique needed to make stone-tipped spears, called hafting, would also have required humans to think and plan ahead: hafting is a multi-step manufacturing process that requires many different materials and skill to put them together in the right way. "It's telling us they're able to collect the appropriate raw materials, they're able to manufacture the right type of stone weapons, they're able to collect wooden shafts, they're able to haft the stone tools to the wooden shaft as a composite technology," said Michael Petraglia, a professor of human evolution and prehistory at the University of Oxford who was not involved in the research. "This is telling us that we're dealing with an ancestor who is very bright."

This joins recent work on arrow-making, which both demonstrates that the manufacture and use of arrows is older than we thought, and that its complexity suggests ancient multitasking abilities:

"These operations would no doubt have taken place over the course of
days, weeks or months, and would have been interrupted by attention to
unrelated, more urgent tasks," observes paleoanthropologist Sally
McBrearty of the University of Connecticut in a commentary accompanying
the team’s report. "The ability to hold and manipulate operations and
images of objects in memory, and to execute goal-directed procedures
over space and time, is termed executive function and is an essential
component of the modern mind," she explains.

McBrearty, who has long argued that modern cognitive capacity evolved at
the same time as modern anatomy, with various elements of modern
behavior emerging gradually over the subsequent millennia, says the new
study supports her hypothesis. A competing hypothesis, advanced by
Richard Klein of Stanford University, holds that modern human behavior
only arose 50,000 to 40,000 years ago, as a result of some kind of
fortuitous genetic mutation that kicked our ancestors’ creativity into
high gear. But discoveries of symbolic items much older than that
supposed mutation–and older than the PP5-6 arrowheads for that
matter–have cast doubt on Klein’s theory. And other finds hint that
Neandertals, too, engaged in symbolic behaviors, which would suggest
that the capacity for symbolic thinking arose in our common ancestor
perhaps half a million years ago.

You damn kids get off my lawn, with your misinterpretations of neuroplasticity and media history!

I'm just getting around to Carl Wilkinson's recent Telegraph essay on writers "Shutting out a world of digital distraction." It's about how Zadie Smith, Nick Hornby and others deal with digital distraction, which for writers is particularly challenging. Successful writing requires a high degree of concentration over long periods, but the Internet can be quite useful for doing the sort of research that supports imaginative writing (not to mention serious nonfiction). Add in communicating with agents, getting messages from fans, and the temptation to check your Amazon rank, and you have a powerful device.

Unfortunately, the piece also has a couple paragraphs featuring that mix of technological determinism and neuroscience that I now regard as nearly inevitable. Editors seem to require having a section like this:

the internet is not just a distraction – it’s actually changing our brains, too. In his Pulitzer Prize-nominated book The Shallows: How the Internet is Changing the Way We Think, Read and Remember (2010), Nicholas Carr highlighted the shift that is occurring from the calm, focused “linear mind” of the past to one that demands information in “short, disjointed, often overlapping bursts – the faster, the better”….

Our working lives are ever more dominated by computer screens, and thanks to the demanding, fragmentary and distracting nature of the internet, we are finding it harder to both focus at work and switch off afterwards.

“How can people not think this is changing your brain?” asks the neuroscientist Baroness Susan Greenfield, professor of pharmacology at Oxford University. “How can you seriously think that people who work like this are the same as people 20 or 30 years ago? Whether it’s better or worse is another issue, but clearly there is a sea change going on and one that we need to think about and evaluate…. I’m a baby boomer, not part of the digital-native generation, and even I find it harder to read a full news story now. These are trends that I find concerning.”

As with Nick Carr's recent piece, Katie Roiphe's piece on Freedom, everything Sven Birkets has written since about 1991, and the rest of the "digital Cassandra" literature (Christopher Chabris and Daniel Simons called it "digital alarmism"), I think the problem here is that statements like these emphasize the flexibility of neural structure in a way that ironically diminishes our sense of agency and capacity for change. The argument works like this:

  1. The world is changing rapidly.
  2. Our media are changing very rapidly.
  3. Brains adapt to their (media) environments.
  4. Therefore our brains must be changing very rapidly.
  5. These changes are beyond our control.

I don't want to argue, pace Stephen Poole, that this is merely neurobollocks (though I love that phrase), (Nor do i want to single out Baroness Greenfield, who's come in for lots of criticism for the ways she's tried to talk about these issues.)

All I want to argue is that 1-4 can be true, but that doesn't mean 5 must be true as well.

Technological determinism is not, absolutely not, a logical consequence of neuroplasticity.

It's possible to believe that the world is changing quickly, that our brains seek to mirror these changes or adapt to them in ways that we're starting to understand (but have a long way to go before we completely comprehend), and lots of this change happens without our realizing it, before we're aware of it, and becomes self-reinforcing.

But– and this is the important bit, so listen up– we also have the ability of observe our minds, to retake control of the direction in which they develop, and to use neuroplasticity for our own ends.

Because we can observe our minds as work, we can draw on a very long tradition of practice in building attention and controlling our minds– no matter what the world is doing. Yes, the great Jeff Hammerbacher line that "The best minds of my generation are thinking about how to make people click ads" is absolutely true*, but when all is said and done, even Google hasn't taken away free will.

We can get our minds back. It's just a matter of remembering how.

And can even be represented in graphical form.

A Thomas Merton line I must use one day

In the opening paragraph of Mystics and Zen Masters, he contrasts "the monk or the Zen man" to "people dedicated to lives that are, shall we say, aggressively noncontemplative."

That pretty much describes us all these days. Though it's worth noting that Merton published his book in 1961, before the Internet, when even long-distance telephone calls were a rarity in some parts of the U.S.

Pre-order The Distraction Addiction!

My book is now . Amazing.


via

It’s been about two and half years since I first wrote the words “contemplative computing” (or typed them on my iPhone, actually), and nearly two years since I started my fellowship at Microsoft Research Cambridge to explore the idea in depth. It’s a bit surreal to watch the book move closer to the bookshelves, but it’s also very cool.

Google is a search engine, not a Free Will Destruction Machine

The Memory Network has published a new essay by Nick Carr on computer versus human memory. This is a subject I’ve followed with great interest, and when I was at Microsoft Research Cambridge I had the good fortune to be down the hall from Abigail Sellen, whose thinking about the differences between human and computer memory is far subtler than my own.

Carr himself makes points about how human memory is imaginative, creative in both good and bad ways, changes with experience, and has a social and ethical dimension. This isn’t new: Viktor Mayer-Schönberger’s book Delete: The Virtue of Forgetting in the Digital Age is all about this (though how successful it is is a matter of argument), and Liam Bannon likewise argues that we should regard forgetting as a feature, not a bug.

The one serious problem I have with the piece comes after a discussion of Betsy Sparrow’s work on Internet use and transactive memory:

We humans have, of course, always had external, or “transactive,” information stores to supplement our biological memory. These stores can reside in the brains of other people we know (if your friend Julie is an expert on gardening, then you know you can use her knowledge of plant facts to supplement your own memory) or in media technologies such as maps and books and microfilm. But we’ve never had an “external memory” so capacious, so available and so easily searched as the Web. If, as this study suggests, the way we form (or fail to form) memories is deeply influenced by the mere existence of outside information stores, then we may be entering an era in history in which we will store fewer and fewer memories inside our own brains.

To me this paragraph exemplifies both the insights and shortcomings of Carr’s approach: in particular, with the conclusion that “we may be entering an era in history in which we will store fewer and fewer memories inside our own brains,” he ends on a note of technological determinism that I think is both incorrect and counterproductive. Incorrect because we continue to have, and to make, choices about what we memorize, what we entrust to others, and what we leave to books or iPhones or the Web. Counterproductive because thinking we can’t resist the overwhelming wave of Google (or technology more generally) disarms our ability to see that we still can choose to use technology in ways that suit us, rather than using it ways that Larry and Sergei, or Tim Cook, or Bill Gates, want us to use it. 

The question of whether we should memorize something is, in my view, partly practical, partly… moral, for lack of a better word. Once I got a cellphone, I stopped memorizing phone numbers, except for my immediate family’s: in the last decade, the only new numbers I’ve committed to memory are my wife’s and kids’. I carry my phone with me all the time, and it’s a lot better than me at remembering the number of the local taqueria, the pediatrician, etc.. However, in an emergency, or if I lose my phone, I still want to be able to reach my family. So I know those numbers.

Remembering the numbers of my family also feels to me like a statement that these people are different, that they deserve a different space in my mind than anyone else. It’s like birthdays: while I’m not always great at it, I really try to remember the birthdays of relatives and friends, because that feels to me like something that a considerate person does.

The point is, we’re still perfectly capable of making rules about what we remember and don’t, and make choices about where in our extended minds we store things. Generally I don’t memorize things that I won’t need after the zombie apocalypse. But I do seek to remember all sorts of other things, and despite working in a job that invites perpetual distraction, I can still do it. We all can, despite the belief that Google makes it harder. Google is a search engine, not a Free Will Destruction Machine. Unless we act like it’s one.

A forgotten piece of the book: an essay on Digital Panglosses versus Digital Cassandras

Chad Wellmon has a smart essay in The Hedgehog Review arguing that “Google Isn’t Making Us Stupid…or Smart.”

Our compulsive talk about information overload can isolate and abstract digital technology from society, human persons, and our broader culture. We have become distracted by all the data and inarticulate about our digital technologies…. [A]sking whether Google makes us stupid, as some cultural critics recently have, is the wrong question. It assumes sharp distinctions between humans and technology that are no longer, if they ever were, tenable…. [T]he history of information overload is instructive less for what it teaches us about the quantity of information than what it teaches us about how the technologies that we design to engage the world come in turn to shape us. 

It’s something you should definitely read, but it also reminded me of a section of my book that I lovingly crafted but ultimately editing out, and indeed pretty much forgot about until tonight. It describes the optimistic and pessimistic evaluations of the impact of information technology on– well, everything– as a prelude to my own bold declaration that I was going to go in a different direction. (It’s something I wrote about early in the project.)

I liked what I wrote, but ultimately I decided that the introduction was too long; more fundamentally, I was really building on the work of everyone I was talking about, not trying to challenge them. (I don’t like getting into unnecessary arguments, and I find you never get into trouble being generous giving credit to people who’ve written before you.) Still, the section is worth sharing.

Continue reading

Older posts

© 2018 Deliberate Rest

Theme by Anders NorenUp ↑