Deliberate Rest

A blog about getting more done by working less

Month: January 2011 (page 1 of 2)

Reading Mihaly Csikszentmihalyi’s Flow

This weekend I started reading Mihaly Csikszentmihalyi's Flow. First published in the early 1990s, it's now considered something of a classic in psychology, and has influenced scientists in a number of areas, from psychoanalysis to human factors research.

The first and maybe most important thing to know is that the author's name is pronounced "chick-sent-me-high" (or "cheek-saint-me-high-ee" depending on who you ask), with the accent on the first syllable.

It's a fascinating, rich book; a couple parts of it are a bit dated (the information theory vision of the brain and consciousness feels a bit old, but that doesn't fatally affect his argument). What I really like about the book is that it's interested in the fundamental question, what is happiness? Csikszentmihalyi's answer is a bit counterintuitive, but quite rich and interesting. This means he's not just interested in isolated experiences, but in the overall shape and tone of life: it's an interest in values rather than merely specific ends. "Happiness is not something that happens," (2) he argues:

It is a condition that must be prepared for, cultivated, and defended privately by each person. People who learn to control inner experience will be able to determine the quality of their lives, which is as close as any of us can come to being happy…. It is by being fully involved with every detail of our lives, whether good or bad, that we find happiness, not by trying to look for it directly. [Reaching happiness involves] a circuitous path that begins with achieving control over the contents of our consciousness. (2)

So what defines that control?

Continue reading

Joshua Lederberg on the sociability of computer centers

Something brought to mind Joshua Lederberg's description of his early computing efforts at Stanford, and the way that the computer center became a meeting-ground for all kinds of people:

In those days, we had a B220- which would match a fairly feeble PC today- as the first campus machine. Its operating system would accept decks of punched cards in serial batch mode, with output either from the printer or new punched cards. The usual turn- around time was about 12 hours. If you got to the computer room around midnight, you might get another pass by 2 A.M. The democracy and night-owl ambience of the batch system was a social mixer for several enthusiasts from wide-ranging disciplines. (I particularly recall Tony Hearn, who was starting his symbolic algebra system, REDUCE, on the IBM 7090). The impedance of a one-pass per day turnaround certainly did filter out all but the most enthusiastic.

I suspect there's a whole social history of the early campus computer center as intellectual crossroads waiting to be written. From the bits and pieces I've picked up over the years, it sounds like computer centers were like printer's shops in the 16th century (as described by Elizabeth Eisenstein in The Printing Press as an Agent of Change): spaces that attracted and mixed together all sorts of people, who shared an interest in a new technology and– thanks to their desire to be close to the machine– discovered that they had many other mutual interests as well.

He also had this little "kids today have no values" throwaway line, which I've heard from other old school programmers (like my father-in-law):

You also spent a lot of energy trying to simulate the machine in your own thought, in contrast to the casual, experimental mode- "Let's see if this works"- of today's interactive systems.

Last thoughts on Digital Cassandras and sabbaths

I've spent the last week working through the literature criticizing the effects of the Internet on our brains, the balance between our inward (private, contemplative) ;and outward (public, social) lives, our ability to read novels, and wrapped up with arguments for unplugging. Next week I start constructing the response.

Fundamentally, while I'm sympathetic to these arguments, I think the way they've framed the problem and its solutions are seriously flawed. I've already talked about the problems with equating literary reading with thinking and intelligence more generally; three other issues concern me here.

Continue reading

Digital sabbaths

For the last day I've been reading about the digital sabbath movement, using it to try to better understand critiques of digital culture and life: how people self-diagnose the problems of being connected, and how they frame their responses.

Continue reading

The literary undergrowl of the old regime*

When I first got into arguments about “what the Web is doing to our culture and our brains,” I was manging editor of the Encyclopaedia Britannica. It was an extraordinary personal experience and a fascinating time to be in reference publishing, but those arguments were also very much shaped by the interests of the time: in particular, debates over the impact of the Internet were taking place in the shadow of the culture wars and arguments about the future of literature and cricism. Reading the literature on how the Web is making us dumb, I’m struck by the fact that while we no longer talk about how Derridian the experience of the reading on the Web is (even though publishers, authorities, and traditional ideas about grammar and quality are busy deconstrcting themselves), the opponents of that view are still going strong. They’re still using literature as a proxy for intelligence and education, and are still implicitly making the case for the importance of literature in the formation of the self.

Continue reading

Reading and contemplation

I've seen this in two recent essays. First, David Ulin, in a widely-circulated essay on "The Lost Art of Reading," which draws a link between reading and contemplation. He begins by recalling how

Sometime late last year — I don't remember when, exactly — I noticed I was having trouble sitting down to read…. It isn't a failure of desire so much as one of will. Or not will, exactly, but focus: the ability to still my mind long enough to inhabit someone else's world, and to let that someone else inhabit mine.

So what was going on?

Such a state is increasingly elusive in our over-networked culture, in which every rumor and mundanity is blogged and tweeted. Today, it seems it is not contemplation we seek but an odd sort of distraction masquerading as being in the know. Why? Because of the illusion that illumination is based on speed, that it is more important to react than to think, that we live in a culture in which something is attached to every bit of time.

Reading matters, Ulin argues, because it "is an act of contemplation,"

perhaps the only act in which we allow ourselves to merge with the consciousness of another human being. We possess the books we read, animating the waiting stillness of their language, but they possess us also, filling us with thoughts and observations, asking us to make them part of ourselves….

Contemplation is not only possible but necessary, especially in light of all the overload… [in an age where] time collapses into an ever-present now. How do we pause when we must know everything instantly? How do we ruminate when we are constantly expected to respond? How do we immerse in something (an idea, an emotion, a decision) when we are no longer willing to give ourselves the space to reflect? This is where real reading comes in — because it demands that space, because by drawing us back from the present, it restores time to us in a fundamental way.

Second is Sven Birket's recent American Scholar essay on reading in the digital age.

I read novels in order to indulge in a concentrated and directed sort of inner activity that is not available in most of my daily transactions. This reading, more than anything else I do, parallels—and thereby tunes up, accentuates—my own inner life, which is ever associative, a shuttling between observation, memory, reflection, emotional recognition, and so forth. A good novel puts all these elements into play in its own unique fashion.

WE ALWAYS HEAR arguments about how the original time-passing function of the triple-decker novel has been rendered obsolete by competing media. What we hear less is the idea that the novel serves and embodies a certain interior pace, and that this has been shouted down (but not eliminated) by the transformations of modern life. Reading requires a synchronization of one’s reflective rhythms to those of the work.

Concentration is no longer a given; it has to be strategized, fought for. But when it is achieved it can yield experiences that are more rewarding for being singular and hard-won.

Hamlet’s Blackberry

A few weeks ago I read William Powers' book Hamlet's Blackberry, but with the move and everything couldn't really write much about it. Still, it's worth noting.

Powers' book mainly consists of a series of case studies of reactions to new media innovations in the past– from the inevitable discussion of Phaedrus (does anyone read The Republic any more?), to the equally inevitable reading of Marshall McLuhan. The case studies are made in support the argument has three big parts.

Continue reading

Wading into The Shallows

Yesterday I read Nicholas Carr's The Shallows: How the Internet is Changing the Way We Think, Read, and Remember. Frankly, I was prepared to severely dislike it– his first book, Does IT Matter? drove me around the bend– but I was a lot more sympathetic to this effort.

The Shallows' main argument is easy enough to summarize: all the time we're spending online is making it harder for us to think deeply, to read intensively, and to remember things. These three changes reinforce each other: the kind of reading Carr talks about is intellectually strenuous; memory turns out to be an important resource for deeper cognitive functions; and our ability to think is implicated in memory and attention.

And losing the cognitive abilities we used to develop by reading novels– the sine qua non of deep thinking– is a big loss indeed: "The linear, literary mind has been at the center of art, science and society," Carr contends. "It may soon be yesterday's mind."

Continue reading

Smiles, emotion, and embodied knowledge

A very interesting new study on smiles, which nicely illustrates the connection between emotion, cognition, and embodiment.

Paula Niedenthal… and her colleagues have surveyed a wide range of studies, from brain scans to cultural observations, to build a new scientific model of the smile. They believe they can account not only for the source of smiles, but how people perceive them. In a recent issue of the journal Behavioral and Brain Sciences, they argue that smiles are not simply the expression of an internal feeling. Smiles in fact are only the most visible part of an intimate melding between two minds.

One of the things her team has analyzed is how people recognize smiles.

One way people recognize smiles is comparing the geometry of a person’s face to a standard smile. A second way is thinking about the situation in which someone is making an expression, judging if it’s the sort where a smile would be expected.

But most importantly, Dr. Niedenthal argues, people recognize smiles by mimicking them. When a smiling person locks eyes with another person, the viewer unknowingly mimics a smile as well…. A happy smile, for example, is accompanied by activity in the brain’s reward circuits, and looking at a happy smile can excite those circuits as well. Mimicking a friendly smile produces a different pattern of brain activity…. Embodying smiles not only lets people recognize smiles, Dr. Niedenthal argues. It also lets them recognize false smiles. When they unconsciously mimic a false smile, they don’t experience the same brain activity as an authentic one. The mismatch lets them know something’s wrong.

So one of the ways we recognize a smile is to mimic it, and to evaluate its veracity based on how that mimicry feels.

One compelling piece of evidence for the importance of this evaluation tool is that you can short-circuit this system by freezing the facial muscles of the person who’s looking at a smile:

In one study, she and her colleagues are testing the idea that mimicry lets people recognize authentic smiles. They showed pictures of smiling people to a group of students. Some of the smiles were genuine and others were fake. The students could readily tell the difference between them.

Then Dr. Niedenthal and her colleagues asked the students to place a pencil between their lips. This simple action engaged muscles that could otherwise produce a smile. Unable to mimic the faces they saw, the students had a much harder time telling which smiles were real and which were fake.

Reinventing HCI for the 21st century

One of the interesting and challenging things about coming to Microsoft Research is learning about the recent history of HCI, and to learn it well enough to do two things: be reasonably credible among the people who I'm talking to, and understand how to frame my work so that it more or less fits in the discipline. We often think that really good work is marked by being completely novel and different from what's come before; but in fact, influential scholarship and science almost always clearly builds on its predecessors, and speaks to interests that lots of people in the field share.

I've spent the last few days reading about emotion in HCI, and through it am getting a better understanding of what's happening in the field. At least around here, the field is in the midst of a paradigm shift. HCI started in the 1970s and 1980s, when computers were just emerging from university science centers, and the big questions it asked reflected those origins. But are the kinds of questions that HCI researchers in an era in which personal computing was novel, our interactions with PCs and terminals was mainly in schools and the workplace, and the big challenges were to make computers that could be used by people other than programmers, still that important in an era of iPhones, Facebook, and RFID tags in clothes and credit cards?

Continue reading

Older posts

© 2018 Deliberate Rest

Theme by Anders NorenUp ↑