A Commonplace Book

From Steven Johnson’s Where Good Ideas Come From: The Natural History of Innovation:

Darwin’s notebooks lie at the tail end of a long and fruitful tradition that peaked in Enlightenment-era Europe, particularly in England: the practice of maintaining a “commonplace” book. Scholars, amateur scientists, aspiring men of letters—just about anyone with intellectual ambition in the seventeenth and eighteenth centuries was likely to keep a commonplace book. The great minds of the period—Milton, Bacon, Locke—were zealous believers in the memory-enhancing powers of the commonplace book. In its most customary form, “commonplacing,” as it was called, involved transcribing interesting or inspirational passages from one’s reading, assembling a personalized encyclopedia of quotations. There is a distinct self-help quality to the early descriptions of commonplacing’s virtues: maintaining the books enabled one to “lay up a fund of knowledge, from which we may at all times select what is useful in the several pursuits of life.”

John Locke first began maintaining a commonplace book in 1652, during his first year at Oxford. Over the next decade he developed and refined an elaborate system for indexing the book’s content. Locke thought his method important enough that he appended it to a printing of his canonical work, An Essay Concerning Human Understanding. Locke’s approach seems almost comical in its intricacy, but it was a response to a specific set of design constraints: creating a functional index in only two pages that could be expanded as the commonplace book accumulated more quotes and observations:

When I meet with any thing, that I think fit to put into my common-place-book, I first find a proper head. Suppose for example that the head be EPISTOLA, I look unto the index for the first letter and the following vowel which in this instance are E. i. if in the space marked E. i. there is any number that directs me to the page designed for words that begin with an E and whose first vowel after the initial letter is I, I must then write under the word Epistola in that page what I have to remark.

Locke’s method proved so popular that a century later, an enterprising publisher named John Bell printed a notebook entitled “Bell’s Common-Place Book, Formed generally upon the Principles Recommended and Practised by Mr Locke.” The book included eight pages of instructions on Locke’s indexing method, a system which not only made it easier to find passages, but also served the higher purpose of “facilitat[ing] reflexive thought.” Bell’s volume would be the basis for one of the most famous commonplace books of the late eighteenth century, maintained from 1776 to 1787 by Erasmus Darwin, Charles’s grandfather. At the very end of his life, while working on a biography of his grandfather, Charles obtained what he called “the great book” from his cousin Reginald. In the biography, the younger Darwin captures the book’s marvelous diversity: “There are schemes and sketches for an improved lamp, like our present moderators; candlesticks with telescope stands so as to be raised at pleasure to any required height; a manifold writer; a knitting loom for stockings; a weighing machine; a surveying machine; a flying bird, with an ingenious escapement for the movement of the wings, and he suggests gunpowder or compressed air as the motive power.”

The tradition of the commonplace book contains a central tension between order and chaos, between the desire for methodical arrangement, and the desire for surprising new links of association. For some Enlightenment-era advocates, the systematic indexing of the commonplace book became an aspirational metaphor for one’s own mental life. The dissenting preacher John Mason wrote in 1745:

Think it not enough to furnish this Store-house of the Mind with good Thoughts, but lay them up there in Order, digested or ranged under proper Subjects or Classes. That whatever Subject you have Occasion to think or talk upon you may have recourse immediately to a good Thought, which you heretofore laid up there under that Subject. So that the very Mention of the Subject may bring the Thought to hand; by which means you will carry a regular Common Place-Book in your Memory.

Others, including Priestley and both Darwins, used their commonplace books as a repository for a vast miscellany of hunches. The historian Robert Darnton describes this tangled mix of writing and reading:

Unlike modern readers, who follow the flow of a narrative from beginning to end, early modern Englishmen read in fits and starts and jumped from book to book. They broke texts into fragments and assembled them into new patterns by transcribing them in different sections of their notebooks. Then they reread the copies and rearranged the patterns while adding more excerpts. Reading and writing were therefore inseparable activities. They belonged to a continuous effort to make sense of things, for the world was full of signs: you could read your way through it; and by keeping an account of your readings, you made a book of your own, one stamped with your personality.

Each rereading of the commonplace book becomes a new kind of revelation. You see the evolutionary paths of all your past hunches: the ones that turned out to be red herrings; the ones that turned out to be too obvious to write; even the ones that turned into entire books. But each encounter holds the promise that some longforgotten hunch will connect in a new way with some emerging obsession. The beauty of Locke’s scheme was that it provided just enough order to find snippets when you were looking for them, but at the same time it allowed the main body of the commonplace book to have its own unruly, unplanned meanderings. Imposing too much order runs the risk of orphaning a promising hunch in a larger project that has died, and it makes it difficult for those ideas to mingle and breed when you revisit them. You need a system for capturing hunches, but not necessarily categorizing them, because categories can build barriers between disparate ideas, restrict them to their own conceptual islands. This is one way in which the human history of innovation deviates from the natural history. New ideas do not thrive on archipelagos.


State of the Shirt, March 2012

Day of the Shirt continues to delight me. I added a fun but subtle feature back in January: updating @dayoftheshirt’s Twitter avatar to today’s date. Day of the Shirt makes of point of being “refreshed every day” so since it’s tweeting out daily list of shirts, it makes sense to have a fresh daily avatar too. In addition to delight, Day of the Shirt is moderately successful too: while not hockey-sticks, I’m seeing 50 - 100% unique visitor growth month-over-month; and ~80% of my daily traffic is returning visits. Day of the Shirt went from about an average of 50 unique daily visitors in October, 2011 to an average of 1100 in March. Adding the “Like on FaceBook” widget at the beginning of February boosted new visitors too; in 2 months, Day of the Shirt has nearly twice as many Facebook “Likes” than it does Twitter followers, and Facebook drives ~150% more traffic than Twitter. In terms of the future, Day of the Shirt is reaching the limit of its current architecture—rewriting a static HTML file—which was cute originally. Day of the Shirt launched with 5 daily shirts and now it aggrates 13 daily (and a few semi-weekly) shirts. Putting it on an actual framework would make adding new shirts and testing/updating the templates much simpler and more reliable. I’m a bit caught-up on what I’ll move to (Django is in the lead), but I expect the experience to stay the same.


Methodological Belief

From Peter Elbow’s “The Believing Game”:

The doubting game represents the kind of thinking most widely honored and taught in our culture. It’s sometimes called “critical thinking.” It’s the disciplined practice of trying to be as skeptical and analytic as possible with every idea we encounter. By trying hard to doubt ideas, we can discover hidden contradictions, bad reasoning, or other weaknesses in them–especiallyin the case of ideas that seem true or attractive. We are using doubting as a tool in order to scrutinize and test.

In contrast, the believing game is the disciplined practice of trying to be as welcoming or accepting as possible to every idea we encounter: not just listening to views different from our own and holding back from arguing with them; not just trying to restate them without bias; but actually trying to believe them. We are using believing as a tool to scrutinize and test. But instead of scrutinizing fashionable or widely accepted ideas for hidden flaws, the believing game asks us to scrutinize unfashionable or even repellent ideas for hidden virtues. Often we cannot see what’s good in someone else’s idea (or in our own!) till we work at believing it. When an idea goes against current assumptions and beliefs–or if it seems alien, dangerous, or poorly formulated—we often cannot see any merit in it.*

And from the asterisk:

* I’m on slippery ground when I equate the doubting game with critical thinking, since critical thinking has come to mean almost any and every kind of thinking felt to be good. Consider the opening definition at the website of the Foundation for Critical Thinking:

Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action. In its exemplary form, it is based on universal intellectual values that transcend subject matter divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth, breadth, and fairness.

It entails the examination of those structures or elements of thought implicit in all reasoning: purpose, problem, or question-at-issue; assumptions; concepts; empirical grounding; reasoning leading to conclusions; implications and consequences; objections from alternative viewpoints; and frame of reference. Critical thinking — in being responsive to variable subject matter, issues, and purposes — is incorporated in a family of interwoven modes of thinking, among them: scientific thinking, mathematical thinking, historical thinking, anthropological thinking, economic thinking, moral thinking, and philosophical thinking.

Critical thinking can be seen as having two components: 1) a set of information and belief generating and processing skills, and 2) the habit, based on intellectual commitment, of using those skills to guide behavior. ….People who think critically consistently attempt to live rationally, reasonably, empathically. (Scriven and Paul)

Who could ever resist anything here (except the prose)?

I’d argue, however, that despite all attempts to de-fuse the word “critical,” it nevertheless carries a connotation of criticism. The word still does that work for many fields that use it for a label. For example, in “critical theory,” “critical literacy,” and “critical legal theory,” the word still actively signals a critique, in this case a critique of what is more generally accepted as “theory” or “literacy” or “legal theory”. The OED’s first meaning for critical is “Given to judging; esp. given to adverse or unfavourable criticism; fault-finding, censorious.” Not till the sixth meaning do we get past a censorious meaning to a sense of merely “decisive” or “crucial.”

In the simple fact that “critical thinking” has become a “god term” that means any kind of good thinking, I see striking evidence of the monopoly of the doubting game in our culture’s conception of thinking itself. (“Burke refers to a word like honor as a god-term, because it represents an aspiration towards a kind of perfection. The ultimate term, of course, is God himself.” [Goodhart]


Protest shirts

Regular readers of this blog are aware that posts rarely reference the present, let along the contemporary. But on Day of the Shirt, I felt compunction: a daily t-shirt aggregator is nothing but contemporary—it’s a 1-page website. So I took down Day of the Shirt today to protest SOPA/PIPA legislation.

And, as any organizer can tell you, going on strike takes more time and effort than not striking—cron scripts don’t  just turn themselves off.


2011 in review

2011 was a year of transitions: plenty of new starts and sad endings.

Shuttering the Transmission Project: in August our funding for the Digital Arts Service Corps ended. Despite the disappearance of our funding (and a pointless federal audit to boot), it was also one of the Transmission Project’s best years in terms in terms of our staff, Corps members and the work we did in our twilight. I can’t not mention though the frustration felt by the end of the Digital Arts Service Corps at the same time other media/technology-based national service programs come online.

Building 549 Columbus: with 4 months of unemployment, I had time to help build a cooperative coworking space in the South End.

Sundowned two websites: Both MeetAmeriCorps and MappingAccess were taken down in 2011; it felt good to recognize that they were long neglected.

Many new websites: DrunkenStumble, Print & Share, and Day of the Shirt (it launched in Oct 2010, but I spent a lot of time building traffic in 2011).

**Goodbye Boston: **I moved to San Francisco in November.

Founding the Boston Cyclists Union: following Danielle Martin’s 1st Principle of Community Organizing, “Keep showing up”, I am proud to be a founding board member of the BCU.

BetterBio: that didn’t go so well, but I learned a lot.

Other stuff: I did the layout again for my second edition of Survival News. I made my first serious pitches for a project (“All the Right People”, which was the thoughtchild of my coworkers Howie and Billy). I did a bunch of contract work for WGBH, Utile, LexCreative and SocialContxt. I had the pleasure of filing for unemployment insurance. I presented at the SF Wordpress Meetup (speedgeeking on “wp_filter”). Angelina’s friend Anna and Greg were married in Iowa.  I attended the Code for America Summit, Nonprofit Software Developers Conference, and DrupalCon.

Places I’ve slept

  • York, ME (I woke up there on the 1st)

  • Jamaica Plain, MA

  • Seattle, WA

  • Chicago, IL

  • Okoboji, IA.

  • Lenox, MA

  • San Francisco, CA

  • Poway, CA



Literacy is more than reading

Below is a year-old memo I wrote for the Transmission Project was later polished into a more general statement on media literacy:

Synopsis: Digital Literacy Training should not continue the skills-based approach of the basic Computer Literacy that forms its foundation. To effectively prepare participants for 21st employment, Digital Literacy Training must focus upon the motivations and context for using new media and social networking technologies within business, nonprofit and community environments

Internet technologies—and the methods in which individuals and organizations utilize them—have undergone a paradigm shift over the past decade. Broader computer literacy skills coupled with high-speed broadband and the mobile internet have altered the barriers to becoming technology users and technology creators. The skills necessary to create modern internet technologies—rich media and interactive websites, custom social networks and mobile applications—have become professionalized: the education and experience necessary for their creation has moved beyond the scope of a community-centered training program. Meanwhile, those same technologies have undergone a process of consumerization, making their usage accessible to those with only basic computer literacy training. Today, millions of people every day create online, interactive, and rich-media experiences (for example, by embedding YouTube videos on their Facebook profile). yet do not have the skills necessary to build a basic website, nor do they need those skills to engage through these modern technologies.

While Basic Computer Literacy skills form the foundation for a Digital Literacy curriculum, an effective Digital Literacy Training will provide participants with the motivations and context for utilizing modern internet technologies and applications within business, nonprofit and community environments. This educational model requires a broader focus upon training participants in community building, decision making, and team facilitation. Training participants to apply digital technologies within these given contexts will best prepare them for 21st century employment. The job interview of tomorrow will not ask “What new media tools can you use?”, but rather “Given a business problem, how would you use new media tools in its solution?”

…of course, “you’re burying people at the bottom of Maslow’s hierarchy” is a pretty easy criticism to make about many things.


The work itself isn’t inhumane

Two weeks ago, while riding BART from East Bay to San Francisco, I was offered a transit survey from a BART Survey Team member. Pointing to what seemed to me to be a large number of written-response questions for a survey being delivered on a moving train, I asked her, “Do you have to read and code all these surveys?”

The answer was on my mind as I was in the Bay Area to attend a Summit at Code for America, the organization where I will be serving as a Fellow for 2012. During this trip, I observed (though this is by no means a fresh observation) that geeks are drawn to architecting databases, “open” no less, without recognizing or paying much attention to the experience of the people who will be entering and maintaining the data.

As someone who has spent a large amount of life sorting (and resorting) spreadsheets in Excel—and creator of an open database of my own—the experience of maintaining data isn’t particularly liberating (nor is compiling federal grant reports, for that matter). The experience reminds me of what Joel Johnson wrote for Wired Magazine in “1 Million Workers. 90 Million iPhones. 17 Suicides. Who’s to Blame?”:

“…the work itself isn’t inhumane—unless you consider a repetitive, exhausting, and alienating workplace over which you have no influence or authority to be inhumane. And that would pretty much describe every single manufacturing or burger-flipping job ever.”

Which is to say that maintaining a database is work and doubly so when the “data driven decisions” are not the maintainers’ to make. I think of this whenever the licensing costs of “closed” commercial databases are cited as driving the need for “open”; I agree that data in the public interest should be “free”,  but I recognize that freedom comes with a cost.

At the Summit, Tim O’Reilly said, “Data is infrastructure.” If that’s the case, I hope us data “architects” recognize the experience of people—the data pavers, data plumbers, and data janitors—who maintain such “virtual” infrastructure if we expect to gain greater liberties than our current physical landscape provides.

The BART Survey Team member’s answer on that shaky train was “Yes”.


The point where creativity and invention occur

From the preface to Arnold Pacey’s The Maze of Ingenuity : Ideas and Idealism in the Development of Technology:

So far I have written about efforts to inaugurate a new direction for technical progress as if the chief problem is a lack of methods and discipline. But there are other problems too. Technology does not exist apart from the people who create and use it, and its precise forms have a lot to do with the way these people choose to organise their society. One of the problems about the use of intermediate or appropriate technology in the developing countries is that the people there often do not have suitable forms of local organisation to make effective use of the equipment being offered to them. Frequently, it is equipment devised by well-meaning Westerners who have little understanding of the social component of technology or of complex local patterns of social organisation.

In the industrialized countries also, we do not have many social structures with suitable organisation to use alternative technology. And although the necessary changes in society may come partly through unconscious evolution, or through individual efforts to organise self-help groups, village societies or communes, change will be needed at the political and legislative level also. And Dickson sees the great weakness of much alternative technology as its neglect of the ‘political dimension’ – neglect which implies ‘an idealistic concept … that does not coincide with the social reality of technology as it has been experienced’.

This is fair criticism in many respects, but it is a mistake to think that the political dimension is the over-riding totality within which all other aspects of technology are worked out – and my book is very largely about some of the other dimensions of technological change. The distinction becomes clear when we consider the symbolic purposes which technology is made to serve, about which Dickson has useful things to say. For example, individuals buy automobiles or household goods and nations buy armaments, not solely with a view to their utilitarian value but because of what they symbolize. Discussions about more modest lifestyles for an age of zero growth, or about disarmament, rarely acknowledge this, and so become confused as people invent phoney utilitarian or practical purposes for their acquisitions, and nations invent unreal threats to justify their arms.

The Report from the Iron Mountain almost a decade ago explained how the armies, structures and industries associated with preparedness for war in fact perform many non-military functions. Many of these functions can be described in terms of the ‘symbolic objectives’ discussed in this book and have to do with ‘ideological clarification’ and building national unity.

As a partial substitute for the non-military functions of war, the Iron Mountain report suggested that a massive space programme could fill the place of the armaments industry in the economy and would provide an equally potent, but less dangerous, symbolism to express national goals and national prestige – rather as the building of cathedrals in the 12th century, provided an effective substitute for the non-military functions of the Crusades (P. 42).

Dickson’s argument is that the symbolism of armaments, or of cathedrals, is largely invented by the ruling groups within society as a means of controlling the mass of the people. Thus Dickson sees the building of the cathedrals as a way in which the Church could extend its influence over craftsmen, artisans, and I would add, merchants.

There is much truth in this, but to present such political aspects of a creative technological movement as the whole of the picture seems wrong. From the viewpoint of the architects and stone masons who built the cathedrals, the work was something that carried conviction because of its symbolic meanings, whether concerning the New Jerusalem, the glory of God or the prestige of their own home town. It was these things which fired the imagination and sparked the immense burst of artistic and technology creativity which the cathedrals represent. We need to understand the reality of the symbolism, and not just its political uses, if we wish to understand the ideals and objectives which give rise to discovery and invention in technology. So I do not agree with Dickson that ‘technological development is essentially a political process’. It is partly a political process, but at the point where creativity and invention occur, it is the values and ideals of individuals that matter, and personal appreciations of ‘quality’ or fitness for purpose. The convictions and sensitivity of the technologist have a validity beyond just the social environment which shapes them, important though that is.


The prevailing worldview of the present

From the preface to The Vision of Islam by Sachiko Murata and William C. Chittick:

In this book we try to pry open the door to the Islamic universe. We are not interested in evaluating Islam from within those dominant perspectives of modern scholarship that make various contemporary modes of self-understanding the basis for judging the subject. Instead, we want to portray Islam from the perspective of those great Muslims of the past who established the major modes of Koranic interpretation and Islamic understanding.

This is not to say that we will simply translate passages from the classical texts in the manner of an anthology. The classical texts ask too much from beginning readers. They were not written for people coming from another cultural milieu. Rather, they were written for people who thought more or less the same way the authors did and who shared the same world view. Moreover, as a general rule they were written for those with advanced intellectual training, a type of training that is seldom offered in our graduate schools, much less on the undergraduate level.

The classical texts did not play the same role as contemporary textbooks, which attempt to explain everything in a relatively elementary format. On the contrary, they were usually written to present a position in a broad intellectual context. Frequently the texts would present only the outline of the argument—the rest was supplied orally by the teacher. Students did not borrow these books from the library and return them the following week. They would often copy the text for themselves (by hand, of course), and spend several months or years studying it word by word with a master. We ourselves have attended sessions in which classical texts were being studied in the Islamic World, and we can attest to how easily a good teacher can choose a word or a sentence and draw out endless meaning from it.

[…]

We are perfectly aware that many contemporary Muslims are tired of what they consider outdated material: they would like to discard their intellectual heritage and replace it with truly “scientific” endeavors, such as sociology. By claiming that the Islamic intellectual heritage is superfluous and that the Koran is sufficient, such people have surrendered to the spirit of the times. Those who ignore the interpretations of the past are forced to interpret their text in light of the prevailing world view of the present. This is a far different enterprise than that pursued by the great authorities, who interpreted their present in the light of a grand tradition and who never fell prey to the up-to-date—that most obsolescent of all abstractions.

The introductory texts on Islam that we have encountered devote a relatively small proportion of space to the Muslim understanding of reality. The reader is always told that the Koran is of primary importance and that Muslims have certain beliefs about God and the afterlife, but seldom do the authors of these works make more than a cursory attempt to explain what this means in actuality. Usually the reader encounters a short history of Islamic thought that makes Muslim intellectuals appear a bit foolish for apparently spending a great amount of time discussing irrelevant issues. More sympathetic authors try to explain that these issues were important in their historical context. Rarely is it suggested that these issues are just as important for the contemporary world as they were for the past, and that they are constantly being discussed today in our own culture, though with different terminology.

We like to think that the Islamic tradition provides many examples of great answers to great questions. The questions are those that all human beings are forced to ask at one time or another, even if contemporary intellectual predispositions tend to dismiss them as irrelevant or immature or unanswerable or self-deconstructing. We have in mind the great whys and whats that five-year-olds have the good sense to ask—though they soon learn to keep quiet in order to avoid the ridicule of their elders. Why are we here? What is the meaning of life? Where did we live before we were born? Where do we go after we die? Where did the world come from? Where does God come from? What are angels? Why is the world full of evil? What are devils? If God is good, why did he create Satan? Why does God allow good people to suffer? How can a merciful God predestine people to hell? Why do I have to go through all this?

Texts on Islam often tell the reader, in extremely cursory fashion, what Muslim thinkers have concluded about such issues; what they do not address is the universe of discourse that informs Islamic thinking and allows the conclusions to make sense. Studies usually highlight the differences of opinion; what they do not clarify is that the logic of either/or is not always at work. Perspectives differ in accordance with differing interpretations of the sources, and the perspectives do not necessarily exclude each other. We are told that people took sides, for example, on free will and predestination. But any careful reading of a variety of texts will show that the common intuition was that the true situation is neither/nor, or both/and. The extreme positions were often formulated as intellectual exercises to be struck down by the thinker himself, if not by his followers.

[…]

Readers need to be warned at the outset that this book is not designed to provide the “historical acts.” In the last section of the book, we will say something about the Islamic view of history. That will help explain why the concerns of the modern critical study of history are not our concerns. To write history, after all, is to read meaning into the events of the past on the basis of contemporary views of reality. The events themselves cannot make sense until they are filtered through the human lens. If the Koran and the Islamic tradition are read in terms of contemporary scholarly opinions or ideologies, their significance for the Islamic tradition is necessarily lost to sight.

Naturally, we as authors have our own lenses. In fact, some people may criticize us for trying to find Islam’s vision of itself within the Islamic intellectual tradition in general and the Sufi tradition in particular. But it is precisely these perspectives within Islam that provide the most self-conscious reflections on the nature of the tradition. If we did not take seriously the Muslim intellectuals’ own understanding of their religion, we would have to replace it with the perspectives of modern Western intellectuals. Then we would be reading the tradition through critical methodologies that have developed within Western universities. But why should an alien perspective be preferable to an indigenous perspective that has survived the test of time? It does not make sense to us to employ a methodology that happens to be in vogue at the moment and to ignore the resources of an intellectual tradition that is still alive after a thousand-year history.