Keeping tabs

In The Public and its Problems, John Dewey (1927) talked about the need for a vibrant Public as a necessary condition for a healthy democracy. This meant the development of what’s been called a “critical, socially-engaged intelligence,” for citizens who respect each others’ diverse viewpoints, who inquire about their society, and are actively engaged in improving it. Doing this requires open diaolgue, the ability to speak without reprisals, and, it should go without saying, the confidence that one is not being hounded by agencies who operate in secret with little check on their behavior.

A colleague of mine recently showed me her copy of Jane Addams’s FBI file. It’s 188 pages of prying into the life of one of the greatest Americans. Hull House, which she co-founded (1889), was a settlement home for thousands of immigrants in Chicago, providing food, shelter, education, arts, English classes, and an introduction to particpatory democracy. It was the beginning of sociology, social work, and public health. The first little theater in America, the first playgrounds in Chicago, and so much more, came out of Addams’s lifetime of dedication to building a better world for all.

It’s true, Addams protested against World War I. For that she won the Nobel Peace Prize (1931), but also the attention of the Federal Bureau of Investigation, thanks in part to the machinations of J. Edgar Hoover. At least those files are finally public. You can judge for yourself whether our Nation was more secure because of that assault on personal liberty.

Another colleague pointed me to a court case in Canada related to the Canadian Security Intelligence Service’s (CSIS) file on Tommy Douglas. Douglas was premier of Saskatchewan from 1941 to 1960, and led in the development of a universal, publicly-funded single-payer health care system. The success there led eventually to the national Medicare plan in Canada. In part for that work, Douglas was selected as “The Greatest Canadian” of all time.

Now, CSIS argues in an affidavit filed in court last month that “Secrecy is intrinsic to security intelligence matters.” They say that full disclosure of Douglas’s file could endanger the lives of confidential informants.

Since the file is secret, we can’t know whether CSIS is correct in saying that it’s absolutely necessary to maintain secrecy on its operations. For that matter, how do we know that their reluctance isn’t based on covering up their own violations? Reading the file on Addams, I’d say that the only people who might want to hide it are those in the FBI. One thing we do know is that there’s no public basis for these security investigations–no violence, no plots, no attempts to undermine the government.

If the file on Douglas is anything like the one on Addams, we should all ask about the extent to which our security services really serve the public good. Why do we create dossiers on our great leaders, without any public evidence? Does security intelligence trump all other values? Does it help create a vibrant Public when secret agencies are given free rein to explore and document our lives, with little oversight on their own actions?

References

CSIS trying to block release Tommy Douglas’ file. Toronto Sun.

Dewey, John (1927). The public and its problems. New York: Holt.

Maloney, Steven Douglas. The public and Its problems.

The land of forms

form_1040_us_individual_income_tax_return_form_imageFar across the sea, there’s a certain land in which curious practices began to emerge some time ago. These practices began with the idea of documenting the work people were doing. Someone had the brilliant idea to ask each person to fill out a form to show how much they had done at such and such a time. It was never clear that the information so collected had any bearing on the work or the people involved, but the form was beautiful and quickly evolved from a few simple questions into a formidable document.

Soon, it was decided that forms would be useful in health care, asking all kinds of questions about the body, regardless of whether that information would be used. There were then forms for voting, for taxes, for getting a job, for running a business, for schools, for shopping, for clubs, for religion, for travel, for sports, for software, indeed for every aspect of the people’s lives. In the early stage, the typical form would fit on a sheet of paper. But that stage was short-lived. The forms began to grow, soon needing special, long sheets of paper, or multiple sheets. Then, online forms appeared, with checkboxes, open fields, Previous and Next buttons and all sorts of other helpful features.

prc-health-form-eAn especially useful feature was “Are you absolutely sure that the information you have entered is accurate and complete? Severe penalties for non-compliance will ensue.” This one was good because the forms were inevitably obscure and self-contradictory, making it a challenge to know what one had just filled out, much less whether it was accurate and complete.

A major advance in the practice of forms was to create forms to determine whether you were filling out other forms properly. Ethics compliance forms were established to check that other activities, inevitably themselves involving forms, were properly conducted. As with the other uses of forms, the genesis was quite understandable. For example, people had been incorrectly filling out forms to issue driving licenses, thus endangering the public. A new form arose to ensure more ethical behavior. The fact that ethical abuses escalated following the introduction of the new ethical form led to a now-familiar phenomenon: The form was expanded. Again, the link between ends and means was tenuous at best.

An especially interesting aspect of the forms culture was that some forms could not be completed without first doing another form. Completing the second form would lead to the production of a control number to be entered on the first, assuming of course that it, the second one, could be properly completed, submitted, and reviewed. This practice reached its zenith with the realization that form number two could itself require the completion of another form, and so on.

In this way, the forms began to come alive, each connected to the others though a complex, essentially unknowable rhizomatic network. Forms naturally spawned other forms in an ever-growing ecology of forms in multiple media.pro-job-application-form-thumb

As the forms ecology grew, some people began to raise questions about whether it was possible to complete a form if doing so entailed completing other forms in an endless succession. Fortunately, there were philosophers and mathematicians to weigh on on this question. One school of thought, the Infinitists, began to argue that the chains of forms were infinite, meaning that some forms were uncompleteable, a seeming tragedy in the forms world. Others claimed that the total number of forms had to be finite, but that there were circular chains such that a form could be completed only by being already completed.

This latter view is reminiscent of Schopenhauer’s demand on the reader in his The World as Will and Idea. Schopenhauer says that his book has but one idea. That idea is an organic whole that cannot be expressed by a book with “a first and a last line.” His compromise solution to this conundrum is to ask the reader to read the book twice or not at all. The Circularists,  as those who believed in the circular chains of forms came to be called, adopted a similar view: They argued that although the circular topology prevented the form from ever being completed, repeated revisitings could lead to a kind of oneness with the form akin to groking Schopenhauer’s one organic idea.

Pragmatists, of the Peircean variety were quick to see the ever-increasing complexity of the forms ecology, with its convoluted topologies and possible lack of finitude. But they emphasized an additional wrinkle that had passed by even some of the great connoisseuers of forms. The forms were not static; they could change in small and large ways at a moment’s notice. This meant, among other things, that having completed a form on one day was no assurance that one would not be required to complete it again the next.

autofill_formThere was also a curious aspect of the storage of forms data. I’ve remarked on the separation of the forms from the dally life and purposes they purported to address. But beyond that, they spoke to themselves in what some deemed to be a fractured dialect. Forms completed at a doctor’s office could not communicate with the apparently similar form at the physical therapy facility whose purpose was to implement the doctor’s prescription. And neither of those forms could speak to the pharmacy forms or those of the medical supply.  This occurred even when the facilities were all part of the same organization.

On the other hand, even though the forms were disconnected from daily life and each other, they had a remarkable ability to retain and communicate data in a dysfunctional fashion. For example, no matter how grudgingly and circumspectly people had revealed details of their lives or how many assurances had been made, these details were regularly transmitted throughout the land. The word for “privacy” disappeared from the language, as it no longer had a use.

Despite the massive accumulation and dissemination of data engendered by the forms, people seemed to know less and less about one another or the concrete problems they faced in their lives. The reason was clear: Police spent time on forms, not on preventing crime; health providers likewise became adept at forms, but not at ensuring health; teachers knew every line and checkbox, but had little time for details such as students.

Over time, the people learned that nothing was real in their lives unless it could fit on a form–their wealth, their citizenship, their job, their spouse, and so on. What could not be form-alized did not exist. The forms became the reality they originally sought only to document. They infiltrated every aspect of the people’s lives and slithered with ease across natural and political boundaries. While the forms ecology had a beginning in specific times and places, it warmed the hearts of forms afficianados to know that there was no way to stop their spread.

I welcome comments on this little story. There’s a form below for your convenience.

Your very own barcode

Tech-Ex talks about the Google doodle becoming a barcode, and then, how to make your own. That article also includes a little history of the barcode:

The first item scanned was a pack of chewing gum scanned at an Ohio supermarket in 1974. On June 26, 1974, Clyde Dawson pulled a 10-pack of Wrigley’s Juicy Fruit gum out of his basket and it was scanned by Sharon Buchanan at 8:01 AM. The pack of gum and the receipt are now on display in the Smithsonian Institution.

Click here to create your own, and you too could be scanned!: the barcode printer: free barcode generator by Barcodes Inc

chipbruce_barcode

hear you are — [murmur]

murmur[murmur] is a documentary oral history project that records stories and memories told about specific geographic locations. We collect and make accessible people’s personal histories and anecdotes about the places in their neighborhoods that are important to them. In each of these locations we install a [murmur] sign with a telephone number on it that anyone can call with a mobile phone to listen to that story while standing in that exact spot, and engaging in the physical experience of being right where the story takes place. Some stories suggest that the listener walk around, following a certain path through a place, while others allow a person to wander with both their feet and their gaze…

All our stories are available on the [murmur] website, but their details truly come alive as the listener walks through, around, and into the narrative. By engaging with [murmur], people develop a new intimacy with places, and “history” acquires a multitude of new voices. The physical experience of hearing a story in its actual setting – of hearing the walls talk – brings uncommon knowledge to common space, and brings people closer to the real histories that make up their world.

Being cellphone-impaired, and far from Toronto, I’m reduced to listening to the stories on the website, but they still convey a sense of the city and its history. The site’s a well-designed example of integrating oral history, geographic information systems, and mobile phones.

Community Inquiry Labs

Inquiry cycle

Inquiry cycle

Community Inquiry Labs (aka CIL’s or CILabs) is rising again!

What is CILabs?

Drawing from the work of John Dewey and others, showing that education begins with the curiosity of the learner, CILabs promotes an iterative process of inquiry: asking questions, investigating solutions, creating new knowledge, discussing experiences, and reflecting on new-found knowledge, in a way that leads to new questions.

In addition to the standard features found on group support sites, such as Ning, Google, Yahoo, and Moodle, CILabs offers a means for building Inquiry Units based on the Inquiry Cycle. Also, unlike most university-supported software there is a secure means for users without university netid’s to participate. This is crucial for university-community collaborations.

CILabs (aka iLabs) are being used currently in courses such as Will Patterson’s Hip Hop as Community Informatics and Martin Wolske’s Intro to Network Systems. Projects such as Youth Community Informatics use it as do a variety of  other projects and organizations.

The redesign

Despite filling a need for many individuals and groups since 2003, use of CILabs fell off after a security hole was discovered in CILabs 3. That led to a temporary shutdown and a major redesign on the Drupal platform.

Thanks to the support of Robert Baird at CITES EdTech, a project to rebuild CILabs was led by Alan Bilansky with Julieanne Chapman as lead programmer. Claudia Serbanuta represented GSLIS and the CILabs user base. The new CILabs is now hosted by the University of Illinois College of Education, thanks to Ryan Thomas and John Barclay. This represents an unusual and successful collaboration across two colleges and CITES, with support from the Center for Global Studies, Community Informatics Initiative and the Illinois Informatics Institute.

I encourage you to give it a try now, and to let us know how to improve it.it

The birth of computer networking

I had arrived at Bolt Beranek and Newman (BBN) in the summer of 1971, knowing of the important work there in artificial intelligence, computer simulations in psychology, and natural language understanding. But I understood only vaguely the explosive potential of the work on computer networking.

Computer Networks – The Heralds of Resource Sharing was a movie made to accompany the public demo of the ARPANET at the 1st International Conference on Computer Communications in Washington DC in October, 1972, about a year after my arrival. Unfortunately, the movie wasn’t finished in time for the demo, but it was released before the end of that year. I didn’t have anything to do directly with the movie or the work described, but knew many of the people and projects that are featured.

The movie represents both a thoughtful account and a primary source itself for the general history of computing and communication. It also tells us about successful collaboration–how participants at the time themselves described it. I think it also gives a good account of the motivations behind the ARPANET, forerunner of the Internet, and a good basic description of how it works.

The origins of mobile phone and email

atsignMartin Cooper and Raymond S. Tomlinson have just been granted the 2009 Prince of Asturias Award.

Cooper invented “the first handheld mobile telephone and supervised the ten years that were necessary to commercialize the product. He … formulated the Law on Spectrum Efficiency, also known as Cooper’s Law, which states that the maximum amount of information that is transmitted over a given amount of radio spectrum doubles every 30 months.”

tomlinsonTomlinson worked at Bolt Beranek and Newman (now BBN Technologies). He helped develop the TENEX operating system, which had several innovative features, including a full virtual memory system, a user-oriented command line interpreter, and a command escape recognition system. In 1971 he developed the ARPANet’s first application for email by combining the SNDMSG and CPYNET programs so messages could be sent to users on other computers. He selected the @ sign to identify the user’s computer. Before long, that sign became the icon of the digital era.

Ray and I were colleagues at BBN, and teammates on the Great Swamp Volleyball Team, but I was just an ordinary user of that ground-breaking operating system and that early form of email.

References

Kapitzke, Cushla, & Bruce, Bertram C. (2005). The arobase in the libr@ary: New political economies of children’s literature and literacy. Computers and Composition: Special Issue on the Influence of Gunther Kress’ Work, 22(1), 69-78. [doi:10.1016/j.compcom.2004.12.014]one of

Single-payer health care: Why not?

180px-Roma_-_FatebenefratelliI’ve been fortunate to have traveled many places, and to have lived for extended periods in China, Australia, France, and Ireland. During those travels, my family has received health care on many occasions, including for our small children in China and Asutralia, my wife in Scotland, and my 87-year-old mother in Ireland.

This health care has come in a variety of forms, including treatment for my ten-year-old daughter’s eyes at the Hospitaller Order of St. John of God or Fatebenefratelli (see left), located on San Bartolomeo, the only island in the Tiber River in Rome. That hospital was built in 1584 on the site of the Aesculapius temple.

clontarfWe also faced emergency surgery for my mother’s hip at Beaumont Hospital in Dublin, Ireland and subsequent rehab at the Orthopaedic Hospital of Ireland in Clontarf (right). In China, we were served in medical facilities with separate queues for Western medicine (our choice) and traditional Chinese medicine (below left). I donated blood many times at the Hôtel-Dieu de Paris, founded in 651 on the Ile de la Cité (below right). I’ve also observed, though not had to depend upon, health care in Russia and even in economically oppressed places such as Haiti.

beida_hospitalOn the whole, I’ve received excellent care in a variety of conditions. Individual health providers have been courteous, knowledgeable, and dedicated to their professions. For myself and my family, the experience of care did not depend on the setting or language, but rather on the ailment or the specific people providing care.

And yet, one thing stands out: Among the industrialized nations, the United States is the only one without universal health care. All of the others provide health care for all. They also do it primarily through single-payer systems.

The United States operates instead through a complex bureaucracy of insurance policies, doughnut hole prescription drug coverage, forms and regulations galore, massive administration, unnecessary and excessive procedures, complex and confusing tax codes, leading to escalating costs and unfair coverage. The inequity of care actually costs all of us more in the end, because of lack of preventative care, inefficient delivery (e.g., emergency rooms), and lost productivity. Our system costs much more, even double that found in other countries.

hotel_dieuIf we were to find that spending a few dollars more gave us better care, there might be little room for argument. But in comparable economies, people spend much less, yet have longer, healthier lives (American Health Care: A System to Die For: Health Care for All). Why then, is the system that works in Canada, Japan, Europe, Australia, etc., not even under consideration here?

The answer is unfortunately all too obvious: Americans, unlike citizens in other countries, have ceded control of their own health care to profit-making insurance companies, hospitals, clinics, laboratories, pharmaceutical companies, and other entities. The best we can do is an occasional feeble cheer when someone asks why our government can’t even consider a single-payer system. Then we listen to an answer that mostly obfuscates and lays the blame for it back on our own timidity:

Visual literacy in the information age

ching-chiu1Ching-Chiu Lin is a founding member of the Youth Community Informatics project. Her work with Timnah, Lisa, and Karen at the Urbana Middle School integrated art, music, story-telling, cultural heritage, and multimedia in an after-school program. That’s one of the models for our current work.

michoacanChing-Chiu’s dissertation, A qualitative study of three secondary art teachers’ conceptualizations of visual literacy as manifested through their teaching with electronic technologies, analyzed similar arts and new media projects in three schools. I’ve learned a little while ago that it was awarded second place for the 2008 Eisner Doctoral Research award. This was officially announced at the National Art Education Association (NAEA) convention in Minneapolis this month.

Congratulations, Ching-Chiu!

How Europe underdeveloped Africa

how_europe_underdevelopedAmita’s interesting post, Education for Liberation…., and the materials she cites (Challenging White Supremacy workshop), reminded me of a book that had a big influence on me. I read it shortly before the author, Walter Rodney, was assassinated, in 1980, at the age of 38.

The book, How Europe Underdeveloped Africa, is now available online.

Rodney presents a new way of thinking about Africa’s so-called “underdevelopment.” The question was “why are some areas of the world rich and others poor?” I had been taught many reasons for this—that successful countries had better inventions, more adventurous explorers, greater natural resources, geographical advantages, better climate, less corruption, or just good fortune. The implication was that they mostly deserved their status as did the less successful ones. Africa’s underdevelopment was thus to a large extent Africa’s fault. Of course, a generous impulse might lead us to help those less fortunate to develop and share the goods of the world, maybe not to achieve full equality, but at least enough to meet their minimal needs.

Rodney challenges that entire view. He describes an Africa that is more developed than Europe in most ways except military conquest. When Europe fails to compete on even terms with Africa and Asia it turns to war and colonization to take by force what it cannot achieve through fair trade. Africa is then consciously exploited by European imperialists, leading directly to the modern underdevelopment of most of the continent. Thus, “underdeveloped” is an active verb, with an agent who does the underdeveloping; it’s not just a descriptive adjective.

1492Rodney’s thesis was highly influential. James M. Blaut’s works, The Colonizer’s Model of the World: Geographical Diffusionism and Eurocentric History (Guilford, 1993) and 1492: The Debate on Colonialism, Eurocentrism, and History (Africa Research & Publications, 1993) extends the basic thesis, with more detailed economic analyses.

Other writers have criticized aspects of Rodney’s work, but the general idea seems even more salient in an era of neocolonialism. For example, Haiti today struggles under a crushing external debt. Nearly half of that was incurred under the Duvaliers, puppet dictators of the US. The Duvaliers stole the resources of the Haitian people, then assumed debts that oppress their children and grandchildren. Debt service, a burden essentially imposed by the US, makes economic growth nearly impossible. Yet commonplace accounts would say that “they” (the Haitian people) can’t manage finances, don’t know how to protect their natural resources, have a corrupt economy, lack creativity or initiative, or otherwise are to blame for their fate.

For many countries in Africa, for Haiti, and for other colonized areas, the forcible appropriating of indigenous human and natural resources means underdeveloping those areas. When we turn “underdevelop” into a past participle, “underdeveloped,” we make it easy to forget how that happened. Rodney puts it this way:

The question as to who, and what, is responsible for African underdevelopment can be answered at two levels. Firstly, the answer is that the operation of the imperialist system bears major responsibility for African economic retardation by draining African wealth and by making it impossible to develop more rapidly the resources of the continent. Secondly, one has to deal with those who manipulated the system and those who are either agents or unwitting accomplices of the said system. The capitalists of Western Europe were the ones who actively extended their exploitation from inside Europe to cover the whole of Africa. In recent times, they were joined, and to some extent replaced, by the capitalists from the United States; and for many years now even the workers of those metropolitan countries have benefited from the exploitation and underdevelopment of Africa. (§1.2)

Rodney’s account of Africa, written 37 years ago, is still relevant for Africa today. But it extends to other international regions and even to communities within so-called “developed” countries. When we see, and label, communities as underdeveloped, low-resource, impoverished, disadvantaged, economically depressed, troubled, or marginalized, we follow the lead of the 1965 Moynihan report, which described a “tangle of pathology,” locating problems within the community with causes in the distant past.

We should ask not only how these communities compare to privileged ones, or even what useful things we might do to help them. We need to look first at the structures and mechanisms of power that caused these conditions in the first place, and now, continue to maintain them. This means turning from the conceit that underdevelopment just happens, that an appropriate and full response is to “give” to those less fortunate. It requires collaborative struggle in which all participants are willing to examine the roots of oppression and to engage in the practice of freedom.