Pat Logan's Web Log
This personal Web page is not an official University of Rhode Island Web page. See URI disclaimer
Jan. 25, 2013
Critical Enabling Technology
Web Development in My Academic World
As an academic, I'm at least as enmeshed in use of the world-wide-web as anyone. I use it for everything: academic instruction, research, administrative duties, correspondence, recreation, weather reports, and social networking. When the internet goes down, I morph into a dysfunctional zombie, stumbling around wondering what to do with my life. Most of my colleagues and students (texting or constantly connected to the internet via laptops, ipads, or smart phones) are similarly cybernetic. We merely await another techno-doubling in Kurzweil's singularity before joining the Borg. Resistance, after all, is futile. We live in the internet age and it has engulfed us.
People at the University are engrossed in consuming web content. But how good are people at the University at producing web content? After all, someone has to do it, and if not the faculty, then who? If the internet is ubiquitous and vital, are we preparing ourselves to use it to create, rather than merely to mine, its important content?
These questions were on my mind as I attended a conference held by URI's Harrington School of Communication and Media, Jan. 16 & 17, "Convergence and Community: Preparing Future Workers for a New Knowledge Network of Libraries, Newsrooms, Studios, and Agencies." The goals of the conference were to explore issues that unite journalism, communication, education, and library/information science; to generate ideas for curriculum renewal and programmatic change for URI's Journalism and Library Science programs; and to imagine how the University can catalyze advancement of society's information and economic needs.
As one of three dozen attendees, I was particularly interested in how the two principle targets (Journalism and Library) viewed web development. This is an old interest for me, mainly because it involves a substantial part of the teaching at URI that affords me a living. But it is also an area that applies through the University. These are my perceptions of what I learned.
A Little Background
First, a little history that may inform how I view the relative significance of web development as something to be taught at universities, and my experience with the views of the rest of URI and, I suspect, most universities:
In a Cornell University report of a Research Futures Task Force for the Physical and Biological Sciences and Engineering from a decade ago, there was this recognition of the nature of cross-discipline-cutting technologies in need of focused investment by the University:
We focus much of the report on areas which are characterized by a breadth of impact in basic and applied research throughout the sciences, the social sciences and the humanities, with an importance over decades. To this end the Task Force suggests that the following broad research themes are likely to strongly influence future scientific research: (1) genomics and integrative molecular biology, (2) information sciences, and (3) advanced materials. Cornell currently has enormous faculty resources in or directly related to these three strategic enabling areas. These resources are dispersed across many of the departments and all of the colleges of the University. Within this broad, loosely connected community of researchers, we need to develop a strategy for enhancing the effectiveness of resource commitments.
In my mind, the essence of the concept captured by the phrase "strategic enabling areas" has morphed into "critical enabling technologies," but the core concept remains the same: Are there vital fields of technology that are so important across many parts of the University such that they warrant a strategic investment? In Cornell's case, the exercise was one of identifying targets for adding University resources, which were to be taken from elsewhere in Cornell. That, of course, has political implications, and I'm sure the usual turf and cultural wars must have ensued. But these are the necessary, and sometimes courageous, decisions that all evolving institutions must confront. Sometimes you just have to get with a new program if you want to stay in business.
Similarly, I view "information sciences" to be those technologies that share a common pragmatic dependency for implementation on the internet and internet information-exchange technologies, the most ubiquitous of which is the familiar browser-based use of documents created using hypertext markup and related web languages, an area of technology usually referred to broadly as web development. That is, I take web development to be at the heart of what at least one prominent University sees as critical to the functioning of at least several parts of the institution.
Web Development at URI
I used the Cornell report in a document I prepared in January 2005 for the URI Curricular Affairs Committee, "Web Development Curricula in Research Universities," which was used to put an end to a 3-year CAC refusal to approve two proposals I had submitted to teach web development. The Committee, who had no one familiar with the area, initially informed me that such courses were not necessary "because every high school graduate knows how to make a web page using FrontPage" (a Microsoft Program used to generate proprietary, non-standard, bloated code which no serious web developer would ever use, IMHO).
By conducting an informal survey of 134 universities (Carnegie old-system research I & II plus URI "peers" commonly used for local benchmarking), I was able to point to a substantial number of leading institutions which supported web development courses (HTML basics all the way to graduate level work using C# (at Harvard)), saying, in essence, "see, web development is being taught in serious colleges and universities." Cornell, Harvard, NYU, Northeastern, UC San Diego, Washington U, and Cal Poly Pomona were among 80 institutions hosting significant web-related courses in my online database of universities, colleges, and departments, and listings of course descriptions. The summary report is still online (here), although the database is not currently available. It was, to say the least, an extraordinary barriar to creating coursework in web development. And remember, this was only eight years ago!
How Do Journalism and Library Science View Web Development in Their Future?
The "Convergence" part of the conference title has three possible meanings (they weren't a focus of the discussion, so I'm free to imagine):
- The phrase "convergent media" recognizes that separate fields—journalism, television, the web, and film or digital media—are increasingly entwined. Conference participants seemed to share a gloomy sense that print media is waning. The New York Times, for example, publishes both the daily paper version of its product and a complete online equivalent, which I subscribe to as my principle daily news source, along with the Washington Post and liberal (so to speak) doses of MSNBC (you are not shocked that I'm not a Fox devotee, I presume). The local Providence Journal has been downsizing, and there is a feeling of dispair about its future. Students get much of what passes as news from various internet postings: for the most part this generation of students seems to be largely disinterested in the politics, economics, or ecology of the planet around them, in my personal observations. Increasingly, the lines between print, television, web, and wireless information sources is blurring; the fields are converging.
- I learned that librarians and journalists both see themselves as not only dispensers of news (my former view of librarians only), but also as creators (still the primary function of journalists). Librarians increasingly see a role as developers of critical resources to meet the information needs of local communities. Journalists draw on a vast array of stored data to assemble more comprehensive reporting. To a great extent, these "fields" are beginning to overlap. I noted (although this was not discussed) a parallel in an underlying strong commitment to community outreach and service, once a hallmark of land grant (public) universities. The missions are converging.
- I sense also a congruency in recognizing that all forms of media (including libraries) feel a renewed and vigorous commitment to the traditional "Fourth Estate." There was an expressed, albeit subdued, worry about the implications for a weakening of the critical analytic and public accounting roles of the media, an essential function of the information sector (my phrase), including Universities broadly. The role is shared across all forms of media.
From my outsider's perspective, a critical unifying technical need (critical enabling technology, if you will)—a dependency shared by all forms of media, to an increasing extent as they converge in the senses above—is a pragmatic ability (skill set) with contemporary web development technology. There was, however, among conference participants (including a few nationally prominent people who managed complex media databases) a relative lack of interest or recognition that web technology is essential within a forward looking media curriculum. My reason for saying this is that when given an opportunity to identify web development as a preference for a break-out session, no one expressed interest.
How do I view this? My career is long enough that I compare it to the state on mind that shaped the attitudes of scientists working at the University as I began my career (in the 1970s). When I first began working as a university researcher (entomology), my colleagues generally wrote manuscripts in a way that was several decades old. That is, we would work from a top-down manuscript, written either on a typewriter or by hand. First draft complete, we'd literally get a marker, scissors, stapler or tape and physically cut and paste parts of the document on to fresh sheets of paper, adding additional text as needed. I describe this for a current generation that probably never actually did the "cut and paste" in anything but an electronic environment. As often as possible, this and subsequent drafts were handed over to a full-time secretary for retyping, a process with an average turn around time of 3-5 days. In the late 1970s, the ratio of secretaries to faculty was generally 1 to 4 at URI. The attitude? "Of course I don't type my own manuscripts. What in unimaginable waste of my very important time that would be!"
A decade later, there was typically a single secretary working for eight or more faculty in the average URI academic department. The difference? The mind-set that rejected the notion that faculty or researchers would ever waste their time on such a thing as typing gradually yielded to the enabling technology of computer-dependent word processing. The average manuscript went from undergoing 3-4 drafts to typically seeing a dozen or more revisions, with multiple revisions involving many authors being turned around in fractions of days. Today, it is impossible to imagine the inefficiency of the former system. The learning curve of a competent producer using available technology is taken for granted as a minor yet essential inconvenience. And the attitude? "Why would I waste time and money having someone else do work that I can do almost instantly while producing a superior product? What a waste it would be to do it otherwise!"
I don't see anything unusual here. There is resistance to change, but that is the norm in academia. Why?
Particularly from outside of universities, it is hard to appreciate the investment faculty make in the curriculum. Individually, what faculty teach is the essential justification for their job. Clearly, all faculty tend to see their particular classes and interests as important not only to themselves but, by extension, obviously to the department and the field itself (and of course to the future of human knowledge and civilization). I love the things I teach. They interest and at times fascinate me. I can see why they are useful or even vital to the future of my students. Naturally, they belong in the curriculum.
Academic curricula quickly fill up with the things the members of a department take seriously. Sure, there may be a give-and-take and some conflict in perspectives. But eventually the entomologist teaching pesticide mode of action (because pesticides are vital to agriculture and human welfare) exchanges with a colleague teaching natural biological control (because pesticides contaminate the environment and make us sick) the acknowledgment that both courses are necessary as part of a well-rounded graduate's education, and of such good intentions are 180 credits built.
But having invested in the correct components, how are faculty to reverse course to make room for innovation or revolution. What would we cut to make room? What was not so important to teach after all: surely something the junior faculty or the about-to-retire professor was involved in! And so it goes. Despite the branding of their political critics, universities are hotbeds of conservatism in this sense. As a friend once observed, "When the cool, fresh winds of change begin to blow, shut the damn window!"
I'll draw a similar conclusion as I consider a second topic likewise sidestepped at the conference, the question of creating enhanced science awareness in journalists or librarians (next post). For now, two suggestions:
In short order, URI needs to examine and make some decisions on its own commitment to web development, which I continue to see as a critical enabling technology across the entire university. One senior academic teaching a couple of web courses to a handful of students will not produce a generation of web-competent producers. In the near future, no graduate of the Harrington School's 6 programs should leave URI without practical working abilities as a web content creator. Room must be made.
An enhanced web development component across more than one curriculum requires commitment of resources (fortunately, building and maintaining web teaching labs isn't terrible expensive relative to the public return on investment of having a broadly web-enabled workforce). The Harrington Departments (currently web is a foster child of Communication Studies, which is not a normal home in other institutions) or the School itself (lowering former department walls through enhanced cross-collaboration) need to facilitate discussion with the Computer Science Department and the College (Arts & Sciences) to determine the place in a broad array of programs and the requisite workforce to offer what is needed. I suspect it will be more than one person working without support.
Failure to engage will leave the Harrington School and URI less than fully enabled to be an important center of communications and media. I see no other future than one where this is recognized and made a priority throughout the entire community. I don't think that time can remain very far off. As someone who could not function without daily production of web content for teaching (or this blog or similar academic exercises), I can barely imagine how my colleagues live without this vital suite of skills: I don't know how modern academics can do their jobs without being producers of web content. I sincerely hope that a broad technological epiphany will soon transform URI.