Beyond Star Flashes: The Elements of Web 2.0 style

Beyond Star Flashes: The Elements of Web 2.0 style

Available online at www.sciencedirect.com Computers and Composition 27 (2010) 15–26 Beyond Star Flashes: The Elements of Web 2.0 style Bradley Dilge...

170KB Sizes 4 Downloads 36 Views

Available online at www.sciencedirect.com

Computers and Composition 27 (2010) 15–26

Beyond Star Flashes: The Elements of Web 2.0 style Bradley Dilger Western Illinois University, Dept. of English & Journalism, 1 University Circle, Macomb, IL 61455, United States

Abstract In his “Web 2.0 How-to Design Guide,” Ben Hunt identifies the stylistic elements shared by Web 2.0 sites, including “star flashes,” circular badges reminiscent of sale price stickers. However, Hunt’s approach to style is limited to cataloging surface features. A site designed using his guide would certainly look like Flickr, YouTube, or LibraryThing but might not employ the approach or functionality of those sites. While composition teachers can and should embrace Web 2.0, we must do so critically, by considering what Francis-Noël Thomas and Mark Turner would call the “conceptual stand” of Web 2.0, its fundamentals of writer, reader, thought, language, and their relationships. This approach to style recognizes that separating style and substance, however convenient, is misleading. In this essay, I map the conceptual stand of Web 2.0, providing a structure for critically evaluating sites that claim the “2.0” moniker. Given these elements of Web 2.0 style, composition teachers can better understand, employ, and engage Web 2.0 in teaching and scholarship. © 2009 Elsevier Inc. All rights reserved. Keywords: Style; Web; Web 2.0; Function; Network; Design

1. Introduction Few have done more for codifying the meaning of Web 2.0 than publisher Tim O’Reilly, co-founder of the Web 2.0 conference and author of the widely cited “What is Web 2.0?” (2005). O’Reilly used the term Web 2.0 to acknowledge a broad shift in the Web fulminated by the turning point of the dot-com boom and bust. For O’Reilly, the seven principles outlined in “What is Web 2.0?” form a new “pattern language” or “mindset” sharply distinct from the broadcast-oriented model that the first web boom borrowed from the mass media. Second-generation developers saw the Web as a rich platform for applications and services, not just as a simple medium. They welcomed a wide variety of browsing technologies and imagined users not only as readers but also as writers. According to O’Reilly, Kevin Kelly (2005), and others, the Web had reinvented itself with folksonomies, weblogs, feeds, and social networks about a decade after the Web first took off. The new Web was different and merited a new name. Hence “Web 2.0.” But from the start, the concept of “Web 2.0” was sharply criticized as a marketing ploy, nothing more than a new label for the same old irrational exuberance of the dot-com boom, the same corporations selling the same commodities at the behest of the same advertisers. Indeed, “What is Web 2.0?” begins with the acknowledgment that critics like Tim Bray (2005) had already labeled “Web 2.0” as “a meaningless marketing buzzword.” Trebor Scholz (2008) also used this language, calling Web 2.0 “a good example of marketers entering the discussion about the Internet.” For Scholz, Web 2.0 is little more than an “ideology. . . a framing device of professional elites that define what enters the public discourse about the impact of the Internet on society.” Ted Dziuba’s criticisms of Web 2.0 are well known for

E-mail address: [email protected]. 8755-4615/$ – see front matter © 2009 Elsevier Inc. All rights reserved. doi:10.1016/j.compcom.2009.12.002

16

B. Dilger / Computers and Composition 27 (2010) 15–26

their humor and profanity (Sloan, 2007); but more interestingly, he wrote of Web 2.0 using the past tense (Dziuba, 2008)—as if it were another bubble that had already burst and gone. For these critics, slapping a “2.0” label on the Web is a deliberate misrepresentation that seeks to hide its lack of substance. Without the sharpness of Scholz or Dziuba, Ben Hunt (2006) made a similar claim with his “Web 2.0 How-to Design Guide,” which catalogs fifteen features of the visual style of Web 2.0. Among them are simplicity, a central layout, gradients, reflections, and star flashes—the “star-shaped labels that you see stuck on web pages” which “work by evoking price stickers in low-cost stores.” Hunt warned that simply adding these features to an existing site “doesn’t make a design ‘2.0’—or good!” But Hunt did not discuss what might make a design “2.0,” electing to remain at the level of visual style. Nor did he suggest ways that stylistic elements like simplicity or transparency might extend beyond visual style to other areas of content. I want to suggest that Hunt has the right concept—style—but the wrong definition: one that opposes style and substance. As Richard Lanham’s The Economics of Attention (2005) argued, that always-problematic distinction has broken down completely given the rise of an attention-based economy focused on managing an abundance of information, as opposed to traditional worries about the scarcity of resources. But that doesn’t mean style is unimportant—quite the reverse. As consumers more often make decisions based on aesthetics and appearance rather than functionality and engineering, attention comes to the foreground in more ways than one. As Virginia Postrel wrote in The Substance of Style (2003), it has to; nearly every product and service comes in a wide array of styles, colors, and patterns, and consumers are required to make more aesthetically oriented choices nearly every time they shop—whether they are purchasing toasters, cars, milk, or babysitting. In an age where design is increasingly powerful, and increasingly visible, style is more and more critical. Lanham suggests a new rhythm of attention has emerged: “oscillatio” (p. xiii), a back and forth between “stuff and fluff” that contrasts with traditional approaches to style that view it as mere seasoning added once the real work is done (as with Ramus’s reaction to the vagaries of Renaissance rhetoric). He contends that we must engage a “revisionist thinking” (p. 254) that engages oscillatio to inspire and create, as well as seek to develop technologies and methodologies that support rapid movement between attention to style and attention to substance. Though I applaud Lanham’s sentiments, I am afraid his approach maintains the style and substance dichotomy. For me, it would be better to shift the definition of style beyond Hunt’s star flashes to more fully acknowledge its connection to and inclusion of substance—the commonalities of stylistic elements of all kinds, not only those manifested in surface features. If we deploy a comprehensive definition, style can provide both the “critical perspective on Web 2.0” that Scholz wishes to offer and the pointed critiques Dziuba and others desire. For composition teachers, an approach to style that does not divorce it from substance is not only familiar but valuable in providing a nuanced approach to aesthetics and design suitable for the complex attention and information economies of the Web. To get to the heart of Web 2.0 style, I turn to the definition Francis-Nöel Thomas and Mark Turner (1996) offered in Clear and Simple as the Truth: Writing Classic Prose. For them, a style is “defined by its conceptual stand on truth, presentation, writer, reader, thought, language, and their relationships” (p. 4). Style is never optional, as the common sense opposition of style to substance wrongly indicates: “Nothing we do can be done ‘simply’ and in no style, because style is something inherent in action, not something added to it” (p. 10). Everything we do has a style of some kind. However, because we learn to speak, write, or perform any activity with a default style embedded, the stylistic dimensions of speaking, writing, or any activity are customarily ignored. We cannot perceive a Southern drawl as a style of speech until we hear a Harvard honk or an Irish brogue, nor can we appreciate that typewriters use typefaces until we see computers that present multiple options. However, Thomas and Turner argue that through analysis, practice, and comparison to other styles, we can learn to recognize and differentiate styles, understand their conceptual stands, and enumerate their stylistic elements, which are “fundamental” principles that provide “a small number of starting points at a high level of generality from which all the details of the subject follow” (p. 20). For any style, the elements of the conceptual stand are “definite and few,” never derived from one another and never including the surface features and mechanical rules that correspond to a particular style. Rather, these principles answer questions about the “series of relationships” implied by Thomas and Turner’s definition: What can be known? What can be put into words? What is the relationship between thought and language? Who is the writer addressing and why? What is the implied relationship between writer and reader? What are the implied conditions of discourse? (p. 22) For example, in the conceptual stand of the classic style, Thomas and Turner’s writer converses with receptive readers by offering prose that derives its persuasive value from elegant, efficient presentation of similarly elegant truths. Writing

B. Dilger / Computers and Composition 27 (2010) 15–26

17

is a tool with which thoughts are shared, and truths expressed, in an uncommon manner that remains accessible to all receptive to it (see pp. 27-72). On the other hand, plain style “is communal, its model scene in which speakers reaffirm for each other common truths that are the property of all” (p. 76). Thomas and Turner’s definition of style does not cast aside questions about surface features. Rather, they insist we can only understand style by articulating sentence length, vocabulary, syntax, and other surface elements with the conceptual stands that underlie them. Plain style’s lack of ornamentation matters because of its vision of truth as simple and pure, the equal property of all writers and readers. Similarly, to truly understand Web 2.0 style, instead of building lists of surface features (after Hunt) or focusing only on ideological motivations (like Scholz), we should seek to understand the relationships between truth, presentation, writer, reader, thought, and language that Web 2.0 embodies. We can consider this a focus on the “back end” of style, analogous to the databases and algorithmic systems that do the heavy lifting of Web 2.0. As O’Reilly later wrote (2007), “Web 2.0 is not about front end technologies. It’s precisely about back-end, and it’s about meaning and intelligence in the back end.” By naming the elements of Web 2.0 style, I provide a road map to this back end, giving direction to those in rhetoric and composition who, like me, look toward Web 2.0 for inspiration, methodology, and content. Certainly, Scholz and Bray are right that some sites that claim “2.0” status do so only for marketing purposes. For writing teachers, approaching Web 2.0 as a matter of style offers a familiar critical instrument that can help separate the adopters from the pretenders. Thomas and Turner provide not only my definition of style (though slightly modified, as I will soon note), but also my approach. Like Thomas and Turner, I begin defining Web 2.0 style by describing its elementary principles, then describe “trade secrets,” fundamentals of style that seem to contradict Web 2.0’s conceptual stand but are in fact essential. I conclude each section and the essay with discussion of implications for composition studies and the teaching of networked writing. 2. Function comes first Thomas and Turner begin their definition of style with truth. As their title suggests, truth is at the heart of classic style, from its enabling convention that “truth can be known” (p. 31) to its role as motivation for both reader and writer (p. 35). Truth carries throughout all of the elements of classic style, bearing upon the writer’s stance (p. 57–59), the presentation (p. 60), and the relationship between thought and language (pp. 64–71). Notably, the ambiguity in Thomas and Turner’s title is deliberate, as their inclusion and description of trade secrets shows. Classic style only appears to be clear and simple, though it is never easy to produce; it purports to present the truth as simple, not contingent, and unmotivated, though it simultaneously acknowledges that is impossible (pp. 106–107). For Thomas and Turner, then, their definition of style is fitted to the principal element of their target’s conceptual stand: classic style. If generalized, their definition would read, “A style is defined by its conceptual stand on a single core value, presentation, writer, reader, thought, language, and their relationships.” For classic style, this value is truth; for plain style, it is simplicity. In Web 2.0 style, this core value is function. Many writers consider centrality of function the most important difference between Web 2.0 and its predecessors (Garrett, 2000; Kelly, 2005). Function is the first element O’Reilly (2005) names in “What is Web 2.0,” in his first subhead, “The Web As Platform.” As opposed to the “strategic positioning” of the Web as a conduit for information delivered in discrete units metaphorically called “pages,” Web 2.0 imagines itself as a platform much like a computer operating system on which applications are run and services delivered. This function-oriented approach has been a part of web development theory and practice for quite some time. For example, Jakob Nielsen’s massively influential work focuses on usability, calling for a functionalist approach in both writing style and site design (Morkes & Nielsen, 1997; Nielsen, 2000). As I point out in “Tabling the Grid” (Dilger, in press), the rise of a minimalist, function-oriented approach to design around the year 2000 was, in part, a strong counterpoint to the heterogeneity and chaos of the Web. Today, several schools of design that advance minimalism and functionalism remain influential (Cloninger, 2001; McManus, 2004), including Hunt’s “Web 2.0 visual style.” User-centered design and user-experience design, arguably dominant in web development, focus heavily on function through methods such as task analysis, user testing, and rapid prototyping. Finally, standards-compliant web design, the very popular approach to creating web pages engaged by numerous Web 2.0 sites, privileges function by calling for its “graceful degradation” or “forward compatibility” (Zeldman, 2006, p. 42) through separate encoding of structure, presentation, and behavior. What does function mean for Web 2.0 sites? The site does something, providing a service or helping users with a task. Web 2.0 delivers information that serves its users, supports functions that users shape and control, and/or establishes connections between users and other web entities. This centrality of function provides a simplifying influence that

18

B. Dilger / Computers and Composition 27 (2010) 15–26

cascades throughout every element of Web 2.0 style. As every element of classic style’s conceptual stand can be connected to truth, Web 2.0’s conceptual stand revolves around function. 3. Functionality is layered For Web 2.0 sites, functionality often works on several levels. Quite a few Web 2.0 sites began in the “scratching an itch” manner Eric Raymond (2000) used to describe open source development: as coders solve a problem for themselves—for LibraryThing, quickly creating a catalog of one’s personal library, or for Delicious, a more portable approach to saving web bookmarks—the coders create a tool that is generally useful. Additional functions can be layered over the core function that drives the site. For example, using LibraryThing, I can find other books similar to mine, share my notes about books with other users, read the reviews others have written, and build and deliver reading lists for book clubs, courses, or other purposes. These “over the shoulder” functions often leverage network effects, making these seemingly secondary uses profoundly different than the primary function. In the case of LibraryThing, aggregation allows site users—registered or unregistered—to find out the popularity of books, to quickly learn the differences between editions of books, or to discover trends for subject areas on a given day or month. O’Reilly described this phenomenon as “harnessing collective intelligence,” tapping into what James Surowiecki (2004) called “the wisdom of crowds.” The best Web 2.0 sites successfully connect this collectively developed information to the individual engagement with core functionality that makes this connection possible. Sites often explicitly delineate levels of functionality so registration or payment are not necessary for basic functionality, only for more advanced functions. The programming site Stack Overflow (2009) requires registration only for building reputation points. Asking questions and reading answers is open to all. This holds true for online retailer Amazon.com, the most widely used library catalog on the Web; users do not have to buy anything from Amazon to enjoy its speedy and flexible searching, robust metadata, and integrated product reviews (Dilger & Thompson, 2008). Making an account opens the door to more features, such as lists, automated recommendations, and one-click ordering. Sites like photosharing leader Flickr, which use the “freemium” approach, extend this model even further, offering some functionality at no cost, and additional functions to paid users. Anyone who creates a Flickr account can upload 100MB of photos every month, tag photos, comment on others’ photos, and participate in pools and groups. However, purchasing a subscription unlocks additional features, such as unlimited uploads and the ability to include videos as well (Flickr, 2008). 4. Though often sophisticated, functions are never hidden Every Web 2.0 site has a core function which is quickly apparent, has a low threshold of effort, and delivers clear benefit to end users. Google provides the best example: type something into the box, click “search,” and results appear a second later. No directions are required, and the interface could be no simpler. Yet a wide variety of Web 2.0 sites with far more sophisticated functions are almost as straightforward, allowing account creation with little more than an email address and providing basic functionality without demanding complex planning by the user. For example, the social bookmark manager Delicious arguably works best if users plan their usage of tags, deciding whether to use plurals, multi-word tags, certain kinds of punctuation, and rectifying overlap between tags before tagging a bunch of sites. However, the site doesn’t encourage this kind of pre-planning by suggesting that new users read “Delicious 101” before starting to add and tag bookmarks. Rather, the mental model is learning by doing, which we can summarize as “function before complete understanding”—or even “complete understanding comes through function.” This ethos extends to a wide variety of sites that promise that users can “get started today” whether or not they check out the short tour. On site after site, quick starts and readily apparent results are the norm. I still remember the rush I felt when, after using Delicious for an hour or so, I realized it had solved my bookmark problem. Because I use many different computers in multiple locations, some with multiple web browsers, saving bookmarks inside browsers just does not work for me. After just a few minutes, I could tell I would benefit from Delicious’ core functionality of saving bookmarks to the network, and I committed to using it by reading some of the site’s recommendations and then organizing and standardizing my hastily entered bookmarks. This required that I use more advanced tools that required a bit more effort to learn, but allowed me to benefit from the secondary functionalities that make the site so useful: comparing my bookmarks with others, copying bookmarks others had already tagged, and using my bookmark stream to build lists of useful sources for students. As Delicious creator Joshua Schacter observed, layering functionality is

B. Dilger / Computers and Composition 27 (2010) 15–26

19

critical for sites that lack the user base to create the network effects some mistakenly assume are the sole heart of Web 2.0: “For a system to be successful, the users of the system have to perceive that it’s directly valuable to them. . . If you need scale in order to create value, it’s hard to get scale, because there’s little incentive for the first people to use the product. Ideally, the system should be useful for user number one” (as cited in Surowiecki, 2006, para. 6). With function up front and a quick route to competency—which is usually the case on sites of this type—other sections, features, or visuals never overshadow the core functionality. As noted above, the design of Google’s front page epitomizes this approach. It does not get simpler than a title and a single search box. Though signing up for a Google account allows customization of the Google front page and provides access to a host of other functionalities through Gmail, Blogger, and other sites, account creation is by no means necessary. John Batelle (2005) has argued that Google’s decision to privilege search over its other services led to increased user satisfaction and comfort which made later adoption of those other services more likely. Other sites that embrace the Web 2.0 style have followed suit: the Basecamp product management software of 37Signals; BeerAdvocate’s catalog of beer reviews and beer-related places; and YouTube’s rotation of videos being watched by users at the current moment on its front page. 5. Function, not presentation, provides identity Function can be independent of the site itself. The creators of early commercial web sites obsessed over “stickiness,” assuming that the longer users stayed on a site, the more money they spent (Siegel, 1997). Nielsen perhaps unwittingly encouraged this “keep your site in front of the user” mentality by repeatedly yelling, “The rest of the Web is just a click away!” Some overeager web sites even used scripting to disable the “back” button, or pop-ups to trap users on sites, or redirection to prevent cross-site embedding of media. Many Web 2.0 sites, however, offer their core function transparently with no intention of forcing their presentation on readers. Feeds are the key to this approach. Syndication via Really Simple Syndication (RSS) or Atom provides a stream of data that can be read or remixed at will. To use Thomas and Turner’s terms, readers are in control of presentation, not writers. For example, Google allows search results to appear in other places via its application programming interface (API) and other “syndicated search” options. Most photo- and video-sharing sites allow the embedding of media hosted on their servers in other pages—and quite a few even encourage it by making code readily available, as I note below. Usually, tiny indicia identify the hosting site, such as the “YouTube” logo that appears in the lower right of embedded videos, but in some cases, there is no identification at all. Sites that deliver function offsite via APIs or hotlinking often request a link back to the page where the embedded media or information can be found. The bond between identity and functionality, as opposed to identity and presentation, explains why Web 2.0 sites do not allow users to customize the appearance of their profile or user pages. Though the standards-compliant web design used by many Web 2.0 sites makes user control of appearance and design technically possible, even easy, few sites allow it. Compare Facebook and Myspace: both allow the addition of custom applications (“boxes”), but only Myspace allows wholesale changes in appearance, and a cottage industry of sites has grown to support it (Yardi, Luther, Diakopoulos, & Bruckman, 2008). Flickr offers limited stylistic control; Delicious none. Site users who wish to customize the presentation of their collected data must do so via external means—adding custom Cascading Style Sheets (CSS) or layouts to a mashup created and hosted elsewhere. Browser extensions such as Greasemonkey and sites such as ProgrammableWeb enable creation of content that combines the functionality offered by distinct sites, often reforming presentation and eliminating the branding customarily included on a web site (Ballentine, 2009). Branding and imaging, so important in early stages of the Web and still a huge part of consumer culture, takes a back seat to functionality. In Web 2.0 style, readers’ engagement with content is more important than conventional approaches to branding and design, which insist on presentation as created by writers or designers. Many sites continue to use email alerts and links to create user traffic, but as alternative ways to view content increase (e.g., RSS, alternative sites for mobile devices), the center of identity grows ever more functional. 6. There are many kinds of readers and writers Web 2.0 style understands that both “reader” and “writer” are in many senses plural, layered, and complex, much like Web 2.0’s approach to function. There are many ways to be the writer of a Web 2.0 site, far more than for traditional web presences. Contributing users can take on the following writerly roles:

20

B. Dilger / Computers and Composition 27 (2010) 15–26

1) Designing sites, or broadly speaking, being members of the teams who code back ends to create and update site functions; 2) Administering sites, again working as team members who ensure that sites run smoothly; 3) Serving as community leaders who help build and maintain connections between users, whether employees, end users serving with the blessing of site administrators, or end users serving because of their community standing; 4) Contributing content to their accounts on the site; 5) Connecting their content to other users’ content; 6) Allowing their content to be aggregated; 7) Communicating with other users and/or site designers about content, aggregations, or other communication; 8) Providing direct feedback about site designs to designers, administrators, or community leaders who can modify them (usually via surveys or similar means); 9) Providing indirect feedback when behavior is tracked individually, as in error logs, and thereby influencing site design; 10) Providing indirect feedback when behavior is tracked in the aggregate, through usage patterns analyzed by designers and administrators, and thereby influencing site design. External non-human agents can be writers, too: 11) Search engines crawling the site, collecting data, and representing it in their indexes; 12) Other sites and services contributing content through the use of an API. Similarly, Web 2.0 sites have many kinds of readers, both human and machine. Human users can engage content, whether their own, others’, or aggregated. They can observe or follow connections between content, both theirs and others’. They can read commentary on their own content, others’ content, or communication about content. Or they can engage sites’ infrastructures: documentation, demonstrations, or feedback capabilities. Search engines and other automated agents are also readers, accessing individual and aggregated content streams as well as commentary of all kinds. Though I have probably missed a few, there are at least twenty distinct ways for human and non-human agents to be writers or readers of Web 2.0. Obviously, there is room for overlap as well. Sites will have active, engaged users who fit several of these profiles and move quickly between and/or combine several of these roles. Again, this is why Web 2.0 sites must be truly committed to user-centered design: tremendous resources are required to establish feedback infrastructure, determine what user profiles exist, discover which profiles best fit users, and refine methodologies for listening to one’s own users. Attention to community, then, is not necessarily a pernicious attempt to attract users who can be quickly alienated from the value of their labor (Scholz’s position), but an essential element of approaching an audience that potentially breaks down into hundreds of distinct use cases. The many kinds of readers and writers that populate Web 2.0 arise directly from the emphasis Web 2.0 gives to function, the real engine of its much-ballyhooed participatory nature. 7. Code is a window Thomas and Turner showed that for classic style, the relationship of thought and language is not seen as problematic. Thanks to the invisible labor of the classic writer (p. 60), “it is always possible to achieve a perfect fit between a thought and its expression” (p. 65). In the conceptual stand of classic style, prose is a perfect, efficient, exacting window. Of course, classic writers realize prose cannot live up to this standard, but they accept this “enabling convention” (p. 33) because it allows classic style to differentiate itself from other styles. For Web 2.0 style, interoperation, not representation, must be transparent so that function can ascend to its primary position and the display of information need not be tied to a single device (O’Reilly, 2005). That is, like classic writers whose communicative efficacy depends on an agreement to see language as a vehicle for thought, web designers realize a functional Web by agreeing to make their code work with other sites as much as possible. This enabling convention of sharing makes code the window of Web 2.0. In Web 2.0 style, the ideal encoding has the least possible complexity and is easily understood by both human and machine readers. Consider permalinks, web addresses that point to specific weblog entries. With this “trivial piece of functionality,” bloggers can easily refer to other bloggers’ postings, creating citation pathways for human readers and search engines as well (Coates, as cited in O’Reilly, 2005). A little transparent code goes a long way, and

B. Dilger / Computers and Composition 27 (2010) 15–26

21

similar encodings, often using markup like HTML, are everywhere in Web 2.0. On many sites, web addresses use simple encoding to display a variety of information sets and/or search results. For example, shows all my bookmarks for style, and shows the bookmarks I’ve tagged both “style” and “web.” To switch from browser-based display to an RSS feed, one need only change the URL slightly to . Many of these encodings can be quickly decoded and manipulated by end users, a practice called “URL Hacking” (Jerz, 2000) that most Web 2.0 sites do little if anything to discourage. In fact, Google actively supports this kind of learning with its advanced search form. For example, to search for PDF files that contain the phrase “Web 2.0” and the word “style,” users should: 1) 2) 3) 4)

Type “Web 2.0” in the “Exact wording or phrase” box, Type “style” in the “All these words” box, Select “Adobe Acrobat PDF” from the “File type” menu, and Type “.edu” in the “Search within a site or domain” box.

As the form is filled out, Google builds a complex search query (“Web 2.0” style filetype:pdf) in plain sight. As a result, users can quickly learn the simple encoding Google’s servers use to identify phrases and specific file types. This approach to encoding parallels the way many writers learned to write web pages—by comparing code exposed through “view source” with rendered sites (Rice, in press). On the other hand, Web 2.0 sites recognize that many of their readers and writers do not speak these languages and do not want to. Users do not have to remember “filetype:pdf”—they can use Google’s form instead. Similarly, Delicious allows users to create the tag unions demonstrated above by selecting links or manipulating code in URLs, whichever works better for particular readers or writers. YouTube, Flickr, and other photo- and video-sharing sites offer code snippets isolated in form controls, allowing users to quickly embed and share media by cutting and pasting the code into other sites or software without knowing the code. Many sites offer slightly different code blocks for different purposes. A common approach includes several kinds of code targeting email, web sites, and blogs, as well as forums and bulletin boards. Numerous comment and posting forms use the TinyMCE editor or similar browser-based software to allow word-processor-style text editing, including the addition of links or media. As Gregory Ulmer has observed (2003), “cut and paste” coding has become a fundamental strategy of web writing, recalling ancient pedagogical strategies such as imitatio. In sum, approaching code as a necessarily simple tool for readers to enable functionality often involves writers doing the work of coding on behalf of readers, thereby facilitating the establishment of relationships between writers through intermediary readers. Empowered by these technologies, users can engage the same manipulations and produce the same results as those who are fluent users of code—and if they choose, leverage these affordances into learning. For those who do engage code, most Web 2.0 sites offer direct access to data via APIs, which use a variety of technologies. This code is often standardized, following open specifications such as RSS or JavaScript Object Notation (JSON), allowing users who learn a particular method of encoding to apply that knowledge to a wide variety of sites. Standardization is also the engine behind widespread automation: when a critical mass of sites use one approach, network effects further boost its adoption (Benkler, 2006). Metatools for handling automated use of APIs become possible, such as sites like FriendFeed which aggregate the automated display of data from multiple social networking sites, or software libraries like Magpie, which programmers can use to quickly convert RSS to HTML. The API functions for two reasons: first, developers choose to provide open access to encoding systems, and do not conceal or restrict access via patents, licensing, or other means. In most frameworks, intellectual property is not open for the taking, but selected portions which support relevant reader- and writer-oriented functions are made available for limited use. Secondly, non-human agents can gain access through an API, as opposed to approaches that use techniques that ensure human users are doing the clicking. In turn, this function empowers human agents, who no longer have to generate data sets by hand and can instead read and write using the automata’s output. Code begets function and vice versa. 8. Networking changes everything The programming language Perl is known (positively and negatively) for its philosophy of “no unnecessary limits.” Perl can read a file of any size, create arrays with any number of elements, and use all available memory for its operations (Schwartz, Phoenix, & Foy, 2005). Web 2.0 style takes the same approach to networking, bearing out the last and most

22

B. Dilger / Computers and Composition 27 (2010) 15–26

important part of Thomas and Turner’s definition, “and their relationships” (p. 4). In the conceptual stand of Web 2.0 style, readers and writers never ask how networking should be limited, but rather the exact opposite. The answer to the question, “What can be networked?” is always “Everything.” This remains true writ large as readers and writers ask, “Can we make these data the central content of a site, or this function its core activity?” It is also true writ small, when they consider, “Should we enable connections between these elements?” Always answering “Yes” to these and similar questions maximizes potential network effects, long recognized as critical for the function of any communications network but especially important for Web 2.0 (Porter, 2008). Of course, Web 2.0 style does not undercut functionality with excessive articulation. Wikipedia, for example, does not link every possible cross-reference. Elements related to core functionality, however, are usually just a click away. The term “networking” itself serves as shorthand, perhaps problematically, for a stunning heterogeneity of connections, articulations, exchanges, and movements. Clay Spinuzzi (2008) points out the many different approaches to understanding “network,” the result of fundamental differences between the many theoretical frameworks applied to networks. Like Spinuzzi, who does not “reconcile” approaches but “attempts to keep them in productive tension, yielding a productive dialogue,” (p. 16), Web 2.0 style welcomes a variety of conceptualizations of “networking.” Web 2.0 writers and readers applaud multiplicity, articulating the “relationships” in the conceptual stand both in the manner of actor-network theorists who “foreground the continual recruiting of new allies—both human and non-human,” and as activity theory, where the conceptual stand is “grounded in the orientation of particular activities toward particular objects” (p. 16). Web 2.0 style’s conceptual stand can approach networks using any of the conceptualizations that Spinuzzi documents: the diversity and interconnection of rhizomes; the tree-like structures of hierarchies; or the splicing and weaving required to make connections or establish division points in net work. Web 2.0 style also recognizes that the changes networking provokes are fundamental (Latour, 2006). When I moved my bookmarks from computers scattered throughout my workspaces to Delicious, I did not simply put my bookmarks online, but rather I transformed their very nature over and above the taxonomies (folksonomies) Delicious facilitates. This change restored my confidence in bookmarking, changed my bookmarking patterns, and enabled my bookmarks to be connected to other bookmarks and users. In fact, networking my bookmarks was less a single change than a matrix of changes with cascading effects that could in turn be examined, rearticulated, and refocused. When networked, my list of bookmarks becomes a potential source of information with uses and connections that are simply impossible when the bookmark list exists in isolation (Shirky, 2005). As I add bookmarks, changes to my Delicious stream become a source of information as well, showing what research projects I am currently working on or changes in my thinking given the way I tag and comment on certain links, and so on. The articulation of other functionalities (user-generated tagging, description, aggregation) to the core function of saving bookmarks redefines the core act itself, and so on. Following Thomas and Turner’s approach, I now turn to the two “trade secrets” of Web 2.0 style. As Thomas and Turner argued (p. 103), expert practitioners of a style understand not only the elements of that style and the principles of the conceptual stand that support them, but also the style’s limitations and contradictions. These “trade secrets” may be widely known but are seldom discussed. 9. Users are more important than any user In late 2006, Time named “You” as its “Person of the Year,” proclaiming the end of the “Great Man” narrative as far as the Web is concerned, with a cover that included a Mylar mirror intended to “literally reflect the idea that you, not we, are transforming the information age” (Stengel, 2006, p. 1). As Lev Grossman (2006) wrote, the biggest story of 2006 was not “about conflict or great men,” but “community and collaboration on a scale never seen before. . . the many wresting power from the few and helping one another for nothing and how that will not only change the world, but also change the way the world changes” (p. 40). Echoing much of the dot-com boom’s ebullient language, though with caveats about the possibility of complete failure, Grossman spoke of “an explosion of productivity and innovation. . . a massive social experiment” (p. 41) that would be very interesting to observe and even more fascinating to be a part of. Grossman’s article used “we” more often than it used “you:” the latter appears five times but the former nine. (Stengel’s “we,” quoted above, was not the “we” of web users in the aggregate, but of the editorial staff of Time, similar magazines, and their parent companies—what most people call “the media.”)

B. Dilger / Computers and Composition 27 (2010) 15–26

23

Though the Time special issue implies otherwise, it is not the you in Web 2.0 that makes it different but the we, in two senses: first, the aggregation of data known by a variety of different names (crowdsourcing, the wisdom of crowds, prediction markets, etc.); second, the suturing of communication between “you’s” of all kinds, offering the possibility of producing thousands of overlapping and interlocked “we’s.” Kevin Kelly had it right: “We are the Web” in our ability to build networks of all kinds, involving all manner of subject matter; in our roles as many kinds of writers and readers; in our abilities to do things using the Web; in our abilities to talk about all of these things. Web 2.0 must be about this “we” because the elements of its conceptual stand depend on the audience’s make up of “we’s” of different overlapping groups. Much of Amy Shuen’s Web 2.0: A Strategy Guide (2008) speaks to a balance between extracting value from or capitalizing on users and providing them services that take in their needs and are useful to them. Scholz (2008) also suggests there is middle ground between exploitative grabbing of user content and an empowering “social web.” Since the network effects upon which much of Web 2.0 is built depend on a critical mass of users, the conceptual stand of Web 2.0 style requires considering all users before taking care of any particular user. Furthermore, Clay Shirky (2003) observes that, as online systems develop dedicated groups of “core users,” the community’s health becomes more important than any individual user’s rights. These ideas imply the cultivation of a new kind of design: users-centered, as opposed to user-centered, balancing the functions and needs of user-as-individual with those of users-as-groups—an activity familiar to practitioners of user-centered design but raised in importance by the “us” and the “we” of Web 2.0 style. Indeed, the rise of user-experience design can be seen as acknowledgment of the need to refine methodologies to focus less on individual users and more on their aggregated experiences. The increased importance of this balancing act has remarkable consequences for theorizing and teaching concepts of audience in composition. 10. Weak ties are enough In its early days, the Web was criticized, even attacked, by hypertext theorists who saw its single-function oneway links as weak substitutes for the multiple-function, two-way links of Storyspace or similar proprietary software (Brooke, 2008). Today, Eastgate Systems still uses the motto “serious hypertext,” implying that the Web’s version of hypertext falls short in many ways. Arguably, this implication is not true, given the richness of interaction supported by technologies such as Flash and the Ajax development approach. Furthermore, the scope, scale, and cultural influence of the Web clearly show that, however simple, web-based hypertext can be stunningly powerful. For Web 2.0 style in particular, weak ties are always enough. For Mark Granovetter (1973), weak ties are the bonds between acquaintances, less powerful than friendships but valuable because they connect communities to each other. As Bill Thompson and I argue about cataloging, “[B]ecause weak ties can be made quickly, we can afford to have many of them without expecting rewards on the short or long term. And different types of ties are more amenable to individuals for different reasons” (Dilger & Thompson, 2008, p. 43). The Web’s diversity of uses for links, for example, becomes a strength rather than a weakness. A link used ironically or humorously, such as targeting an image with the text “Fail,” still makes a connection regardless of its intention. Why is the weakness of links a secret? Two reasons. First, the language of Web 2.0 often portrays the weak ties foundational to its networks as strong ties by shifting the frame from the sterility of computer networking to warmer metaphors of friendship and the home. Social networking sites use language like “friend” far more often than language like “contact,” though the latter is arguably more accurate. The metaphor of “home page” remains strong, even though the personal home page is supposedly dead (Hopkinson, 2008). “Home” is used in other ways; Wordpress.org, for instance, uses the slogan “Your blogging home.” Common sense validates the need for these deep and strong ties, with the “friends” of social networking services (still seen by many as an epitome of Web 2.0) often ridiculed in the popular press (e.g., Rosen, 2007). Secondly, the common sense of “networking” suggests a physical connection: a tie, a link, a net, a piece of string, or sticky tape. But as Spinuzzi (2008) points out, this narrow view can be problematic. Though networking includes transports such as “wire, wood, and glass,” and many imagine charts with lines drawn between agents, “networking” represents a huge diversity of activities: creating lists, grouping items, building friendships, writing online, and more. Weak ties can always be formed with links—the currency of the Web, thanks to Google—but they can also be formed in a variety of other ways. Web 2.0 style recognizes this and does not try to get past it: its most successful sites revolve around low-risk, low-effort networking, offering a variety of ways to create, comment on, and engage with others’ networks.

24

B. Dilger / Computers and Composition 27 (2010) 15–26

11. Our own Web 2.0? I began this essay by connecting criticisms of “Web 2.0” as a marketing buzzword to the familiar but problematic question of “style vs. substance.” I noted that Lanham’s The Economics of Attention (2005) suggests this distinction has become irrelevant in our information- and attention-based economy. For me, Lanham’s work shows that a robust concept of style, like Thomas and Turner’s, is more important than ever. With a rich definition that acknowledges function, presentation, writer, reader, thought, language, and their relationships, we can elevate style from Ramus’ dustbin, motivating it for the critical activities Scholz and others rightfully call for, such as determining whether a new web service is, after Dziuba, “not based on technology, but on a dog-and-pony show” (as cited in Sloan, 2007). Given that many educators and educational institutions are embracing Web 2.0 with enthusiasm, even hastiness, critical apparatus is badly needed. Recently, Ray Henderson, an executive with oft-criticized courseware vendor Blackboard, recently launched a weblog and asserted that Blackboard was “going to communicate more often and more openly” (2009a). Henderson (2009b) also promised more adoption of and “leadership” in standards-based development. Blackboard (2009) also heavily promoted the “Web 2.0 look-and-feel” of its new courseware, asserting that it would be “an open foundation for whatever complementary technologies [educational institutions] need to support their approach to teaching and learning” (Redden, 2009). Indeed, the new Blackboard has the Web 2.0 look that Hunt documents as well as many of the features O’Reilly (2005) describes as the Web 2.0 pattern language. However, its embrace of the stylistic principles I offer remains an open question which must be actively engaged by scholar-technologists: is Blackboard really offering courseware that leverages function to create network effects and empower end users? Or are they adding star flashes, rounded corners, blog modules, and faux-participatory features to sell software? That is, has Blackboard embraced the conceptual stand that I argue underlies Web 2.0 style? Similarly, we should ask hard questions about Web 2.0 in the educational enterprise. My institution, Western Illinois University (2009) with links to those who have done the same. On the one hand, this approach is clearly wiser than the passive alternative, assuming that Web 2.0 can be ignored since it is just a soon-to-pass marketing fad. The aggressive alternative, actively blocking Web 2.0 sites because they are distractions that suck up too much bandwidth, is even less constructive, as Brigham Young University’s recent decision to stop blocking YouTube shows (Haddock, 2009). On the other hand, I find myself wondering what it means for a university to have a Delicious account: who makes the decisions about what is bookmarked? Most of WIU’s bookmarks point to the university itself. If all Delicious users used the site in that manner, cataloging only their own web presences, much of its robustness would vanish. In stylistic terms, this approach would ignore readers, or at best would imagine their interests as limited solely to gathering the information supplied by writers. A purely self-referential Web 2.0 would have no relationships on which to build a style. Henry Jenkins (2008) suggested that universities should not build their own alternatives to Web 2.0 and similar social networking sites because, despite best intentions, these homegrown versions of YouTube often fall short: Many universities are trying to figure out how they can build “something like YouTube” to support their educational activities. Most of them end up building things that are very little like YouTube in that they tend to lock down the content and make it hard to move into other spaces and mobilize in other conversations. In a sense, these university based sites are about disciplining the flow of knowledge rather than facilitating it. (para. 16) The path Jenkins outlines involves critical engagement with Web 2.0 on its own turf, so to speak. Given suitable infrastructure, institutions can mix the two approaches—creating their own software installations that use RSS and similar technologies to allow content to be integrated with mainstream Web 2.0 sites, or for that matter any site that affords it. Composition teachers mobilizing Web 2.0 or asking students to do the same can evaluate the efficacy of their work by asking a single question based on Thomas and Turner’s definition of style: “How does our work approach function, presentation, writers, readers, thought, language, and their relationships?” As I suggested earlier, and as Jenkins’s other work with community (2006) also indicates, the final three words of that question are the most important—and the most complicated. We will never truly understand our approach to Web 2.0 by making an inventory of its stylistic elements, visual or otherwise. Instead, however complex it may be, we must continue to investigate the complex connections between all of the elements that Thomas and Turner name and seek to discover and document the principles that built, support, and guide the relationships between them. Bradley Dilger is an associate professor of English at Western Illinois University, where he studies and teaches software studies, technical communication, writing studies, web accessibility, and new media.

B. Dilger / Computers and Composition 27 (2010) 15–26

25

References Ballentine, Brian D. (2009). Hacker ethics & Firefox extensions: Writing and teaching the “grey” areas of Web 2.0. Computers and Composition Online, Fall 2009. Retrieved from http://www.bgsu.edu/cconline/Ballentine/ Battelle, John. (2005). The search: How Google and its rivals rewrote the rules of business and transformed our culture. New York: Portfolio. Benkler, Yochai. (2006). The wealth of networks: How social production transforms markets and freedom. New Haven: Yale University Press. Blackboard, (2009). Release 9 | What’s new | Web 2.0. Retrieved from http://www.blackboard.com/Release9/Release-9/What-is-New-in-9/Web-20.aspx Brooke, Collin Gifford, (2008). Revisiting the matter and manner of linking in new media. In Byron Hawk, David M. Reider, & Ollie Oviedo (Eds.), Small tech: The culture of digital tools (pp. 69-79). Minneapolis University of Minnesota Press. Cloninger, Curt. (2001). Fresh styles for web designers: Eye candy from the underground. Berkeley: New Riders. Dilger, Bradley, & Thompson, Bill. (2008). Ubiquitous cataloging. Radical cataloging: Essays at the front. Jefferson, NC: McFarland & Company. Dilger, Bradley. (in press). Tabling the grid. In Bradley Dilger & Jeff Rice (Eds.), From A to : Keywords of markup. Minneapolis: University of Minnesota Press. Dziuba, Ted. (2008). The fall of the house of crunch. Retrieved from http://uncov.com/fall-of-the-house-of-crunch Flickr. (2008). Can I pay to keep more of my photos on Flickr? Retrieved from http://www.flickr.com/help/limits/#28 Garrett, Jesse James. (2000). The elements of user experience. Retrieved from http://www.jjg.net/elements/pdf/elements.pdf Granovetter, Mark. (1973). The strength of weak ties. American Journal of Sociology, 78(6), 1360–1380. Grossman, Lev. (2006 December 25). Time’s person of the year: You. Time, 168(26), 38–41. Haddock, Marc. (2009, June 26). BYU unblocks on-campus access to YouTube. Deseret News. Retrieved from http://www.deseretnews. com/article/705313132/BYU-unblocks-on-campus-access-to-YouTube.html Henderson, Ray. (2009a, June 9). On beginning. Ray Henderson. Retrieved from http://www.rayhblog.com/blog/2009/06/on-beginning.html Henderson, Ray. (2009b, June 23). Openness and standards at Blackboard. Ray Henderson. Retrieved from http://www.rayhblog.com/blog/ openness-and-standards-at-blackboard.html Hopkinson, Jim. (2008, August 21). Episode 18: Death of the website home page. Retrieved from http://thehopkinsonreport.com/2008/08/21/ episode-18-death-of-the-website-home-page/ Hunt, Ben. (2006, December 20). Web 2.0 how-to design style guide. Web design from scratch. Retrieved from http://www.webdesignfromscratch. com/web-2.0-design-style-guide.cfm Jenkins, Henry. (2006). Convergence culture. New York: NYU Press. Jenkins, Henry. (2008, October 13). Why universities shouldn’t create “something like YouTube” (Part One). Confessions of an aca-fan. Retrieved from http://henryjenkins.org/2008/10/why universities shouldnt crea.html Jerz, Dennis. (2000). URL-hacking: Do-it-yourself navigation. Jerz’z literacy weblog. Retrieved from http://jerz.setonhill.edu/writing/e-text/ url-hacking.htm Kelly, Kevin, (2005, August). We are the web. Wired 13(08). Retrieved from http://www.wired.com/wired/archive/13.08/tech.html Latour, Bruno. (2006). Reassembling the social: An introduction to actor-network-theory. Oxford: Oxford University Press. MacManus, Richard. (2004, April 28). The evolution of corporate web sites. Digital Web Magazine. Retrieved from http://www.digitalweb.com/articles/the evolution of corporate web sites/ Morkes, John, & Nielsen, Jakob. (1997). Concise, scannable, and objective: How to write for the web. Retrieved from http://useit.com/ papers/webwriting/writing.html Nielsen, Jakob. (2000). Designing web usability: The practice of simplicity. Indianapolis: New Riders. O’Reilly, Tim. (2005, September 30). What is Web 2.0: Design patterns and business models for the next generation of software. O’Reilly radar. Retrieved from http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/what-is-web-20.html O’Reilly, Tim. (2007, October 4). Today’s Web 3.0 nonsense blogstorm. O’Reilly radar. Retrieved from http://radar.oreilly.com/archives/2007/10/ web-30-semantic-web-web-20.html Porter, Joshua. (2008). Designing for the social web. Indianapolis: New Riders. Postrel, Virginia. (2003). The substance of style: How the rise of aesthetic value is remaking commerce, culture, and consciousness. New York: Harper Perennial. Raymond, Eric. (2000). The cathedral and the bazaar. Retrieved from http://www.catb.org/∼esr/writings/cathedral-bazaar/cathedral-bazaar/ Redden, Elizabeth. (2009, January 27). Blackboard, 9.0. Inside Higher Ed. Retrieved from http://www.insidehighered.com/news/2009/ 01/27/blackboard Rice, Jeff. (in press). English . In Bradley Dilger & Jeff Rice (Eds.), From A to : Keywords of markup. Minneapolis: University of Minnesota Press. Rosen, Christine. (2007). Virtual friendship and the new narcissism. The New Atlantis Summer 2007. Retrieved from http://www.thenewatlantis.com/ publications/virtual-friendship-and-the-new-narcissism Scholz, Trebor. (2008). Market ideology and the myths of Web 2.0. First Monday 13(3). Retrieved from http://www.uic.edu/ htbin/cgiwrap/bin/ojs/index.php/fm/article/view/2138/1945 Schwartz, Randal L., Phoenix, Tom, & Foy, Brian D. (2005). Learning Perl (3rd ed.). Sebastopol: O’Reilly Media. Shirky, Clay. (2003). A group is its own worst enemy. Retrieved from http://shirky.com/writings/group enemy.html Shirky, Clay. (2005). Ontology is overrated: Categories, links, and tags. Retrieved from http://www.shirky.com/writings/ontology overrated.html Shuen, Amy. (2008). Web 2.0: A strategy guide. Sebastopol: O’Reilly Media. Siegel, David. (1997). Creating killer web sites: The art of third-generation site design. Indianapolis: Hayden Books.

26

B. Dilger / Computers and Composition 27 (2010) 15–26

Sloan, Julie. (2007). Q&A: Foul-mouthed blogger Ted Dziuba tells why most startups fail. Retrieved from http://www.wired.com/ techbiz/people/news/2007/10/dzubia qa Spinuzzi, Clay. (2008). Network. Cambridge: Cambridge University Press. Stack Overflow. (2009). Frequently asked questions. Retrieved from http://stackoverflow.com/faq Stengel, Richard. (2006 December 25). Now it’s your turn. Time, 168(26), 1. Surowiecki, James. (2006). TR35 2006 young innovator: Joshua Schacter, 32. Technology Review. Retrieved from http://www.technologyreview. com/tr35/Profile.aspx?Cand=T&TRID=432 Surowiecki, James. (2004). The wisdom of crowds: Why the many are smarter than the few and how collective wisdom shapes business, economies, societies and nations. New York: Anchor/Random House. Thomas, Francis-Noël, & Mark, Turner. (1996). Clear and simple as the truth: Writing classic prose. Princeton: Princeton University Press. Western Illinois University. (2009). New media: University Relations. Retrieved from http://www.wiu.edu/U-Relations/newmedia.php Yardi, Sarita; Luther, Kurt; Diakopoulos, Nick; & Bruckman, Amy. (2008). Opening the black box: four views of transparency in remix culture. 2008 Association for Computing Machinery conference on computer supported cooperative work (CSCW 2008). Retrieved from http://kurtluther.com/pdf/cscw08 w12 yardi.pdf Zeldman, Jeffrey. Designing with web standards (2nd ed.). Berkeley: New Riders.