<< Chapter < Page Chapter >> Page >

1. Geo-Temporal Argumentation : Created using traditional GIS tools (such as ESRI’s ArcGIS), 3D visualization applications (such as Maya or Google’s Sketch-Up), and basic KML editors (such as Google My Maps), all scholarship published in HyperCities is parsed in KML (and can also be exported as such). KML is now widely recognized as the standard of choice for the geo-spatial web, with a robust developer community.  KML files can also be viewed in any geo-browser, including Google Maps/Earth, Microsoft Virtual Earth, and Nasa World Wind.

2. Publications for the World of Web 2.0 :  Unlike traditional monographs which tend to be single-authored, fixed, discrete, and print publications, “digital cultural mappings” are often collaboratively produced, interactive, iterative, and hypermedia in format.  Humanists work with technologists and designers to create the digital files, which are then made available to end-users who can view, navigate, and even contribute to or manipulate the KML files within HyperCities.  Navigation is both curated by the author as well as free-form, should the user so choose (comparable to public notes in the margin). Scholarly content co-exists and even intermingles with community-generated content, allowing new interactions between traditionally separated venues, but without compromising the integrity of either individual collection. And, finally, the scholarship is iterative—that is to say, it can be expanded, changed, and revised at the will of the author. Since the KML file is delivered as a live network link (rather than as a downloadable file), new versions are instantly accessible to the viewing public.

3. Hypermedia : Unlike books, which rest upon the linearity of print, strict pagination, and the limits of select illustrations, the publications within HyperCities are truly hypermedia maps that include unique narratives, troves of illustrations, sound, cartographic renderings, 3D models, and relevant datasets. The user can select to follow a specific pathway, or access the material following his/her own organizational ideas and needs. There are an infinite number of possible routes to traverse the material.

4. Scholarly Rigor and Peer Review : The emphasis is placed upon geo-temporal argumentation, interpretation, and critique.  These studies are not simply about space and time; rather, they are part of a cartographic visualization engine that uses representations of space and time to make the argument. As with traditional publications, scholarly rigor and peer review is critical for success as well as the wider acceptance of digital publications inside and outside the academy. Cognizant of the criteria for evaluating new media publications developed by the MLA, HASTAC, the University of Maine and numerous other institutions, our review process of publications within HyperCities and the development of the platform itself asks questions such as the following:

  • Does the work present and advance an original argument that could not be made as effectively in a single medium or in a traditional print publication?
  • Is the mode of navigation (and kinetic "sign posting") appropriate for the argument?
  • Does it make effective use of hypermedia elements to strengthen the argument?
  • Can the publication be deployed and enhanced by putting it in new contexts or in new digital environments with similar projects?
  • Is it extensible and iterative (i.e., can it continue to grow as more research is done either by the author or other people)?
  • Is it collaborative? Is there a participatory dimension beyond clicking on icons?
  • How does the scholarship support HyperCities' federative (non-silo based) approach to scholarly publishing?
  • Does it allow the audience to see new connections and make new discoveries that would not be possible otherwise?
  • Does the publication engage a wide cross-section of audiences (across disciplines in the academy as well as in the community)?

Challenges to sustainability

  1. The first challenge to sustainability is that the application itself is in (what seems to be) perpetual beta . New functionalities and new design features are still being developed based on user needs and demands, and these developments sometimes introduce new bugs. While we would like to release a HyperCities API within the next year or two in order to allow others to develop on our code, we have not reached the level of stability necessary for a public release. Iterative development of both the application and the content has become a standard feature of HyperCities, and sometimes this means that content developed for one version is not always easily transferrable to the next version. To minimize such problems, we have focused our development on an open-source, standards-compliant browser (Firefox) and made all data available in KML, the standard of choice for geo-temporal mark-up, as well as stored all data with a standard character encoding in a MySQL database.
  2. Processing Power and Managed Growth : Our current production environment consists of two Dell PowerEdge R905 servers running Linux, each with 32GB of RAM and 5 TB of storage. One server functions as the primary web-server and database server; the other server functions as the map server and runs Apache, PHP, and a custom-built map tile generator. The servers are maintained and backed up at UCLA, as part of the Academic Technology Services center, which provides staff support and technology infrastructure for large-scale projects. These servers were purchased and configured through external grants; additional grants will be necessary to cover the costs of maintenance and upgrades. Clearly, one challenge for the sustainability of a project of this scope is the cyberinfrastructure to support it: Servers with adequate processing power and storage capacities as well as the staff support to maintain them. Our current production environment is adequate for our current user base, but what happens if we have ten thousand or one-hundred thousand users? A clear challenge is how to manage sustainable growth, particularly as the project becomes more and more distributed in scope and scale. With regard to content, we have considered a cloud computing solution (through Amazon or Google) should speed and storage become compromised in our current environment; however, we have decided, to date, against this solution because of the fact that the data disappears once the lease is over, and we have no way of guaranteeing an indefinite lease with a commercial company. Finally, a further challenge of sustainability has to do with managing the sheer number of collaboration requests from external institutions, museums, archives, and other community groups: As the user base grows, so too do the number of collaboration requests. While we are certainly delighted by the international interest in the project and the possibility of new collaborations, it has also become clear that—without a full-time complement of project representatives and programmers—we cannot possibly vet and pursue every request that comes in.
  3. Institutional Support : Even though projects like HyperCities are not "boutique" projects (but rather offer common solutions for faculty working in numerous fields and teaching many different classes, ranging from archaeology and classics to history, architecture, literature and cultural studies), digital media projects are still treated as such, reflecting the individual needs and ambitions of the faculty directors. These projects certainly benefit from institutional cyberinfrastructure, but they are not yet an integrated part of this cyberinfrastructure. This, in my opinion, is the crux of the matter: How do digital research projects that offer common solutions to advancing and publishing scholarship become part of the institutional cyberinfrastructure of the campus, or for that matter, of the cyberinfrastructure of inter-institutional networks? This would require the university to invest in targeted projects as part of its own mission-critical investment in infrastructural support for research, teaching, and service. Here, I entirely agree with Kathleen Fitzpatrick's eloquent account that digital projects must be seen as playing "an indispensable role in the university’s mission,...[such that] scholarly publishing units must be treated as part of the institution’s infrastructure, as necessary as the information technology center, as indispensable as the library [and other]service-oriented organizations increasingly central to the mission of the twenty-first century university." Kathleen Fitzpatrick, Planned Obsolescence: Publishing, Technology and the Future of the Academy (New York: NYU Press/MediaCommonsPress). Accessed on-line at: (External Link)
  4. Dependence on Google APIs: The HyperCities project benefits tremendously from the Google Map and Earth APIs, allowing us to design a scholarly research, teaching, and publication environment around "digital cultural mapping" without the need to pay licensing fees for world satellite imagery, develop a 2D or 3D earth browser, or re-create many of the system functionalities that Google has already developed for producing and annotating maps and integrating data. However, the API key can be turned off by Google at any time, and periodic updates to the Google API, although often resulting in new functionalities, still require some programming on our end. The question is how our reliance on commercial companies impacts the project's development and long-term sustainability. Such reliance will need to be negotiated over and over again as commercial enterprises (as well as certain nonprofit ones, such as Wikipedia) start to play central roles in shaping the academic mission of the university and the stewardship of knowledge in the twenty-first century.
  5. The Learning Curve: It is one thing to upload content or create a map in the geo-temporal environment of HyperCities, something that amounts to a fairly straightforward process of geo-locating objects, assigning time stamps, and coordinating or authoring a collection. But it is something else to conceive of a multi-dimensional argument within HyperCities as something that could only be imagined through hypermedia "time-layers" and experienced through time-space navigation. To be sure, there are shades of gradation connecting the two, and both bespeak a learning process that is not common in traditional scholarly output: Choices about design, organization of collections (both hierarchical and synchronic), creation and deployment of media objects in 2D and 3D environments, use of base-maps, openness to user-generated content, kinetic guideposts and navigation decisions, symbology, creation of network links to distributed content, and multimedia authorship all become critical questions in the organization of the publication. Needless to say, this is not something that can be learned in a single afternoon or something for which there exist long-standing precedents. The questions raised for sustainability run as follows: What kind of institutional support is necessary to help scholars develop, design, test, and deploy such arguments? How are they iteratively versioned, refined, and maintained over time? What does it mean to place decisions about design, functionality, and presentation on the same level as decisions about content, sources, and argumentation (that is to say, to show that design decisions are already argumentations).
  6. Distributed and Fungible Content : The goal of HyperCities, as mentioned earlier, is not to become a meta-repository but rather to behave as the connective tissue between interlinked archival resources (such as historical maps, photograph collections, 3D models, oral histories, and other digital assets) as well as interlinked community resources (such as YouTube videos, Flickr photostreams, Tweets, user-generated maps, and other privately produced and uploaded materials). While the distributed nature of the content allows for the development of an ever-expanding and ever-changing network of resources that can be linked together, such a network is always fragile, and even minor changes in meta-data standards or query and display protocols can cause disruptions in the system. The issue for sustainability concerns the implementation and maintenance of shared metadata standards and the development of a set of best practices that all contributing archives can easily follow. While archival content hosted at institutions tends to be fairly reliable (vetted content accessed by a permalink, with metadata formatted according to standards), content on commercial servers is extremely fungible: Here today, gone tomorrow. This is especially true when the content is created across the world, uploaded to YouTube and variously embedded in user-created maps and other collections. Despite the fungibility and sometimes even unreliability of this content, HyperCities has pursued a strategy of opening up participation as broadly as possible, even if this means that not all content is "archive-ready" or "archive-quality." To this end, we have simultaneously pursued vetted content (authorized by institutions and peer review) and public content, in which virtually anything goes: Anyone can create a public or private collection and start adding material right away; however, only vetted contributions become "featured collections" or are accepted as "partner collections" by our editorial board.
  7. This raises the perennial sustainability question of what's worth saving and what's not . To be sure, not everything in HyperCities should be copied and saved in perpetuity. HyperCities is currently at an experimental level of development with digital publications in an environment that is rapidly changing. I strongly believe that it would be a mistake to short-circuit this process of experimentation, which inevitably entails the casting about for new scholarly models, interdisciplinary methodologies, hybrid forms of media content, and alternative modes of authorship. It's not clear to me that we are currently able to answer the question of what's worth saving and what's not. Instead, we should facilitate an open environment for experimentation and risk-taking, knowing that some projects and platforms will fail or, at least, need to be radically reconceived or even abandoned. At this point in the development cycle of HyperCities, I am interested in seeing what can be done within this platform (and, of course, what cannot be), such that some projects will push us further to develop the platform, while other projects will fall short. The primary question, at this stage, is not so much "what should or should not be saved or preserved" (a question of selection) but rather "what can and cannot be thought" (a question of imagination).
  8. This raises a related question, namely how one documents this process of experimentation and how one preserves an experiential, hypermedia environment. Many of the most interesting collections within HyperCities cannot be easily "translated" into traditional media formats without much loss, because such translation results in the decontexualization of the objects, removing them from their time/space junctures and stripping them of the rich interactivity (or potential for interactivity) that they have within the HyperCities environment. Not only is the whole greater than the sum of the parts (for example, the Tehran election protests collection), but the whole in the context of other wholes is greater still. That's because the collections are meant to be navigated, and navigation depends on the choices that users make for how they want to move through the materials or, for that matter, curate and create new material. Before we can make firm determinations about what to preserve, it would probably be worth documenting and beginning to historicize the contemporary experimentations in scholarly publishing. How, for example, does one preserve a scholarly environment, even one that only exists for a few years? How do such environments inform longer-term shifts and developments in knowledge production and scholarly publishing?
  9. Finally, there is the larger social and institutional issue of legitimation of this kind of scholarship. It places a high demand on the "reader" to invest the necessary time to traverse the collection in ways that were both intended and perhaps unintended by the author/curator. Certainly, the imprimatur of a university press would go a long way to legitimizing the scholarship (at least in the minds of many in the academy) and also recognizing the vitally generative nature of these kind of publications and publication environments. But this kind of recognition is also risky since business models need to be rethought, the editing and design process needs to be entirely reconceived, and traditional distribution channels can no longer be pursued (at least not in themselves). What this really entails is not only a fundamental rethinking of how knowledge gets designed and created, but also a fundamental rethinking of what knowledge looks and sounds like, who gets to create and interact with knowledge, when it is "done" or transformed, how it gets authorized and evaluated, and how it is made accessible to a significantly broader (and potentially global) audience. The twenty-first century university and university press have the potential to generate, legitimate, and disseminate knowledge in radically new ways, on a scale never before realized, involving technologies and communities that rarely (if ever) were engaged in a global knowledge-creation enterprise. We are just starting to understand and leverage that potential, and the question for me is how to sustain (and not short-circuit) this critical process of experimentation and risk-taking.

Questions & Answers

are nano particles real
Missy Reply
Hello, if I study Physics teacher in bachelor, can I study Nanotechnology in master?
Lale Reply
no can't
where we get a research paper on Nano chemistry....?
Maira Reply
nanopartical of organic/inorganic / physical chemistry , pdf / thesis / review
what are the products of Nano chemistry?
Maira Reply
There are lots of products of nano chemistry... Like nano coatings.....carbon fiber.. And lots of others..
Even nanotechnology is pretty much all about chemistry... Its the chemistry on quantum or atomic level
no nanotechnology is also a part of physics and maths it requires angle formulas and some pressure regarding concepts
Preparation and Applications of Nanomaterial for Drug Delivery
Hafiz Reply
Application of nanotechnology in medicine
has a lot of application modern world
what is variations in raman spectra for nanomaterials
Jyoti Reply
ya I also want to know the raman spectra
I only see partial conversation and what's the question here!
Crow Reply
what about nanotechnology for water purification
RAW Reply
please someone correct me if I'm wrong but I think one can use nanoparticles, specially silver nanoparticles for water treatment.
yes that's correct
I think
Nasa has use it in the 60's, copper as water purification in the moon travel.
nanocopper obvius
what is the stm
Brian Reply
is there industrial application of fullrenes. What is the method to prepare fullrene on large scale.?
industrial application...? mmm I think on the medical side as drug carrier, but you should go deeper on your research, I may be wrong
How we are making nano material?
what is a peer
What is meant by 'nano scale'?
What is STMs full form?
scanning tunneling microscope
how nano science is used for hydrophobicity
Do u think that Graphene and Fullrene fiber can be used to make Air Plane body structure the lightest and strongest. Rafiq
what is differents between GO and RGO?
what is simplest way to understand the applications of nano robots used to detect the cancer affected cell of human body.? How this robot is carried to required site of body cell.? what will be the carrier material and how can be detected that correct delivery of drug is done Rafiq
analytical skills graphene is prepared to kill any type viruses .
Any one who tell me about Preparation and application of Nanomaterial for drug Delivery
what is Nano technology ?
Bob Reply
write examples of Nano molecule?
The nanotechnology is as new science, to scale nanometric
nanotechnology is the study, desing, synthesis, manipulation and application of materials and functional systems through control of matter at nanoscale
Is there any normative that regulates the use of silver nanoparticles?
Damian Reply
what king of growth are you checking .?
Got questions? Join the online conversation and get instant answers!
Jobilize.com Reply

Get Jobilize Job Search Mobile App in your pocket Now!

Get it on Google Play Download on the App Store Now

Source:  OpenStax, Online humanities scholarship: the shape of things to come. OpenStax CNX. May 08, 2010 Download for free at http://cnx.org/content/col11199/1.1
Google Play and the Google Play logo are trademarks of Google Inc.

Notification Switch

Would you like to follow the 'Online humanities scholarship: the shape of things to come' conversation and receive update notifications?