I’ve been trying to define what kind of scholar I am for five years now. My answer remains fluid — a bit more like Silly Putty now, but not yet firm like concrete — perhaps to the dismay of my current adviser. The journey of discovery is a more finely honed process than initially expected. Arriving in grad school, I simply thought I’d be trained to become a scholar — you know, like every other scholar. Of course, I quickly learned that this involved a game of Twister, placing hands and feet on established fields, theoretical perspectives, and myriad schools of thought, as well as playing tug-of-war with my own critical insights, situated knowledges, and bees in various bonnets. Thankfully, my first cohort (at UI-Chicago) happened to be one that landed in front of Kevin Barnhurst for class one, semester one: Philosophy of Communication.
This month saw publication of The Oxford Handbook of Music & Virtuality, containing my chapter, "Hatsune Miku, 2.0Pac and Beyond: Rewinding and Fast-forwarding the Virtual Pop Star."
In it, I survey a history of virtuality in pop music stars, from the Chipmunks and the Archies up to Gorillaz and Dethklok — many of the non-corporeal, animated characters that presaged current virtual pop stars like Hastune Miku and the Tupac resurrection.
When researching and writing about (or designing and producing) hologram simulations, there’s always an initial coming-to-terms with the terms.
When I analyzed the discourses of simulation designers, nearly all of them made some attempt to square and/or pare the language of their field. Designers and artists usually opened interviews with this, eager to make sure I understood that while we call these things “holograms” they’re not actual holography. “The words ‘hologram’ and ‘3D,’ like the word ‘love,’ are some of the most abused words in the industry,” one commercial developer told me. Michel Lemieux at Canada’s 4D Art echoed a common refrain: “A lot of people call it holography. At the beginning, 20 years ago, I was kind of always saying, ‘No, no, it’s not holography.’ And then I said to myself, ‘You know, if you want to call it holography, there’s no problem.’” In my own talks and presentations, I’ve let go of the constant scare-quotes. The Tupac “hologram” has graduated to just being a hologram.
It gets stickier when we begin parsing the myriad and important differences between virtual reality (VR) and augmented reality (AR). Many of us think we have an understanding of both, largely as a result of exposure to special effects in movies and TV — where the concept of a hologram underwent its most radical evolution, from a mere technologically produced semi-static 3D image to a computer-projected, real-time, fully embodied and interactive communication medium — but it’s AR people usually grasp more than VR. They’ll say “virtual reality,” but they’ll describe Princess Leia’s message, the haptic digital displays in “Minority Report,” or the digital doctor on “Star Trek: Voyager.” Neither of these are VR, in which the user dons cumbersome gear to transport her presence into a world inside a machine (think William Gibson’s cyberspace or jacking into “The Matrix”); they are AR, which overlays digital information onto existing physical space.
Yet both VR and AR refer to technologies requiring the user to user some sort of eyewear — the physical reality-blinding goggles of OculusRift (VR) or the physical reality-enhancing eye-shield of HoloLens (AR). Volumetric holograms — fully three-dimensional, projected digital imagery occupying real space — remain a “Holy Grail” (see Poon 2006, xiii) in tech development, and we may need a new term with which to label that experience. One developer just coined one.
The film “2010” — the 1984 sequel to the vaunted “2001” adaptation from ’68 — opens with its protagonist facing a huge decision: whether or not to embark on a long mission fraught with danger while prone to both failure and a threat to his marriage. He soon wakes up far from home in a bewildering technical environment among a cohort that speaks a different language. They struggle to collaborate on their first project, a research mission in which they find something unexpected, some groundbreaking new knowledge. Then their computer crashes and erases all the new data.
I see it now. It’s a movie about grad school.
Life is great like this: I spent an afternoon this week unpacking the remainder of my library (delayed, as often happens, months after moving in), and basking in the intense comfort of having treasured volumes once again within reach; then, I sat down with a well-earned cocktail and opened Illuminations, a collection of Walter Benjamin essays recently added to my to-read shelf — and what to my wondering eyes should appear but the anthology’s first selection: “Unpacking My Library.”
The Tao that can be explained is not the enduring and unchanging Tao.
— Lao Tzu
Before beginning my graduate communication studies, I knew I was entering a conflicted field. The fact that every scholar I’ve spoken to or studied with defines communication slightly differently and citing different theoretical perspectives is exciting, not daunting — and, surprisingly, not that confusing. It is large, this field; it contains multitudes. Translation: there’s still much to be done — more than ever, now that the communication of information is a vaunted pillar of modern society — so come on aboard.
Thus, a new missive questioning the standing, ambition and overall health of communication scholarship — “Communication Scholars Need to Communicate” by USC Annenberg’s dean, the earnest Ernest J. Wilson III — is merely the latest in a long series of semi-perennial glances toward our brainy navels. The field, it seems, is still fermenting.
I'm THOMAS CONNER, communication researcher and culture journalist.