We have been discussing what we can learn about visualizations from gaming. One broad area is to look at how games use HUDs (Heads-Up Displays). Another is to look at how games use the time of the player. Perhaps the most unintuitive use of time is the postponement typical of various pet simulators and the recently popular and translated Neko Atsume: Kitty Collector game. In pet simulations like the Tamagotchi the chronotope is not the intense, fast, immersive experience of a first-person shooter, but the slow everyday rhythms and spaces of life. You carry the toy with you and feed your pet in real time. For periods you can’t do much unless you speed up the time. The play is in how you sustain play with small interventions over time. Imagine if we had visualizations that postponed gratification?
Reading Geoffrey’s posting on the inaugural talk given by Eric Champion and his mention of the dimension of ‘soundscape’ that seems to have been relevant even for the first cave paintings (these paintings tend to be in a spot that is acoustically prominent in terms of echo effects etc.) I wonder whether it wouldn’t make sense to think of media channels as dimensions. In other words, 3D – which we nowadays automatically equate with topographical three-dimensionality – could actually also be visual 2D + sound, touch, smell, time, etc.. Doesn’t an image take on a new dimensionality when it is enhanced by either of these? Why restrict the notion of ‘dimension’ to the visual axes in the first place?
Erik Champion, author of Critical Gaming: Interactive History and Virtual Heritage gave the first guest lecture for the 3DH project. Erik was originally an architect who now works in interactive history and digital culture. He has led a number of projects that adapt game engines for cultural heritage. He gave us a great tour of various 3D examples to encourage us to think of games and virtual spaces as visualization.