Media Technology MSc

Course Material “Perceptualization”

Fall 2016
by Edwin van der Heide & Maarten Lamers

Media Technology MSc program, Leiden University

The term “perceptualization“ was coined specifically for this course. It describes the translation of signals and information to modalities that appeal to any of the human senses. As such, it generalizes the terms “visualization“ and “sonification“ to include all other senses. We study such perceptualizations, with particular focus on how properties of a perception systems can be used to optimally convey information. Aspects to consider are perceptual resolutions, dealing with time, psychology of perception, cross-modal effects, simultaneous conveying of different features, etcetera. Theory, practice, history and examples of information perceptualization are studied and discussed. Lectures are combined with reading homework, student presentations and a student project. Full attendance is compulsory.

Students from outside the Media Technology MSc program who wish to participate should contact the program coordinator.

Lecturers Maarten Lamers and Edwin van der Heide
Contact via perceptualization@liacs.leidenuniv.nl
Level/credits level 400 (specialized course), 3 EC
Lecture dates see the Media Technology calendar
Location room 413 of the Snellius building
Communication via the Media Technology forum under "course announcements". Check it regularly!
Course vault the course vault is password protected
Grading Final grade is composed of homework writing assignment 4 (40%) and final project (60%), rounded to the nearest valid grade.

 

Course Schedule

 

Lecture Date Topic Homework due
1 Mon Nov 7 Course intro &
Introducing the senses
-
- Mon Nov 14 no class (CR instead) -
2 Mon Nov 21 Sonification Assignment 1
- Mon Nov 28 no class (MM instead) -
3 Tue Dec 6
10-12h
Psychology of perception &
Haptification
Assignment 2
4 Mon Dec 12 Cross-modal perception &
Sensory subsitution
Assignment 3
5 Mon Dec 19 Discussion of project proposals Project proposal
6 T.b.a. Project question hour -
7 Tue Jan 17 Project presentations Final project & paper

 

Assignment 1

 

Compulsory reading:

  • Introduction (pp 1-4) and Chapter 2 (Theory of Sonification) of [Hermann 2011]

Suggested reading:

Assignment 2

Read:

Assignment 3

Read:

Project

In teams of two students, create a perceptualization system. It should make a dataset perceivable and interpretable. It should map dimensions (or qualities) of the data onto dimensions (or qualities) of the human sense(s). How you map these dimensions onto each other is most important in the evaluation/grading of your work.

Choose one of the provided datasets to perceptualize (download here):

  1. offensive statistics from the 2008 USA Major League Baseball season
  2. match statistics (both women and men) at four major tennis tournaments of the year 2013
  3. tennis players actions (non-hit, hit, serve)
  4. one-dimensional Game-of-Life data (includes Processing source code)
  5. daily-averaged measurements of 15 weather-related variables in De Bilt, NL (n=2522)
  6. very simple simulation of light-sensitive Euglena cells (Processing source code)

If you wish to use another dataset, then it must be proposed by you in the lecture of Mon Dec 19, and be explicitly accepted by the lecturers.

You can write software to perceptualize the dataset, or create a more "static" perceptualization to for example taste or smell. If you are technically capable, then you could create a device to tactify the dataset. Visualization is not accepted.

Demonstrate your system to the class in under 5 minutes, in such a way that others can experience it. If your system involves coding, then make it possible to input another dataset (with the exact same structure) into your system, so it can be demonstrated on other data. Besides the demonstration, explain the mapping/translation that you made, and give your system a title.

In a paper of max 750 words, describe your system, the mapping, and language/interpretation issues such as strengths and weaknesses. Send your paper in PDF format to perceptualization@liacs.leidenuniv.nl before the presentation session.

Evaluation criteria include:

  • does the work match the assignment?
  • does that work achieve the goals of the intended translation? (important)
  • is the outcome interpretable? (important)
  • how are the choices that you made motivated?

Bibliography

For reasons of copyright protection, some of the articles can be downloaded from within the university's network only.

[Ando] Hideyuki Ando, Tomofumi Yoshida, Taro Maeda, and Junji Watanabe (2007), Save YourSelf: Galvanic Vestibular Stimulation Interface, website
[Bach-y-Rita]

Paul Bach-y-Rita and Stephen W. Kercel (2003), Sensory Substitution and the Human-Machine Interface, Trends in Cognitive Sciences 7(12), December 2003, pp. 541-546

Paul Bach-y-Rita and Neuroplasticity, PBS Wired Science short documentary about the work of Bach-y-Rita, December 2007

Paul Bach-y-Rita - Neuroplasticity, BBC documentary segment, exact source unknown, est. 2010.

[Barrett] Natasha Barrett (2016), Interactive Spatial Sonification of Multidimensional Data for Composition and Auditory Display, Computer Music Journal 40(2), pp 47-69
[Benali-Khoudja] Mohamed Benali-Khoudja, Moustapha Hafez, Jean-Marc Alexandre, and Abderrahmane Kheddar (2004), Tactile Interfaces: a State-of-the-Art Survey, 35th International Symposium on Robotics, Paris, March 2004
[Brewster] S.A. Brewster and L.M. Brown (2004), Tactons: structured tactile messages for non-visual information display, Australasian User Interface Conference, 18-22 January 2004, ACS Conferences in Research and Practice in Information Technology, Vol 28, pp. 15-23
[Cain] WS Cain, R de Wijk, C Lulejian, F Schiet, and L-C See (1998), Odor Identification: Perceptual and Semantic Dimensions, Chemical Senses Vol 23, pp 309-326
[Cassinelli] Alvaro Cassinelli, Carson Reynolds, and Masatoshi Ishikawa (2006), The Haptic Radar; Extended Skin Project
[Chouvardas] VG Chouvardas, AN Miliou, and MK Hatalis (2008), Tactile Displays: Overview and Recent Advances. Displays 29(3), pp 185-194
[Elsenaar] Arthur Elsenaar and Remko Scha (1997), Arthur & The Solenoids, video on youtube.com
[Fisher] Madeline Fisher (2007), Balancing Act (about the work of Paul Bach-y-Rita and Mitch Tyler), On Wisconsin magazine, Spring 2007
[Gilbert] Avery N. Gilbert, Robyn Martin, and Sarah E. Kemp (1996), Cross-modal Correspondence Between Vision and Olfaction: The Color of Smells, The American Journal of Psychology 109(3), pp. 335-351
[Heimbecker] Steve Heimbecker (2003), Wind Array Cascade Machine
[Hermann 1999] T. Hermann and H. Ritter (1999), Listen to your Data: Model-Based Sonification for Data Analysis, In Advances in Intelligent Computing and Multimedia Systems, G.E. Lasker (Editor), International Institute for Advanced Studies in System Research and Cybernetics, pp. 189–194
[Hermann 2011] T Hermann, A Hunt, and JG Neuhoff (Eds.), The Sonification Handbook, Logos Verlag Berlin GmbH, 2011
[Ho 2014] Hsin-Ni Ho, Daisuke Iwai, Yuki Yoshikawa, Junji Watanabe and Shin'ya Nishida (2014), Combining Colour and Temperature: A Blue Object is More Likely to be Judged as Warm Than a Red Object, Scientific Reports 4:5527
[ICAD] International Community for Auditory Display, www.icad.org
[Infosthetics] Information Aesthetics website, infosthetics.com
[Irving] Lucy Irving, Sensory Substitution project, www.sensorysubstitution.co.uk
[Kaye] Joseph Kaye (2004), Making Scents: Aromatic Output for HCI, Interactions 11(1), Jan/Feb 2004
[Loftin] R. Bowen Loftin (2003), Multisensory Perception: Beyond the Visual in Visualization, Computing in Science and Engineering 5(4), pp. 56-58, Jul/Aug 2003
[Noë] Alva Noë and Evan Thompson (2002), Vision and Mind: Selected Readings in the Philosophy of Perception, MIT Press
[Robles-De-La-Torre] Gabriel Robles-De-La-Torre (2006), The Importance of the Sense of Touch in Virtual and Real Environments, IEEE Multimedia, July-September 2006, pp 24-30
[Salisbury] K Salisbury, F Conti, and F Barbagli (2004), Haptic Rendering: Introductory Concepts, IEEE Computer Graphics and Applications, March/April 2004, pp 24-32
[Sturm] Bob L. Sturm (2002), http://imi.aau.dk/~bst/publications/Sturm2002.pdf">Surf Music: Sonification of Ocean Buoy Spectral Data, Proceedings of the International Conference for Auditory Display, Kyoto (Japan), July 2002
[Turin] Luca Turin (2005), Luca Turin on the Science of Scent, talk on TED.com, February 2005
[Warwick 2004] Kevin Warwick and Mark Gasson (2004), Practical Interface Experiments with Implant Technology, in "Computer vision in human-computer interaction: ECCV 2004 Workshop on HCI", Nicu Sebe, Michael Lew, Thomas Huang (eds), LNCS 3058, May 2004, pp. 7-16.
[Warwick 2005] Kevin Warwick, Mark Gasson, B Hutt, and I Goodhew (2005), An attempt to extend human sensory capabilities by means of implant technology, IEEE Int Conf on Systems, Man and Cybernetics, October 2005, pp. 1663-1668
[Warwick 2008] Kevin Warwick (2008), Upgrading Humans via Implants - Why Not?, 19: Interdisciplinary Studies in the Long Nineteenth Century, Issue 7 "Mind, Body, Machines", October 2008
[Washburn] Donald A. Washburn and Lauriann M. Jones (2004), Could Olfactory Displays Improve Data Visualization?, Computing in Science and Engineering 6(6), pp. 80-83, Nov/Dec 2004
[xSonify] http://spdf.gsfc.nasa.gov/research/sonification/documents/Chapter1.pdf">Introduction of Sonification, Chapter 1 of documentation for xSonify sonification software, NASA