Media Technology MSc

Homepage Banner Image

The banner image on our homepage is dynamic: students and staff can create new images. Banner images are shown on the Media Technology homepage for a fixed period of time. A suitable banner image
  • must be exactly 750 x 112 pixels, 96 dpi, GIF or JPG format,
  • must be somehow related to the Media Technology MSc program,
  • must look good on the homepage and be pleasing to the eye,
  • cannot contain (significant) moving elements,
  • cannot be offensive, strange, advertisement, etc.
Obviously, we hold the right to accept or deny banner images at our own will. To submit a banner image, send it to the Media Technology office, and include
  • your full name,
  • a short (max 45 characters) title for your banner image,
  • a one paragraph (max 100 words) description/explanation of your image, may include hyperlinks, will be published on the website,
  • possible start and end dates, if your banner image relates to some event in time, such as an upcoming exhibition of your work.

Current Images

Media Technology research team at Lowlands music festival

Media Technology research team at Lowlands music festival, where they studied how stories change as they are being transmitted from one person to the other. In three experiments they collected data from festival participants, together with students and PhD candidates.

Read more about this study on the webpage of the Creative Research Lab (CIL), which is the research lab associated with the Media Technology MSc program.

Student discovery: things memorized in virtual reality are easily lost from memory

Context-dependency in memory-recall tasks is a known effect: when you memorize information in one context (such as underwater), you recall the information best in the same context. Student Maik Lanen realized that this was never studied for virtual reality as a context. Via a well-designed experiment, he discovered that things memorized in VR are much more difficult to recall outside VR, typically leading to a 30% drop in items correctly recalled. Maik presented his results at an international conference on VR.

Using fungi and bacteria to grow virtual worlds for gaming

Noticing the similarity between mountain ranges seen from above and the shape of fungal and bacterial cultures, PhD student Wim van Eck set out to create the one using the other. By growing cultures of bacteria and fungi on a consumer-grade flatbed scanner, he generates a real-time feed of landscapes that can be (and were) used as virtual worlds in real-time gaming applications. Read the published paper about this work.

If it smells like coffee, tastes like coffee, and feels like coffee, could it be coffee?

For their exhibition project "Clear is the New Black", students Dagmar Geerlings and Jasper Schelling set out to create coffee that differed in visual form from the brown liquid we know. They managed to recreate the smell, taste and feel of coffee in a fully transparent liquid. Beautifully visually presented, to abstract even further from the common sight of steaming brew, this project shows the originality and perseverance that our students display.

Does being Superman in Virtual Reality make you stronger?

It is known from prior research (Rosenberg et al. 2013) that experiencing yourself as Superman via virtual reality encourages pro-social behaviour. Basically, it makes you more friendly. This prompted student Maarten Lodewijk to question whether one also becomes stronger. A logical question. Maarten studied this for his master thesis research project, being co-supervised by none other than Robin Rosenberg (USA) of the aforementioned study.

Effects of Drone Flight Patterns on Foraging Birds, by student Rinus Bot

The rapid increase in prevalence of unmanned flying drones will have animals confronted with them in unexpected ways. Study of interactions between animals and drones should be developed from a perspective of scalable data gathering. With this in mind, student Rinus Bot undertook a case study experiment involving birds in the wild and drones. It shows that in response to nearby drone flights, different species of bird react differently, and affirms usability of the proposed data gathering method.

Exploring “Exactitudes” portraits with deep learning, by Sam Verkoelen

Deep learning neural networks have shown unprecedented results in concept identification in collections of images. For his graduation project, student Sam Verkoelen applied them to the collection of portraits from the Exactitudes artwork by Dutch artists Ari Versluis and Ellie Uyttenbroek. Through these portraits, the artists attempt to document and identify different subcultures in societies worldwide.

By applying such unsupervised machine learning techniques to the photo collection, Sam aimed to understand the relationship between portraits and series, uncover underlying features and dimensions of subculture imagery, and generate new images. As such, the work provides a useful case example of understanding structure that is uncovered by deep neural networks, as well as a tool to analyze the underlying structure of a collection of visual artworks, and a very first step towards a robot curator.

“The Augmented Zebrafish”, shadow based physical interaction with complex data

The Augmented Zebrafish project lets users interact with multi-modal images (X-ray, MRI, photo, etc) of zebrafish development, by physically casting shadows onto a physical model of such fish. This work by Marcello Gómez-Maureira, Carolien Teunisse and Fons Verbeek was presented at multiple scientific conferences. See the project website.

Detail of “cricket-controlled PacMan” experiment

For his graduation project, student Wim van Eck experimented with animal-controlled "non-player characters" in video games.

Cover detail of “AR[t]” magazine, a Media Technology collaboration

The Media Technology programme cooporates within the Augmented Reality Lab, together with the Royal Academy of Creative and Performing Arts and TU Delft. The lab aims to operate on the boundary of augmented reality research and artistic practice. It published the "AR[t]" magazine twice yearly.

Shape-changing material designed by student Alice Bodanzky

For het graduation project, Alice Bodanzky developed a material of which the shape can be programmatically changed. Alice applied it to explore the expressive potential of self-actuating surfaces (surfaces inside which the actuating systems are embedded) — she wants designers to better understand their properties and possibilities.

Award-winning “Room Racers” mixed-reality game, by Lieven van Velthoven

Student Lieven van Velthoven designed and developed the "Room Racers" mixed-reality car-racing game for his own entertainment, and graduated on it. It projects virtual cars among real everyday objects, and lets them interact in real-time. Strengths of the game are its playability and the extreme fast responses to changes in the real world. Room Racers received several awards and nominations, among which the 2011 Dutch Game Award for Best Student Game.

Student project for the “Meta Media” course

For their project within the Meta Media course, students installed a public painting canvas in downtown Leiden (2010). In the Meta Media classes students learn to integrate the choice of a medium into the creative process.

“Ik zie, ik zie wat jij niet ziet” by Casper Schipper and Bastiaan Terhorst

Image from page 26 of the book Ik zie, ik zie wat jij niet ziet; Communicating science to a larger audience The book discusses perception from three perspectives: biological, philosophical and phychological. Scientific theories are explained in simple terms, placing the emphasis not on the theories themselves, but on how they add to our understanding of perception.

The “Cyclotactor” platform for musical interaction, by Staas de Jong

“Cyclotactor” is a finger-based tactile i/o device for musical interaction, built by PhD student Staas de Jong. It was designed as part of his Media Technology graduation project, and now offers a basis for his PhD research. The device allows for cyclical relations between tactile input and output. Staas studies the possibilites that this tactile closed-loop offers for musical interaction.

“Sound Illusion Cube” installation by student Thijs Eerens

Volunteer in the Sound Illusion Cube installation by student Thijs Eerens. It researches the question whether people can be spatially disoriented by combining localized sound with physical movement of the body. The subject in blindfolded and seated in a chair of which the movement is tightly synchronized with the rotation around the subject of localized sound. This localized sound is generated from 8 speakers, one in each corner of the cube.

Smelly artificial creature, by students Ella Keijzer and Bastiaan Terhorst

“Gouden Regen” is the name of the smelly creature built by students Ella Keijzer and Bastiaan Terhorst. It was made as part of the Artificial Creatures workshop, in which students had to incorporate uncommon aspects of lifelikeness into an artificial creature. This particular creature marks its territory with a pungent smell.

Detail photograph of the “Globe4D” student installation

Detail photograph of the Globe4D installation, by students Rick Companje, Nico van Dijk, Hanco Hogenbirk, and Danica Mast (2005). It is an actual globe that can be freely rotated along all axes to view the world from any direction. And by rotating a ring that surrounds it, the Earth travels through the dimension of time. Continents drift apart or connect, ice ages form and melt before your eyes, sea levels increase and land disappears. The installation was presented at several international scientific conferences and festivals, and has won multiple scientific awards

Detail photograph of the “Big Mean Steam Machine” student installation

Detail photograph of the Big Mean Steam Machine installation, by students Maarten Bennis, Wim van Eck and Willem van Vliet (2004). The machine consists of one-thousand small glass jars, which are put into arrays of five by five. These jars can be filled with a coloured liquid, turning them into ‘pixels’. This way each array of jars can display a character of the sms-alfabet. The filling of these jars is done by a computer controlled arm that hovers above them. It takes about three hours for one full message to be written.

“PingPongPixel” installation, by Jonathan den Breejen and Marenka Deenstra

Detail photograph of PingPongPixel installation, by students Jonathan den Breejen and Marenka Deenstra (2005). It is simply a gigantic image display device, built from over 8000 pingpong balls in six different grey tones that together form an image. The display measures rougly 2 by 3 meters, with a resolution of 45 times 60 pixels and a refresh-rate of once per 2.5 hours (or 0.0001 Hertz). PingPongPixel weighs over 500 kg and was exhibited on several occasions around the globe.