24 October 2014

In the Literature: Flexible Surfaces & Displays

After further thought, my group have decided to explore flexible surfaces and displays. We did not feel that the wearable ring idea had enough scope to make a really good interactive device.

In recent years, there has been interest among the research community to explore novel displays that can shape-shift. Our idea is to build a system that can project images on a flexible material such as paper, even when bent. The aim of the idea though is not the display itself but to design new ways of interacting with these curved surfaces. For instance, if a user bends the paper into a cylinder shape, a map can be visualised on the surface that the user can interact with by swiping the cylinder or rotating it.

The rest of this blog post explores some of the literature in this field.

"Flexpad: Highly Flexible Bending Interactions for Projected Handheld Displays", J. Steimle et al. CHI 2013

In this paper, a flexible interactive system is built that uses a depth camera (Kinect) and a projector. The system is able to detect and analyse deformations in the surface in real-time thereby allowing it to project an adjusted image also in real-time.

An algorithm is presented that can be used to capture complex deformations in high detail and in real-time. This sort of algorithm could be used when we come to implement our interactive device. Our use cases are different to the ones described in the paper so we would have to adjust the algorithms to work with our use cases. However, this paper provides a good starting point.

"Novel user interaction styles with flexible/rollable screens", S. Nagaraju CHItaly 2013

  This paper presents a concept for a device that has a rollable display and uses novel interactions. They propose a display that has multiple modes: (c) Full fold mode, (d) Fragment mode, (e) Stand mode. Using sensors built into the display (gyroscope, ), the device is able to determine the mode it is currently in. Based on this data, the device is able to determine the parts of the display the user can see (i.e. the exposed sides). The UI is then adjusted to account for the new display mode.

The paper then goes on to describe new user interfaces that can exploit the new display modes. This is something my group can think about for our own interactive device - how should the interface change for each form factor?

Our own idea is different from that explained here because we are using a projector that projects onto paper whereas this uses a dedicated display device.

Other papers:

PyzoFlex: Printed Piezoelectric Pressure Sensing Foil [2012]
  1. Used piezoelectric sensors to detect pressure and temperature

FlexSense: A Transparent Self-Sensing Deformable Surface [2014]
  1. This paper develops on the above research paper based on piezoelectric sensors.
  2. The authors used machine learning to detect the shape of the deformed surface in real-time.
  3. Applications described in the paper include: transparent smart cover for tablets, external high-DoF (Degrees of Freedom) input device

0 comments:

Post a Comment