Rather than developing a periphery device that responds to handling material in CGI using a haptic feedback glove, I began by thinking about material itself as a starting point.
I thought about clay and how humans have always moulded the material since digging it from the ground to form models that they had in their mind.
I'm sure we've all had the experience whereby we have made 'mud pies' in a field or coiled clay to make a pot...
Thousands of years ago, humans formed goddess figurines which can be found in every part of the globe.
And if you are lucky and have worked with animation, you would have formed small objects using animation clay to form all aspects of a world. Puppets and all manner of props and the set itself can be made from clay.
If you haven't seen any of Adam Elliot's work, I'd recommend you check his work out. Mary and Max is an epic stop motion animation.
If, like I have, you have experienced a disconnection when using CGI Modelling tools, you would have thought about the contrast between modelling models in CGI and modelling with clay. Maybe you've tried Z-Brush or 3Ds Max/Maya. You'd have thought about the bridge between physical materials and virtual representations on screen.
Well, when I thought about this in my research, I was looking at ways in which I could reconnect the disconnect.
I asked whether the tension one of...
- Cultural values and beliefs
- Material interaction
- Periphery Device Design
- Interface Design
- Something Else...
However, when I woke up this morning I began to think about whether it would be possible to wrap nano particles around a particle of clay which could transfer locational information about where that particle of clay is in space.
When moulding the clay this would shift the XYZ position and the information could be represented on screen.
This could then be printed using a 3D printer or simply represented on screen.
It's just an idea I have no idea if its practical or not.