Last modified: 2015-08-27
Abstract
Introduction
This paper presents an ongoing Human-Computer Interaction (HCI) project. The project’s aim is to demonstrate the future potentials of tactile interaction for design development and manufacturing. Its output is a hybrid design medium, which introduces tactility and materiality into digital design cycle and upgrades the design activity to a complete practice of making by integrating design development and manufacturing. It aims to provide;
- interaction with the digital design object through hands and physical tools,
- opportunity to work with physical materials during digital modelling,
- simultaneous coordination between the physical model, digital model and on-site product,
- remote and close collaboration.
It is actually an experimental prototype set-up which allows empirical analysis on the opportunities of tactile interaction and proposes a new methodological and theoretical approach to HCI in design. This paper is going to present its functionalities and samples produced by using it, in order to open up a discussion towards these analysis and propositions.
Motivation
Modernist design methods celebrate the eye as a superior sense organ above the others. They propose seeing as the main source of perception and separate design and manufacturing. Because, in the time of professionalism and expertise, the principal responsibility of the designer is to produce rhetoric. And rhetoric is principally produced by and for the eye, because, like Pallasmaa mentions, the only sense that is fast enough to keep pace with the astounding increase of speed in the technological world is sight (Pallasmaa 1996).
Within the knowledge acquired by perception, Diderot claims that the eye is the most superficial, the ear is the haughtiest, smell is the most voluptuous, taste is the most superstitious and inconstant, and touch is the most profound and philosophical (Diderot 2009). What is profound about touch is that, it is not only a way of receiving but also a way of transmitting. It is reciprocal therefore it enables mutual engagements with things which we touch. This reciprocity enables the hybrid assemblage of brains, bodies and things, which is defined as the way we think by Malafouris (Malafouris 2013).
A profound critique of the studies in Virtual Reality is that, they attempt to make a world inside the computer (Weiser 1999); and invite the individuals to witness it through his/her eyes. However, we need to develop computer systems in order to enhance the existing world instead of simulating it so that they can fit into human environments (Weiser 1999). Actually, if theoretically and methodologically well structured, the studies in HCI provide us tools and methods with opportunities to revitalize the forgotten notions of tactility and materiality in design. In order to do so, we need to reconceptualise the practice of design and to prove critical strategies towards developing diverse tools, which, as defined by Latour and Yaneva, are the aids of imagination and instruments of thinking tied to the body (Latour and Yaneva 2008).
Research Methods
The research is conducted by developing a use-inspired design tool as a proof of the concept. A comprehensive analysis of the relevant work was held towards the aims mentioned above. The most important works with similarities within the project aims are; the pioneering works like the Seek (Negroponte 1973) or the 3D Input for CAAD systems (Aish 1979); and more recent works such as the Hand Gestures in CAD Systems (Tumkor, Esche, and Chassapis 2013), Finger-Based Multi-touch Interface For Performing 3D CAD Operations (Radhakrishnan et al. 2013), KidCAD (Follmer and Ishii 2012), Easigami (Huang and Eisenberg 2012), Recompose (Blackshaw et al. 2011), 3D Sketching Using Interactive Fabric For Tangible and Bimanual Input (Leal et al. 2011), Raw Shaping Form Finding (Wendrich 2010), Tangible Augmented Prototyping of Digital Handheld Products (Park, Moon, and Lee 2009), Tangible Design Support System Using RFID Technology (Hosokawa et al. 2008), FlexM (Eng et al. 2006), iSphere (Lee, Hu, and Selker 2006), Twister (Llamas et al. 2003), SmartSkin (Rekimoto 2002), CUBIK (Lertsithichai and Seegmiller 2002), Gesture Modelling (Gross and Kemp 2001), Computational Building Blocks (Anderson et al. 2000), integration of SpaceBall and DataGlove (Zhai et al. 1999), and Direct and Intuitive Input Device For 3D Shape Deformation (Murakami and Nakajima 1994).
The unique aspect of this project is the new relationships it constructs between the existing tools and technologies. It integrates; a motion sensing input device for capturing the intuitive movements of the hands and tools of the designer during model making, a 3D printing pen for physical model making, a CAD software plug-in for digital geometric modeling, and a 4 axis robot arm for simultaneous on-site manufacturing.
The project development had 3 phases. The first phase aims to integrate motion sensing with 3D digital modeling. A CAD plug-in was developed for this purpose. It captures the movements and gestures of the designer’s hands and uses them as input for geometric modeling. This feature provides one of the project aims by enabling the use of hands in real 3D space during digital modeling. The second phase aims to integrate physical model making by using a 3D printing pen. This feature enables the use of not only hands but also physical tools during model making. It also provides the use of a real model making material (ABS or PLA plastics) and allows it to be a design agent with its own content and requirements. The third phase aims to integrate a desktop-scale robot arm for on-site manufacturing. It allows the simultaneous coordination between the production of the physical model, digital model and on-site product.
The project development is still going on in order to increase the functionalities of the medium. However, all the three phases already have concrete outputs to be presented and discussed. The discussions will contribute to the development.
Results
The first two phases of the project are fully completed and the features developed in these phases currently work robustly. So, the aim of integrating physical model making and simultaneous digital modelling is achieved. The third phase, which aims to integrate robotic manufacturing into the cycle is completed with minor problems. The problem is that, the 4-axis robot arm is yet not able to construct very complex geometries. The best solution to this problem would be using a 6-axis arm, which is currently not available for this project. Therefore, we are planning to optimize the geometry to perform in the next tests.
The first tests were done with a small group of architecture students. There are two more workshops planned for June and July. So, it is expected that there will be more samples to be presented, and more importantly, a rich experience to be discussed by August.
Discussions
The project contributes to HCI studies by providing a new theoretical and methodological approach. It allows us discuss new ways of interaction within both theoretical and practical means. The first tests which were conducted by using the medium prove the applicability of the project aim, which is to include tactility in design by using real materials and tools in collaborative environments where design development and manufacturing are integrated.
The most important discussion that may stem from this project is that design, is again capable of being performed through making. Therefore it becomes possible to discuss the applicability of a contemporary vernacular approach with the support of such hybrid systems. The vernacular here does not necessarily refer to an economical or legal content, but to design knowledge and cognition. Unlike modern conceptions, cognitive and physical activities merge and design development and evaluation is performed during manufacturing in vernacular context; so design refers to a whole practice of making. It is expected that the aims of this project are able to contribute to discuss design as an activity of making.
The project’s output is currently a proof of concept, and supports the sketching of conceptual design ideas by using physical modelling, digital modelling and robotic manufacturing at the same time; which is a unique approach in the field. Therefore it enables the designers to rediscover the importance of tactility and material input for design development and evaluation with the support of digital technologies. However, in order to address real world problems, it needs to be equipped with a more diverse set of tools, materials and operations into making.
It has potentials to support collaborative making where tacit knowledge is necessary; for example, in education. It will allow the actors of design education (the tutors and students) work together on the design task through a shared tacit knowledge, therefore, will enable learning by collaborative doing. This model of learning is analogous to the relationship between a master and apprentice, therefore is much different than the current common methods of design education which is based on the verbal critics of the tutor on students’ works.
Keywords
References
Aish, R. 1979. “3D Input for CAAD Systems.” Computer-Aided Design 11 (2): 66–70.
Anderson, David, Jonathan S. Yedidia, James L. Frankel, Joe Marks, Aseem Agarwala, Paul Beardsley, Jessica Hodgins, Darren Leigh, Kathy Ryall, and Eddie Sullivan. 2000. “Tangible Interaction + Graphical Interpretation: A New Approach to 3D Modeling.” In Proceedings of the 27th Annual Conference on Computer Graphics and Interactive Techniques - SIGGRAPH ’00, 393–402. New York: ACM Press.
Blackshaw, Matthew, Anthony DeVincenzi, David Lakatos, Daniel Leithinger, and Hiroshi Ishii. 2011. “Recompose.” In Proceedings of the 2011 Annual Conference Extended Abstracts on Human Factors in Computing Systems - CHI EA ’11, 1237–42. New York: ACM Press.
Diderot, Denis. 2009. “Letter on the Blind for the Use of Those Who Can See.” In Diderot’s Early Philosophical Works, edited by Margaret Jourdain, 68–157. South Carolina: BiblioLife.
Eng, Markus, Ken Camarata, Ellen Yi-Luen Do, and Mark D Gross. 2006. “FlexM Designing a Construction Kit for 3D Modeling.” International Journal of Architectural Computing 4 (2): 27–47.
Follmer, Sean, and Hiroshi Ishii. 2012. “KidCAD.” In Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems - CHI ’12, 2401–10. New York: ACM Press.
Gibson, James. 1983. The Senses Considered as Perceptual Systems. California: Praeger.
Gross, Mark, and Ariel Kemp. 2001. “Gesture Modelling: Using Video to Capture Freehand Modeling Commands.” In Computer Aided Architectural Design Futures 2001: Proceedings of the Ninth International Conference Held at the Eindhoven University of Technology, 6:33–46. Dordrecht: Kluwer Academic Publishers.
Hosokawa, Takuma, Yasuhiko Takeda, Norio Shioiri, Mitsunori Hirano, and Kazuhiko Tanaka. 2008. “Tangible Design Support System Using RFID Technology.” In Proceedings of the 2nd International Conference on Tangible and Embedded Interaction - TEI ’08, 75–78. New York: ACM Press.
Huang, Yingdan, and Michael Eisenberg. 2012. “Easigami: Virtual Creation by Physical Folding.” In Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction - TEI ’12, 41–48. New York: ACM Press.
Latour, Bruno, and Albena Yaneva. 2008. “Give Me a Gun and I Will Make All Buildings Move: An ANT’s View of Architecture.” In Explorations in Architecture: Teaching, Design, Research, edited by Reto Geiser, 80–89. Basel: Birkhäuser.
Leal, Anamary, Doug Bowman, Laurel Schaefer, Francis Quek, and Clarissa K Stiles. 2011. “3D Sketching Using Interactive Fabric for Tangible and Bimanual Input.” In GI ’11 Proceedings of Graphics Interface 2011, 49–56. Ontario: Canadian Human-Computer Communications Society.
Lee, Jackie Chia-Hsun, Yuchang Hu, and Ted Selker. 2006. “iSphere:A Free-Hand 3D Modeling Interface.” International Journal of Architectural Computing 4 (1): 19–31.
Lertsithichai, Surapong, and Matthew Seegmiller. 2002. “CUBIK: A Bi-Directional Tangible Modeling Interface.” In CHI ’02 Extended Abstracts on Human Factors in Computing Systems - CHI '02, 756–57. New York: ACM Press.
Llamas, Ignacio, Byungmoon Kim, Joshua Gargus, Jarek Rossignac, and Chris D Shaw. 2003. “Twister: A Space-Warp Operator for the Two-Handed Editing of 3D Shapes.” In SIGGRAPH ’03 ACM SIGGRAPH 2003 Papers, 663–68. New York: ACM Press.
Malafouris, Lambros. 2013. How Things Shape the Mind: A Theory of Material Engagement. Boston: MIT Press.
Murakami, Tamotsu, and Naomasa Nakajima. 1994. “Direct and Intuitive Input Device for 3-D Shape Deformation.” In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems Celebrating Interdependence - CHI ’94, 465–70. New York: ACM Press.
Negroponte, Nicholas. 1973. The Architecture Machine: Toward a More Human Environment. Boston: The MIT Press.
Pallasmaa, Juhani. 1996. The Eyes of the Skin: Architecture and the Senses. New Jersey: John Wiley & Sons.
Park, Hyungjun, Hee-Cheol Moon, and Jae Yeol Lee. 2009. “Tangible Augmented Prototyping of Digital Handheld Products.” Computers in Industry 60 (2): 114–25.
Radhakrishnan, Srinivasan, Yingzi Lin, Ibrahim Zeid, and Sagar Kamarthi. 2013. “Finger-Based Multitouch Interface for Performing 3D CAD Operations.” International Journal of Human-Computer Studies 71 (3): 261–75.
Rekimoto, Jun. 2002. “SmartSkin: An Infrastructure for Freehand Manipulation on Interactive Surfaces.” In CHI ’02 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, 113–20. New York: ACM Press.
Sontag, Susan. 1977. On Photography. London: Penguin Books.
Steiner, Rudolf. 1981. Man as a Being of Sense and Perception. London: Rudolf Steiner Press.
Tumkor, Serdar, Sven K. Esche, and Constantin Chassapis. 2013. “Hand Gestures in CAD Systems.” In ASME 2013 International Mechanical Engineering Congress and Exposition. California: ASME.
Weiser, Mark. 1999. “The Computer for the 21st Century.” ACM SIGMOBILE Mobile Computing and Communications Review 3 (3): 3–11.
Wendrich, Robert E. 2010. “Raw Shaping Form Finding: Tacit Tangible CAD.” Computer-Aided Design and Applications 7 (4). Taylor & Francis: 505–31.
Zhai, Shumin, Eser Kandogan, Barton A. Smith, and Ted Selker. 1999. “In Search of the ‘Magic Carpet’: Design and Experimentation of a Bimanual 3D Navigation Interface.” Journal of Visual Languages & Computing 10 (1): 3–17.