Hi, I'm Christian Weichel.

I work on the next big thing for developers, did HCI research and make stuff.

Here's my story.

Publications

Show Abstract

SPATA: Spatio-Tangible Tools for Fabrication-Aware Design.

Best Paper Award

Christian Weichel, Jason Alexander, Abhijit Karnik and Hans Gellersen. 2015.

The physical tools used when designing new objects for digital fabrication are mature, yet disconnected from their virtual accompaniments. SPATA is the digital adaptation of two spatial measurement tools, that explores their closer integration into virtual design environments. We adapt two of the traditional measurement tools: calipers and protractors. Both tools can measure, transfer, and present size and angle. Their close integration into different design environments makes tasks more fluid and convenient. We describe the tools’ design, a prototype implementation, integration into different environments, and application scenarios validating the concept.

In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction (TEI '14). ACM, New York, NY, USA, 189-196.

MixFab: A Mixed-Reality Environment for Personal Fabrication.

Best Paper Award

Christian Weichel, Manfred Lau, David Kim, Nicolas Villar and Hans Gellersen. . 2014.

Personal fabrication machines, such as 3D printers and laser cutters, are becoming increasingly ubiquitous. However, designing objects for fabrication still requires 3D modeling skills, thereby rendering such technologies inaccessible to a wide user-group. In this paper, we introduce MixFab, a mixed-reality environment for personal fabrication that lowers the barrier for users to engage in personal fabrication. Users design objects in an immersive augmented reality environment, interact with virtual objects in a direct gestural manner and can introduce existing physical objects effortlessly into their designs. We describe the design and implementation of MixFab, a user-defined gesture study that informed this design, show artifacts designed with the system and describe a user study evaluating the system’s prototype.

In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 3855-3864.

Exploring Interactions with Physically Dynamic Bar Charts

Faisal Taher, John Hardy, Abhijit Karnik, Christian Weichel, Yvonne Jansen, Kasper Hornbæk and Jason Alexander. 2015.

Visualizations such as bar charts help users reason about data, but are mostly screen-based, rarely physical, and almost never physical and dynamic. This paper investigates the role of physically dynamic bar charts and evaluates new interactions for exploring and working with datasets rendered in dynamic physical form. To facilitate our exploration we constructed a 10×10 interactive bar chart and designed interactions that supported fundamental visualisation tasks, specifically: annotation, navigation, filtering, comparison, organization, and sorting. The interactions were evaluated in a user study with 17 participants. We identify the preferred methods of working with the data for each task (e.g. directly tapping rows to hide bars), highlight the strengths and limitations of working with physical data, and discuss the challenges of integrating the proposed interactions together into a larger data exploration system. In general, physical interactions were intuitive, informative, and enjoyable, paving the way for new explorations in physical data visualizations.

In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 3237-3246. DOI=http://dx.doi.org/10.1145/2702123.2702604

ShapeClip: Towards Rapid Prototyping with Shape-Changing Displays for Designers

John Hardy, Christian Weichel, Faisal Taher, John Vidler and Jason Alexander. 2015.

This paper presents ShapeClip: a modular tool capable of transforming any computer screen into a z-actuating shape-changing display. This enables designers to produce dynamic physical forms by ‘clipping’ actuators onto screens. ShapeClip displays are portable, scalable, fault-tolerant, and support runtime re-arrangement. Users are not required to have knowledge of electronics or programming, and can develop motion designs with presentation software, image editors, or web-technologies. To evaluate ShapeClip we carried out a full-day workshop with expert designers. Participants were asked to generate shape-changing designs and then construct them using ShapeClip. ShapeClip enabled participants to rapidly and successfully transform their ideas into functional systems.

In Proceedings of the 33rd Annual ACM Conference on Human Factors in Computing Systems (CHI '15). ACM, New York, NY, USA, 19-28. DOI=http://dx.doi.org/10.1145/2702123.2702599

ReForm: Integrating Physical and Digital Design through Bidirectional Fabrication

Christian Weichel, John Hardy, Jason Alexander, and Hans Gellersen. 2015.

Digital fabrication machines such as 3D printers and lasercutters allow users to produce physical objects based on virtual models. The creation process is currently unidirectional: once an object is fabricated it is separated from its originating virtual model. Consequently, users are tied into digital modeling tools, the virtual design must be completed before fabrication, and once fabricated, re-shaping the physical object no longer influences the digital model. To provide a more flexible design process that allows objects to iteratively evolve through both digital and physical input, we introduce bidirectional fabrication. To demonstrate the concept, we built ReForm, a system that integrates digital modeling with shape input, shape output, annotation for machine commands, and visual output. By continually synchronizing the physical object and digital model it supports object versioning to allow physical changes to be undone. Through application examples, we demonstrate the benefits of ReForm to the digital fabrication process.

In Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology (UIST '15). ACM, New York, NY, USA, 93-102. DOI=http://dx.doi.org/10.1145/2807442.2807451

Shape Display Shader Language (SDSL): A New Programming Model for Shape Changing Displays

Christian Weichel, John Hardy and Jason Alexander. 2015.

Shape-changing displays' dynamic physical affordances have inspired a range of novel hardware designs to support new types of interaction. Despite rapid technological progress, the community lacks a common programming model for developing applications for these visually and physically-dynamic display surfaces. This results in complex, hardware-specific, custom-code that requires significant development effort and prevents researchers from easily building on and sharing their applications across hardware platforms. As a first attempt to address these issues we introduce SDSL, a Shape-Display Shader Language for easily programming shape-changing displays in a hardware-independent manner. We introduce the (graphics-derived) pipeline model of SDSL, an open-source implementation that includes a compiler, runtime, IDE, debugger, and simulator, and show demonstrator applications running on two shape-changing hardware setups.

In Proceedings of the 33rd Annual ACM Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '15). ACM, New York, NY, USA, 1121-1126. DOI=http://dx.doi.org/10.1145/2702613.2732727

Mixed Physical and Virtual Design Environments for Digital Fabrication

Christian Weichel. 2015.

Digital Fabrication (3D printing, laser-cutting or CNC milling) enables the automated fabrication of physical objects from digital models. This technology is becoming more readily available and ubiquitous, as digital fabrication machines become more capable and affordable. When it comes to designing the objects that are to be fabricated however, there are still barriers for novices and inconveniences for experts. This thesis looks at combining digital and physical spaces for desiging fabricable artifacts.

PDF

Connected Tools in Digital Design

Christian Weichel, Jason Alexander, Abhijit Karnik, and Hans Gellersen. 2015.

As digital fabrication and digital design become more pervasive, the physical tools we use in conjunction will have to catch up. With the Internet of Things, cyberphysical systems, and Industry 4.0 in our midst, connecting and integrating measurement tools into design processes is a logical step. Here, the authors describe the first steps in that direction, coming from a variety of communities: academics, makers, and industry alike. In particular, they present their spatio-tangible (SPATA) tools for fabrication-aware design.

IEEE Pervasive Computing, vol. 14, no. 2, pp. 18-21, Apr.-June 2015. doi: 10.1109/MPRV.2015.29

EyeContext: Recognition of High-level Contextual Cues from Human Visual Behaviour.

Andreas Bulling, Christian Weichel and Hans Gellersen. 2013.

In this work we present EyeContext, a system to infer highlevel contextual cues from human visual behaviour. We conducted a user study to record eye movements of four participants over a full day of their daily life, totalling 42.5 hours of eye movement data. Participants were asked to self-annotate four non-mutually exclusive cues: social (interacting with somebody vs. no interaction), cognitive (concentrated work vs. leisure), physical (physically active vs. not active), and spatial (inside vs. outside a building). We evaluate a proofof-concept EyeContext system that combines encoding of eye movements into strings and a spectrum string kernel support vector machine (SVM) classifier. Our results demonstrate the large information content available in long-term human visual behaviour and opens up new venues for research on eye-based behavioural monitoring and life logging.

In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, 305-308. DOI=10.1145/2470654.2470697

Enclosed: A Component-Centric Interface for Designing Prototype Enclosures.

Christian Weichel, Manfred Lau and Hans Gellersen. . 2013.

This paper explores the problem of designing enclosures (or physical cases) that are needed for prototyping electronic devices. We present a novel interface that uses electronic components as handles for designing the 3D shape of the enclosure. We use the .NET Gadgeteer platform as a case study of this problem, and implemented a proof-of-concept system for designing enclosures for Gadgeteer components. We show examples of enclosures designed and fabricated with our system.

In Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction (TEI '13). ACM, New York, NY, USA, 215-218. DOI=10.1145/2460625.2460659

Overcoming Interaction Blindness through Curiosity Objects

Steven Houben and Christian Weichel. 2013.

In recent years there has been a widespread installation of large interactive public displays. Longitudinal studies however show that these interactive displays suffer from interaction blindness – the inability of the public to recognize the interactive capabilities of those surfaces. In this paper, we explore the use of curiosity-provoking artifacts, (curiosity objects) to overcome interaction blindness. Our study confirmed the interaction blindness problem and shows that introducing a curiosity object results in a significant increase in interactivity with the display as well as changes in movement in the spaces surrounding the interactive display.

In CHI '13 Extended Abstracts on Human Factors in Computing Systems (CHI EA '13). ACM, New York, NY, USA, 1539-1544. DOI=10.1145/2468356.2468631

Ingredients for a New Wave of Ubicomp Products

Thomas Kubitza, Norman Pohl, Tilman Dingler, Stefan Schneegaß, Christian Weichel, and Albrecht Schmidt. 2013.

The emergence of many new embedded computing platforms has lowered the hurdle for creating ubiquitous computing devices. Here, the authors highlight some of the newer platforms, communication technologies, sensors, actuators, and cloud-based development tools, which are creating new opportunities for ubiquitous computing.

IEEE Pervasive Computing, vol.12, no. 3, pp. 5-8, July-Sept. 2013, doi:10.1109/MPRV.2013.51

A day in the life of our eyes

Christian Weichel. 2012.

Using the eyes as input modality has a long history in human computer interaction. However, it is only recently that eye-movement is considered a possibility for activity recognition. Yet, current video-based eye-trackers do not allow long-term data collection in a daily life setting. In this work we built an experimental system, including an activity centric study design and technical implementation.

PDF

Adapting Self-Organizing Maps to the MapReduce Programming Paradigm

Christian Weichel. 2010.

We present an adaption of the self organizing map (SOM) useful for cluster analysis of large quantities of data such as music classification or customer behavior analysis. The algorithm is based on the batch SOM formulation which has been successfully adopted to other parallel architectures and perfectly suits the map reduce programming paradigm, thus enabling the use of large cloud computing infrastructures such as Amazon EC2.