Since its inception, computer aided 3D modeling has primarily relied on the Windows, Icons, Menus, Pointer (WIMP) user interface. WIMP has rarely been able to tap into the natural intuitiveness and imagination of the user which accompanies any design process. Brain-computer interface (BCI) is a novel modality that uses the brain signals of a user to enable natural and intuitive interaction with an external device. The BCI’s potential to become an important modality of natural interaction for 3D modeling is almost limitless and unexplored. In theory, using BCI one can create any 3D model by simply thinking about it. This paper presents a basic framework for using BCI as an interface for computer aided 3D modeling. This framework involves the task of recording and recognizing electroencephalogram (EEG) brain wave patterns and electromyogram (EMG) signals corresponding to facial movements. The recognized EEG/EMG brain signals and associated keystrokes are used to activate/control different commands of a CAD package. Eight sample CAD models are created using the Emotiv EEG head set based BCI interface and Google SketchUp and presented to demonstrate the efficacy of the developed system based on the framework. To further exhibit BCI’s usability, human factor studies have been carried out on subjects from different backgrounds. Based on preliminary results, it is concluded that EEG/EMG based BCI is suitable for computer aided 3D modeling purposes. Issues in signal acquisition, system flexibility, integration with other modalities, and data collection are also discussed.

This content is only available via PDF.
You do not currently have access to this content.