As computing becomes more ubiquitous, there is a need for distributed intelligent human-computer interfaces that can perceive and interpret a user’s actions through sensors that see, hear and feel. A perceptually intelligent interface enables a more natural interaction between a user and a machine in the sense that the user can look at, talk to or touch an object instead of using a machine language. The goal of the present work on a Sensing Chair is to enable a computer to track, in real time, the sitting postures of a user through contact sensors that act like a layer of artificial skin. This is accomplished with surface-mounted pressure distribution sensors placed on the backrest and the seatpan of an office chair. Given the similarity between a pressure distribution map from the contact sensors and a greyscale image, computer vision and pattern recognition algorithms, such as Principal Components Analysis, are applied to the problem of classifying steady-state sitting postures. A real-time multi-user sitting posture classification system has been implemented in our laboratory. The system is trained on pressure distribution data from subjects with varying anthropometrics, and performs at an overall accuracy of 96%. Future work will focus on the modeling of transient postures when a user moves from one steady-state posture to the next. A robust, real-time sitting posture tracking system can lead to many exciting applications such as automatic control of airbag deployment forces, ergonomics of furniture design, and biometric authentication for computer security.