Projects

Hand and Finger Recognition for Image Manipulation using Kinect
Today, in order to control our computers, we use devices such as mouse or keyboard, but they both seem unnatural for humans. We have been used to interact with the world with our own hands, our body, or our voice. This is the reason that natural user interfaces have become so popular in recent years. In this project, we used hand gestures such as pinch, rotate, and waive to manipulate images, i.e., scale, rotate, and slide the photos respectively. Kinect data was used to accomplish these tasks. To implement the gestures, geometric properties of hands were calculated. This was to find out important information such as fingertips and center of hand. (Video)






 

Collaborators: Hossein Mousavi (University of California, Irvine)

Comparing “Pick and Place” Task in Spatial Augmented Reality versus Non-immersive Virtual Reality for Rehabilitation Setting
Introducing computer games to the rehabilitation market led to development of numerous Virtual Reality (VR) training applications. Although, VR has provided tremendous benefit to the patients and caregivers, it has inherent limitations some of which might be solved by replacing it with Augmented Reality (AR). The task of pick-and-place, which is involved in many activities of daily living (ADL’s), is one of the main impaired functions stroke survivors most wish to recover. We developed an exercise consisting of moving an object between various points, following a flash light that indicates the next target. The results show superior performance of subjects in spatial AR versus non-immersive VR setting.  This could be due to the extraneous hand-eye coordination which exists in VR whereas it is eliminated in spatial AR.

Publication: EMBC2013, Japan
Collaborators: Hossein Mousavi (University of California, Irvine)
A Spatial Augmented Reality Rehab System for Post-Stroke Hand Rehabilitation
Stroke rehabilitation is a challenging process. Movement impairments after stroke typically require intensive treatments, hands-on physical, and occupational therapy for several weeks after the initial injury. Unfortunately, due to economic pressures on health care providers, stroke patients are receiving less therapy and going home earlier. Therefore, an important goal for Rehabilitation Engineering (RE) is to develop technology that allows individuals with stroke to practice intensive movement training without the expense of an always-present therapist. We have developed a low-cost, Spatial Augmented Reality system that allows individuals with stroke to practice hand and arm movement exercises at home or at clinic with minimal interventions of a therapist.

Publication: NextMed/MMVR20, San Diego
Collaborators: Hossein Mousavi (University of California, Irvine)
Haptic Augmented Reality to Monitor Human Arm’s Stiffness in Rehabilitation
Augmented Reality (AR) is a live, direct or indirect, view of a physical, real-world environment whose elements are overlaid by virtual, computer generated objects. In this paper, AR is combined with haptics in order to observe human arm's stiffness. A haptic, hand-held device is used to measure the human arm's impedance. While a computer vision system tracks and records the position of the hand, a computer screen displays the impedance diagrams superimposed on the hand in a real-time video feed. The visual augmentation is also performed using a video projector that project's the diagrams on the hand as it moves.


Publication:
IECBES2012
Collaborators: Hossein Mousavi (University of California, Irvine)
Use of a Portable Device for Measuring Arm’s Planar Mechanical Impedance During Motion
This project proposes the design and use of a portable device for measuring the impedance of human arm during reaching motion. While the device looks and weighs like an ordinary mug, it contains a DC motor, which rotates an eccentric mass, and a 2-axis inertia sensor, which monitors vibration. The centrifugal force due to the rotating eccentric mass applies a perturbing force to the hand; correlation of the acceleration signals with the perturbing force gives the posture dependant mechanical impedance in form of ellipses. The method can configure an impedance ellipse in 0.1 second and can repeat it continuously over the course of reaching; hence for every second of a reaching trial, the device obtains 10 impedance ellipses, which is considerably faster than previous methods. Experimental data collected by the device during reaching motion is also presented.

Publication:
 IECBES2012
Collaborators: Hossein Mousavi (University of California, Irvine)
Optical Illusion in Augmented Reality
Augmented Reality has numerous compelling applications, but many of them will not be fulfilled until we understand how to display graphical objects relative to real-world objects. Although many researchers are tackling primary problems in developing AR systems, perceptually-correct augmentation rests a critical challenge. In this paper, we focus on how to correctly display and accurately convey size in respect to real-world objects. We conducted a user study to examine how subjects would verify relative size of virtual objects, augmented in a real scene. The results confirmed that optical illusion occurs in Augmented Reality applications if comparative size of virtual objects to real-world ones is not considered.

Publication: submitted to VRST2012, Toronto
Collaborators: Hossein Mousavi (University of California, Irvine)
Monitoring Intake Gestures using Sensor Fusion (Microsoft Kinect and Inertial Sensors) for Smart Home Tele-Rehab Settings
Smart home technologies help post-stroke patients complete activities of daily living (ADL) independently, while saving their time, money, and extra effort. The patients are otherwise required to visit rehabilitation clinics for formal care. Toward this goal, we present our approach to spot specific ADL’s of eating and drinking in a home setting. We fuse inertial and Microsoft Kinect sensors to monitor the patient’s intake gestures including fine cutting, loading food, and maneuvering the food to the mouth. For both sides of the body, we measured (i) position of the wrist, elbow, and shoulder; (ii) angular displacements at the elbow and shoulder joints; and (iii) acceleration of the spoon/fork/cup which are held by the subject. The use of Kinect allows distinguishing between healthy and paralyzed body sides which is a common problem in tele-rehab. The system was tested successfully on healthy subjects; because stroke-patients show slower motion in a shorter range, the system would serve them at least equally well.

Publication:
 IEEE HIC2012, Houston
Collaborators: Hossein Mousavi (University of California, Irvine)
MusiPad©
MusiPad is a gamified, finger rehabilitation platform for stroke, cerebral palsy, and similar patients. The platform consists of Synaptic TouchPadTM and an open source game called Frets on Fire© which is the Python version of the famous Guitar Hero©. Patients suffering from the above-mentioned diseases have difficulty with their finger individuation and/or grasp. "Practice makes perfect" is exactly the case for finger rehabilitation. MusiPad exploits a combination of the force measuring capabilities of the TouchPad as well as the engaging, motivational nature of Frets on Fire game to encourage patients to continue their practice for a long period. The subjects attempt to press their individual fingers to generate a certain force threshold at times specified by notes that are streamed on the LCD. The amount of force threshold can be adjusted for patients with difference strength and levels of impairment. The results demonstrate the effectiveness of MusiPad to provide controlled levels of challenge during an engaging computer game. It also shows MusiPad efficacy in quantifying finger individuation in post-stroke or cerebral palsy patients.

Presentation: UIST2012 (Innovation Contest), Boston
Collaborators: Mingming Fan, Hossein Mousavi (University of California, Irvine)
WhoBot
WhoBot is a shopping application that gives users real-time news feed of products that their friends have recently purchased or wanted to recommend to their social group. Our application will be a place where people come to talk about shopping only. We have our own social media where users can create a social network with the use of their contact list and share their shopping adventures with their friends. We hope to create a more reliable and enjoyable shopping experience for our users.

 Collaborators: Hossein Mousavi (University of California, Irvine)
Who-is-in
An Augmented Reality, Location-based application that helps explore indoors. This application has been customized for University of California, Irvine buildings. However, the idea could be extended for other university/corporation settings. The goal is to present information about classrooms, theatres, etc to students, faculty, and staff members, using a handheld device enabled with video see-through. Using this mobile application, users need to hold up a smartphone or tablet to capture an image of the room number. Once the mobile device identifies the room number, the information will be processed and based on features of the building and the room number, new information will be overlaid on the mobile screen to inform the user about his/her room of preference.
CaloMeter
A context-aware, location-based android application to promote self-improvment through awareness of calories in the foods and beverages of the restaurants located in UCI campus.

Screenshots:

Collaborators: Mingming Fan (University of California, Irvine)
Proteus
A design and simulation tool which runs either as a standalone application or the web and can be used to model, design, and simulate the non-linear dynamical systems.

Screenshots:


Collaborators: Jasleen Singh, Sam Zhang, Gao Lei, Prof. Yap Fook Fah (Nanyang Technological University)
Instapol
An easy to setup and use Audience Response System using COTS (Commercial, Off The Shelf) components. This wireless student response and voting system enables educators, trainers, and presenters to develop and administer real-time assessments of student participants. It can seamlessly identify and confirm student understanding, increase student attentiveness, and gather, rank, and report critical information simultaneously with your delivery of live face-to-face events.

Screenshots:

Collaborators: Jasleen Singh, Prof. Yap Fook Fah (Nanyang Technological University)
PDWMS
A Parallel and Distributed Workflow Management System with the purpose of workflow execution in parallel, using a novel partitioning algorithm. Compared to its rivals, it benefits from parallel partitioning,  distributed enactment, and peer-to-peer data movement.

Screenshots:

SMS-Shopper
A mobile application to shop on-the-fly. The idea is to fulfill the need of customers to do their shopping through their cell phones on-the-fly. Due to the installation of the application on cell phones, customers are able to see the most up-to-date products belonging to the stores of their choice, ranging anywhere from supermarkets to restaurants to bakeries and etc.  After ordering what they need through SMS, the products would be delivered right to their homes.