Research Interests
- Natural Interaction
- Augmented Reality
- Computer Vision
"Augmented reality (AR)
is a live, direct or indirect, view of a physical, real-world
environment whose elements are augmented by computer-generated sensory
input such as sound, video, graphics or GPS data." It is an
amazing concept that has been floating around for a couple of years
now, but recently has started gaining momentum as smartphones are
becoming powerful to tackle the processor requirements.
I'd like to push the boundaries of what we know as AR today. Iron Man the Movie is a fantastic example of what we should accomplish in future. It uses Augmented Reality as a part of the Jarvis vision system and demonstrates it pretty well. The system segregates the view into various objects it can recognize and overlays information on it. I know this is quite a futuristic view of what AR could do and how we could hypothetically achieve it. But I guess this is the right direction; we should forget about all the input and output mediums that have been used so far and think of the easiest and most idealistic way in feeding information into processors and then displaying back the results. That is, using gestures as input and 3d ambience around us as output.
I'd like to push the boundaries of what we know as AR today. Iron Man the Movie is a fantastic example of what we should accomplish in future. It uses Augmented Reality as a part of the Jarvis vision system and demonstrates it pretty well. The system segregates the view into various objects it can recognize and overlays information on it. I know this is quite a futuristic view of what AR could do and how we could hypothetically achieve it. But I guess this is the right direction; we should forget about all the input and output mediums that have been used so far and think of the easiest and most idealistic way in feeding information into processors and then displaying back the results. That is, using gestures as input and 3d ambience around us as output.
Publication and Presentations (Link to Google Scholar)
- Comparing “Pick and Place” Task in Spatial Augmented Reality versus Non-immersive Virtual Reality for Rehabilitation Setting
- M. Khademi, L.
Dodakian, S. C. Cramer, and C. V. Lopes, “Comparing “Pick and Place” Task in
Spatial Augmented Reality versus Non-immersive Virtual Reality for
Rehabilitation Setting,” to appear in the proceeding of the 2013 IEEE EMBC Conference, Osaka, Japan, July 3-7, 2013.
- A Spatial Augmented Reality Rehab System for Post-Stroke Hand Rehabilitation
- H. M. Hondori, M. Khademi, L. Dodakian, S. C. Cramer, and C. V. Lopes, “A Spatial Augmented Reality Rehab System for Post-Stroke Hand Rehabilitation,” appeared in proceeding of the 2013 Conference on Medicine Meets Virtual Reality (NextMed/MMVR20), San Diego, USA, Feb 20-23, 2013.
- Haptic Augmented Reality to Monitor Human Arm’s Stiffness in Rehabilitation
- M. Khademi, H. M. Hondori, L. Dodakian, S. C. Cramer, and C. V. Lopes, “Haptic Augmented Reality to Monitor Human Arm’s Stiffness in Rehabilitation,” appeared in proceeding of the 2012 IEEE EMBS Conference on Biomedical Engineering and Sciences (IECBES), Dec 17-19, 2012.
- Use of a Portable Device for Measuring Arm’s Planar Mechanical Impedance during Motion
- H. M. Hondori, M. Khademi, and C. V. Lopes, “Use of a Portable Device for Measuring Arm’s Planar Mechanical Impedance during Motion,” appeared in proceeding of the 2012 IEEE EMBS Conference on Biomedical Engineering and Sciences (IECBES), Dec 17-19, 2012.
- Optical Illusion in Augmented Reality
- M. Khademi, H. M. Hondori, and C. V. Lopes, “Optical Illusion in Augmented Reality,” appeared in proceeding of the 18th ACM Symposium on Virtual Reality Software and Technology, Toronto, Canada, Dec 10-12, 2012.
- Monitoring Intake Gestures using Sensor Fusion (Microsoft Kinect and Inertial Sensors) for Smart Home Tele-Rehab Setting
- H. M. Hondori, M. Khademi, and C. V. Lopes, "Monitoring Intake Gestures using Sensor Fusion (Microsoft Kinect and Inertial Sensors) for Smart Home Tele-Rehab Setting," appeared in proceeding of the 2012 HIC EMBS Conference on Healthcare Innovation, Houston, TX, USA, Nov 7-9, 2012.
- MusiPad: A Multi-touch Force-sensitive Gaming Pad for Finger Rehabilitation and Treatment in Stroke and Cerebral Palsy
- M. Khademi, M. Fan, H. M. Hondori, and C. V. Lopes, "MusiPad: A Multi-touch Force sensitive Gaming Pad for Finger Rehabilitation and Treatment in Stroke and Cerebral Palsy," presented at the Innovation Contest of the 25th User Interface and Software Technology (UIST2012), Boston, USA, Oct 7-10, 2012.
- Real-time Measurement of Arm's Mechanical Impedance with Augmented Reality Illustration
- H. M. Hondori, A. W. Tech, M. Khademi,
and C. V. Lopes, "Real-time Measurement of Arm's Mechanical Impedance
with Augmented Reality Illustration," presented at the IEEE-EMBC 2012
Unconference on Robatic Rehabilitation, San Diego, California, USA, Aug
29-Sep 1, 2012.
- Who-is-in: An Augmented Reality Application to Explore Campus Buildings
- M. Khademi, "Who-is-in: An Augmented Reality Application to Explore Campus Buildings," poster presented at CRA-W, Grad Cohort Workshop, Bellevue, WA, USA, April 2012.
- A Dynamic Web Service Scheduling & Deployment Framework for Grid Workflow
- S. Shahand, S. Turner, C. Wentong, M. Khademi, "A Dynamic Web Service Scheduling & Deployment Framework for Grid Workflow," appeared in Proceedings of the 10th International Conference on Computational Science, Amsterdam, June 2010.
- Distributed Execution of Workflow Using Parallel Partitioning
- M. Khademi, C. Wentong, S. Turner, S. Shahand, "Distributed Execution of Workflow Using Parallel Partitioning," appeared in the proceedings of the 2009 IEEE International Symposium on Parallel and Distributed Processing, China, Aug 2009.
- Distributed Execution of Workflow
- M. Khademi, "Distributed Execution of Workflow," thesis for obtaining Master of Engineering degree, SCE, NTU, Singapore, Aug 2009.