[ | E-mail | Share ]
Contact: Emil Venere
venere@purdue.edu
765-494-4709
Purdue University
WEST LAFAYETTE, Ind. People can let their fingers - and hands - do the talking with a new touch-activated system that projects onto walls and other surfaces and allows users to interact with their environment and each other.
The system identifies the fingers of a person's hand while touching any plain surface. It also recognizes hand posture and gestures, revealing individual users by their unique traits.
"Imagine having giant iPads everywhere, on any wall in your house or office, every kitchen counter, without using expensive technology," said Niklas Elmqvist, an assistant professor of electrical and computer engineering at Purdue University. "You can use any surface, even a dumb physical surface like wood. You don't need to install expensive LED displays and touch-sensitive screens."
The new "extended multitouch" system allows more than one person to use a surface at the same time and also enables people to use both hands, distinguishing between the right and left hand.
Research indicates the system is 98 percent accurate in determining hand posture, which is critical to recognizing gestures and carrying out commands. The technology has many possible applications, said Karthik Ramani, Purdue's Donald W. Feddersen Professor of Mechanical Engineering.
"Basically, it might be used for any interior surface to interact virtually with a computer," he said. "You could use it for living environments, to turn appliances on, in a design studio to work on a concept or in a laboratory, where a student and instructor interact."
Findings are detailed in a research paper being presented this week during the Association for Computing Machinery Symposium on User Interface Software and Technology (ACM UIST 2012) in Cambridge, Mass. The paper was written by doctoral students Sundar Murugappan and Vinayak, who uses only one name, Elmqvist and Ramani.
The system uses the Microsoft Kinect camera, which senses three-dimensional space.
"We project a computer screen on any surface, just a normal table covered with white paper," Ramani said. "The camera sees where your hands are, which fingers you are pressing on the surface, tracks hand gestures and recognizes whether there is more than one person working at the same time." The Kinect camera senses depth, making it possible to see how far each 3-D pixel is from the camera. The researchers married the camera with a new computer model for the hand.
"So we can isolate different parts of a hand or finger to show how far they are from the surface," Elmqvist said. "We can see which fingers are touching the surface. With this technology, you could potentially call up a menu by positioning your hand just above the surface." That camera coupled with the hand model allows the system to locate the center of each hand, which is necessary for determining gestures and distinguishing between left and right hands.
Researchers explored possible applications, including one that allows the user to draw a sketch with a pen and then modify it with their hands.
"We can detect gestural interactions between more than one hand and more than one user," Ramani said. "You could do precision things, like writing with a pen, with your dominant hand and more general things, such as selecting colors, using the non-dominant hand."
Researchers tested the concept in two user studies, one with 14 volunteers and the other with nine. Findings from one study indicated display features should be no smaller than 18 millimeters, or about an inch, to be efficient.
"While new and more precise cameras will improve accuracy, we have established the necessary hand models and principles for the system," Ramani said.
The other user study showed the system can effectively determine hand posture and whether the right or left hand is being used.
"We wanted to see how accurate we could be while figuring out different configurations, such as touching with all 10 fingers, which hand is being used and so on," Elmqvist said.
That study indicated the system was 98 percent accurate in determining hand posture.
Patents are pending on the concept.
###
The research has been supported by the National Science Foundation and the Donald W. Feddersen Chaired Professorship at Purdue School of Mechanical Engineering. (A video about the system can be viewed at https://Engineering.Purdue.edu/cdesign/wp/?p=1423).
Writer: Emil Venere, 765-494-3470, venere@purdue.edu
Sources: Niklas Elmqvist, 765 494-0364, elm@purdue.edu
Karthik Ramani, 765-494-5725, ramani@purdue.edu
Related websites:
Niklas Elmqvist: http://engineering.purdue.edu/~elm/
Karthik Ramani: https://engineering.purdue.edu/~ramani
IMAGE CAPTION:
This composite image shows how fingers and hands are computed in a new touch-activated system that projects onto walls and other surfaces and allows people to interact with their environment and each other. (Purdue University image)
A publication-quality image is available at http://news.uns.purdue.edu/images/2012/elmqvist-multitouch.jpg
ABSTRACT
Extended Multitouch: Recovering Touch Posture and Differentiating Users using a Depth Camera
Sundar Murugappan1, Vinayak1, Niklas Elmqvist2, Karthik Ramani1,2
1School of Mechanical Engineering and 2School of Electrical and Computer Engineering
Purdue University
Multitouch surfaces are becoming prevalent, but most existing technologies are only capable of detecting the user's actual points of contact on the surface and not the identity, posture, and handedness of the user. In this paper, we define the concept of extended multitouch interaction as a richer input modality that includes all of this information. We further present a practical solution to achieve this on tabletop displays based on mounting a single commodity depth camera above a horizontal surface. This will enable us to not only detect when the surface is being touched, but also recover the user's exact finger and hand posture, as well as distinguish between different users and their handedness. We validate our approach using two user studies, and deploy the technique in a scratchpad tool and in a pen + touch sketch tool.
[ | E-mail | Share ]
?
AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
[ | E-mail | Share ]
Contact: Emil Venere
venere@purdue.edu
765-494-4709
Purdue University
WEST LAFAYETTE, Ind. People can let their fingers - and hands - do the talking with a new touch-activated system that projects onto walls and other surfaces and allows users to interact with their environment and each other.
The system identifies the fingers of a person's hand while touching any plain surface. It also recognizes hand posture and gestures, revealing individual users by their unique traits.
"Imagine having giant iPads everywhere, on any wall in your house or office, every kitchen counter, without using expensive technology," said Niklas Elmqvist, an assistant professor of electrical and computer engineering at Purdue University. "You can use any surface, even a dumb physical surface like wood. You don't need to install expensive LED displays and touch-sensitive screens."
The new "extended multitouch" system allows more than one person to use a surface at the same time and also enables people to use both hands, distinguishing between the right and left hand.
Research indicates the system is 98 percent accurate in determining hand posture, which is critical to recognizing gestures and carrying out commands. The technology has many possible applications, said Karthik Ramani, Purdue's Donald W. Feddersen Professor of Mechanical Engineering.
"Basically, it might be used for any interior surface to interact virtually with a computer," he said. "You could use it for living environments, to turn appliances on, in a design studio to work on a concept or in a laboratory, where a student and instructor interact."
Findings are detailed in a research paper being presented this week during the Association for Computing Machinery Symposium on User Interface Software and Technology (ACM UIST 2012) in Cambridge, Mass. The paper was written by doctoral students Sundar Murugappan and Vinayak, who uses only one name, Elmqvist and Ramani.
The system uses the Microsoft Kinect camera, which senses three-dimensional space.
"We project a computer screen on any surface, just a normal table covered with white paper," Ramani said. "The camera sees where your hands are, which fingers you are pressing on the surface, tracks hand gestures and recognizes whether there is more than one person working at the same time." The Kinect camera senses depth, making it possible to see how far each 3-D pixel is from the camera. The researchers married the camera with a new computer model for the hand.
"So we can isolate different parts of a hand or finger to show how far they are from the surface," Elmqvist said. "We can see which fingers are touching the surface. With this technology, you could potentially call up a menu by positioning your hand just above the surface." That camera coupled with the hand model allows the system to locate the center of each hand, which is necessary for determining gestures and distinguishing between left and right hands.
Researchers explored possible applications, including one that allows the user to draw a sketch with a pen and then modify it with their hands.
"We can detect gestural interactions between more than one hand and more than one user," Ramani said. "You could do precision things, like writing with a pen, with your dominant hand and more general things, such as selecting colors, using the non-dominant hand."
Researchers tested the concept in two user studies, one with 14 volunteers and the other with nine. Findings from one study indicated display features should be no smaller than 18 millimeters, or about an inch, to be efficient.
"While new and more precise cameras will improve accuracy, we have established the necessary hand models and principles for the system," Ramani said.
The other user study showed the system can effectively determine hand posture and whether the right or left hand is being used.
"We wanted to see how accurate we could be while figuring out different configurations, such as touching with all 10 fingers, which hand is being used and so on," Elmqvist said.
That study indicated the system was 98 percent accurate in determining hand posture.
Patents are pending on the concept.
###
The research has been supported by the National Science Foundation and the Donald W. Feddersen Chaired Professorship at Purdue School of Mechanical Engineering. (A video about the system can be viewed at https://Engineering.Purdue.edu/cdesign/wp/?p=1423).
Writer: Emil Venere, 765-494-3470, venere@purdue.edu
Sources: Niklas Elmqvist, 765 494-0364, elm@purdue.edu
Karthik Ramani, 765-494-5725, ramani@purdue.edu
Related websites:
Niklas Elmqvist: http://engineering.purdue.edu/~elm/
Karthik Ramani: https://engineering.purdue.edu/~ramani
IMAGE CAPTION:
This composite image shows how fingers and hands are computed in a new touch-activated system that projects onto walls and other surfaces and allows people to interact with their environment and each other. (Purdue University image)
A publication-quality image is available at http://news.uns.purdue.edu/images/2012/elmqvist-multitouch.jpg
ABSTRACT
Extended Multitouch: Recovering Touch Posture and Differentiating Users using a Depth Camera
Sundar Murugappan1, Vinayak1, Niklas Elmqvist2, Karthik Ramani1,2
1School of Mechanical Engineering and 2School of Electrical and Computer Engineering
Purdue University
Multitouch surfaces are becoming prevalent, but most existing technologies are only capable of detecting the user's actual points of contact on the surface and not the identity, posture, and handedness of the user. In this paper, we define the concept of extended multitouch interaction as a richer input modality that includes all of this information. We further present a practical solution to achieve this on tabletop displays based on mounting a single commodity depth camera above a horizontal surface. This will enable us to not only detect when the surface is being touched, but also recover the user's exact finger and hand posture, as well as distinguish between different users and their handedness. We validate our approach using two user studies, and deploy the technique in a scratchpad tool and in a pen + touch sketch tool.
[ | E-mail | Share ]
?
AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert! system.
Source: http://www.eurekalert.org/pub_releases/2012-10/pu-nis100912.php
us open Hurricane Isaac 2012 bill nye Snooki Baby terrell owens terrell owens neil armstrong
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.