New Delhi: A new tool has been developed which allows users to execute operations on a smartphone. A team team of researchers including one of Indian origin has developed the tool to control the smartphone by combining gaze control and simple hand gestures.Also Read – Haryana Class 10 Student Cheats Through WhatsApp By Hiding Phone in Glass Clipboard, Busted | Watch
While smartphone devices have grown to accommodate the bigger screens and higher processing power needed for more demanding activities, the problem is that they frequently require a second hand or voice commands to operate. Also Read – Indian Smartphone Exports Rise By 83 Per Cent, Up 32 Times In Last 4 Years
The new tool called EyeMU was developed by researchers at Carnegie Mellon University in the US. This tool shows how gaze estimation using a phone’s user-facing camera can be paired with motion gestures to enable a rapid interaction technique on handheld phones. Also Read – OnePlus Launches 9RT 5G Smartphone In India. Specifications, Price Details Here
“Current phones only respond when we ask them for things, whether by speech, taps or button clicks,” said Andy Kong, a senior majoring in computer science at the University.
“Imagine how much more useful it would be if we could predict what the user wanted by analysing gaze or other biometrics,” Kong added.
The first step was a programme that used a laptop’s built-in camera to track the user’s eyes, which in turn moved the cursor around the screen.
“We asked the question, ‘Is there a more natural mechanism to use to interact with the phone?’ And the precursor for a lot of what we do is to look at something,” said Karan Ahuja, a doctoral student in human-computer interaction.
Kong and Ahuja advanced that early prototype by using Google’s Face Mesh tool to study the gaze patterns of users looking at different areas of the screen and render the mapping data.
Next, the team developed a gaze predictor that uses the smartphone’s front-facing camera to lock in what the viewer is looking at and register it as the target.
They made the tool more productive by combining the gaze predictor with the smartphone’s built-in motion sensors to enable commands.
For example, a user could look at a notification long enough to secure it as a target and flick the phone to the left to dismiss it or to the right to respond to the notification.
“The real innovation in this project is the addition of a second modality, such as flicking the phone left or right, combined with gaze prediction. That’s what makes it powerful. It seems so obvious in retrospect, but it’s a clever idea that makes EyeMU much more intuitive,” said Chris Harrison, an associate professor at the varsity.
(With inputs from IANS)
$(document).ready(function(){ $('#commentbtn').on("click",function(){ (function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_US/all.js#xfbml=1&appId=178196885542208"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));
$(".cmntbox").toggle(); }); });
Stay connected with us on social media platform for instant update click here to join our Twitter, & Facebook
We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.
For all the latest Technology News Click Here