Minput, a device that hopes to solve the problem of space and visibility in small devices and screens, is being developed by Chris Harrison and Scott Hudson, who are both part of Carnegie Mellon’s Human-Computer Interaction Institute. Harrison has already been featured in The Tartan for his work with Skinput, which turns the arm into a touchpad.
When sliding Minput over a surface, sensors on the bottom can register movement to control various software on the device. In this respect, Minput is a mouse with the computer screen attached, and it can be manipulated without having the user’s fingers block the screen.
Minput allows mouse control and optical tracking to be applied to small mobile devices.
According to a demonstration in Harrison’s video, Minput could be operated anywhere a user could possibly use the device, including on a table, on the leg of one’s pants, and on a user’s palm. Because there are two sensors on the device, a variety of motion can be detected, including twisting the device on a table. According to www.chrisharrison.com, the optical devices use negligible tracking power. According to the video, Minput supports three “input modalities,” or ways the user can manipulate the device. Gestural input modality involves motions like flicks and twists and was demonstrated using photo-browsing software. Flicking the device in a certain direction changes the image, while twisting changes the number of images on the screen, like a zooming function. These inputs can be changed to perform many functions on different programs. The variety of motions allows programs on the devices to change certain settings without having to navigate complex menus.