Jump to content

I 'll just toss this out here and see what kinds of responses I get I really think Google sho...


G+_Mike Tsirigotis
 Share

Recommended Posts

I'll just toss this out here and see what kinds of responses I get... I really think Google should look at extending the UI of Android, and making it adaptable to various environments.

 

For example, the OS UI already responds to environmental conditions given to Android from the hardware. The best example of this is the Portrait and Landscape orientations. The OS UI is different depending on how you are holding the device. Likewise, this information is passed to Apps, and developers that choose to give the user a different UI based on orientation. While in 90% of the cases this different UI apperes to the user as just a turning and reorganizing of the screen, from a developer's point of view it's a completely different UI that must be programed into the app. For Apps that are "Universal" between phone and tablets, the UIs for each of those platforms (and orientations) are little more than just embedded options in the App which are called up based on configuration. 

 

If we take this a step further, it seams reasonable that Android to report other environmental conditions to the OS and App, and similarity adaptive UIs can be presented. 

 

Specifically, docking an Android phone with full size Monitor/Keyboard/Mouse, and using your phone like a desktop. Android already supports USB keyboards, mice, and, storage (thumb drive), and it supports HID devices over Bluetooth. The Galaxy Nexus had an HDMI dock which seemed well suited to this purpose, and Asus's line of Transformer tablets with their keyboard and trackpad docs are ready made laptop replacements. There is also a string of smaller Android manufactures who make hardware that connects to their primary displays over HDMI, and we are even seeing startups raising money to build "Android" desktops. 

 

Why not? My Android phone has as much power, ram, and storage, as desktops of just 5 years ago, and every year that gap gets narrower. The problem is that while Android is a fantastic "touch" OS, this method of input does not lend itself well to the work-a-day world of data entry and creation. These sorts of jobs require full sized keyboards, big (touch-less) screens, and precision pointing.

 

All the connectivity and HID inputs are there and the OS is mostly there. It just needs a tiny bit of tweaking to make it fully HID friendly. Things like "right click" context menus, click and drag selection, and mouse wheel rather than swipe scrolling. Of course these UI elements might interfere with normal touch operation, which is why the OS would only turn them on when the device is "docked", a user connects a USB or Bluetooth HID, or the user  just turns on the functions. Again, like the orientation, App developers can choose to make their Apps more HID friendly as an additional UI option, or they can simply leave them as touch only.

 

 

Taking it a step further, Google could define a few discreet environmental condition that they will support and pass to App developers. These could be things like a "lean back" TV experience, where the UI is displayed on a larger screen and navigation is done via a d-pad or by swiping the devices screen without looking at it.

 

Another environmental condition could be exporting the display to a screen on or in the dashboard of a car. This could give drivers access to navigation and media playback (like Google Music or Pandora), but via a simplified big icon touch interface. If such connections are wireless, then the driver could have access to all the power, connectivity, and information of her android phone right on the car's consol while her hands stay on the wheel and the phone stays in her pocket. 

 

Just some thoughts.....

 

 

Link to comment
Share on other sites

 Share

×
×
  • Create New...