One of the big topics nowadays in client computing is sensors - from touch screens to GPS devices, webcams, microphones, proximity sensors, accelerometers, temperature sensors, etc. Now they are coming to the browser, which could radically change the web.
Sensors provide information about the environment around the computer, and can make the user experience more natural (e.g. "pinching" a touch screen) or open up new features such as GPS navigation.
But consider the progress made recently by browsers:
- Apple create a multi-touch browser. Touch events can be handled by Safari automatically or handled by the web developer. Firefox is not far behind.
- A new W3C standard for geolocation with initial support by Apple Safari, Google Gears and Mozilla Firefox.
- Firefox and Google start work on camera and microphone support inside the browser.
Even without an overarching sensor framework, support for geolocation, audio and video should have an incredible effect on the web:
- Accessibility: dictate your search to Google, via a microphone
- Games: Play human pacman by watching on a map where your friends are in the city
- Communication: Make phone calls using your browser
- Market share: yet more reasons to use the web stack, rather than a client stack
- Security: We'll need strong security to prevent people from hacking in to your microphone