Supporting sensors in Windows 8

AWS

Owner
FPCH Owner
Joined
Nov 19, 2003
Messages
10,976
Location
Florida U.S.A.
Recent advances in sensor technology are catalysts for the acceleration and evolution of user experiences on PCs. The ability to react to changes in ambient light, motion, human proximity, and location are becoming common and essential elements of the computing experience. Even something simple—like an ambient light sensor to adjust display brightness in a room with changing light—is potentially a basic scenario for desktop PCs. Of course, we also want to make sure you have full control over the use of these peripherals, since we know that different sensors leave open opportunities for risk or abuse that some folks might not be comfortable with. This post looks at the details of supporting sensors in Windows 8 and was authored by Gavin Gear, a PM on the Device Connectivity team.
--Steven
<hr />
The first thing we explored about sensors was how Windows 8 should use them at the system level, to adapt the PC to the environment while preserving battery life. Adaptive brightness The first system feature was automatic display brightness control, or what we call “adaptive brightness.” This was a feature that we first introduced in Windows 7 using ambient light sensors (ALS), and is targeted at mobile form factors like slates, convertibles, and laptops. With today’s display panels supporting brightness levels at approximately twice the intensity of what was common just a few years ago, this feature is more important than ever. By dynamically controlling screen brightness based on changing ambient light conditions, we can optimize the level of reading comfort, and save battery life when the screen is dimmed in darker environments. A tablet PC in harsh outdoor lighting with adaptive brightness (left), and without (right)[/i] You can see here that adaptive brightness helps you see content on the screen more clearly, since the screen automatically gets brighter when the tablet enters a bright environment. And for those of you who use your desktop PCs in a sunny room, you know this same thing can happen at different times of the day in different seasons. Automatic screen rotation Many smartphones and other mobile devices have established the expectation that when you rotate the device, the graphic display will also rotate and adapt to the new orientation (including adapting to aspect ratio changes). Data from an accelerometer allows the device to determine its basic orientation. By automatically rotating the screen, people can use their devices (primarily slates and convertibles) in a more natural and intuitive way, without needing to manually rotate the screen with software controls or hardware buttons. [url=http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/3348.Samsung_5F00_Device_5F00_Landscape_5F00_500_5F00_3C093380.jpg" target="_blank"><img title="Landscape orientation" style="display: inline background-image: none" border="0" alt="Windows 8 Start screen on a tablet PC held horizontally" src="http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/3362.Samsung_5F00_Device_5F00_Landscape_5F00_500_5F00_thumb_5F00_4DE5C74D.jpg" width="314" height="195" /></a> [url=http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/2604.Samsung_5F00_Device_5F00_Portrait_5F00_500_5F00_46C68AD5.jpg" target="_blank"><img title="Portrait orientation" style="display: inline background-image: none" border="0" alt="Windows 8 Start screen on a tablet PC held vertically" src="http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/3264.Samsung_5F00_Device_5F00_Portrait_5F00_500_5F00_thumb_5F00_6D94A115.jpg" width="238" height="312" /></a> [i]Windows 8 Start screen in landscape and portrait orientations[/i] Developer support for sensors Beyond figuring out the basics for how a Windows 8 system might use sensors, we also needed to think about how apps might use sensors. We looked at a variety of examples of sensor-enabled apps including games, commercial applications, tools, and utilities, to help us determine which scenarios to support. First on the list was the ability for apps to understand motion and screen rotation. This requires an accelerometer – a device that can be used to measure the force due to gravity, and the motion of the device itself. But most scenarios require more than just an understanding of motion and gravity. Orientation is also an important requirement for many applications. To enable a PC to understand orientation we needed to integrate the functionality of a compass. Supporting a compass would at minimum require a 3D accelerometer (which measures acceleration on three axes) and a 3D magnetometer (which measures magnetic field strengths on 3 axes). This combination of sensors is called a 6-axis motion and orientation sensing system, and can support a basic tilt-compensated compass, screen rotation, and certain casual game apps like a labyrinth style game. However, in our testing and prototyping, we found the 6-axis motion sensing system has two key drawbacks: sporadic compass inaccuracy, and a lack of the responsiveness required by 3D interactive games. Recently, a new type of sensor has started to emerge on phone platforms – the gyro sensor. Gyro sensors measure angular speed, typically along 3 axes. You can also use the data from gyro sensors to increase the responsiveness and accuracy of 3D motion-sensing systems. A gyro sensor is very sensitive, but it lacks any form of orientation reference (such as gravity or north heading). This diagram shows how gyro data is represented as a set of three rotations along the three primary axes for the device: [url=http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/0525.Yaw_2D00_Pitch_2D00_Roll_2D00_Slate_2D00_Win8_5F00_7B66E710.png" target="_blank"><img title="Diagram of Yaw, Pitch, and Roll rotation" style="margin-right: auto margin-left: auto float: none display: block background-image: none" border="0" alt="Yaw has +Z rotation, Roll has +Y rotation, and Pitch has +X rotation" src="http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/2262.Yaw_2D00_Pitch_2D00_Roll_2D00_Slate_2D00_Win8_5F00_thumb_5F00_3B30CD96.png" width="560" height="374" /></a> Initially, some thought that the need for such sensors was scoped to very few apps, such as specialized games. But the more we examined the 3D motion and orientation sensing problem, the more we realized that applications are much more immersive and attractive if they react to the kind of motion humans can easily understand, such as shakes, twists, and rotations in multiple dimensions. With these kinds of sensors it would certainly be possible to build very immersive 3D games, but it would also enable lots of other apps to more naturally respond to input from a variety of motions, including mapping and navigation applications, measuring utilities, interactive (between two machines) applications, and simple apps like casual games. [size=5]Engineering challenges[/size] We started our exploration into motion apps by prototyping some 3D experiences. The first challenge was to map the physical orientation of the device directly to a virtual 3D environment in the app. We decided to model a simple augmented reality experience by emulating a tablet as a window into a virtual world. The concept was fairly simple: when you move the device while looking at the screen, the virtual environment (the inside of a room) would appear to stay stationary. Initially, we tried an experiment using the accelerometer to map up and down movement of the device to up and down movement of the 3D environment in response. When you hold the device still, the scene should remain stable. When you tilt the device, the view should tilt up or down. Right away we encountered an issue: “noise” in the data from the accelerometer sensor was causing jittery movement of the 3D environment even when the device was held stationary. We were able to see this noise clearly by capturing accelerometer data and charting it. [url=http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/8623.Raw_2D00_accelerometer_2D00_data_2D002D002D00_stationary_2D00_device_5F00_3411911E.jpg" target="_blank"><img title="Raw accelerometer data - stationary device" style="margin-right: auto margin-left: auto float: none display: block background-image: none" border="0" alt="Acceleration X is a jagged line near 0 Acceleration Y is a jagged line near 0 Acceleration Z is a jagged line near -1" src="http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/8535.Raw_2D00_accelerometer_2D00_data_2D002D002D00_stationary_2D00_device_5F00_thumb_5F00_45EE24EB.jpg" width="559" height="310" /></a> Without noise, the lines on the chart would be straight, with no vertical deviation. The conventional way to remove such noise is to apply a low-pass filter to the raw data stream. When we implemented this mitigation in our prototype, the resultant motion was smooth and stable (jitter-free). But the low-pass filter introduced another problem: the app lost responsiveness and felt sluggish when responding to motion. We needed a way to compensate for this jitter without reducing responsiveness. The next experiment was to provide the ability to “look left” and “look right” in our virtual 3D environment app. We used a 6-axis compass solution (3D accelerometer + 3D magnetometer) to support this type of movement. Although this [i]kind of[/i] worked, the movement was not consistent due to the general instability of the 6-axis compass. It was also challenging to blend the up-and-down movement with the left-and-right movement. From these experiments it was clear that this combination of sensors could not provide the fluid and responsive experience we wanted. The accelerometer sensor was not providing clean data, and could not be used alone to determine device orientation. The magnetometer was slow to update and was susceptible to electromagnetic interference (think of a compass needle that sticks in one position occasionally). We had yet to experiment with the gyro sensors, but because gyros could only determine rotational speed, it wasn’t clear how they could help. Creating “sensor fusion” But further experimentation demonstrated that using all three sensors together could solve the problem. It turns out that an accelerometer, magnetometer, and a gyro can complement each-other’s weaknesses, effectively filling in gaps in data and data responsiveness. Using a combination of these sensors it is possible to create a better, more responsive, and more fluid experience than the sensors can provide individually. Combining the input of multiple sensors to produce better overall results is a process we call sensor fusion. Essentially, sensor fusion is a case where the whole is greater than the sum of the parts. A typical sensor fusion system uses a 3D accelerometer, a 3D magnetometer, and a 3D gyro to create a combined “9-axis sensor fusion” system. To understand how this system works, let’s take a look at the inputs and outputs. [url=http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/6433.Hardware_2D00_Sensor_2D00_Outputs_5F00_0AA25FC3.jpg" target="_blank"><img title="9-axis sensor fusion system" style="border: 0px currentcolor margin-right: auto margin-left: auto float: none display: block background-image: none" border="0" alt="Hardware (3D Accelerometer, 3D Gyrometer, 3D Magnetometer) with arrows pointing to "Pass-through" and "Sensor Fusion", and arrows from these go to Sensor Outputs (3D Accelerometer, 3D Gyrometer, 3D Compass, 3D Inclinometer, Device Orientation)" src="http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/7103.Hardware_2D00_Sensor_2D00_Outputs_5F00_thumb_5F00_78C5CBF5.jpg" width="560" height="244" /></a>
[i]9-axis sensor fusion system[/i] This diagram shows two types of outputs: pass-through outputs in which the sensor data is passed directly to an application, and sensor fusion outputs in which the sensor data is synthesized into more powerful data types. Some applications can use pass-through sensor data directly. This data can be used at “face value” for a variety of scenarios. One such scenario is an app that implements a pedometer to count your steps as you walk. The graph below shows the output of the accelerometer for a person walking with a tablet PC. This graph clearly shows it is possible to detect every step the person took. [url=http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/1185.Raw_2D00_accelerometer_2D00_data_2D002D002D00_user_2D00_walking_2D00_with_2D00_device_5F00_215C9767.jpg" target="_blank"><img title="Raw accelerometer data - user walking with device" style="margin-right: auto margin-left: auto float: none display: block background-image: none" border="0" alt="Acceleration X, Acceleration Y and Acceleration Z shown as lines on chart with regular variation which represents the movement from each step the user takes." src="http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/1108.Raw_2D00_accelerometer_2D00_data_2D002D002D00_user_2D00_walking_2D00_with_2D00_device_5F00_thumb_5F00_1A3D5AEF.jpg" width="560" height="239" /></a>   But, as our experiments revealed, many applications can’t effectively use the raw sensor data. Some of these applications include: [list] [*]Compass apps [*]Enhanced navigation and augmented reality apps [*]Casual games [*]3D gaming apps [/list] Here’s a screenshot from a 3D game sample: [url=http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/2235.MySimple3DGameScreenshot_5F00_112DA328.jpg" target="_blank"><img title="3D first-person shooter sample game" style="border: 0px currentcolor margin-right: auto margin-left: auto float: none display: block background-image: none" border="0" alt="Simple 3D Game with targets shown in a simulated 3D room" src="http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/2620.MySimple3DGameScreenshot_5F00_thumb_5F00_14CBBE05.jpg" width="560" height="315" /></a> [i]3D first-person shooter game (shown at //Build/)[/i] These applications need to use sensor fusion data in order to support the features they implement. The “magic” of sensor fusion is to mathematically combine the data from all three sensors to produce more sophisticated outputs, including a tilt-compensated compass, an inclinometer (exposing yaw, pitch, and roll), and more advanced representations of device orientation. With this kind of data, more sophisticated apps can produce fast, fluid, and responsive reactions to natural motions. By integrating a sensor fusion solution, Windows 8 provides a complete solution for the full range of applications. Sensor fusion in Windows solves the problems of jittery movement and jerky transitions, reduces data integrity issues, and provides data that allows a seamless representation of full device motion in 3D space (without any awkward transitions). Working with hardware partners While designing a sensor fusion solution for Windows, we also needed to help hardware designers to take advantage of this solution by partnering with them early. Designing a sensor fusion system is relatively easy if you’re designing a single device. But Windows runs on many kinds of PCs in many form factors, using hardware components from many different manufacturers. We needed to provide a solution that enabled the entire ecosystem of Windows hardware partners to participate. The first step was to provide a baseline of performance for sensor packages that would work with Windows’ sensor fusion solution. Using Windows certification guidelines, we provided specifications for sensor performance. To help hardware companies verify that their solutions were compatible with Windows, we built a number of tests, which we provide with the Windows Certification kit. Reducing the cost of developing and supporting drivers was another challenge. In order to make it simpler for sensor hardware manufacturers and PC makers, we wrote a single Microsoft-supplied driver that would work with all Windows-compatible sensor packages connected over USB and even lower power busses like I2C. This sensor class driver enables hardware companies to innovate with sensor hardware while ensuring that their hardware can be supported easily with drivers that ship with the Windows operating system. To help speed adoption of the class driver, Microsoft worked with industry partners to introduce the specification into public standards. In July 2011 the standard for sensors was introduced in the HID (Human Interface Device) specification of the USB-IF (HID spec version 1.12, introduced with [url=http://www.usb.org/developers/hidpage/HUTRR39b.pdf" target="_blank">review request #39</a>). This standardization enables any sensor company to build a sensor package that is compatible with Windows 8 by following the public standard USB-IF specifications for compliant device firmware. This reduces the time and cost required to integrate sensor hardware with Windows 8 PCs. Other benefits include a lower support cost and more consistent hardware capabilities for Windows 8 PCs that are equipped with sensors. But beyond standardizing the class driver, we also wanted to optimize the performance of the sensor fusion solution, and minimize its impact on battery life. Each active sensor on a system draws power, and sending data up the stack consumes both memory and CPU time. We helped minimize the power and performance impact for sensor fusion systems running on Windows 8 in two major ways: 1. We architected the sensor fusion interfaces in Windows 8 to enable much of the processing of sensor fusion data to happen at the hardware level. This hardware-level sensor fusion capability means that computationally expensive algorithms don’t have to run on the main CPU, saving power and CPU cycles. 2. We implemented powerful filtering mechanisms that we tied directly to the needs of sensor apps running at any given point in time. This pay-for-play data and event model means that sensor data is only sent up the stack at the rate that apps need that data, and no faster. This results in greatly reduced CPU utilization for sensor data throughput. [size=5]Sensors and Metro style apps[/size] To pull all of this together, our final challenge was to make the power and promise of sensor fusion available to those writing Metro style apps. To enable this, we designed a sensor API as part of the new WinRT. Through these APIs, developers can access the power of sensor fusion from any Metro style app. These APIs are clean and simple, and at the same time give developers access to the data needed to support everything from casual games to virtual reality applications. Of course these capabilities are all available as Win32 APIs for game developers or other uses in desktop applications. The following JavaScript code snippet shows how easy it is to get access to an accelerometer and subscribe to events using the Windows Runtime:

For more information about support for sensors in the Windows Runtime, please see this //build/ session on [url=http://channel9.msdn.com/events/BUILD/BUILD2011/PLAT-781T">using location & sensors in your app</a>.

You may be wondering at this point how you can try out sensor fusion on Windows 8, or even write some apps that use these new capabilities. Developers who attended the //build/ conference in 2011 received the Samsung Windows 8 Developer Preview slate PC, which included a full package of sensors. There were only about 4,000 of those given out, so of course, not everyone had the opportunity to get one. The good news is that the same 9-axis sensor fusion system that was built into the Windows Developer Preview device is now available online for purchase from ST Microelectronics. The “ST Microelectronics eMotion Development Board for Windows 8” (model # STEVAL-MKI119V1) attaches via USB, and works with the HID sensor class driver that’s included in Windows 8. If you’ve downloaded the Developer Preview version of Windows 8 and are itching to try out the sensor experience you should consider getting one of these devices.

<img title="ST Microelectronics eMotion Development Board for Windows 8 (model # STEVAL-MKI119V1)" style="border: 0px currentcolor display: inline background-image: none" border="0" alt="Circuit board attached to USB dongle" src="http://blogs.msdn.com/cfs-file.ashx/__key/communityserver-blogs-components-weblogfiles/00-00-01-29-43-metablogapi/3718.eMotion_5F00_Board_5F00_500_5F00_5ADC7B18.jpg" width="500" height="340" />

[b]ST Microelectronics eMotion Development Board for Windows 8 [/b]

Now let’s take a look at sensor fusion in action!

<video controls="controls" poster="http://video.ch9.ms/ch9/bfda/824ab48a-2b96-49b4-85de-9fda0142bfda/SupportSensorsinWin8_512_ch9.jpg" width="480" height="270"> <source src="http://video.ch9.ms/ch9/bfda/824ab48a-2b96-49b4-85de-9fda0142bfda/SupportSensorsinWin8_low_ch9.mp4"></source>Your browser doesn't support HTML5 video. </video>


Download this video to view it in your favorite media player:


[url=http://video.ch9.ms/ch9/bfda/824ab48a-2b96-49b4-85de-9fda0142bfda/SupportSensorsinWin8_high_ch9.mp4">High quality MP4</a> | [url=http://video.ch9.ms/ch9/bfda/824ab48a-2b96-49b4-85de-9fda0142bfda/SupportSensorsinWin8_low_ch9.mp4">Lower quality MP4</a>

-- Gavin[img]http://freepchelp.forum/data/MetaMirrorCache/98788fcebaa92d1dd5b383627d508669._.gif[/img]

Source: [url=http://blogs.msdn.com/b/b8/archive/2012/01/24/supporting-sensors-in-windows-8.aspx]Windows 8 Blog
 
Last edited:
Back
Top