Categories
Technical

Accelerometers and InvenSense

We have been integrating Accelerometers in a lot of the systems that we are building. We typically take a module device, integrate it into the OS that we are using and ship it out. About two weeks ago I (along with Ajit M. B.) got to meet InvenSense. InvenSense is a company that more or less created the accelerometer market. This post is a data dump of the meeting.

InvenSense has been a pioneer in the market place, introducing a 2 axis gyro in 2003. Recently they introduced a 9 axis device. We went to them because we were using a head tracker in the Kopin Golden-I that was from HillCrest. We were looking for something smaller that we could put down on the board. We found out that hillcrest was using their sensor.

Today if you use an Accelerometer, it is likely from InvenSense (example the Samsung Galaxy SII). Not only do they design the MEMS devices, they also develop the device drivers so that we can integrate it into our product. Typically the sensor requires a dedicated micro controller to process the data coming from the sensor. The hillcrest device, we found out has an MSP430 that does all the processing of the sensor input which it converts into a mouse type of input over the SPI bus interface.

InvenSense is focused on the sensor market. There are other means of detecting movement, like camera inputs. Windows 8 requires that every device has a 9 axis sensor in it (on a separate note I met with the Sensor team in TI which is having a hard time providing drivers for their sensors).

Their target markets are Wearable sensors, Gesture detection, pointing, tracking. Their secret sauce is in an app specific library that they license to companies trying to integrate their sensors. The library includes auto calibrate, and sync. We also talked a bit about the math behind their sensors and the orientation matrix which is the heart of their system. We also talked about the life of a sensor (6 years or so) and that it can survive a 10,000g shock for 0.2ms.

The typical size of their sensors are 4mm by 4mm by 1mm. 6mA of current. They have it available in QFN and LGA packages. We discussed a bit about their library requirements. They have libraries available for the MSP430, the STM32 (Cortex M3), the AVR, the Atmel parts.

The typical library size is 22K on a micro controller. That is quite a lot. They mentioned that drivers are integrated into Android. We did discuss that they recommended an independent micro controller to convert the sensor input into coherent input. The importance of the real-time nature of it became extremely apparent, when we saw their demo.

The demo is a tablet that has a display of the insides of a large cathedral. This is a augmented/virtual reality application which gives you a feeling of a complete view of the inside of a room. All of us took turns at trying to fool the sensor. Think of it as a tablet that you are holding in front of you. What you see is one part of the cathedral. You move it to the left, right, above your head, below you and you see a complete view. You bring the tablet back in front of you and you are back to the view you started with. They is one of the best spatial recognition demo’s that I have seen. Technically that means that I can walk around the room with a sensor on and it will know where I am and the direction I am facing. There are some very interesting possibilities with that…the first that comes to my mind is a camera that follows where I am in the room. Nice!

Companies that build their sensors into their devices include AKM, Yamaha and Alps.

The intention of writing this up is to start a chain of additional information about the work that we are doing with sensors. Please add comments to this post if you have worked with a sensor. Things like the project ,the part that was used, and anything specific that you needed to do to get it to work.

Leave a comment