The iPhone, like a lot of high-end smart phones these days, comes with a number of sensors: camera, accelerometer, GPS module, and digital compass. We’re entering a period of change, more and more users expect these sensors to be integrated into the “application experience.” If you application can make use of them, it probably should.
He is the author of a number of books, and from time to time he also stands in front of cameras. You can often find him at conferences talking about interesting things, or deploying sensors to measure them. He recently rolled out a mesh network of five hundred sensors motes covering the entire of Moscone West during Google I/O. He’s still recovering.
He sporadically writes blog posts about things that interest him, or more frequently provides commentary in 140 characters or less. He is a contributing editor for MAKE magazine, and a contributor to the O’Reilly Radar.
A few years ago he caused a privacy scandal by uncovering that your iPhone was recording your location all the time. This caused several class action lawsuits and a U.S. Senate hearing. Several years on, he still isn’t sure what to think about that.
Alasdair is a former academic. As part of his work he built a distributed peer-to-peer network of telescopes which, acting autonomously, reactively scheduled observations of time-critical events. Notable successes included contributing to the detection of what was—at the time—the most distant object yet discovered, a gamma-ray burster at a redshift of 8.2.
Help us make this conference the best it can be for you. Have questions you'd like this speaker to address? Suggestions for issues that deserve extra attention? Feedback that you'd like to share with the speaker and other attendees?
Join the conversation here (requires login)