Andrew Campbell - mobile sensing (from IT conversations) The settle difference in architecture which leads to at least some hope that that the privacy issues can be dealth with. With active badge, it kind of assumed that there was some computer infrastructure outside of you monitoring the badges and doing things; whereas with smart phones, the smart phone is mine, I carry around with me and I don't control all of the signals that emanates. Talks about space-time data tracking by phone companies to realize how pervasive your whereabouts are known or at least knowable. The fact that these sensors are at least nominally under my control which gives us a path to addressing the privacy challenges. There are some open problems. For instance, you can download an application and typically your smart phone will ask you if that application can use that accelerometer, camera, GPS sensor, or microphone, etc. To some degree, you have a certain amount of ability to say yes or no, but it is difficult to keep track these days. What applications you have given access to what hardware sensors. As soon as the access given to these sensors, that information can be exported off the phone, so we have to be careful when we're thinking of these new area of smart phone sensing and at the same time, what ways to maintain the privacy and anonymity of the user if they choose. CenceMe- instead of obtaining your facebook page through text, it automated that. The phone could determine if you are sitting, walking, biking, or whatever. Push that sensing information to your facebook page. I'd be fine with sharing what I'm doing, where I'm with my social network. Is there truly a way to be protected? The phone is automatically on because of 911 services. The cell phone company knows where you are traveling- instead of technological issue, it is a political or legal issue. People will know where you are. If it's not you, someone will pick up the bluetooth signal coming from your phone. Passive monitoring of your presence will be everywhere. Is there going to be a privacy? If I'm running an app on my phone that is doing some form of sensing, Senseapp: Microphone is the most ubiquitous powerful sensor in the world. There are many microphones in many cell phones. Campbell used the microphone to actually to make inferences about human behavior, what am I doing? context, surrounding: what was happening around me? The phone could reason if you are in a coffee shop or using an ATM machine. The microphone can learn and say that you are in the coffee shop. With this sort of sensing application, the third party, people who are around you are also within the vicinity of the microphone. So, how do you go about protecting the what's called second hand smokers problem, protecting 3rd parties when these people are nearby. How do you protect the privacy of those people when you are running such an application. There are open questions which depend on the sensing modality used. Whether you are using the accelerometer, the camera, gyroscope, microphone. Each of these modalities present different sort of privacy issues. Depending on the application, for instance, if you are running a healtcare application monitoring your EKG, you do not want to leak that information. The next phase in the evolution: moving from smart phone to the cognitive phone, the ability of the phone reason about what you are doing and the environment around you. If you consider a person a sensor collecting information about their lives, there will be huge amount of sensing information associated by each individual. The big question becomes if that information is stored in the cloud somewhere, the ownership of that data is fundamental problem. How do we protect our data, how do we not stop people from using it or but allow people using it through our involvement. We have to have the control of our data and how data is used. Other challenges: technical and engineering challenges. Continuous sensing application access hardware sensors on the phone, they use signal processing techniques to extract information from raw data, called feature. They use these features as input to classification algorithms or machine learning algorithms software running on the phone or in the cloud. This pipeline of raw data, feature extraction, classification and communication with back-end presents a number of different technical problems. Post 2008: CenceMe in Nokia phone- 30 phones (students and faculty in campus)- 10K users distributed around the world- using amazon infrastructure, not the server in the lab. As a researcher, we're in a different ball game. 2011- Android on iOS VibeIt: presents the vibe of the city- takes audio snippets from smart phones and puts them on the map. Sense of what is happening in a coffee or club. Campbell implemented on Amazon. The student who did the back-end coding- found the bug in the code, issuing requests and generating huge amount of computational processing for Amazon (bill from Amazon was 10K instead of $500 for December). What are the types of sensors missing from the smart phones? The typical smart phones usually come with things like ambient light sensors, proximity sensors (how the display gets turned on/off as you use the phone to make a call), GPS, accelerometer (used for movement), gyroscope (used for orientation), digital compass (for direction), microphone, camera, radios (useful to determine the proximity to the other people- so if I'm interested in how people interact socially- I can use bluetooth interactions for example). The next phase, the number of body area sensors will appear. We already have seen this to some degree with recreational sensors such as Nike plus ride- the sensor in a shoe interacts with your phone and it allows you to capture your runs, interact with the community. Availability of medical sensors, more specialized sensors, EKG, EAG types of sensors. wrist band, GRS sensor- tells you how you are doing emotionally. Ear phone can read your blood pressure will interacts with your phone to support health care type of things. Difficult to say what type of sensors will be included. Campbell is excited about the current set of sensors, not the next set of sensors.