I guess driving on 101 was still better than flying all the way from Japan or India or Russia.
The sond day was much more eventful than the first day.
I started my second day with 'Inside Android User Interface' by Karim Yaghmour. He explained how Android display stack is different than Linux display stack, how is WindowManager used to add views and also touched upon StatsuBarManagerService. Do you know you can send commands to any system service from adb shell? I though that was pretty neat especially during debugging.
For example, if you want to expand the status bar, you can do this on adb shell
$ service call statusbar 1
In order to close the notification tray which was expanded by previous call, send this command
$ service call statusbar 2
The general form of the command is service call
The highlight of the second day was a keynote from Romain Guy and Chet Haase. They talked about what's new in 4.1 and 4.2, nothing much different from their Google IO session except for recent 4.2 enhancements. But nevertheless, it's always fun to listen to these Android Ninjas.
Then it was time to meet people at Exhibition hall, umm ... actually it was time to collect swags and t-shirts. Many of the companies were either focused on testing or monetizing apps. The only interesting spots were Google booth for Android Robots, Amazon booth because it was Amazon, Qualcomm booth with MDP and SnapDragon demo and Immersion tech booth because this was something new for me - use of Haptics tech with Android hardware - can make apps more accessible. Got a chance to ask Romain Guy a few questions and the discussion was quite insightful.
This was an interesting slide from Qualcomm's lightening talk.
On Thursday, the lunch was special too. Robin Jeffries, a Google engineer and a member of the Board of Advisors for the Anita Borg Foundation hosted Women in Android lunch. It was nice to see more geek chics in the room and more importantly Robin's talk was quite inspiring.
After lunch, I attended 'Optimizing Android UI - Tips and Tricks' by Jason Ostrander. He is the author of Android UI Fundamentals book. He quotes many examples from his book and overall it was a good session. He touched upon Handlers and Services to offload work from main thread, use of Loaders for grabbing data from network sources, ViewHolder pattern to create smooth scrolling lists and many more ...
This was followed by coffee and ice-cream break and a chance to collect more goodies from exhibition hall. And then it was time for yet another keynote, by Mike Shaver, Director of Mobile Engineering at Facebook. The theme of his talk was 'scale' and this was by far the most entertaining talk. He glossed over the challenges posed by Android fragmentation, more than 900 devices and many more coming up with Gingerbread ...
One more day; four more sessions and a hope to win Nexus 7 -
I was really looking forward to attending 'Extending the Android Vibrate Function for Games' especially after the brief discussion I had with Bob at the exhibition hall.
Android supports an API to specify the duration of Vibrator. There is no API to control the frequency, voltage or any other parameters. Immersion Tech. focuses on haptics technologies and in their efforts to promote haptics tech adoption, they have developed a library of over 120 vibration effects. What was more interesting to me was the possibility of making some apps more accessibleusing these various effect. In their presentation, they proposed the idea of Haptics Clock which anyone can use to check time even without looking at the screen.
I have this idea, not sure if it's feasible. Assume a message is written in Braille script and someone does not want it be read to him aloud. In such cases he could touch the screen, and every dot will generate a distinct vibration pattern and it would be possible to read any arbitary message.
Anyways, it's free to use by developers and it's really easy to use. They work with OEMs in order to include more precise motors and implement low level APIS to control the motor. Apparently, OEMs are interested in this technology as it allows them to distinguish their decives based on enhanced Game experience ... Samsung S3 & NOOK have such motors and support their library natively,
Later they announced the winners of various raffles. Oh btw someone gave away an iPad!
So the next session was a long one. 'Battle tested patterns in Android Concurrency'. He had lot of slides and a lot of information to share.
In a nutshell, here are main points
Google gave away some (only 4 or 5) fur toys, was happy to grab one :)
Overall it was a great experience!
The sond day was much more eventful than the first day.
I started my second day with 'Inside Android User Interface' by Karim Yaghmour. He explained how Android display stack is different than Linux display stack, how is WindowManager used to add views and also touched upon StatsuBarManagerService. Do you know you can send commands to any system service from adb shell? I though that was pretty neat especially during debugging.
For example, if you want to expand the status bar, you can do this on adb shell
$ service call statusbar 1
In order to close the notification tray which was expanded by previous call, send this command
$ service call statusbar 2
The general form of the command is service call
The highlight of the second day was a keynote from Romain Guy and Chet Haase. They talked about what's new in 4.1 and 4.2, nothing much different from their Google IO session except for recent 4.2 enhancements. But nevertheless, it's always fun to listen to these Android Ninjas.
Then it was time to meet people at Exhibition hall, umm ... actually it was time to collect swags and t-shirts. Many of the companies were either focused on testing or monetizing apps. The only interesting spots were Google booth for Android Robots, Amazon booth because it was Amazon, Qualcomm booth with MDP and SnapDragon demo and Immersion tech booth because this was something new for me - use of Haptics tech with Android hardware - can make apps more accessible. Got a chance to ask Romain Guy a few questions and the discussion was quite insightful.
This was an interesting slide from Qualcomm's lightening talk.
On Thursday, the lunch was special too. Robin Jeffries, a Google engineer and a member of the Board of Advisors for the Anita Borg Foundation hosted Women in Android lunch. It was nice to see more geek chics in the room and more importantly Robin's talk was quite inspiring.
After lunch, I attended 'Optimizing Android UI - Tips and Tricks' by Jason Ostrander. He is the author of Android UI Fundamentals book. He quotes many examples from his book and overall it was a good session. He touched upon Handlers and Services to offload work from main thread, use of Loaders for grabbing data from network sources, ViewHolder pattern to create smooth scrolling lists and many more ...
This was followed by coffee and ice-cream break and a chance to collect more goodies from exhibition hall. And then it was time for yet another keynote, by Mike Shaver, Director of Mobile Engineering at Facebook. The theme of his talk was 'scale' and this was by far the most entertaining talk. He glossed over the challenges posed by Android fragmentation, more than 900 devices and many more coming up with Gingerbread ...
One more day; four more sessions and a hope to win Nexus 7 -
I was really looking forward to attending 'Extending the Android Vibrate Function for Games' especially after the brief discussion I had with Bob at the exhibition hall.
Android supports an API to specify the duration of Vibrator. There is no API to control the frequency, voltage or any other parameters. Immersion Tech. focuses on haptics technologies and in their efforts to promote haptics tech adoption, they have developed a library of over 120 vibration effects. What was more interesting to me was the possibility of making some apps more accessibleusing these various effect. In their presentation, they proposed the idea of Haptics Clock which anyone can use to check time even without looking at the screen.
I have this idea, not sure if it's feasible. Assume a message is written in Braille script and someone does not want it be read to him aloud. In such cases he could touch the screen, and every dot will generate a distinct vibration pattern and it would be possible to read any arbitary message.
Anyways, it's free to use by developers and it's really easy to use. They work with OEMs in order to include more precise motors and implement low level APIS to control the motor. Apparently, OEMs are interested in this technology as it allows them to distinguish their decives based on enhanced Game experience ... Samsung S3 & NOOK have such motors and support their library natively,
Later they announced the winners of various raffles. Oh btw someone gave away an iPad!
So the next session was a long one. 'Battle tested patterns in Android Concurrency'. He had lot of slides and a lot of information to share.
In a nutshell, here are main points
- get off the main UI thread
- don't leak activity references
- don't just use background threads as they keep running even if Activity is destroed of not handled carefully
- Same applies to Asynctask, tehy are not tied with Activity Life cycle
- Loaders are aware of Activity Life Cycle - hence post data to the running insatce, managed by LoaderManager - much better than AsyncTask
- Loaders cannot throw exceptions which an activity can catch
- keep thread count = # cores - 1 => leave some CPU for UI activities!
- If you have to, use old school counterlatches to achieve synchronization
- use ExecutorService whch has its own threadpool (plain old Java)
- LocalBroadcasts can be used as a local data exchange method between loaders and activites - super cheap (work like callback registry)
Google gave away some (only 4 or 5) fur toys, was happy to grab one :)
Overall it was a great experience!