The Android | What is Android | Androids | Android Market | App for Android | Android Apps | Android Application | Download Android | Tablet Android | Tablet PC | Phone Android | Android 2.2 | Android 2.3 | Android Free | Games for Android | Android Samsung | Galaxy Android | Android Google | Games Android | アンドロイド | Android | Android 2 | Android PC | Google Android | Android Galaxy | Phones with Android | Which Android Phone | What is an Android Tablet | What is an Android Phone
Tis the Season to be Merry and Mobile
Come to Our Virtual Office Hours
Starting this week, we're going to be holding regular IRC office hours for Android app developers in the #android-dev channel on irc.freenode.net. Members of the Android team will be on hand to answer your technical questions. (Note that we will not be able to provide customer support for the phones themselves.)
We've arranged our office hours to accommodate as many different schedules as possible, for folks around the world. We will initially hold two sessions each week:
- 12/15/09 Tuesday, 9 a.m. to 10 a.m. PST
- 12/17/09, Thursday 5 p.m. to 6 p.m. PST
- 12/22/09, Tuesday 9 a.m. to 10 a.m. PST
- 01/06/10 Wednesday 9 a.m. to 10 a.m. PST
- 01/07/10 Thursday 5 p.m. to 6 p.m. PST
Check Wikipedia for a helpful list of IRC clients. Alternatively, you could use a web interface such as the one at freenode.net. We will try to answer as many as we can get through in the hour.
We hope to see you there!
Optimize your layouts
Writing user interface layouts for Android applications is easy, but it can sometimes be difficult to optimize them. Most often, heavy modifications made to existing XML layouts, like shuffling views around or changing the type of a container, lead to inefficiencies that go unnoticed.
$ layoutopt samples/
samples/compound.xml
7:23 The root-level <FrameLayout/> can be replaced with <merge/>
11:21 This LinearLayout layout or its FrameLayout parent is useless samples/simple.xml
7:7 The root-level <FrameLayout/> can be replaced with <merge/>
samples/too_deep.xml
-1:-1 This layout has too many nested layouts: 13 levels, it should have <= 10!
20:81 This LinearLayout layout or its LinearLayout parent is useless
24:79 This LinearLayout layout or its LinearLayout parent is useless
28:77 This LinearLayout layout or its LinearLayout parent is useless
32:75 This LinearLayout layout or its LinearLayout parent is useless
36:73 This LinearLayout layout or its LinearLayout parent is useless
40:71 This LinearLayout layout or its LinearLayout parent is useless
44:69 This LinearLayout layout or its LinearLayout parent is useless
48:67 This LinearLayout layout or its LinearLayout parent is useless
52:65 This LinearLayout layout or its LinearLayout parent is useless
56:63 This LinearLayout layout or its LinearLayout parent is useless
samples/too_many.xml
7:413 The root-level <FrameLayout/> can be replaced with <merge/>
-1:-1 This layout has too many views: 81 views, it should have <= 80! samples/useless.xml
7:19 The root-level <FrameLayout/> can be replaced with <merge/>
11:17 This LinearLayout layout or its FrameLayout parent is useless
layoutopt.bat
in the tools directory of the SDK and on the last line, replace %jarpath%
with -jar %jarpath%
.New version of Google Mobile App for iPhone in the App Store
In this version, we have a redesigned search results display that shows more results at once and, more importantly, opens web pages from the results within the app. This will get you to what you need faster, which is always our goal at Google.

For those less utilitarian and more flamboyant, we've exposed our visual tweaks settings called "Bells and Whistles" - some of our users had discovered this already in previous versions. You can style your Google Mobile App in any shade: red, taupe, or even heliotrope. If you're on a faster iPhone, like the iPhone 3GS, you may want to try the live waveform setting which turns on, as the name suggests, a moving waveform when you search by voice.
On the subject of searching by voice, you can now choose your spoken language or accent. For example, if you're Australian but live in London, you can improve the recognition accuracy by selecting Australian in the Voice Search settings. And now both Mandarin and Japanese are supported languages as well.
If you don't have Google Mobile App yet, download it from the App Store or read more about it. If you have any suggestions or comments, feel free to join in on our support forums or suggest ideas in our Mobile Products Ideas page. You can also follow us on Twitter @googlemobileapp.
Posted by Alastair Tse, Software Engineer
Mobile Search for a New Era: Voice, Location and Sight
A New Era of Computing
Mobile devices straddle the intersection of three significant industry trends: computing (or Moore's Law), connectivity, and the cloud. Simply put:
- Phones get more powerful and less expensive all the time
- They're connected to the Internet more often, from more places; and
- They tap into computational power that's available in datacenters around the world
Just think: with a sensor-rich phone that's connected to the cloud, users can now search by voice (using the microphone), by location (using GPS and the compass), and by sight (using the camera). And we're excited to share Google's early contributions to this new era of computing.
Search by Voice
We first launched search by voice about a year ago, enabling millions of users to speak to Google. And we're constantly reminded that the combination of a powerful device, an Internet connection, and datacenters in the cloud is what makes it work. After all:
- We first stream sound files to Google's datacenters in real-time
- We then convert utterances into phonemes, into words, into phrases; and
- We then compare phrases against Google's billions of daily queries to assign probability scores to all possible transcriptions; and
- We do all of this in the time it takes to speak a few words

Looking ahead, we dream of combining voice recognition with our language translation infrastructure to provide in-conversation translation [video]-- a UN interpreter for everyone! And we're just getting started.
Search by Location
Your phone's location is usually your location: it's in your pocket, in your purse, or on your nightstand, and as a result it's more personal than any PC before it. This intimacy is what makes location-based services possible, and for its part, Google continues to invest in things like My Location, real-time traffic, and turn-by-turn navigation. Today we're tackling a question that's simple to ask, but surprisingly difficult to answer: "What's around here, anyway?"
Suppose you're early to pickup your child from school, or your drive to dinner was quicker than expected, or you've just checked into a new hotel. Chances are you've got time to kill, but you don't want to spend it entering addresses, sifting through POI categories, or even typing a search. Instead you just want stuff nearby, whatever that might be. Your location is your query, and we hear you loud and clear.

Of course our future plans include more than just nearby places. In the new year we'll begin showing local product inventory in search results [video]; and Google Suggest will even include location-specific search terms [video]. All thanks to powerful, Internet-enabled mobile devices.
Search by Sight
When you connect your phone's camera to datacenters in the cloud, it becomes an eye to see and search with. It sees the world like you do, but it simultaneously taps the world's info in ways that you can't. And this makes it a perfect answering machine for your visual questions.
Perhaps you're vacationing in a foreign country, and you want to learn more about the monument in your field of view. Maybe you're visiting a modern art museum, and you want to know who painted the work in front of you. Or maybe you want wine tasting notes for the Cabernet sitting on the dinner table. In every example, the query you care about isn't a text string, or a location -- it's whatever you're looking at. And today we're announcing a Labs product for Android 1.6+ devices that lets users search by sight: Google Goggles.
In a nutshell, Goggles lets users search for objects using images rather than words. Simply take a picture with your phone's camera, and if we recognize the item, Goggles returns relevant search results. Right now Goggles identifies landmarks, works of art, and products (among other things), and in all cases its ability to "see further" is rooted in powerful computing, pervasive connectivity, and the cloud:
- We first send the user's image to Google's datacenters
- We then create signatures of objects in the image using computer vision algorithms
- We then compare signatures against all other known items in our image recognition databases; and
- We then figure out how many matches exist; and
- We then return one or more search results, based on available meta data and ranking signals; and
- We do all of this in just a few seconds
Computer vision, like all of Google's extra-sensory efforts, is still in its infancy. Today Goggles recognizes certain images in certain categories, but our goal is to return high quality results for any image. Today you frame and snap a photo to get results, but one day visual search will be as natural as pointing a finger -- like a mouse for the real world. Either way we've got plenty of work to do, so please download Goggles from Android Market and help us get started.
The Beginning of the Beginning
Posted by Vic Gundotra, Vice President of Engineering
Android SDK Updates
Today we are releasing updates to multiple components of the Android SDK:
- Android 2.0.1, revision 1
- Android 1.6, revision 2
- SDK Tools, revision 4
Android 2.0.1 is a minor update to Android 2.0. This update includes several bug fixes and behavior changes, such as application resource selection based on API level and changes to the value of some Bluetooth-related constants. For more detailed information, please see the Android 2.0.1 release notes.
To differentiate its behavior from Android 2.0, the API level of Android 2.0.1 is 6. All Android 2.0 devices will be updated to 2.0.1 before the end of the year, so developers will no longer need to support Android 2.0 at that time. Of course, developers of applications affected by the behavior changes should start compiling and testing their apps immediately.
We are also providing an update to the Android 1.6 SDK component. Revision 2 includes fixes to the compatibility mode for applications that don't support multiple screen sizes, as well as SDK fixes. Please see the Android 1.6, revision 2 release notes for the full list of changes.
Finally, we are also releasing an update to the SDK Tools, now in revision 4. This is a minor update with mostly bug fixes in the SDK Manager. A new version of the Eclipse plug-in that embeds those fixes is also available. For complete details, please see the SDK Tools, revision 4 and ADT 0.9.5 release notes.
One more thing: you can now follow us on twitter @AndroidDev.
Keep your starred items in sync with Google Maps
Google Maps for mobile has long allowed you to add stars on a map to mark your favorite places. You may have noticed a few months ago that Google Maps for desktop browsers introduced the ability to star places as well. Unfortunately, there was no way to keep these starred places in sync with Google Maps on your phone. With today's release of Google Maps for mobile 3.3 on Windows Mobile and Symbian phones, you'll now be able to keep the starred places on your phone and on your computer completely synchronized. It's like magic, but magic that you can use. Let me show you how:
My colleague Andy is at his desk right now, and he wants to check out some comedy in London tonight. Google Maps lists the 4th result as Upstairs at the Ritzy -- it sounds like a great spot: cheap, fun and comfortable. With one click, Andy stars the item and he's done. When he walks out of the office and turns on Google Maps on his Nokia phone, Upstairs at the Ritzy will be the top place in his list of Starred Items, and it will show up as a star on his map. From there he can call the theater, get walking directions, or even SMS the address to a friend.


Starring on Google Maps for desktop computers and Google Maps for mobile
Starring places also works great when you're out on the town and you find cool spots using your phone. I was in Paris with my wife recently. We visited the obvious tourist spots like la tour Eiffel and le Musée du Louvre, but we also found a few interesting places we hadn't expected. While wandering the streets of Paris, we stumbled upon a cafe...the sort of place you'll remember forever, but immediately forget the name. I started Google Maps on my Nokia phone, searched for the name of the cafe (Les Philosophes) and starred it, knowing that when I come back to Google Maps on my computer at home, it will be starred, right there, on my map. How cool is it to create a trail of interesting places from your phone?!
For users upgrading from an older version of Google Maps for mobile, you'll be asked, when you log in, whether you'd like to synchronize your existing starred items with your Google Account. This means you can preserve all the work you've put into customizing your map on your mobile, and have it show up, conveniently, in Google Maps in your desktop browser.
To enjoy the benefits of all this mobile synchronization goodness, download Google Maps for mobile for your Symbian or Windows Mobile phone by visiting m.google.com/maps in your mobile browser. And don't worry, we're busy building this same functionality into our other mobile versions of Google Maps -- so sit tight.
Posted by Flavio Lerda and Andy McEwan, Software Engineers