Google - Google I/O 2014 — Keynote [tekst, tłumaczenie i interpretacja piosenki]

Wykonawca: Google
Gatunek: Tech

Tekst piosenki

Ladies and gentlemen, please welcome Senior Vice President, Android, Chrome and Apps, Sundar Pichai.

[Sundar Pichai — Senior Vice President for Android, Chrome and Apps]

Thank you, everyone. It's great to see all of you. Welcome to Google I/O. Every year, we look forward to this date. We've been hard at work since last I/O evolving our platforms so that developers like you can build amazing experiences. So thank you for joining us in person.

I/O is a pretty global event. We have viewing parties in over 597 locations in 85 countries, in six continents. And there are over one million people watching this on the live stream today. Let's say hello to a few locations.

London. Hello, London.

Let's say hello to Brazil. Everyone is talking about Brazil today. So if it weren't for I/O, I would be there for the World Cup. I'm tempted to shout "goal!"

Finally, let's go to Nigeria, where we are thrilled to have an all-female developer group in Nigeria. We are working hard to elevate women in computer science. So look forward to seeing what they develop one day. In fact, at I/O this year, we are very excited there is over 20% female participation, which is up from 8% last year.

And even more excited, we are joined by over 1,000 women in this room today. So thank you.

Of course, I/O is when we talk about our two large computing platforms, open platforms, Android and Chrome, which are built from the groundup for developers like you. Today, we are going to give you an update on the momentum we are seeing in mobile. We are living in amazing times. So we want to talk about the mobile momentum we see and how we are evolving our platforms to support that momentum. And more importantly, we are beginning to think and evolve our platforms beyond mobile. You will hear about that from us today.

And, finally, we want to talk to you as developers as to how you can achieve success on top of our platforms, including an update on Google Cloud platform and Google Play. So let's get started.

If you look at global smartphone shipments, the numbers are stunning. The industry shipped over 300 million phones last quarter, so they're on track to ship well over a billion phones each year.

So how is Android doing in the face of this momentum? So in the past, we have talked about cumulative activations of Android. We are switching and focusing on 30-day-active users, users who are currently using their Android devices globally. And you can see the number has been doubling every year. We have gone from 220 million to over 530 million as of last year's I/O. We are very excited as of this year's I/O, we are over one billion 30-day-actives users.

The robot is pretty happy as well. So let's internalise what one billion users actually mean. Android users on a given day sent over 20 billion text messages each and every day. More importantly, perhaps, they take around 93 million selfies every day. The team tells me about 31 million of these are duck faces. We estimate Android users take around 1.5 trillion steps per day, and they pull out their phones and check it over 100 billion times each day. Important use cases which we are working on addressing and you'll hear about it later today.

Developers are building profound experiences on top of smartphones. Stories we hear every day. A few examples. In Kenya, 40% of Kenya's GDP flows through M-PESA, giving unbanked people access to financial transactions throughout the country. NETRA-G, a company uses a smartphone and just off the shelf accessories to measure your eye prescription. And they are as accurate as $50,000 equipment you find in optometrist offices, providing very, very affording care to many people.

And finally, University of Michigan, they are using for their patients, they monitor subtle changes in voice quality using their smartphone to detect early signs of bipolar disorder. So the kind of experiences we are seeing on top of these phones are amazing.

So far, I've been talking about phones. Let's shift to tablets.

We are seeing amazing growth in Android Tablets as well. So there is tremendous adoption of these devices. And if you look at how we are doing vis-à-vis the overall market, Android Tablets accounted for 39% of all shipments two years ago. That number increased to 46% as of last year's I/O.

As of this year's I/O, Android Tablets account for 62% of the overall market. We don't include other variants of Android like Kindle. If you add that, it would go up a few percentage points. You know, these are shipment numbers. Again, we care about usage, so we view these as leading indicators of where usage would be.

If you take a look at tablet usage, we're going to use YouTube as a way — as a proxy to understand usage. A year ago, the total tablet viewership of YouTube, 28% was from Android. That number has gone up, again, to 42%. So we are seeing usage track shipments, and we are very excited people are adopting these devices as well.

Another metric of engagement is app installs. App installs just this year alone on tablet is up by over 200%. So people are really engaging with these devices.

So we are very excited we have a billion users. But we talked about this at last year's I/O. Our goal is to reach the next five billion people in the world. If you look at a map of the world today, all the regions in blue, emerging markets, majority of users don't have a smartphone.

When I go back home to India and other countries like that — thank you — it is exciting to see the impact phones have on people's lives. But it's disappointing that only less than 10% of the population have access to smartphones. We want to change that. So we've been working hard with our ecosystem for — on a very important initiative which we call Android One.

So let me talk to you about Android One.

What we are doing for the first time, if you look at all the OEMs in these countries, each of them has to reinvent the wheel, and in the fast-paced mobile industry, they have to build a new smartphone within nine months. So we want to pool resources and help everyone. So we are working on a set of hardware reference platforms. We identify the components which go into a next-generation smartphone. These are high-quality, affordable smartphones. We qualify vendors so that we provide a turnkey solution for OEMs to more easily build a smartphone.

In addition to hardware, we are working on software as well. So the software on Android One is the same software you see running on Stock Android, Nexus phones and Google Play edition phones. In addition, through Play, we allow OEMs and carriers to add locally-relevant applications on the device which users have full control over.

And finally, we provide full automatic updates, all the software in Android One comes from Google , so we will keep them updated, just like we do with Nexus and Google Play edition phones.

Let's take a look at one example device which we are working on.

So this is a device with Micromax. You can see it has a 4.5-inch screen. It has features which matters to a country like India: dual sim, removable SD cards and FM radio. I'm used to cutting-edge phones, and I've been using this phone for a while. And it is really good. And it costs less than $100.

We are working with many partners, we are going to be launching this around the world. But we start this journey in India. And we are launching this with three OEMs in India in the fall of this year: Micromax, Karbonn, and Spice. We are also working with carriers in these markets to provide affordable connectivity packages with these devices.

What we are excited is this is a leveraged turnkey solution so that at scale, we can bring high-quality, affordable smartphones so that we can get the next billion people access to these devices. And we can't wait to see the impact that it would have.

So we have talked about the momentum in mobile. The next thing we want to talk to you is about how we are evolving our platforms, Android and Chrome. And today, for the first time since we launched Android with Open SDK, we are going to give you a preview of the upcoming L release. You will be able to download this later on your development devices. We've been working very hard. This is one of the most comprehensive releases we have done. It has over 5,000 new APIs, and we are thinking about L release not just for mobile, but for form factors beyond mobile.

One of the things, as we thought about L, we wanted to take a radical new approach to design. User experiences are evolving rapidly, and we wanted to rethink the user design experience in Android to have a fresh, bold, and new look.

To talk about the design for L, let me invite Matias Duarte.

[Matias Duarte — Vice President of Design]

Thank you, Sundar. Design is essential in today's world. It defines your experiences and your emotions. So we challenged ourselves to create a design that was not just for Android phones and tablets. We worked together — Android, Chrome, and across all of Google — to craft one consistent vision for mobile, desktop, and beyond. We wanted a design that was clear and simple and that people would intuitively understand.

So we imagined, what if pixels didn't just have colour, but also depth? What if there was an intelligent material that was as simple as paper but could transform and change shape in response to touch? And this led us to a way of thinking that we call material design.

We drew inspiration from paper and ink. However, unlike real paper, our digital material can expand, reform and reshape intelligently. Material has physical surfaces and edges because the human mind is wired at its most primitive level to instinctively understand objects and their relationships. These scenes and shadows provide meaning about what you can touch and how it will move.

In the real world every small change in position and depth creates subtle, but important changes in lighting and shadows. So as part of the L preview, we'll now allow app developers to specify an elevation value for any UI surface and the framework will render it in correct perspective with virtual light sources and real-time shadows.

[Patrick Brady]

Material design is beautiful and bold because clean typographic layouts are simple and easy to understand. Your content is the focus. So the L preview will allow app developers to easily colourise all framework elements in your app to match the theme to your brand and we're previewing a new support library that we call palette to easily extract colours from images and really put those vivid pictures front and center.

We're giving designers familiar tools like baseline grids that work across screens, grids ensure apps have a consistent rhythm and character and this will allow you to start with a design on a phone and logically and easily bring that same design to tablets and laptops.

Now one design doesn't mean one size fits all. Our guidelines allow you to appropriately adapt the UI, so your users will already know their way around your app no matter what screen they use it on.

And we've also updated our system font Roboto so that designers and developers can use one type face, designed and optimised for every screen from your watch to your laptop to your television.

So now let's talk about animation. It's delightful when your touch is rewarded with motion and material surfaces slide around with the physics of card stock, but they respond to touch with splashes of virtual link that are like ripples in a pond.

As part of the L preview, all of your applications UI building blocks have been updated to include rich, animated touch feedback. And no detail is too small to bring a smile to your face like when the reload button loops around or the playback controls can change.

Finally, in the real world, nothing teleports from one place to another and that's why it's so important to animate every change on screen in a way that makes sense. In the L preview, Android developers will now be able to create seamless animations from any screen to any other between activities and even between apps.

So you're probably wondering how this looks like in practice. We're going to give you a sneak peek at one of our familiar Google applications in the new material design.

Here you can see step by step how we update the design. The typography, the grid changes, and finally the surfaces and bold colours.

And a few small changes make a really big difference. And you can also see how easy it is to take a same design to different screens. Now, I've talked about only a few of the highlights of material design and just some of the APIs that you can try out in the Android L preview.

But as you know, people spend an enormous amount of time on the web and especially the mobile web. Last year at I/O, we announced Polymer, which is a powerful new UI library for the web. Today we're bringing you all of the material design capabilities to the web through Polymer. As a web developer, you will be able to build applications out of material design building blocks with all of the same surfaces, bold graphics and smooth animations at 60 frames per second.

So between the L preview and Polymer, you can bring the same rich, fluid material design to every screen. And to help you take full advantage of this framework, we've also completely redesigned and created one unified set of style guidelines for every screen and for all devices. These guidelines will help designers and developers alike understand best practices and build consistent, beautiful experiences.

We're releasing the first draft of these guidelines as part of our preview today at Google.com/design.

Now that you've seen our new look and feel, I'd like to invite Dave Burke to show you some of the new features in the Android L developer preview.

[Dave Burke — Android Director of Engineering]

All right. So over the last eight months, our team has been busy cooking up the biggest release in the history of Android. And as Sundar mentioned, we've added over 5,000 new APIs touching nearly every aspect of the system.

Now we don't have time to even come close to covering everything in L today. So instead, what I like to do is walk you through some of the highlights of the tremendous steps we're taking on the user experience and on the performance of the Android platform.

so let's start with user experience. Now, bringing material to L is of course a big part of what we're trying to do here. We've added a new material theme, so it's a new style for your application that includes new system widgets, transition animations and animated touch feedback.

We've also added new animation support, so a new drawable for animated ripples, a reveal animator to animate a clipping circle to reveal views and we've extended views to not just have an X and Y component, but also a Z component to provide elevation. So you can float elements of your UI and the framework will cast a real-time shadow for you.

My favorite feature that we've added in support material is the ability to customise activity, entry and exit animations. You can even include "share here" elements. For example, an image that starts in one activity and animates seamlessly through translation and scaling into another.

So let's take a look at this in practice. Let's have a look at now we're all familiar with, which is the phone dialer. Thanks, Marcelo.

Okay. So the first thing you will notice when you fire up the phone dialer are those bold material colours and shadows. And you will see the ripple touch effect as I touch each of the tabs and you get a more subtle material touch effect on the recent calls. You'll see that the dialer button has is elevation sets, so it's floating above the UI and as I tap it you get these really nice delightful animations.

Now another feature we added to support material is something we call nested scrolling and the idea is as I scroll we propagate the scroll events up the view hierarchy and different parts of your views can respond differently. So for example, as I scroll upwards here you will notice that the recent call to Marcelo will start to shrink and disappear, then the search box will start getting pushed upwards and the tabs will lock into place. It's a really nice effect.

So let's go over to the dialer. So it turns out my mom's a big fan of material design. I need to go call her up and tell her about how to set elevations on her views. I know she loves that. So let's get start dialing. You will see that ripple touch effect again emanating out from the buttons. Then when I go to place a call, you will see a view animator and will animate into the — in call screen like so. So it's a really nice effect.

Okay. So that's a quick taster of material in L and what you're seeing here is really a sneak peek of work in progress. We wanted to give you guys early access so you could start bringing material to your apps.

And we also recognise the change in the UI in such a big way will take some time. So we started with a dialer as a showcase. Over the coming summer months, we'll be extending material to all aspects of our apps and the system and the result is going to be a dramatically enhanced, fresh user experience.

Okay. So another area where we've improved the user experience on L is around notifications. One of the most frequent reasons we all take our phone out of our pocket everyday is to respond to incoming notifications. We all do this dozens and dozens of times a day. So we wanted to streamline the process. Everything from the phone buzzing to you acting on the notification.

In L we give you instant interactive access to notifications right from the lock screen. So now you can read, open and dismiss in seconds.

So let's take a look at my device. The first thing you'll see are all my top notifications on the lock screen and we're rendering them as sheets of material, they animate really beautifully. If I touch them you can see that material touch effect.

Now in L, we've improved the way Android organises and prioritises notifications by analyzing this behavior to make sure only the most useful, relevant notifications are presented to you. I can swipe down and I get my full list of notifications. And we've done a clever thing here where we've merged the notification shade, so something that's been in Android since 1.0 with the lock screen.

And so from here I can double tap on a notification to launch the corresponding app, or if there is something I don't need, I can just dismiss with a single swipe. And to unlock the phone, well this is just the notification shade, so you just swipe it away and you're straight into the device. Fast and simple.

We've also introduced a new type of notification in L that we call the heads-up notification. And this can be used to let you know about something urgent without interrupting what you're doing. So let's say I'm playing any new favorite game, Piano Tiles, and I'm going along here, I'd get my highest score ever, and then all of a sudden I get a call from Marcelo. So from here I can keep going or — if I want to act on that like non-script or if I am busy, swipe it away. And then I can go back to my game and get the highest score I ever got in public. Yeah. All right. That's actually my worst score I've ever got.

Okay, anyway, let's move on. So while we've made the notifications more powerful, if you're one of the approximately 15% of people who has a pin or pattern lock you waste many minutes a day cumulatively on that thinly task of entering your pin. So we figured there's got to be a better way.

In L we're introducing a new concept, we call, personal unlocking. And personal unlocking enables the device to determine if it's in a trusted environment, say in the owner's hand or beside the owner on a table. Person unlocking uses signals such as locations you designate, Bluetooth devices that are visible, even your unique voice print. So for example, let's have a look at this device. Thanks Marcelo.

So I currently have a pattern unlock on this device, but because I'm wearing a Bluetooth watch my phone knows it's me who is present and so it doesn't challenge me with an unlock. So for example, if I just swipe up, the phone will unlock just like that.

Now, let me just reset that. If I take any watch off, let me just hand it to Marcelo. Okay. So now my phone can no longer see the watch. And because of that my phone cannot ascertain that's me who is present. As a result, my phone will lock down its security. So now when I go to unlock the device it presents me with a pin lock. It's a really great feature.

So that's a few of the user experience improvements we've made to support material and notifications.

Another area of L where we're significantly improving the user experience is around how we've integrated the mobile web into the platform. So to learn more, let me invited up Avni Shah to the stage.

[Avni Shah — Director of Product Management]

Thanks, Dave. A core part of your experience with mobile devices is the mobile web. Just to get a sense of the growth that we've been seeing, at the beginning of last year, we had 27 million monthly active users of Chrome on mobile. Today we have more than 300 million. That's 10x growth. Yeah, it's awesome. It's 10x growth in just the last year alone.

What that means for us is that we need to make the mobile web work well for our developers and our users. Today, I'm going to talk about three ways we're going to do that.

We're enabling material design experiences on the mobile web. We're redesigning recents to help you multitask, and we're extending our capabilities of app indexing to help people get to where they want to go faster.

First let's talk about material design. One of the big parts of your experience with the mobile web is, obviously, the web sites themselves. They need to work well. They need to look great. They need to be fun to use. You heard Matias earlier talking about the philosophy of material design, a bold, consistent, intuitive experience that just works across screens. We've been working really hard at making those experiences not just possible, but the new standard for the mobile web.

To show you what this looks like, my good friend Tom here is going to walk us through an exploration of google.com search results on the mobile web re-envisioned with material design.

So, Tom, let's go ahead and do that search for a "starry night".

Now, the first thing that you see here is that this panel is rendered as a beautiful material-style card. You notice the use of colour, the title is on a blue background. That was actually programmatically matched to the painting. If Tom clicks on expand the card, you'll notice it filled the screen with a continuous animation. If he scrolls, the header will shrink. It won't pop into place, but it has a smooth animation that just makes sense.

Now let's go ahead and click on the suggestion at the bottom to get more of Van Gogh's artwork. You'll see those search results also smoothly animated into place. Tom is going to continue to give us some demo eye candy over here.

And while this is just an exploration that you're seeing, I want to mention that this is fast, fluid, continuous animation at 60 frames per second. This thing was — just wasn't possible a year ago.

We've been working really hard at improving the performance and predictability of the platform to make things like this possible. For example, this demo shows off the work that we've done on touch latency, giving you as a developer notice the touch events earlier in the frame so you have more time to act.

As Matias mentioned earlier, with Polymer — our UI toolkit for the web — all of you can build web experiences that feel as awesome as this.

The next big area we've been thinking about is how to help you multitask. And we think the recents feature in Android is one way we can actually make this easier, especially as tasks across the web and apps, as they often do. So once again, Tom is going to walk us through the changes here.

So, Tom, let's go ahead and click on the recents icon in the lower right. Now, as Tom scrolls through, the first thing that you will notice is recents has also been grounded in material design. You'll notice the overlapping cards have been rendered with realistic shadows and perspective.

But there is another thing going on here that may not be immediately apparent. Tom's Chrome, they also sit here as well. He's been researching restaurants to go in SF. So he has articles from the New York Times and SF Chronicle, here is individual items. You'll notice the site icons and the fav icons there.

AS he scrolls back of it further, you'll notice he has been researching in the Urbanspoon app, he has docs up opening where he's been collaborating with some friends.

So let's go ahead and click on that doc and see what your friends have to say.

I've heard great things about State Bird Provisions. Let's check out that article here. Now, what you see here as this loads, this is actually loading as a web site in Chrome. You'll notice the URL up at the top. Now Tom pops it back into recents, that page is now listed there, along with all of his other open stuff.

I want to point out what the big difference is here. This is a view you couldn't get before today. If you wanted to get to all your open websites, you'd have to go into Chrome and kind of flip through them there. But by bringing all of your individual Chrome tabs here and listing them in your recents view, we're making it really easy for you to move between the web and apps, making multitasking just that much easier.

And last, but not least, this change to Chrome is actually built on top of a new API in L that allows apps to populate multiple items in recents. So for all you app developers, if this kind of thing makes sense for you, you can make use of it as well.

Going a step further, we're also making it easy for you to find content using Google search, whether it's deep in the web or deep in an app. So last fall, we announced app indexing. As a developer, this capability lets you get to app content, lets you get your users to app content, directly from the search results page. Since then, we've been working on a ton of UI improvements and extending some APIs to make this more powerful.

But let me just give you a quick refresher of what this capability enables. So let's go ahead and do a search for Waterbar Restaurant, I've heard good things about it, over by the Embarcadero. As Tom scrolls through the search results, you'll see close to the top there's a link for the home page to Waterbar. And near the bottom of the screen, there's — oh, actually, in the middle of the screen, there's a result for OpenTable.

Now, what's different about this UI is it's linked to OpenTable instead of going to the website, it's actually going to take us to the OpenTable app, because Tom happens to have the app installed.

So let's go ahead and click on that link. And you'll see it takes us directly to Waterbar within the OpenTable app. Up until now, this was only available to a few applications. But today we're opening it up to all Android app developers globally, along with some tools to get you started.

And going further, if your site requires — sorry, if your app requires your users to sign in, you'll be able to use Google+ sign-in in the coming months to have your public content show up in search as well. You know, we thought this would be even better if we could help your users rediscover content that they've already found in your apps. So we're adding a new API in Google Play services to do just that.

So let's quickly show you how this works. Tom found this really cool 3D tour of the Ferry Building earlier and he wants to get back to it. So starting with the search box on his home screen ,he's going to do a search for Ferry Building. And what you will notice at the bottom of the screen is there are search suggestions for Ferry Building marketplace in the Google Earth app. And this is there because this was the app that he was using when he found that tour before, even if he himself didn't remember. With a single click, he'll get taken directly to the tour of the Ferry Building within the Google Earth app.

Now, this is possible because the app is making its content available based on its user's previous actions. We just showed you this with Google Earth. But any app that utilises this new API will have the same capability. For developers, we think this is a great way for you to help your users rediscover content right when they're looking for it.

With that, I'll hand it back to Dave who is going to take you through more of the enhancements you can look forward to in L.

Dave Burke – Director of Engineering, Android

Thanks, Avni. So we've covered some of the highlights of the user experience. But there's lots of other user experience improvements in L, for example, the new keyboard UI, Do Not Disturb mode, your quick settings and much, much more.

Android Virtual Machine

But in the interest of time, let's move on to the second major theme of L. And that's performance. Let's start with the Android Virtual Machine. So you might remember that we made a very early version of our new runtime, ART (Android Runtime), available as a developer option in KitKat. But we got some really great feedback from you guys, as well as some excellent open source contributions from ARM and Intel and MIPS. And I'm excited to say that we're finally ready to pull the trigger on this bad boy, because the L release runs exclusively on the new ART runtime.

So we wrote ART from the groundup to support a mix of ahead of time (AOT) compile, just-in-time compile and interpreted 6code. It's truly cross-platform, so it supports ARM x86, and MIPS. We put a lot of effort into optimizing ART's back-end compilers. And this has resulted in a 2x improvement performance over Dalvik. And best of all, this one is on us. You don't have to make a single change. All of your app code just gets the performance improvement for free.

ART also has a brand-new garbage collector and memory allocator. So this dramatically reduces the number of pauses and duration of pauses associated with a garbage collection event. As a result, your app runs more smoothly. So if you take a look, for example, at Google Maps on both Dalvik and ART, first you'll notice the number of pauses have reduced from two to one. But also the pause duration has reduced from roughly 10 milliseconds down to about 2 milliseconds to 4 milliseconds. So now it fits comfortably in a desync window. No more application stutters.

And there's more. ART doesn't just bring better performance, it's also more memory efficient. So it's smart about when the app is put into the background, in which case we'll apply a slower but more intensive moving collector to save anything from hundreds of kilobytes to many megabytes.

And finally, ART is fully 64-bit compatible. In fact, we've adapted and optimised the entire platform to take advantage of the new 64-bit architectures. So now you can benefit from larger number of registers, newer instruction sets and increased memory addressable space.

So to take advantage of 64-bit, we've added support for new APIs in the NDK, so ARMv8, x86-64 and MIPS64 and of course if your app is written in JAVA, then it will work with absolutely no modification on new 64-bit hardware.

Okay. So that's CPU performance. The other side of the coin is GPU performance, graphics. And I'm really excited about some of the things that we're doing in L in this area.

So historically, mobile graphics has lagged to desktop by virtue of the fact that mobile GPUs are smaller and more power-constrained. But that's changing quickly. Mobile GPU performance is catching up with console graphics and even PC graphics. So in L, we specifically wanted to close the gap between desktop DX11-class graphics capabilities and mobile. And we're doing that with something we call Android Extension Pack.

So we set out to work with GPU vendors, including NVIDIA, Qualcomm, ARM, and Imagination Technologies, and together we define the Android Extension Pack. So it's a set of features that includes things like tessellation, geometry shaders, computer shaders, and advanced ASTC texture compression.

So let's take a look at the Android Extension Pack in action. And what you're about to see is Epic's Unreal Engine 4 desktop rendering pipeline running on Android L on the latest NVIDIA tablet hardware. Now the Android Extension Pack enables much more advanced shaders, so we can have more realistic environments, more realistic characters, and vastly improved lighting.

So let's go start this off. We have a bit of a flicker here.

Well, if you're squeamish, you might not want to look now.

So you have to excuse some of the flickers. It's a vertical electromagnetic storm up here.

Okay, so as I mentioned, this isn't just cut-scenes, actually live and we can play it through the world. Hopefully we can get that up in a moment. Some of the rendering that you saw that it was truly incredible. So there were really amazing reflections in the water, lighting effects. Tessellation were being used for the smoke effects.

And starting with the L release in the fall, you're going to see new high-end tablets and phones shipping on Android with this level of graphics capability. So quite literally, this is PC gaming graphics in your pocket.

The last performance enhancement I want to take you through is on battery. And we've worked hard to make sure that the battery keeps up with the performance. And of course, there are variety of systems and components that tax the battery on a modern phone or tablets. So Wi-fi radios, cell radios, GPS, GPU, et cetera.

And you might remember we've had some previous efforts to improve quality on other releases, so Project Butter for UI smoothness and Jelly Bean, Project Spell for memory for print in Kitkat.

Well in the same team, and brought to you by those same project naming geniuses, we have Project Volta. And the goal of Project Volta is to optimise how the extensive subsystems of the device are used and to improve overall battery life. So the first thing we did was improve our instrumentation of battery data. You can't improve unless you can measure. So we created the tool that we call Battery Historian. And it helps you visualise on a time axes the battery usage information.

Now you can correlate battery discharge with what was happening to the device at the time. So here's an example of Battery Historian from a real device. And on the top graph, you can see an issue where the radio is waking up approximately every 20 seconds, but Battery Historian helps us quickly identify the issue so we could fix it and therefore improve battery life. In fact, we're using this tool to make this system in Google apps more efficient so — and we can expect a significant battery improvement in L.

We've also added a new JobScheduler API to help you optimise power consumption in your apps. So using the JobScheduler API, you can make your application more efficient by allowing the platform to call less non-urgent network requests from multiple apps. As a result, the platform can keep the radio asleep a higher percentage of the time, thus saving significant power.

So you can use the JobsCheduler API for your app to schedule a maintenance task while the phone is connected to the charger, for example. Or also even just to download application updates. I'm sure you've all had that experience where you're out and about, your battery is about 4%, and then applications start updating themselves. It's like, "No!." And then finally, we've added a new Battery Saver Mode in L.

So Battery Saver allows you to clock down CPU, the refresh rate, even turn off background data to conserve battery. And you can trigger it manually or configure it to come on automatically when the battery level is low. So Battery Saver is really great if you're about to embark on a long hike or maybe a long protest and, say, you want the battery to last even longer.

So on the Nexus 5 running in Battery Saver mode, you can extend your battery life by up to 90 minutes of usage within a typical single day's use.

So I just gave you a quick whirlwind tour of some of the highlights of L, how we're improving the user experience through steps like improved design, smarter notification, and intuitive authentication. And also the enhancements on the performance side, so faster run time, better graphics, and stronger battery performance.

But I only scratched the surface of L. And as I mentioned at the start, this is our biggest release to date. You're going to find things like better multitasking, Bluetooth 4.1, burst mode camera API, USB audio support and much much more.

Tomorrow morning, we're going to be making the L developer preview SDK available from developer.android.com and also posting early system images for the Nexus 5 and Nexus 7 so you can start developing for L today.

So with that, let me hand back to Sundar. Thank you.

[Sundar Pichai]

Thank you, Dave. As Dave said, the L release with 5,000 new APIs is one of our most comprehensive, and we are very excited we are sharing it today.

We have a whole new design with L, tons of UX features, and a whole slew of performance improvements. When you take a step back and you look at what we are doing with Android, the approach we are taking is very unique and distinctive. We aren't building a vertically integrated product. What we are doing is building an open platform at scale. We work with hundreds of partners globally to bring the product and the platform that touches billions of people. And we want to do it in a way in which we are innovating at a very, very fast pace.

If you take a look at the innovation that's happening in Android and if you look at some of the recent announcements from others, you can see that things like custom keyboards, widgets, those things happened in Android four to five years ago. We are working very, very hard to bring an open platform and innovate on it at an unprecedented scale. We want to make sure we ship these features to users as fast as possible. That's where 3google Play Services come in.

Google Play Services ships every six weeks. And 93% of our users are on the latest version of Google Play Services across all versions of Android. In fact, by shipping every six weeks, we in many ways can iterate faster than typical OS release cycles. While it's open platform and we want to innovate fast, we want to make sure it's very, very secure as well. So we take security very seriously.

Let's take an example at malware production. In Google Play, we automatically scan every single application for malware. And if users opt in, we even scan applications from outside of Google Play to make sure they are malware-free. Given the popularity of Android, there's a whole vested industry, given there's a lot at stake around security. But based on every data we see, well, well less than half a percent of users ever run into any malware issues. And increasingly, we are pushing security updates through Google Play. So any security updates related to Google Server communications, we are now pushing those updates through Google Play Services so that we can get them to users within six weeks.

With L, we are also launching factory reset production so that if your phone gets stolen, users have full control to disable their phones.

Finally, privacy is an important part of security. So with L release, for the first time, we have a centralised setting, what we call universal data controls, where users can go and manage their important privacy protections. They can control data that is shared from the device, like location history, et cetera. And so we are doing that in L as well.

So far, we have been talking about L release in the context of mobile phones and tablets. But users increasingly are living in a multiscreen world. You are using other connected devices: the television in your living room, you're increasingly wearing things on your body, you are — when you get into your car, you expect a connected experience. We want to work to create a seamless experience across all these connected devices.

So with L, as well with Chrome, we started laying some foundational principles on how we evolve our platforms to support these new connected experiences. So here are a few principles.

We are making everything contextually aware. We want to understand whether you're at home with your kids and you want to be entertained, or you're at work trying to be productive, or maybe you are traveling. We want to bring the right information to you at the right time. We want the experience to be voice-enabled. We are building the most advanced voice recognition infrastructure in the world. And we want to help users interact with computing devices in an intuitive way. For example, when they are driving or cooking, we want voice to be a major source of input. We want the experience to be seamless. It shouldn't matter which device you were using before. We want to pick up where you left off.

And, finally, users always have their smartphone. So we want to make sure all these connected experiences work based on your smartphone, be it your wearables, be it your car, or like we have shown with Chromecast, via television.

So both with L release and Chrome, we are bringing a whole set of new experiences to many connected screens around you. The first area we are going to talk to you about is wearables.

About three months ago, we launched our preview of Android Wear. We announced a developer SDK, and the reception has been very, very positive. To give you a further update, I'm going to invite David Singleton onto the stage.

[David Singleton — Android Director of Engineering]

We're right at the beginning in a new phase in the miniaturisation of technology, which means that it's finally possible to make a powerful computer small enough to wear comfortably on your body all day long. And there's a huge opportunity to bring rich user experiences to these devices. And that's why we're building Android Wear as our platform for wearables based on Android.

Android Wear makes it easy for developers to reach users on this new form factor using precisely the same tools we're already familiar with on Android phones and tablets. People will be wearing these small, powerful devices, so style is important. And that's why Android Wear supports both square and circular screens. And we think that there will be a wide variety of fashionable designs.

Sensors will help them understand your context, so they can provide useful information when you need it and help you reach your fitness goals. And as the device that you always have with you, your watch will also provide intelligent answers to spoken questions. And as Dave showed us earlier, act as your key in a multiscreen world.

Across the world, people check their Android phones an average of 125 times every day. And that's why we've designed Android Wear to quickly show you relevant information and make sure you never miss an important message, while letting you stay engaged with the people that you're actually with. We do this by working to understand the context of what you care about while enabling very brief interactions with the device.

Here's a live demo on the LG G watch. You can see that it has an always-on display, that at any given time shows you the most important thing we know for you. So, Jeff, it looks like your flight to Brazil for the World Cup is on time. I guess you do deserve a break after this big demo. And if Jeff wanted to see more, he can simply raise his watch or tap the screen to switch into vibrant, full colour that you're already seeing here.

Throughout the day, if Jeff receives a notification which buzzes his phone, his watch will vibrate on his wrist and show him what's up at a glance, so he won't miss an important message like this one. Swiping up and down navigates you through this stream of cards, which includes information from Google Now, apps running on Jeff's phone, and apps running directly on the wearable itself. And when there's a page indicator, Jeff can swipe horizontally to see more details.

You can see that we've applied material design here. The cards float above beautiful, textured backgrounds. And just like in your phone's notification shade, you can swipe a card away to remove it from your screen. Let's take a look at Jeff's phone. And that notification has disappeared.

Back at the watch face, pressing and holding lets you choose a different one. You can see that there's a broad selection of analog and digital designs in a variety of styles to suit your tastes. Okay. Now that we're acquainted with the overall UI model, let's see how Android Wear can work for you.

Imagine that Jeff has just got up in the morning. He swipes up and sees the weather forecast for the day. His commute's not looking too bad. And, oh, look, Jeff, I guess you need that package for your trip to Brazil. You better not forget to pick it up.

[Jeff]

Okay, Google, remind me to check my mailbox when I get home.

[David Singleton]

If we can see Jeff's phone at the same time, you'll see this is immediately synced across. And in this case his watch was smart enough to know where home is. A little later on, as Jeff is arriving at the office, his watch vibrates again with a chat message from one of the team. He can see who it's from and what he's saying, all without having to fumble around and get out his phone. Your watch and your phone stay in sync. When you swipe away a notification on the watch it disappears from the phone as Jeff is showing now.

It's super convenient and apps will stay in sync too. While Jeff is making his lunch, he finds it really hard to get the last peanut butter out of the jar and he comes up with an idea, possibly a very brilliant one that he wants to remember.

[Jeff]

Take a note, double sided peanut butter jar.

[David Singleton]

Watch his phone. We'll try that one more time.

[Jeff]

Take a note, double sided peanut butter jar.

[David Singleton]

I guess the universe doesn't want there to be a double sided peanut butter jar after all.

So moving along that note would have been saved immediately to his phone and he can get on with his lunch. In the evening Jeff is having dinner with a friend at a restaurant. If he's unfamiliar with one of the ingredients on the menu he can just say…

[Jeff]

What is limburger?

[David Singleton]

So it looks like limburger is a type of cheese. Jeff is lactose intolerant, so he'd better order something different or this dinner could go wrong. And when Jeff receives a phone call his watch will vibrate and he can see who is calling at a glance. It's another one of Jeff's co-workers. Now Jeff could get out his phone to answer, but since he's busy he can either swipe to reject the call from his wrist or swipe up to choose from one of these SMS replies. His phone sends the SMS and he's done.

Sometimes you're enjoying dinner so much that you want to avoid any more interruptions. And for that you can set do-not-disturb with a single downward swipe from the top of the screen.

And now Jeff's watch won't buzz again until he wants it to. Later that night Jeff arrives home. Oh, that's right, your package is here. Now that he's at home, the reminder that Jeff created this morning has triggered.

You can also use Android wearables to control other devices around you. Let's listen up with a bit of music.

[Jeff]

Play some music.

[David Singleton]

Now you will see that Jeff has music controls on his watch. He can see what song is playing, he can pause the music or skip to the next track. And while it's playing the album art is beautifully displayed right there on his wrist.

Finally at the end of the day it's time for bed.

[Jeff]

Set an alarm for seven AM.

[David Singleton]

With glanceable notifications and quick voice actions, Android Wear gives you what you need, right when you need it. Let's take a closer look at some of the contextual information that Android Wear provides when you're traveling. So Jeff's about to leave on that big trip to the World Cup. It's the morning of his flight so his phone is already displaying relevant information for his trip. He can see his flight status, and even show his boarding pass. His hotel address will be there when he needs it and he knows whether or not he will need to pack an umbrella. It does look like it's going to rain in Brazil on Friday.

And once he's in Brazil, Android Wear continues to give him useful, timely information at a glance, whether it's his restaurant reservation, the time back at home so he knows when to call his family or the local bus schedule. And while he's walking around the city, Jeff can see how many steps he's taken today along with a step count history for the week. On devices that support it he can even check his heart rate after a jog.

So we've shown you what Android Wear can do out of the box. We're even more excited to see what developers build on top of this platform. The Android Wear SDK enables you to build glanceable, contextual apps for this new category of device. Let's talk through the capabilities it gives to developers and then we'll show you some examples.

Right off the bat, Android Wear automatically bridges notifications from your Android phone or tablet directly to your watch. Now, Android's notification APIs already allow you to build beautiful user interfaces with big pictures, actions and more. And there are hundreds of thousands of apps delivering billions of these notifications everyday. And now they're available on your wrist.

Back in March, we released a developer preview enabling apps running on the phone to add things like voice replies, have several pages and group notifications in bundles. With these additions you can begin to provide a tailored experience for wearables and we've used these features to add wear support to Google apps like Hangouts and Gmail and there's been a huge response from developers.

The very best wearable apps respond to the user's context, put glanceable cards on the screen and allow the user to take direct actions in just a few seconds. Here is one of my favorite examples. With Pinterest you can follow other people's pins. Pinterest app will let you know when you're near a place that has been pinned by someone you follow. So Jeff's friend Susie loves Korean barbecue and she's somewhat of an authority on the best restaurants in San Francisco. So when Jeff is in the city, Pinterest can notify him that he's near one of Susie's pinned restaurants. The notification will appear on his wrist just like this and it uses pages allowing him to quickly glance at the details, then swipe to see a map. And if he likes it he can start navigation right from his wrist.

This is using Google Maps for mobile which gives you turn by turn navigations on your watch. And it's particularly useful when you are walking and it works with all Android Wear devices. In addition to what's possible with notifications bridged from the phone, today we're making a full Android Wear SDK available which enables you to write code — it's pretty great. It enables you to write code that runs directly on the wearable itself and almost all the APIs that you're already familiar with on Android are available here. That means that you can present fully customised UI, read sensors directly and much, much more.

We're also introducing a new set of APIs in Google Play services that makes it easy for your app to send data between a phone or tablet and a wearable. And we've road tested these APIs with some developers over the past few weeks. Let's take a look at examples of what they built.

Eat24 is an app that makes food ordering both fun and easy. Watch this. Hopefully I'm going to order a pizza in 20 seconds. When it comes to takeout I'm a creature of habit and Eat24 has recognised this and takes advantage of that contextual stream. Around the same time I 2made an order last week it posted a notification suggesting I order again. I can tap on the notification and launch into their full screen UI. And here I'm presented with a beautiful interface that lets me confirm the kind of food I'd like today. Let's stick with pizza. And then I can quickly swipe to see and repeat my last order. Just one more tap to pay. And the pizza is on its way.

I think that clocked in under 20 seconds. Now, you might be wondering how this app got to my watch. Well, all I had to do was install the Eat24 app from the Play Store on my phone. When a watch is connected, the wearable portion of the app is automatically installed and kept up to date on that watch.

I mentioned the new wearable APIs for easy communication between phone and watch. All The Cooks is a social recipes app which has made really great use of these APIs. I don't know about you, but I find it really hard to follow recipes especially when it gets to the tricky bits where everything is happening at the same time. So wouldn't it be more convenient if I could just look down at my watch and see what to do next?

With the All The Cooks app I can choose a recipe — let's go into my favorites and choose beef brisket chili. The recipe will immediately appear on my watch so it's always right there with me. Let's get started.

I've got all the ingredients, so let's start following the steps and now watch the phone carefully. As I move from step to step the phone stays in sync too. And if you're wondering whether or not it's safe to wear your watch while cooking, it's great to know that all the devices we're talking about today are water resistant.

And with All The Cooks, whenever a recipe calls for a timer like this four hours in the oven, I can do that right away on my wrist. So no more burnt dinner.

We saw some great examples of voice actions earlier today. And we believe voice actions will be most useful when they can invoke the best service in any app. We're just getting started with this, but we're making voice available for some key actions on the wearable today and we'll be adding more over the coming months.

Lyft is a transportation service and ride-sharing app that allows you to request a car to pick you up at your exact location. Lyft have implemented our call a car intent so it's really easy to just walk outside and say –

[Jeff]

Okay, Google, call me a car.

[David Singleton]

You'll see that lyft is able to determine Jeff's exact location from his phone and presents this confirmation screen so he can verify his address. The app has also made great use of notifications in the stream. You can see when your car has arrived, keep up to date throughout the journey and even rate your driver right from your wrist when you're at your destination. Thanks to all our developers.

Now, we showed a preview of a couple of watches we were working on with our partners back in March. The LG G watch will be available to order later today on the Play Store.

In addition, you might have caught a glimpse of a new device during the demos. We're very happy that Samsung is joining the Android Wear family with the Samsung Gear Live. And the Samsung Gear Live is also available to order later today.

The Moto 360 is the first watch to adopt the round Android Wear UI and it will be available for sale later this summer. Those are just the first three watches. There are many more on the way and we're thrilled to enable developers across the world to build apps for what we believe will be a revolutionary new form-factor.

And now I'd like to invite Patrick Brady on stage to tell you about how we're bringing Android to the car.

[Patrick Brady – Android Director of Engineering]

Thank you, David. Isn't that great? Android Wear creates a seamless experience by connecting your Android smartphone to a wearable device. And the result is truly amazing. Wouldn't it be great if all of your devices were this connected? For many of us cars are an integral and essential part of life. They bring us to the grocery store and take us on weekend trips. They bring us to work and take us home.

In fact, in the United States the average commuter spends over one hour in the car everyday. In many ways our cars keep us connected to the physical world around us, but they remain disconnected from our other devices and our digital lives.

So what have drivers done to bridge this divide? Well, even though it's unsafe and in many cases illegal, people use their phones while driving. And reports show that 25% of accidents in the US are caused by people fumbling with gadgets behind the wheel. There's got to be a better way?

So back in January, we announced the Open Automotive Alliance to address this problem and make the connected car a reality. We'd like to show you what we've all been working on. And today we're happy to announce Android Auto.

We've redesigned the Android platform for automotive, making it easier and safer to use the connected apps and services drivers want in the car. We looked at what people do with their phones in the car today. And these things stood out to us. Navigation, communication, music and other forms of streaming media. Android Auto puts this front and center so you don't have to go hunting through the grid of icons to find the apps that are most important to you when you're in the car.

Android Auto is contextually aware to give you the right information right when you need it. And most importantly Android Auto is completely voice enabled so that you can keep your hands on the wheel and your eyes on the road. You know, we really wanted to drive a car up here on stage and show you this live in action, but apparently there are these regulations and logistics that make driving a vehicle in a room packed with 6,000 people a very hard thing to do. So we set one of our engineers on the problem and apparently this is what happens when engineers have access to a blowtorch.

So we're down one test car, but we have a great demo cockpit to show you. And now I am happy to introduce Andrew Brenner, our product manager who will literally drive this demo.

So to start, Andy connects his Android phone to the car and the phone casts Android Auto experience to the car's screen. Andy can now put his phone down and use the familiar car controls, steering wheel buttons, console dials, and touch screens, to control Android Auto. It looks and feels like it's part of the car. But all the apps we see here are running on Andy's phone, which means that the experience gets better when Andy updates his apps or gets a newer, faster phone. This also means that Andy has a personalised experience that he can bring with him into any compatible car.

The first thing Andy sees is the overview screen, which shows personal and contextually relevant destinations, reminders, contacts, and music from Google Now and other apps. One tap and he is navigating or listening to his favorite road trip mix. Andy, why don't you play us some music.

Let's look for a seconds at Play Music. It has been adapted to have simple, glanceable controls for the car. Andy has access to all of his curated play lists, radio stations, albums, and artists, and to all the key features in Google Play music. He can also use voice or the steering wheel controls to control the music in the car, keeping his hands on the wheel. Fantastic. Of course, Android Auto needs great maps and navigation. So let's show you Google Maps.

We all love Google Maps because it's fast, accurate, updated, and it seems to know where everything is. In Android Auto, drivers have access to all of their favorite maps features: great local search, personalised suggestions, live traffic, and, of course, turn-by-turn navigation. And Google Maps for Android Auto is even more powerful because it is completely voice-enabled.

Andy, why don't you take us for a drive?

[Andy Brenner]

How late is the De Young museum open today?

[Google]

De Young museum is open from 9h30 AM to 5h15 PM on Wednesday.

[Andy Brenner]

Good, I can go there. Navigate there.

[Google]

Navigating to De Young museum…

Head toward Fourth Street Northeast on Main Street and 600 feet, use any lane to turn right onto Fourth Street.

[Patrick Brady]

So Andy was able to start navigation without ever entering an address or taking his hands off the steering wheel. During navigation, instructions are spoken, as you heard, and displayed on the screen in a material card that floats above the map. Great. So that's music and navigation.

What's next? Let's show you voice-enabled messaging.

[Google]

New message from Hiroshi Lockheimer. Here it is: Andy, are we there yet?

[Patrick Brady]

As you can see, incoming messages show up as heads-up notifications. So Andy can still see the upcoming 1turn in maps. When he's ready, he can just use the steering wheel voice button to reply.

[Andy Brenner]

Reply.

[Google]

What's the message?

[Andy Brenner]

I have no wheels.

[Google]

Here's your message to Hiroshi Lockheimer: I have no wheels. Do you want to send it?

[Andy Brenner]

Sure.

[Google]

Sending message.

[Patrick Brady]

So we're really excited to bring these great experiences into the car. But we also want you, our developers, to come along for the ride. We know it's not easy to build apps for cars today. There are dozens of different car platforms, input controls, and user interfaces. There is no central way to distribute your app or keep it updated.

Wouldn't it be great if building an app for the car was just like building an app for your smartphone or tablet? Well, we have good news for you. The road ahead is brighter, and today we're announcing the Android Auto SDK so that you — we thought you'd like that. — so that you can just focus on making great apps for the car.

We're starting with a full set of APIs for audio and messaging applications. First, let's talk about audio. We worked with a great set of developers on a prerelease version of the Android Auto SDK to develop some great audio streaming apps that let you listen to music, internet 9radio, news, sports, and podcasts on the go. You can try these apps out live in our demo cars right outside.

Next, let's talk about messaging apps. Andy showed us earlier how he can send text messages using Android Auto completely with his voice. Well, we're opening this up to your messaging apps. So using these APIs, your apps can notify users of incoming messages and allow them to respond using voice. And this is the same API we're using for notifications and remote reply on Android Wear, with just a few lines of code, you can let users know on their wrist or in their car. It's really, really powerful.

So we're really excited about Android Auto, and we think we've found that better way. But I know what you're all thinking: when does the rubber actually meet the road? Well, we're happy to say that you won't have to wait long. The Android Auto SDK will be published soon, and the Android Auto experience will be available to users with the public "L" release later this year. And the excitement in the auto industry has really been growing.

Today, we're happy to announce that over 40 new partners have joined the Open Automotive Alliance. And over 25 car brands have signed up to ship Android Auto in the near future. What's more? The first cars with Android Auto will be rolling off dealer lots before the end of this year. So that's just a peek at Android Auto, an Android experience that's been redesigned for the car, with all of the apps drivers know and love, through an interface that's built for driving.

Now I'd like to welcome Dave Burke back on stage to tell us about android in the living room.

[Dave Burke]

Thanks, Patrick. It's pretty cool to see what you guys are doing in autos. But some of us don't actually have a car in our living room, wheeled or not. So I'm going to talk about a different form factor, and that's TV.

So TVs are fast becoming smarter and more connected. And really they're becoming computing devices in their own right. So we see a great opportunity to bring some of the strong capabilities of Android, such as voice input, user experience, and content, to the largest screen in your house.

Now, in some ways, TV space is not too dissimilar to the mobile space in 2006. Each TV manufacturer has a different OS with different APIs and programming model, often with limited developer tools. And the cost and friction to developer service to run across multiple TVs is too expensive. As a result, smart TVs are typically limited and not competitive with their mobile cousins. So we wanted to go and change that.

Today, we're announcing Android TV. So this isn't a new platform. That's kind of the point. We're simply giving TV the same level of attention as phones and tablets have traditionally enjoyed. We want you to be able to leverage your existing skills and investment in Android and extend them to TV.

There's now one Android SDK for all form factors. Now, remotes are a core part of the TV experience. And Android TV requires just a d-pad with voice input. And that can be made available as a hardware remote control, as a game controller, or even a virtual controller on a phone or tablet.

Tłumaczenie piosenki

Nikt nie dodał jeszcze tłumaczenia do tej piosenki. Bądź pierwszy!
Jeśli znasz język na tyle, aby móc swobodnie przetłumaczyć ten tekst, zrób to i dołóż swoją cegiełkę do opisu tej piosenki. Po sprawdzeniu tłumaczenia przez naszych redaktorów, dodamy je jako oficjalne tłumaczenie utworu!

+ Dodaj tłumaczenie

Wyślij Niestety coś poszło nie tak, spróbuj później. Treść tłumaczenia musi być wypełniona.
Dziękujemy za wysłanie tłumaczenia.
Nasi najlepsi redaktorzy przejrzą jego treść, gdy tylko będzie to możliwe. Status swojego tłumaczenia możesz obserwować na stronie swojego profilu.

Interpretacja piosenki

Dziękujemy za wysłanie interpretacji
Nasi najlepsi redaktorzy przejrzą jej treść, gdy tylko będzie to możliwe.
Status swojej interpretacji możesz obserwować na stronie swojego profilu.
Dodaj interpretację
Jeśli wiesz o czym śpiewa wykonawca, potrafisz czytać "między wierszami" i znasz historię tego utworu, możesz dodać interpretację tekstu. Po sprawdzeniu przez naszych redaktorów, dodamy ją jako oficjalną interpretację utworu!

Wyślij Niestety coś poszło nie tak, spróbuj później. Treść interpretacji musi być wypełniona.

Lub dodaj całkowicie nową interpretację - dodaj interpretację
Wyślij Niestety coś poszło nie tak, spróbuj później. Treść poprawki musi być wypełniona. Dziękujemy za wysłanie poprawki.
Najpopularniejsze od Google
Google I/O 2014 — Keynote
343
{{ like_int }}
Google I/O 2014 — Keynote
Google
Lenovo to acquire Motorola Mobility
315
{{ like_int }}
Lenovo to acquire Motorola Mobility
Google
Ten Things We Know To Be True
299
{{ like_int }}
Ten Things We Know To Be True
Google
Behind the scenes with Google Fiber: Working with content providers to minimize buffering
285
{{ like_int }}
Behind the scenes with Google Fiber: Working with content providers to minimize buffering
Google
Google to Acquire Nest
282
{{ like_int }}
Google to Acquire Nest
Google
Komentarze
Polecane przez Groove
Fortnight
1,9k
{{ like_int }}
Fortnight
Taylor Swift
Chyba że z Tobą
1,3k
{{ like_int }}
Chyba że z Tobą
MODELKI
HILL BOMB
649
{{ like_int }}
HILL BOMB
Guzior
​i like the way you kiss me
11,3k
{{ like_int }}
​i like the way you kiss me
Artemas
Nadziei Słowa
467
{{ like_int }}
Nadziei Słowa
KęKę (PL)
Popularne teksty
Siedem
51,9k
{{ like_int }}
Siedem
Team X
34+35
42,8k
{{ like_int }}
Love Not War (The Tampa Beat)
25,6k
{{ like_int }}
Love Not War (The Tampa Beat)
Jason Derulo
SEKSOHOLIK
163,9k
{{ like_int }}
SEKSOHOLIK
Żabson
Snowman
74,9k
{{ like_int }}
Snowman
Sia