Posted by Lukas Bergstrom, Product Manager, Android Developer Frameworks Team
Android runs on billions of devices, from high-end phones to airplane seatbacks. The Android OS manages resources aggressively to perform well on this huge range of devices, and sometimes that can make building robust apps complicated. To make it easier, we launched a preview of Architecture Components at Google I/O to provide guidance on app architecture, with libraries for common tasks like lifecycle management and data persistence. Together, these foundational components make it possible to write modular apps with less boilerplate code, so developers can focus on innovating instead of reinventing the wheel – and we hope to keep building on this foundation in the future.
Today we’re happy to announce that the Room and Lifecycle Architecture Components libraries have reached 1.0 stable. These APIs are ready for production apps and libraries, and are our recommendation for developers looking for help with app architecture and local storage (although they’re only recommended, not required.) Lifecycles are now also integrated with the Support Library, so you can use them with standard classes like AppCompatActivity.
Although we’re declaring them stable today, the beta components are already used in apps that together, have billions of installs. Top developers, like Zappos, have been able to spend more time on what’s important thanks to Architecture Components:
Prior to the release of Android Architecture Components we had our own ViewModel implementation. We used Loaders and Dependency Injection to persist our ViewModel through config changes. We recently switched to the Architecture Components ViewModel implementation and all that boilerplate went away. We found that we were able to spend more time on design, business logic and testing, and less on writing boilerplate or worrying about Android lifecycle issues.
We’ve also started to use LiveData which hooks directly into the Activity lifecycle. We use it to retrieve and display network data and no longer have to concern ourselves with network call subscription management.
– David Henry, Android Software Engineer, Zappos
Architecture Components provide a simple, flexible and practical approach that frees developers from some common problems so they can focus on building great experiences. This is based on core building blocks tied together by guidance on app architecture.
Every Android developer has to deal with the operating system starting, stopping and destroying their Activities. That means managing the state of components – such as observables used to update UI – as you move through the lifecycle. Lifecycles enables the creation of lifecycle-aware components that manage their own lifecycles, reducing the possibility of leaks or crashes. The Lifecycle library is the foundation for other Architecture Components like LiveData.
LiveData is a lifecycle-aware observable that holds data and provides updates. Your UI code subscribes to changes and provides LiveData a reference to its Lifecycle. Because LiveData is lifecycle-aware, it provides updates when its Lifecycle is started or resumed, but stops providing updates when the LifecycleOwner is destroyed. LiveData is a simple way to build reactive UIs that are safer and more performant.
ViewModel separates ownership of view data and logic from lifecycle-bound entities like Activities and Fragments. A ViewModel is retained until its associated Activity or Fragment is disposed of forever – that means view data survives events like a Fragment being recreated due to rotation. ViewModels not only eliminate common lifecycle issues, they help build UIs that are more modular and easier to test.
Nearly all apps need to store data locally. While Android has bundled SQLite with the platform since version 1, using it directly can be painful. Room is a simple object-mapping layer that provides the full power of SQlite with less boilerplate. Features like compile-time query verification and built-in migration make it easier to build a robust persistence layer, while integration with LiveData lets Room provide database-backed, lifecycle-aware observables. Room blends of simplicity, power and robustness for managing local storage, and we hope you give it a try.
Last but not least, we created a Guide to App Architecture that provides core principles applicable to all developers, and specific guidance on using Architecture Components together. Because we’ve heard from you that clear and consistent guidance is important, today we’re updating developer documentation to point to Architecture Components where appropriate. We also have a rich set of videos, codelabs and sample apps available at the Architecture Components site, with more to come.
Although the first set of Architecture Components is now stable, we know there’s more work to do. Over the last few months, we’ve listened to your feedback and made improvements. We also recently launched a new Architecture Component, PagedList, to alpha, in response to your feedback that handling large datasets with RecyclerView is too difficult. This is just the beginning – we have more major components under development that we’re looking to announce in the upcoming months.
Our hope with Architecture Components is to free developers to focus on providing unique new experiences for mobile devices. We’re glad we can finally announce them as stable for production use. We’d like to thank the community, which has given such great feedback along the way, and we look forward to continuing the discussion in the comments of this post. Finally, for those of you who’ve been waiting for this stable launch, get started today.
Our partners are incredibly important in helping our customers maximize the value of their cloud investments. Today, we’re announcing a first-of-its-kind strategic partnership with Salesforce that spans Google Cloud and Google Analytics to enable smarter, more collaborative experiences for our customers. As cloud-native companies, our partnership offers a unique opportunity to turn marketing, service and sales data into actionable insights and better business outcomes.
The new collaboration leverages the full value of Google Cloud. Salesforce has named G Suite as its preferred email and productivity provider. In addition, Salesforce plans to use Google Cloud Platform (GCP) for its core services as part of the company’s international infrastructure expansion.
Our teams are working very closely to develop new integrations that will connect Salesforce CRM with G Suite to offer the only cloud-native collaboration platform of its kind. These integrations will enable companies to surface powerful intelligence about your customers from Salesforce directly within Gmail, Sheets, Calendar, Drive, Docs and Hangouts Meet. Here’s some more on what you’ll be able to do:
Salesforce Lightning for Gmail: Surface relevant Salesforce CRM data in Gmail, as well as customer interactions from Gmail directly within Salesforce, to service your customers faster. Identify high priority emails and suggest next steps based on the email content to work with customers faster than before.
Salesforce Lightning for Google Sheets: Embed Sheets anywhere in Salesforce, and with a single click push content from Salesforce Records or Reports to a new Sheet. Data will automatically update bi-directionally to ensure everyone has the most recent information.
Quip Live Apps for Google Drive and Google Calendar: Quip Live Apps will integrate with Google Drive and Calendar, allowing you to access information in a more collaborative, open cloud environment, and embed any list of Drive files, including Google Docs, Slides and Sheets, or your Google Calendar inside Quip. This new combination of Quip and G Suite will create a modern alternative to legacy intranet content services.
Salesforce for Hangouts Meet: Access relevant customer and account details, service case histories and more from Salesforce CRM directly within the Hangouts Meet interface. This gives you powerful insights directly in the communications platform to conduct better sales conversations or efficiently resolve customer service issues.
To help take advantage of the combined Google and Salesforce experience, qualified Salesforce customers can receive G Suite for up to one year at no additional cost—restrictions apply, and more details can be found on our site.
We hope this partnership enables more companies to take advantage of the cloud and that the combined solutions will provide an unmatched experience for customers. In fact, our team at Google Cloud uses Salesforce as our preferred CRM provider to engage with our customers in meaningful ways.
As for availability, several integrations between G Suite and Salesforce are already in market, including Lightning for Gmail and integrations with Calendar and Google Drive. The deeper integrations we’ve announced are expected to start rolling out in 2018.
Creating an integration between two software platforms sounds complex, right? Well, it can be much easier than it looks. In this blog, you can have Salesforce Cloud CRM tightly integrated with Cisco Cloud Collaboration to digitize customer and sales workflows, bringing together experts, customers, data and workflows into a seamless, powerful real-time collaboration experience. What […]
Anyone who’s a fan of the amazing waves on Hawaii’s North Shore has likely heard of Clark Little. A lifelong surfing enthusiast, at the age of 37 Little left his steady job, bought a camera, and started exposing the inside of the North Shore’s massive waves.
From dynamic signage to mobile innovations, to virtual reality and artificial intelligence enabled interactions, and now IoT; technology trends continue to reshape consumer and fan behaviors. Today it isn’t enough for fans to go to the game. They want an unforgettable game-day experience, from the moment they purchase their ticket, to the time the event […]
Maybe you couldn’t make it in person to MAX 2017, or maybe you want to review an inspiring session or keynote. In either case, we’ve got you covered — many of our keynotes and sessions are now available to watch online. So much happened at MAX this year — the amount of amazing content was […]
With portrait mode on the Pixel 2 and Pixel 2 XL, you can take pictures of people, pets and even objects like flowers that keep what’s important sharp and in focus, but softly blur out the background. Portrait mode is powered by computational photography and machine learning, which identifies what to keep in focus and what to blur out. We’ve put together some tips to help you make the most of the new feature. Check it out—you’ll be a master portraitist in no time!
Get closer. This is the most important tip for getting great portraits. The less distance between you and your subject, the more likely your photos will have beautiful blur. Compare the photos on the right, below, with those on the left which are taken from farther away.
Increase distance between your subject and the background. The farther your subject is from the background, as on the image on the right below, the more the background will be blurred. In the left image, the background isn’t far enough away.
Tap that. For the best results, tap to focus the Pixel 2 on your subject, whether a person’s face or an object. Tapping also tells the Pixel 2 what’s most important to you in the photo and adjusts the exposure to prioritize your subject. This is especially useful when your subject has strong light (the sun or windows) behind them. Remember—you can always adjust the exposure by tapping on the screen and dragging your finger up or down.
Put the subject in the front. Seems obvious, right? But you’ll get more blur and beautiful bokeh if your main subject stands out, is prominent, and is clearly located in the foreground, like the image on the right below, not the middleground, as on the left.
Change your perspective. A unique angle adds visual interest to your photos, as in the examples below. Get low to match the eye level of a child or an animal, or shoot from above to emphasize shapes and graphic components in an object.
Remember the rule of thirds. Pixel 2 makes it easy to follow this classic photography recommendation. Tap the grid icon in the camera app to activate the 3×3 grid on your Pixel 2’s screen. Placing the subject along one of the lines or where the lines intersect can improve your composition.
Keep it simple. A photo’s success can be compromised if too many details compete for attention with your subject. You can avoid this fate by filling the frame and shooting in front of clean backgrounds. And don’t forget to check the outside edges of your photo before you press the shutter button to prevent clutter from protruding into the sides of your picture.
Get in line! This is one of our favorite tips. Look for lines, like bricks or tiles, staircases, or a building’s grid. Lines can enhance the sense of depth in photos and emphasize your subject.
Lighting is everything. You can’t always control lighting conditions, but you can find better light. Outdoors, find better light by changing your location or the camera’s orientation, repositioning subjects, or taking photos in the shade (especially at midday). Clouds are your friend, creating a more soft, diffuse light. At night, try lighting up your subject from the side with a friend’s phone in flashlight mode.
Ditch the crowds. Portrait mode works best when all your subjects are the same distance from the camera. This is easier with small groups of people.
We’d love to see how you put these tips to work. Share your Pixel 2 portraits on social media with #teampixel—we may feature them in one of our upcoming posts!
Attendees at Cisco LIVE LatAm 2017 will be able to participate in the Cisco Spark Starship Team Challenge and compete for bragging rights and fun prizes.
Thriving in a digital world requires agility. To bring this concept to life, members of my team created a Digital Business Agility (DBA) framework.
Get ready everyone, Cisco IT is heading to Cancun, Mexico to host the IT Management Program! Cisco Live Cancun is just around the corner and it’s never felt so real! More than 5,000 attendees and 70 sponsors will be joining the Cisco family in enjoying some of the coolest technical talks and inspiring keynote presentations […]
As 2017 draws to a close, organizations should consider how current tax benefits and flexible payment options provide a lucrative time to align tech investments with business goals.
There are a lot of common UX design myths circulating around, and sometimes we just don’t have the time to stop and reflect on how true they really are. So today, we tackle 12 of the most common myths of UX design and try to dispel them.
As humans, we rely on sound to guide us through our environment, help us communicate with others and connect us with what’s happening around us. Whether walking along a busy city street or attending a packed music concert, we’re able to hear hundreds of sounds coming from different directions. So when it comes to AR, VR, games and 360 video, you need rich sound to create an engaging immersive experience that makes you feel like you’re really there. Today, we’re releasing a new spatial audio software development kit (SDK) called Resonance Audio. It’s based on technology from Google’s VR Audio SDK, and it works at scale across mobile and desktop platforms.
Bringing rich, dynamic audio environments into your VR, AR, gaming, or video experiences without affecting performance can be challenging. There are often few CPU resources allocated for audio, especially on mobile, which can limit the number of simultaneous high-fidelity 3D sound sources for complex environments. The SDK uses highly optimized digital signal processing algorithms based on higher order Ambisonics to spatialize hundreds of simultaneous 3D sound sources, without compromising audio quality, even on mobile. We’re also introducing a new feature in Unity for precomputing highly realistic reverb effects that accurately match the acoustic properties of the environment, reducing CPU usage significantly during playback.
We know how important it is that audio solutions integrate seamlessly with your preferred audio middleware and sound design tools. With Resonance Audio, we’ve released cross-platform SDKs for the most popular game engines, audio engines, and digital audio workstations (DAW) to streamline workflows, so you can focus on creating more immersive audio. The SDKs run on Android, iOS, Windows, MacOS and Linux platforms and provide integrations for Unity, Unreal Engine, FMOD, Wwise and DAWs. We also provide native APIs for C/C++, Java, Objective-C and the web. This multi-platform support enables developers to implement sound designs once, and easily deploy their project with consistent sounding results across the top mobile and desktop platforms. Sound designers can save time by using our new DAW plugin for accurately monitoring spatial audio that’s destined for YouTube videos or apps developed with Resonance Audio SDKs. Web developers get the open source Resonance Audio Web SDK that works in the top web browsers by using the Web Audio API.
By providing powerful tools for accurately modeling complex sound environments, Resonance Audio goes beyond basic 3D spatialization. The SDK enables developers to control the direction acoustic waves propagate from sound sources. For example, when standing behind a guitar player, it can sound quieter than when standing in front. And when facing the direction of the guitar, it can sound louder than when your back is turned.
Another SDK feature is automatically rendering near-field effects when sound sources get close to a listener’s head, providing an accurate perception of distance, even when sources are close to the ear. The SDK also enables sound source spread, by specifying the width of the source, allowing sound to be simulated from a tiny point in space up to a wall of sound. We’ve also released an Ambisonic recording tool to spatially capture your sound design directly within Unity, save it to a file, and use it anywhere Ambisionic soundfield playback is supported, from game engines to YouTube videos.
If you’re interested in creating rich, immersive soundscapes using cutting-edge spatial audio technology, check out the Resonance Audio documentation on our developer site. You can also experience spatial audio in our Audio Factory VR app for Daydream and SteamVR. Let us know what you think through GitHub, and show us what you build with #ResonanceAudio on social media; we’ll be resharing our favorites.
By Hossam Tewfik, Product Manager Given the positive feedback and increased monetization seen from our testing of recirculation ads launched in June, we are now enabling publishers who currently use Facebook’s Audience Network to easily turn these ads on or…