During the first few days of PyeongChang 2018, athletes from Korea, Italy, and India experienced the Samsung Olympic Showcase in Gangneung Olympic Park.
Posted by Andrew Woloszyn, Software Engineer
Developing for 3D is complicated. Whether you’re using a native graphics API or
enlisting the help of your favorite game engine, there are thousands of graphics
commands that have to come together perfectly to produce beautiful 3D images on
your phone, desktop or VR headsets.
GAPID (Graphics API
Debugger) is a new tool that helps developers diagnose rendering and
performance issues with their applications. With GAPID, you can capture a trace
of your application and step through each graphics command one-by-one. This lets
you visualize how your final image is built and isolate problematic calls, so
you spend less time debugging through trial-and-error.
GAPID supports OpenGL ES on Android, and Vulkan on Android, Windows and Linux.
GAPID not only enables you to diagnose issues with your rendering commands, but
also acts as a tool to run quick experiments and see immediately how these
changes would affect the presented frame.
Here are a few examples where GAPID can help you isolate and fix issues with
What’s the GPU doing?
Working with a graphics API can be frustrating when you get an unexpected
result, whether it’s a blank screen, an upside-down triangle, or a missing mesh.
As an offline debugger, GAPID lets you take a trace of these applications, and
then inspect the calls afterwards. You can track down exactly which command
produced the incorrect result by looking at the framebuffer, and inspect the
state at that point to help you diagnose the issue.
What happens if I do X?
Even when a program is working as expected, sometimes you want to experiment.
GAPID allows you to modify API calls and shaders at will, so you can test things
- What if I used a different texture on this object?
- What if I changed the calculation of bloom in this shader?
With GAPID, you can now iterate on the look and feel of your app without having
to recompile your application or rebuild your assets.
Whether you’re building a stunning new desktop game with Vulkan or a beautifully
immersive VR experience on Android, we hope that GAPID will save you both time
and frustration and help you get the most out of your GPU. To get started with
GAPID and see just how powerful it is, download it, take your
favorite application, and capture a
Posted by Dave Burke, VP, Android Engineering
With more than two billion active devices, Android is the largest mobile
platform in the world. And for the past nine years, we’ve worked to create a
rich set of tools, frameworks and APIs that deliver developers’ creations to
people everywhere. Today, we’re releasing a preview of a new
software development kit (SDK) called ARCore. It brings augmented reality
capabilities to existing and future Android phones. Developers can start
experimenting with it right now.
We’ve been developing the fundamental technologies that power mobile AR over the
last three years with Tango, and ARCore is built on that work. But, it works
without any additional hardware, which means it can scale across the Android
ecosystem. ARCore will run on millions of devices, starting today with the Pixel
and Samsung’s S8, running 7.0 Nougat and above. We’re targeting 100 million
devices at the end of the preview. We’re working with manufacturers like
Samsung, Huawei, LG, ASUS and others to make this possible with a consistent bar
for quality and high performance.
ARCore works with Java/OpenGL, Unity and Unreal and focuses on three things:
- Motion tracking: Using the
phone’s camera to observe feature points in the room and IMU sensor data, ARCore
determines both the position and orientation (pose) of the phone as it moves.
Virtual objects remain accurately placed.
- Environmental understanding:
It’s common for AR objects to be placed on a floor or a table. ARCore can detect
horizontal surfaces using the same feature points it uses for motion tracking.
- Light estimation: ARCore
observes the ambient light in the environment and makes it possible for
developers to light virtual objects in ways that match their surroundings,
making their appearance even more realistic.
Alongside ARCore, we’ve been investing in apps and services which will further
support developers in creating great AR experiences. We built Blocks and Tilt Brush to make it easy for anyone to
quickly create great 3D content for use in AR apps. As we mentioned
at I/O, we’re also working on Visual Positioning Service (VPS), a service
which will enable world scale AR experiences well beyond a tabletop. And we
think the Web will be a critical component of the future of AR, so we’re also
releasing prototype browsers for web developers so they can start experimenting
with AR, too. These custom browsers allow developers to create AR-enhanced
websites and run them on both Android/ARCore and iOS/ARKit.
ARCore is our next step in bringing AR to everyone, and we’ll have more to share
later this year. Let us know what you think through GitHub, and check out our new AR Experiments showcase
where you can find some fun examples of what’s possible. Show us what you build
on social media with #ARCore; we’ll be resharing some of our favorites.