Category: Virtual Reality

XR on the Bay Recap: aka the Day Roger Didn’t Get to Meet Silicon Valley’s Gilfoyle (Part 2)

Roger Sherwood August 2, 2018 advanced imaging society, AI, ar, artificial intelligence, augmented reality, Cisco Media, Deep Learning, hollywood, Martin Starr, media, media and entertainment, post production, Silicon Valley, SP360: Service Provider, Virtual Reality, VR, xr on the bay, xronthebay

XR On the Bay is a conference exploring “the tech of Hollywood”, including AR, VR, AI, Post Production technologies, Blockchain, Cloud Services and more. More than any conference before, this is the event where Silicon Valley meets Hollywood.

Samsung Electronics Starts Producing Industry’s First 16-Gigabit GDDR6 for Advanced Graphics Systems

Samsung Newsroom January 18, 2018 16-gigabit (Gb) Graphics Double Data Rate 6, 8K Ultra HD video, graphics card, Press Release, Semiconductors, Virtual Reality

Samsung Electronics, the world leader in advanced memory technology, today announced that it has started mass production of the industry’s first 16-gigabit

Diagnose and understand your app’s GPU behavior with GAPID

Android Developers December 13, 2017 3d, gapid, Virtual Reality, VR

Posted by Andrew Woloszyn, Software Engineer

Developing for 3D is complicated. Whether you’re using a native graphics API or
enlisting the help of your favorite game engine, there are thousands of graphics
commands that have to come together perfectly to produce beautiful 3D images on
your phone, desktop or VR headsets.

GAPID (Graphics API
is a new tool that helps developers diagnose rendering and
performance issues with their applications. With GAPID, you can capture a trace
of your application and step through each graphics command one-by-one. This lets
you visualize how your final image is built and isolate problematic calls, so
you spend less time debugging through trial-and-error.

GAPID supports OpenGL ES on Android, and Vulkan on Android, Windows and Linux.

Debugging in action, one draw call at a time

GAPID not only enables you to diagnose issues with your rendering commands, but
also acts as a tool to run quick experiments and see immediately how these
changes would affect the presented frame.

Here are a few examples where GAPID can help you isolate and fix issues with
your application:

What’s the GPU doing?

Why isn’t my text appearing?!

Working with a graphics API can be frustrating when you get an unexpected
result, whether it’s a blank screen, an upside-down triangle, or a missing mesh.
As an offline debugger, GAPID lets you take a trace of these applications, and
then inspect the calls afterwards. You can track down exactly which command
produced the incorrect result by looking at the framebuffer, and inspect the
state at that point to help you diagnose the issue.

What happens if I do X?

Using GAPID to edit shader code

Even when a program is working as expected, sometimes you want to experiment.
GAPID allows you to modify API calls and shaders at will, so you can test things

  • What if I used a different texture on this object?
  • What if I changed the calculation of bloom in this shader?

With GAPID, you can now iterate on the look and feel of your app without having
to recompile your application or rebuild your assets.

Whether you’re building a stunning new desktop game with Vulkan or a beautifully
immersive VR experience on Android, we hope that GAPID will save you both time
and frustration and help you get the most out of your GPU. To get started with
GAPID and see just how powerful it is, download it, take your
favorite application, and capture a

Creating Smarter, Safer, More Inclusive Workplaces: Are You In?

Kate O'Keeffe November 2, 2017 analytics, artificial intelligence, Cisco Hyper Innovation Living Lab, Cisco Services, Featured, future of work, Innovation, robotics, Virtual Reality, ZZFeatured

Analytics can democratize knowledge and drive new insights—so that every worker becomes a knowledge worker. It can help keep workers safe, and make workplaces more inclusive.

Relive the Moment – Coldplay’s ‘A Head Full of Dreams Tour’ Live in VR with Samsung and Live Nation

Samsung Newsroom September 22, 2017 A Head Full Of Dreams Tour, Coldplay, Gear VR, LivePass, Mobile, Virtual Reality, VR

  On August 17, fans from over 50 countries put on their Samsung Gear VR headset to witness the spectacular sights and sounds of Coldplay’s ‘A Head Full

ARCore: Augmented reality at Android scale

Android Developers August 29, 2017 Android, ar, arcore, augmented reality, Virtual Reality, VR

Posted by Dave Burke, VP, Android Engineering

With more than two billion active devices, Android is the largest mobile
platform in the world. And for the past nine years, we’ve worked to create a
rich set of tools, frameworks and APIs that deliver developers’ creations to
people everywhere. Today, we’re releasing a preview of a new
software development kit (SDK) called ARCore. It brings augmented reality
capabilities to existing and future Android phones. Developers can start
experimenting with it right now.

We’ve been developing the fundamental technologies that power mobile AR over the
last three years with Tango, and ARCore is built on that work. But, it works
without any additional hardware, which means it can scale across the Android
ecosystem. ARCore will run on millions of devices, starting today with the Pixel
and Samsung’s S8, running 7.0 Nougat and above. We’re targeting 100 million
devices at the end of the preview. We’re working with manufacturers like
Samsung, Huawei, LG, ASUS and others to make this possible with a consistent bar
for quality and high performance.

ARCore works with Java/OpenGL, Unity and Unreal and focuses on three things:

  • Motion tracking: Using the
    phone’s camera to observe feature points in the room and IMU sensor data, ARCore
    determines both the position and orientation (pose) of the phone as it moves.
    Virtual objects remain accurately placed.
  • Environmental understanding:
    It’s common for AR objects to be placed on a floor or a table. ARCore can detect
    horizontal surfaces using the same feature points it uses for motion tracking.
  • Light estimation: ARCore
    observes the ambient light in the environment and makes it possible for
    developers to light virtual objects in ways that match their surroundings,
    making their appearance even more realistic.

Alongside ARCore, we’ve been investing in apps and services which will further
support developers in creating great AR experiences. We built Blocks and Tilt Brush to make it easy for anyone to
quickly create great 3D content for use in AR apps. As we mentioned
at I/O
, we’re also working on Visual Positioning Service (VPS), a service
which will enable world scale AR experiences well beyond a tabletop. And we
think the Web will be a critical component of the future of AR, so we’re also
releasing prototype browsers for web developers so they can start experimenting
with AR, too. These custom browsers allow developers to create AR-enhanced
websites and run them on both Android/ARCore and iOS/ARKit.

ARCore is our next step in bringing AR to everyone, and we’ll have more to share
later this year. Let us know what you think through GitHub, and check out our new AR Experiments showcase
where you can find some fun examples of what’s possible. Show us what you build
on social media with #ARCore; we’ll be resharing some of our favorites.

Page 1 of 1
Scroll Up