Samsung continues global AI centre rollout with NY opening

The Big Apple is the latest city to be graced with a Samsung AI centre as part of the South Korean giant’s international push.

Samsung’s New York AI centre will be led by Sebastian Seung, Executive Vice President of Samsung Research.

Each of Samsung’s AI centres focuses on different areas of research. The New York branch will be researching the important field of AI in robotics.

Hyun-suk Kim, President and Head of Samsung Research, said:

“What we need now is to focus on creating new values that make people’s lives easier and more convenient by harnessing the power of AI in Samsung’s products and services.

To do this, our Global AI Centers, including the New York AI Center, must play a pivotal role.”

Considering the use of Samsung’s technologies for industries such as manufacturing, AI-powered robotics will be an important part of the company’s future business. Expect plenty of Samsung’s resources to be allocated here to see off competition.

The latest AI centre opening is part of Samsung’s grand plan to employ ~1000 AI specialists by 2020.

Samsung has already opened AI centres in different locations around the world including Korea, the UK, France, Russia, Canada, and Silicon Valley.

Back in March, AI News reported of Samsung’s plans to open an AI centre in France.

“Great progress on artificial intelligence is happening in France,” President Macron said in a Twitter message. “Samsung chooses France to locate its new research centre on artificial intelligence, creating more than 100 jobs.”

However, some experts believe strict EU regulations will stunt the development of AI in Europe.

Our sister publication IoT News reported back in May about Samsung’s opening of a huge AI centre in the UK.

UK Prime Minister Theresa May commented:

“Samsung will create high-paying, high-skilled jobs and our modern industrial strategy will encourage further investment like this all around the country.

It is a vote of confidence in the UK as a world leader in artificial intelligence, and the new AI research centre will benefit from the world-renowned talent and academic prowess of Cambridge.”

The UK has become something of a hotbed for AI talent due to its leading universities and companies such as Google-acquired DeepMind. Significant funding has also been allocated for continued development, highlighting the importance of AI for the future economy.

Samsung’s UK centre is being led by Andrew Blake, an esteemed AI researcher and ex-director of Microsoft’s Cambridge Laboratory.

Blake comments: “Our research will help us to better understand human behaviour while exploring areas like emotion recognition, and further expand the boundaries of user-centric communication to develop AI technologies that ultimately improve people’s lives.”

All of Samsung’s AI centres around the globe will cooperate to advance the company’s goals to lead in the field.

“Samsung has a long history of pursuing innovation and we are excited to be bringing that same passion and technology leadership to AI,” said Hyun-suk Kim. “With the new AI centres and recruitment of leading experts in the field, our aim is to be a game-changer for the AI industry.”

What are your thoughts on Samsung’s new AI centres? Let us know in the comments.

This content was originally published here.

Android Messages Gets Dark Mode, Revamped Interface, and Smart Reply Feature

Android Messages has received a new update that brings a dark mode. The latest Android Messages update (version 3.5) also includes a revamped interface that has Google’s Material Theme elements as well as Google Sans font. The fresh interface matches the design of Samsung’s proprietary Messaging app that has an all-white background. Google has also brought its machine learning-based Smart Reply feature that initially arrived on Inbox by Gmail in 2015. The Android Messages app did receive the same Smart Reply feature in January this year, but that was limited to Project Fi users.

Among other new features, the updated Android Messages app has included the dark mode that lets you completely transform the background from white to black and fonts from black to white. The feature is majorly useful if you often use Android Messages at night or under low-light conditions as it removes limits the emission of unwanted light from the screen. However, as Android Police notes, the app shows inverted colours in conversations once the dark mode is enabled, whereas messages that you sent to a recipient always come in a light blue bubble along with a darker blue font.

Android Messages with dark mode

You can enable the dark mode on the updated Messages app by tapping the three-dot menu key from the top-right corner for the screen and then selecting the Enable dark mode option. By following the same process, you can switch back to the normal mode as well.

Apart from the dark mode, the updated Android Messages app brings Material Theme-based interface that adds a completely white background and Google Sans font. The revamped interface also replaces the original ‘+’ FAB with a larger button that by default shows the ‘Start chat’ text. The text gets replaced with the new message icon once you scroll down.

The updated Android Messages app also brings a Smart Reply feature that you can enable by going to the app settings. The feature uses your recent messages to show you relevant suggestions. However, Google assures users that it doesn’t store the messages to offer suggestions.

As we mentioned, the Smart Reply feature was initially a part of Gmail, though Google brought it to Android Messages earlier this year, but only for the Project Fi users.

The updated Android Messages app is rolling out for all compatible devices through Google Play. Meanwhile, you can download its APK file from APK Mirror to get the latest experience ahead of its formal rollout for your handset.

This content was originally published here.

The iPhone XS camera is magical in all these ways — Quartz

The camera on the first iPhone in 2007 was nothing revolutionary, even for the time. It was 2 megapixels. You couldn’t zoom in on the photos, not even digitally. There was no flash. There was no editing of pics. And there was a single camera on the back—there was no front-facing camera for selfies, which had not yet been invented. Well, not really (paywall).

Fast forward 11 years and the iPhone XS has three cameras—two 12-megapixel ones on the back and one that is 7 megapixels on the front. Where there was once no flash, the flash now comes with four LEDs. More impressive is what those cameras can do, and how far they’ve come in relation to traditional photography and DSLR digital cameras.

The glass

The new flagship iPhone XS and XS Max come with dual-lens systems—one wide-angle and one telephoto. The iPhone XS wide-angle lens has an equivalent focal length of 26mm, while the telephoto has the equivalent of a 52mm lens. We say the “equivalent of” because, in real cameras, the focal length is a measure of where the light rays would converge to form an image on a frame of standard 35mm film or on the sensor of a full-frame DSLR.

But because smartphones are much smaller, the actual focal length of the iPhone wide-angle lens is 4.25mm. The telephoto is 6mm. Apple is able to achieve photos that look similar to what full-sized cameras can create through a combination of improving lenses and a better combination of hardware and software.

It was Apple blogger John Gruber, in his review of the new iPhone, who pointed out that the focal length on the iPhone XS has actually been updated from the previous model and was the equivalent of 26mm, not 28mm anymore. (The iPhone X’s wide-angle focal length was 4mm; the telephoto focal length has remained the same as with 2017’s iPhone X.) He noticed the pictures with his new XS seemed wider than with the X, but that the metadata attached to the photos showed a longer focal length.

Truly magazine-worthy photography.

He contacted Apple and they confirmed that the focal length was longer yet the camera was taking wider pictures, which suggested a much, much larger sensor was being used. “That seemed too good to be true,” Gruber wrote. “But I checked, and Apple confirmed that the iPhone XS wide-angle sensor is in fact 32% larger. That the pixels on the sensor are deeper, too, is what allows this sensor to gather 50% more light.”

The increased sensor size means that, even though the photos are still 12 megapixels (that’s six times the original), more light is reaching the larger megapixels and so each is much richer in information. And still, these sensors are nowhere near the size of what goes into a full-frame DSLR.

That bump

Since the release of the iPhone 6 in 2014, the iPhone has had a “bump” on the back where the camera protrudes rather than sits flush as with previous generations of the phone, which allows the lens to sit further away from the sensor and gives you higher-quality pictures similar to what you get in a proper camera. Apple’s chief designer Jonny Ive once described this (paywall) through what sounds like gritted teeth as “a really very pragmatic optimization.”

Gruber notes that the latest iPhone XS’s bump is the same size as the X, despite the bigger sensor and a longer wide-angle lens, which involved re-architecting the innards of the phone. “Apple managed not only to put a 32% larger sensor in the iPhone XS wide-angle camera, but also moved the sensor deeper into the body of the phone, further from the lens,” he said. (Perhaps this was achieved by the shrinking of the battery.)

From the longer focal length to the larger sensor to maintaining the same-sized body, all those improvements were achieved in a year—and are being produced at scale in the hundreds of millions.

The code

The most impressive leaps in the iPhone come not through the glass on the front, but via the software inside.

Professional photography is often distinctive for the depth of field, usually with the subject being in sharp focus and the background being blurred. That blurry background is often referred to as bokeh, but that’s not correct; it’s the visual quality of the blur that is known as bokeh. Apple has focused a lot on manufacturing good bokeh that is pleasing on the eye, even if its executives sometimes have trouble pronouncing it.

Wide-angle lenses tend to put everything in focus. The iPhone XS’s dual-lens system—sold by Apple since the bigger iPhone 7 Plus was launched in 2016—combines pictures taken by both lenses and uses software to create the depth of field we’re used to from those nice older cameras. The iPhone does this by using its A12 chip and “neural engine” to perform 1 trillion “operations” per photo, like auto exposure, focus, noise reduction, face detection, and so on.

Since last year, the iPhone automatically uses software to adjust the lighting on photos taken after the fact in Portrait mode, mimicking the kind of stylized lighting you see on magazine covers… and in museums. ”If you look at the Dutch Masters and compare them to the paintings that were being done in Asia, stylistically they’re different,” Johnnie Manzari, a designer on Apple’s Human Interface team, told Buzzfeed News last year. “We had some engineers trying to understand the contours of a face and how we could apply lighting to them through software, and we had other silicon engineers just working to make the process super-fast. We really did a lot of work.”

How good is the iPhone at creating beautiful photography via software? So good that many of those features—like Portrait mode and Depth Control—are coming to its cheaper XR model, despite the fact that the XR only has a single wide-angle camera—just like the original iPhone. Which means that Apple has been able to mimic some of the qualities of a dual-lens system and bring them to a single-lens camera for the first time, purely by using software. Which is quite something.

This content was originally published here.

Google and Facebook Are Teaming Up on Artificial Intelligence Tech

Google and Facebook are teaming up to make each company’s artificial intelligence technologies work better together.

The two companies said Tuesday that an unspecified number of engineers are collaborating to make Facebook’s open source machine learning PyTorch framework work with Google’s custom computer chips for machine learning, dubbed Tensor Processing Units, or TPU. The collaboration marks one of the rare instances of the technology rivals working together on joint tech projects.

“Today, we’re pleased to announce that engineers on Google’s TPU team are actively collaborating with core PyTorch developers to connect PyTorch to Cloud TPUs,” Google Cloud director of product management Rajen Sheth wrote in a blog post. “The long-term goal is to enable everyone to enjoy the simplicity and flexibility of PyTorch while benefiting from the performance, scalability, and cost-efficiency of Cloud TPUs.”

Facebook product manager for artificial intelligence Joseph Spisak said in a separate blog post that “Engineers on Google’s Cloud TPU team are in active collaboration with our PyTorch team to enable support for PyTorch 1.0 models on this custom hardware.”

Google first debuted its TPUs in 2016 during its annual developer conference, and pitched them as a more efficient way for companies and researchers to power their machine-learning software projects. The search giant sells access to its TPUs via its cloud computing business instead of selling the chips individually to customers like Nvidia, whose graphics processing units, or GPUs, are popular with researchers working on deep learning projects.

Artificial intelligence technologies like deep learning have grown in popularity over the years with tech giants like Google and Facebook that use the technologies to create software applications that can automatically do tasks like recognize images in photos.

As more businesses explore machine learning technology, companies like Google, Facebook, and others have created their own AI software frameworks, essentially coding tools, intended to make it easier for developers to create their own machine-learning powered software. These companies have also offered these AI frameworks for free in an open source model in order to popularize them with coders.

For the past few years, Google has been courting developers with its so-called Tensorflow framework as the preferred coding tools for AI projects, and it developed its TPUs to work best with Tensorflow. The fact that Google is willing to update its TPUs to work with Facebook’s PyTorch software shows that the company wants to support more than its own AI framework and potentially gain more cloud computing customers and researchers who may use competing frameworks.

Get Data Sheet, Fortune’s technology newsletter.

“Data scientists and machine learning engineers have a wide variety of open source tools to choose from today when it comes to developing intelligent systems,” said Information Services Group principal analyst Blair Hanley Frank. “This announcement is a critical step to help ensure more people have access to the best hardware and software capabilities to create AI models.”

Frank said that he expects “more collaboration like this to crop up in the AI market.”

“Expanding framework support can help cloud providers like AWS, Google and Microsoft drive additional usage of their platforms.” Frank said. “That means it makes sense for them to support as broad a set of development tools as possible, to try and attract as many customers as they can.”

Besides Google, Facebook also said that Amazon and Microsoft are “deepening their investment” in its PyTorch software.

This content was originally published here.

Samsung Launches Full Range Of 8K QLED TVs [UPDATED]

iPhone XS didn’t crack.

I took Apple’s new iPhone XS through my typical four-drop tests, the same one that cracked last year’s iPhone X on the first fall. But while the new iPhone XS looks a lot like last year’s X, with a stainless steel frame and glass on either side, this time it may just be that glass that sets the iPhone XS apart — and above.

At last week’s launch, Phil Schiller, Apple’s marketing chief, said the iPhone XS is “covered on the front and the back with a new formulation of glass that is the most durable glass ever in a smartphone.”

This isn’t the first time we’ve heard this from Apple. In fact, Apple also said that last year’s 2017 iPhone lineup had “the most durable glass ever built into a smartphone,” and you know what happened to our iPhone X.

I subjected a brand-new gold iPhone XS to a series of drops on the cement sidewalk outside of CNET’s San Francisco headquarters, the place where many of our have met their doom.

To be clear, these tests aren’t scientific, but they are real-world demonstrations of what could happen when your phone takes a tumble. The results tend to vary from drop to drop. And yes, I still plan to see just how much abuse this iPhone XS can take before it finally cracks.

Now playing:Watch this: Did the iPhone XS survive our drop tests?

Drop 1: Pocket height (3 feet), screen side down

I started off with a drop from pocket height, or about 3 feet (90 cm). This is a natural height from which people tend to drop their phones. This is also the same impact that cracked last year’s iPhone X.  

I wanted to test the most important part of the phone first, so I dropped it screen-side down. The top edge of the screen broke the fall, then the XS bounced on the bottom edge and did a little flip in the air before landing again, this time completely face down.


First test from hip height (screen side down). 

Celso Bulgatti/CNET

Our iPhone XS looked intact upon first inspection, save for a few scuffs on the metal frame. But upon closer inspection, I noticed that most of the “damage” was cement debris that just rubbed right off. The glass on the edge of the screen near the top left-hand corner had a tiny dent, but it was barely noticeable, and there were no cracks on either side.

Considering last year’s iPhone X had already cracked at this point, I would say it’s already a win for our iPhone XS — up to now. But the testing continued.

Drop 2: Pocket height (3 feet), screen side up

Next I wanted to test out the glass on the back, so I dropped the iPhone XS from the same height (3 feet), this time with the screen facing up.


Second drop: The stainless steel frame broke the fall. 

Celso Bulgatti/CNET

This time the phone changed positions in midair and landed on the top left-hand side, not on its face. After this initial impact, it bounced on the side of that stainless steel frame and then onto its back for its final landing.

Again, it was hard to pinpoint the damage. The frame looked like it had sustained a few more scrapes than before. There were tiny dents on the gold finish of the stainless steel, about the size of a grain of sand. The glass on the front and back of the phone was still intact.

With that one out of the way, I decided to go even higher.

Drop 3: Eye level (5 feet), free fall

For my next drop, I wanted to take it up to eye level, which is roughly the height at which it would fall from your hands if you’re taking a picture.

I held the phone in landscape mode with the screen facing me and let it go.


On the third drop from 5 feet, the iPhone XS landed on its corner. 

Once again the steel frame broke the iPhone XS’ fall. The first point of impact was the top-left corner of the phone, then it bounced on the bottom corner, rotated to hit the bottom edge and then slid out and landed screen side down on the edge of the sidewalk.

The tiny dents on the top left-hand corner of the frame had multiplied, but I had to inspect it closely to notice. Everything else still looked exactly the same. No major damage.

Drop 4: Eye level (5 feet), screen side down

I was running out of time to shoot our drop test, and the glass on the iPhone XS was still intact. For the last test, I decided to drop it again from 5 feet (1.5 meters), but this time starting out with the screen face down.


Fourth and final drop from 5 feet: The phone landed first on the corner where the rear-facing camera is located. 

Celso Bulgatti/CNET

Again, the phone did not land exactly how I wanted it to. Instead, it landed on the top-right corner toward the rear-facing camera, then did a couple of flips in the air before landing with the screen facing up.

This time there was a lot of cement debris on the camera, making me think it had scratched. It wiped off easily. The edge of the bump where the rear-facing camera is had a few grain-size dents on the top, but the glass on the camera didn’t break. And everything else still looked pretty much the same as it did before this second 5-foot drop. 

The breakdown

Based on how similar the iPhone XS looks to its predecessor, the iPhone X, I was expecting it to crack on the first drop or two. Clearly, I was wrong.

After four falls from up to 5 feet onto the concrete sidewalk, this iPhone XS came out almost intact. It has a few tiny dents and scrapes on the frame and the side of the camera, but the glass is nearly flawless.

Does that mean that the iPhone XS glass is stronger? That’s a tough call to make, given the nature of our tests. But I can tell you it fared significantly better than last year’s iPhone X, which ended up with cracks on both sides and tiny pieces of glass falling off the edges after only two drops from hip height.

I reached out to Apple for more information, but the company declined to give further details about the iPhone XS glass compared to that of the iPhone X. We do know that Corning has supplied glass for previous iPhones, but we don’t know whether or not the iPhone XS is covered in Corning’s latest Gorilla Glass 6. Corning also declined to comment for this story.

I would still recommend putting a case on your $1,000-plus iPhone XS and XS Max for some peace of mind. After all, it will cost you $279 to replace the XS screen and $329 for the XS Max (without AppleCare+ coverage). But maybe this means you can be a little more confident with them out and about.

This content was originally published here.

Federal investigators reportedly monitored Michael Cohen’s phone calls

Federal investigators have kept records of Trump lawyer Michael Cohen’s phone calls, three senior officials told NBC News on Thursday.

Initially, NBC News reported federal investigators wiretapped Cohen’s phones, but that story was ultimately retracted and corrected to say investigators established what is known as a pen register.

A pen register is a log of calls made from a specific phone or phone lines and doesn’t constitute a wiretap, which would allow investigators to listen in on calls.

President Donald Trump has sought to distance himself from Cohen since his offices, hotel room and home were raided in early April.

Investigators who raided Cohen’s properties were said to be seeking records about a $130,000 payment he made shortly before the 2016 election to the porn star Stormy Daniels to ensure her silence about an alleged affair she says she had with Trump years ago.

On Wednesday, Trump’s lawyer Rudy Giuliani told Fox News that the president reimbursed Cohen for the payment, appearing to contradict the president’s earlier statements that he knew nothing about it.

Some have speculated about whether the payment violated campaign finance laws.

“Money from the campaign, or campaign contributions, played no roll in this transaction,” Trump said.

Trump and the White House have maintained that the president did not have an affair with Daniels.

Editor’s note: This story has been updated to reflect that federal investigators kept logs of Michael Cohen’s phone calls, creating a record of numbers dialed from phones in Cohen’s possession. Cohen’s phones were not wiretapped, as NBC News first reported.

Michael Cohen
Michael Cohen.
Yana Paskova/Getty Images

This content was originally published here.

Bearish analyst fears weak iPhone XS & iPhone XR sales, but still expects ASP to climb

Preorders for the iPhone XR are better than the equivalents for the iPhone XS and iPhone XS Max, according to Rosenblatt Securities’ Jun Zhang, but that hasn’t stopped the analyst from reducing expectations for the value-oriented iPhone release.

In a note received by AppleInsider, the iPhone XR is believed to have preorders of “less than or equal to 12 million units,” based on a one-week wait time in China and no wait time in other major markets. In the firm’s September 17 note, the combined pre-orders for the iPhone XS and iPhone XS Max were around 10 million in the first three days.

After just one day of preorders, Zhang believes the initial sales of the iPhone XR are weaker than previous expectations, with 12 million units thought to be shipped to retail channels before October 26. Shipment estimates for the iPhone XR are also down overall for the second half of the year, reducing from 50 million units to 46 million.

It is suggested that, while the iPhone XS and iPhone XS Max preorders have been generally stronger in the United States than in China, the revers has occurred for the iPhone XR, with tracked preorder data indicating approximately 2 million units in the first three days. The iPhone XR preorders are apparently similar to the iPhone 8 and 8 Plus preorders from last year, and higher than the 1.5 million iPhone XS and XS Max preorders.

“This is not a very exciting result,” muses Zhang, continuing that there is the belief iPhone XR demand will be lower, prompting Apple to reduce its iPhone XR production for November and December by about 3 million to 4 million units.

Zhang is also cautious on iPhone XS sales for the second half of the year, with no anticipated changes to production ahead of the holiday season, though weak sell-through data leads to expectations some production reduction will occur during the holiday season of around 4 million to 5 million units. Shipments of the iPhone XS continue to be at 15 million for the firm.

The iPhone XS Max’s production is believed to have increased by 2 to 3 million units for the fourth quarter, to build up inventory for holiday sales, though a sell-through cool down is apparently occurring in China as consumers wait for the iPhone XR. Previous second half of 2018 estimates have increased from 15 million units shipped to 17.5 million.

For the Average Selling Price (ASP), Zhang suggests it will rise by $80 year-on-year, citing a production increase for the iPhone 7 and iPhone 7 Plus that apparently caused a lower ASP for the second half of 2017. It is also suggested the higher-capacity 512GB models will account for between 40 and 50 percent of total new model sales in 2018, further bumping the ASP upwards.

It isn’t clear why Zhang believes that the iPhone 7 is gone, as it is still available in the product lineup. The iPhone 8 in the 2018 refresh is taking the place that the iPhone 7 occupied with the iPhone X launched.

The iPad Pro is believed will ramp up in October, with iPad increasing its ASP. A new design of iPad Pro, anticipated to be revealed later this month as part of an Apple special event, is tipped to help drive the new iPad upgrade cycle.

Little is said about the Apple Watch, except that sales “continue to be strong,” which will help with fourth quarter guidance.

For Apple’s quarterly earnings report on November 1, Zhang believes the revenue and earnings will be in line with forecasted figures of $61 billion and $2.65 EPS, with guide figures for the next quarter in line or slightly better than the firm’s own predictions of $91.2 billion and $4.71 caused by new product releases and a higher ASP.

While Zhang’s comments are in line with some other analysts with regard to the iPhone XR’s demand, such as those made by Loup Ventures’ Gene Munster on Friday, Rosenblatt’s analysis of Apple’s supply chain and sales lie generally on the pessimistic side of the analyst pool. AppleInsider’s Daniel Eran Dilger has previously highlighted issues with Rosenblatt’s predictions relating to the iPhone, including proclamations about the iPhone X that turned out to be wrong.

Rosenblatt rates Apple’s shares as “Buy,” with a price target of $200. At the time of publication, AAPL is currently at $219. Rosenblatt’s target price has lagged behind Apple’s actual price for the last four years, with the analyst increasing the price over time —but never to where the stock is at that point.

This content was originally published here.