Kamis, 11 April 2019

NVIDIA Releases DirectX Raytracing Driver for GTX Cards; Posts Trio of DXR Demos - AnandTech

Last month at GDC 2019, NVIDIA revealed that they would finally be enabling public support for DirectX Raytracing on non-RTX cards. Long baked into the DXR specification itself – which is designed encourage ray tracing hardware development while also allowing it to be implemented via traditional compute shaders – the addition of DXR support in cards without hardware support for it is a small but important step in the deployment of the API and its underlying technology. At the time of their announcement, NVIDIA announced that this driver would be released in April, and now this morning, NVIDIA is releasing the new driver.

As we covered in last month’s initial announcement of the driver, this has been something of a long time coming for NVIDIA. The initial development of DXR and the first DXR demos (including the Star Wars Reflections demo) were all handled on cards without hardware RT acceleration; in particular NVIDIA Volta-based video cards. Microsoft used their own fallback layer for a time, but for the public release it was going to be up to GPU manufacturers to provide support, including their own fallback layer. So we have been expecting the release of this driver in some form for quite some time.

Of course, the elephant in the room in enabling DXR on cards without RT hardware is what it will do for performance – or perhaps the lack thereof. High-quality RT features already bog down NVIDIA’s best RTX cards that do have the hardware for the task, never mind their non-RTX cards, which are all either older (GeForce 10 series) or lower-tier (GeForce 16 series) than the flagship GeForce 20 series cards. This actually has NVIDIA a bit worried – they don’t want someone with a GTX 1060 turning on Ultra mode in Battlefield V and wondering why it’s taking seconds per frame – so the company has been on a campaign both at GDC and ahead of the driver’s launch to better explain the different types of common RT effects, and why some RT effects are more expensive than others.

The long and short of it being that simple reflections and shadows can be had without terrible performance drops on cards that lack RT hardware, however the more rays an effect requires, the worse the performance hit gets (or perhaps, the better an RTX card would look). So particularly impressive effects like RT global illumination and accurate ambient occlusion are out, however cheap reflections (which are always a crowd pleaser) are more attainable.

This all varies with the game and the settings used, of course. NVIDIA’s been working with partners to improve their DXR effect implementations – an effort that’s actually been fairly successful over the last half-year, going by some of the earliest games – but it’s still a matter of tradeoffs depending on the game and video card used. Much to my own surprise however, NVIDIA says that they aren’t expecting to see game developers release patches to take into account DXR support on cards without RT hardware; this of course isn’t required since DXR abstracts away the hardware difference, however it’s up to developers to account for the performance difference. In this case, it sounds like game devs are satisfied that they’ve provided enough DXR quality settings that users will be able to dial things down for slower cards. But as always, the proof is in the results, which everyone will be able to see first-hand soon enough.

Ahead of this driver release, NVIDIA has put out some of their own performance numbers. And while they’re harmless enough, they are all done at 1440p with everything cranked up to Ultra quality, so they do present a sort of worst case scenario for cards without RT hardware. The RT quality settings GTX card owners will want to use will be much lower than what NVIDIA does here.

As a reminder, while NVIDIA’s DXR fallback layer is meant to target Pascal and Turing cards that lack RT hardware, not all of these cards are supported. Specifically, the low-end Pascal family isn’t included, so support starts with the GeForce GTX 1060 6GB, as well as NVIDIA’s (thus far) two GTX 16 series cards, the GTX 1660 and GTX 1660 Ti.

Overall the new driver is being released this morning at the same time as this news post goes up – 9am ET. And while NVIDIA hasn’t confirmed the driver build number or given the press an advanced look at the driver, this driver should be the first public driver in NVIDIA’s new Release 430(?) driver branch. In which case there’s going to be a lot more going on with this driver than just adding DXR support for more cards; NVIDIA’s support schedule calls for Mobile Kepler to be moved to legacy status this month, so I’m expecting that this will be the first driver to omit support for those parts. New driver branches are some of the most interesting driver releases from NVIDIA since these are normally the break points where they introduce new features under the hood, so I’m eager to see what they have been up to since R415/R418 was first released back in October.

DXR Tech Demo Releases: Reflections, Justice, and Atomic Heart

Along with today’s driver release, NVIDIA and its partners are also releasing a trio of previously announced/demonstrated DXR tech demos. These include the Star Wars Reflections demo, Justice, and Atomic Heart.

These demos have been screened extensively by NVIDIA, Epic, and others, so admittedly there’s nothing new to see that you wouldn’t have already seen in their respective YouTube videos. However as an aficionado for proper tech demo public binary releases – something that’s become increasingly rare these days (Tim, I need Troll!) – it’s great to see these demos finally released to the public. After all, seeing is believing; and seeing something rendered in real time is a lot more interesting than seeing a recoded video of it.

Anyhow, all three demos are going to be released through NVIDIA today. What I’m being told is that Reflections and Justice will be hosted directly by NVIDIA, whereas Atomic Heart will be hosted off-site, for anyone keeping the score. For NVIDIA of course it’s in their own best interests to put their best foot forward with RT, and to have something a bit more curated/forward-looking than the current crop of games; though I don’t imagine it hurts either that these demos should bring any GTX card to its knees rather quickly.

Let's block ads! (Why?)


https://www.anandtech.com/show/14203/nvidia-releases-dxr-driver-for-gtx-cards

2019-04-11 13:01:52Z
52780266494158

NVIDIA shows how much ray-tracing sucks on older GPUs - Engadget

NVIDIA

NVIDIA recently announced that ray-tracing is coming to older Pascal GPUs, and now it has detailed how well -- or not well, rather -- it will actually work. If you're happy with basic effects, the news isn't too bad. The RTX 2080 Ti will outperform its 1080 Ti counterpart by just over double for reflections, in line with what you'd expect for a next-gen card. However, for stuff that really adds realism, like advanced shadows, global illumination and ambient occlusion, the RTX 2080 Ti outperforms the 1080 Ti by up to a factor of six.

To cite some specific examples, Port Royal will run on the RTX 2080 Ti at 53.3 fps at 2,560 x 1,440 with advanced reflections and shadows, along with DLSS anti-aliasing, turned on. The GTX 1080, on the other hand, will run at just 9.2 fps with those features enabled and won't give you any DLSS at all. That effectively makes the feature useless on those cards for that game. With basic reflections on Battlefield V, on the other hand, you'll see 30 fps on the 1080 Ti compared to 68.3 on the 2080 Ti.

NVIDIA ray-tracing performance on Pascal cards

Meanwhile, if you want ambient occlusion, which delivers subtle shadow and lighting effects, you'll be able to run that at 59.5 fps on the RTX 2080 Ti (on an RTX tech demo), 47.6 fps on the RTX 2080, 33.7 fps on the RTX 2070 and 31.1 fps on the RTX 2060. Meanwhile, the GTX 1080 Ti will only hit an unusable 9.4 fps, or 6.8 fps, 5.2 fps, and 3.5 fps on the GTX 1080, 1070 and 1060 respectively. Other realism-oriented features, like global illumination, feature similar performance drops on the older Pascal cards.

What this means is that despite NVIDIA's promise to offer ray-tracing on Pascal, the features are largely unworkable for real-life gaming. This isn't terribly surprising, the newer RTX cards feature banks of chips dedicated solely to boosting RTX and DLSS anti-aliasing. Also, NVIDIA already said that performance would be poor, but now can see exactly how much they accelerate those effects compared to the older cards.

So what good will the feature do you then? Well, you can always try out your favorite game and turn ray-tracing on and off to see whether you think it's worth using. That's certainly better than not having the effect at all. Mostly, it appears as if NVIDIA wanted to tamp down expectations before the features are released in the wild -- and looking at the numbers, it has succeeded in that.

NVIDIA ray-tracing on older Pascal cards

Let's block ads! (Why?)


https://www.engadget.com/2019/04/11/nvidia-shows-how-much-ray-tracing-sucks-on-older-gpus/

2019-04-11 13:00:57Z
52780266494158

An Amazon employee might have listened to your Alexa recording - Engadget

Bloomberg via Getty Images

Yes, someone might listen to your Alexa conversations someday. A Bloomberg report has detailed how Amazon employs thousands of full-timers and contractors from around the world to review audio clips from Echo devices. Apparently, these workers transcribe and annotate recordings, which they then feed back into the software to make Alexa smarter than before. The process helps beef up the voice AI's understanding of human speech, especially for non-English-speaking countries or for places with distinctive regional colloquialisms. In French, for instance, an Echo speaker could hear avec sa ("with his" or "with her") as "Alexa" and treat it as a wake word.

The contents of the report aren't entirely surprising. Back in January, it was revealed that Amazon-owned Ring gave a large number of employees access to users' video feeds so they could manually identify people and vehicles. The data they gather is used to improve the system's capacity to identify cars and visitors on its own. Like Facebook, which outsources its traumatizing moderation tasks to other countries, Amazon has people transcribing audio in Costa Rica, India and Romania. The project also has workers based in Boston, however.

According to the workers Bloomberg talked to, they sometimes get to listen to recordings with sensitive information or those that were clearly recorded in error. Two of those workers from Romania said they had to listen to what could've been sexual assault. They were apparently told that they couldn't do anything about it, because it's not Amazon's job to interfere.

Amazon has admitted to the publication that it's employing human workers to annotate Alexa voice recordings. A spokesperson defended the company's practices, however, telling Bloomberg that the e-commerce giant only listens to "an extremely small sample" and that its employees do not have access to identifying information:

"We take the security and privacy of our customer' personal information seriously. We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.

We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it."

Let's block ads! (Why?)


https://www.engadget.com/2019/04/11/amazon-alexa-voice-recording-human-review/

2019-04-11 11:15:52Z
52780265665567

Smart speaker recordings reviewed by humans - BBC News

Amazon, Apple and Google all employ staff who listen to customer voice recordings from their smart speakers and voice assistant apps.

News site Bloomberg highlighted the topic after speaking to Amazon staff who "reviewed" Alexa recordings.

All three companies say voice recordings are occasionally reviewed by humans to improve speech recognition.

But the reaction to the Bloomberg article suggests many customers are unaware that humans may be listening.

The news site said it had spoken to seven people who reviewed audio from Amazon Echo smart speakers and the Alexa service.

Reviewers typically transcribed and annotated voice clips to help improve Amazon's speech recognition systems.

Amazon's voice recordings are associated with an account number, the customer's first name and the serial number of the Echo device used.

Some of the reviewers told Bloomberg that they shared amusing voice clips with one another in an internal chat room.

They also described hearing distressing clips such as a potential sexual assault. However, they were told by colleagues that it was not Amazon's job to intervene.

What did Amazon say?

The terms and conditions for Amazon's Alexa service state that voice recordings are used to "answer your questions, fulfil your requests, and improve your experience and our services". Human reviewers are not explicitly mentioned.

In a statement, Amazon said it took security and privacy seriously and only annotated "an extremely small sample of Alexa voice recordings".

"This information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone," it said in a statement.

"We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow."

What about Apple and Siri?

Apple also has human reviewers who make sure its voice assistant Siri is interpreting requests correctly.

Siri records voice commands given through the iPhone and HomePod smart speaker.

According to Apple's security policy, voice recordings lack personally identifiable information and are linked to a random ID number, which is reset every time Siri is switched off.

Any voice recordings kept after six months are stored without the random ID number.

Its human reviewers never receive personally identifiable information or the random ID.

What about Google and Assistant?

Google said human reviewers could listen to audio clips from its Assistant, which is embedded in most Android phones and the Home speaker.

It said clips were not associated with personally identifiable information and the company also distorted the audio to disguise the customer's voice.

Are smart speakers recording all my conversations?

A common fear is that smart speakers are secretly recording everything that is said in the home.

While smart speakers are technically always "hearing", they are typically not "listening" to your conversations.

All the major home assistants record and analyse short snippets of audio internally, in order to detect a wake word such as "Alexa", "Ok Google" or "Hey Siri".

If the wake word is not heard, the audio is discarded.

But if the wake word is detected, the audio is kept and recording continues so that the customer's request can be sent to the voice recognition service.

It would be easy to detect if a speaker was continuously sending entire conversations back to a remote server for analysis, and security researchers have not found evidence to suggest this is happening.

Can I stop human reviewers listening to my voice clips?

Amazon's Alexa privacy settings do not let you opt out of voice recording or human review, but you can stop your recordings being used to "help develop new features". You can also listen to and delete previous voice recordings.

Google lets you listen to and delete voice recordings on the My Activity page. You can also switch off "web and app history tracking" and "voice and audio activity", which Google Assistant pesters you to switch on.

Apple does not let you listen back to Siri recordings. Its privacy portal, which lets you download a copy of your personal data, says it cannot provide information "that is not personally identifiable or linked to your Apple ID".

To delete voice recordings created by Siri on an iOS device, go to the Siri & Search menu in Settings and switch Siri off. Then go to the Keyboard menu (found in the General section) and switch off Dictation.

Let's block ads! (Why?)


https://www.bbc.com/news/technology-47893082

2019-04-11 10:51:19Z
52780265665567

Sony's popular A7 III camera now tracks your pet's eyes - Engadget

Steve Dent/Engadget

Sony is making its already top-notch A7 III and A7R III cameras better with the release of a new firmware update. It introduces a fun AI feature for pet owners called animal eye detection. When set to continuous tracking focus mode, it can focus on your dog's or cat's eyes, ensuring they stay sharp rather than, say, muzzles or fur.

It's a lot harder to detect an animals eyes because their faces tend to have tricky, bumpy geometry and features like button noses that resemble eyes, Sony notes. It added that the technology was made possible with a combination of extensive machine learning coupled with the cameras' fast processors. While it only works on cats and dogs for now, Sony said that "future updates will bring recognition and tracking for other wildlife including birds in flight," making it a useful option for wildlife photographers.

While animal eye detection is the attention-grabbing feature, Sony has also introduced real-time Eye AF for humans, too. It can track your subjects' eyes and re-acquire them in real time, even if they turn around or look down, without having to first lock in on their face. The result, Sony said, "is an eye detection system that's much faster than any previous system." That's saying a lot, since Sony's eye-tracking system was already the best one on the market.

Sony also introduced an interval time lapse timer that can be set for anywhere between one second and one minute for up to 9,999 shots. The new firmware updates are now available for the Sony A7 III and A7R III, with more info about the new features available here.

Let's block ads! (Why?)


https://www.engadget.com/2019/04/11/sony-a7iii-a7riii-animal-eye-detection/

2019-04-11 09:29:54Z
52780266183525

Apple persuades Foxconn and TSMC to use only renewable energy when making iPhones - The Verge

Apple has persuaded 15 more of its suppliers, including Foxconn and TSMC, to manufacture Apple products using 100 percent clean energy. The additions bring the total number of suppliers in the program up to 44. Apple says it now expects to exceed its goal of using four gigawatts of renewable energy in its supply chain by 2020 by an additional gigawatt.

In April last year Apple announced that its facilities now run entirely on renewable energy, and in October the company added that it had achieved the same goal for its retail locations. But, as CNBC notes, Apple’s own facilities only account for just over a quarter of its carbon footprint. The other 74 percent comes from its factory partners.

Today’s announcement says that these suppliers have committed to the renewable goal, not that they have already achieved it. Speaking to Reuters, Apple’s vice president of Environment, Policy and Social Initiatives Lisa Jackson, refused to comment on whether Apple would drop suppliers such as Foxconn and TSMC if they failed to honor these commitments.

“It took a while for them to come on board, and so we believe that now that they have, they’re fully committed to doing it… and obviously if they fall down on the job, we’ll be right there on their chase. I can’t tell you what will happen, but I hope it never does.”

Apple’s environmental goals have proven to be points of contention with some Apple investors. In 2014, for example, one Apple shareholder raised concerns about the company’s environmental initiatives, objecting to “affiliations that may primarily advance social or environmental causes rather than promoting shareholder value.” The concerns were rejected by the company’s shareholders as a whole, and Tim Cook said “you should get out of this stock” if they expected him to be motivated solely by money.

Earlier this year Apple said in a filing that it considers climate change to present a risk to its business operations.

Let's block ads! (Why?)


https://www.theverge.com/2019/4/11/18305840/apple-foxconn-tsmc-renewable-energy-supply-chain-environment-clean-green

2019-04-11 08:25:13Z
52780265511262

Thousands of Amazon workers listen to recordings from Alexa: reports - Fox News

Alexa is like having your own personal assistant that never asks for a raise. The problem is she’s always listening – and so are thousands of Amazon workers, according to a report.

Teams stationed around the world listen to and transcribe recordings, then send them back into the Echo’s software to erase the gaps in Alexa’s ability to understand speech, a report from Bloomberg said.

AMAZON ALEXA ADDS DETAILED NEWS READING OPTION

Sometimes the workers can even hear chatter in the background while Alexa is on but employees on the team are not authorized to speak about their work, Bloomberg reported.

The employees, who range from contract to full-time, reportedly sign nondisclosure agreements and listen to up to 1,000 audio clips per nine-hour shift.

Although Amazon reportedly has procedures in place for when potential criminal conduct is heard. Two workers in Romania told Bloomberg that they were told it isn’t Amazon’s job to interfere. In other cases, the workers said they sometimes use internal chatrooms to share recordings they find amusing.

When workers come across a background conversation about personal information like bank details the worker is supposed to make the audio file as “critical data” and move on, according to Bloomberg.

“We take the security and privacy of our customers’ personal information seriously,” an Amazon spokesman told Bloomberg in a written statement. “We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience. For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.

Amazon did not immediately respond to an email from Fox News.

Alexa users can disable their voice recordings for developing new features.

Apple’s Siri and Google Assistant also have human workers that listen to snippets of audio, but the companies reported to Bloomberg that the recordings aren't linked to personally identifying information.

CLICK HERE TO GET THE FOX NEWS APP

“We have strict technical and operational safeguards, and have a zero tolerance policy for the abuse of our system. Employees do not have direct access to information that can identify the person or account as part of this workflow. All information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption and audits of our control environment to protect it,” Amazon told Bloomberg.

Let's block ads! (Why?)


https://www.foxnews.com/tech/thousands-of-amazon-workers-listening-to-alexa-recordings-hear-personal-information-even-potential-crimes-report

2019-04-11 07:37:58Z
52780265665567