Minggu, 23 Februari 2020

This Alexa kitchen playset is clever — and also really creepy - Tom's Guide

NEW YORK CITY — Amazon’s Alexa is making its debut in the children’s toy room later this year, and I’m not too sure how I feel about it. KidKraft is taking a bet parents will be less hesitant when they get a look at the 2-in-1 Alexa Kitchen and Market, a 360-degree playset powered by a voice assistant.

The $299 2-in-1 Alexa Kitchen and Market debuted at Toy Fair 2020 and is powered by RFID sensors and, of course, Alexa. When kids trifle with the collection of accessories, Alexa walks them through recipes, helps make shopping lists, plays guessing games and provides a healthy dose of puns.

By bringing the play pattern of kitchen sets into a voice-driven toy, KidKraft is offering a liaison between children and AI. 

On one hand, it’s a screen-free, interactive experience. Kids will enjoy the set’s details, and the two-sided setup fosters playtime with friends or siblings. Plus your child won’t be able to control smart home devices, ask Alexa the weather, or do anything else with their voice while the kitchen set’s skill is in use. 

But there are a few concerns that a toy of this budding breed creates. I can’t help but question the social implications of making Alexa a child’s on-demand playmate. Speaking to Alexa often requires a bossy dialect and could cause kids to expect to get whatever they want when they ask.  

When I was a kid I had a second-hand kitchen set with broken parts and chipped paint and loved it dearly. I used my imagination for recipes, and it’s all I needed. 

The 2-in-1 Alexa Kitchen and Market is a massive deviation from the idyllic, uncomplicated playtime of the past, but it also seems also inevitable. As Alexa becomes a household name, it's expected every member of a home will know how to use it.

KidKraft’s 2-in-1 Alexa Kitchen and Market is coming in late 2020 and will cost $299. That price doesn’t include an Echo speaker, but I’d recommend pairing it with an Echo Dot Kids Edition.

Let's block ads! (Why?)


https://news.google.com/__i/rss/rd/articles/CBMiWmh0dHBzOi8vd3d3LnRvbXNndWlkZS5jb20vbmV3cy90aGlzLWFsZXhhLWtpdGNoZW4tcGxheXNldC1pcy1jbGV2ZXItYW5kLWFsc28tcmVhbGx5LWNyZWVwedIBXmh0dHBzOi8vd3d3LnRvbXNndWlkZS5jb20vYW1wL25ld3MvdGhpcy1hbGV4YS1raXRjaGVuLXBsYXlzZXQtaXMtY2xldmVyLWFuZC1hbHNvLXJlYWxseS1jcmVlcHk?oc=5

2020-02-23 15:44:00Z
52780626434360

Ten rules for dating my teenage daughter placing your Wi-Fi access points - Ars Technica

top floor of test house
Enlarge / The top floor of our test house is relatively straightforward—although like many houses, it suffers from terrible router placement nowhere near its center.
Jim Salter
Here at Ars, we've spent a lot of time covering how Wi-Fi works, which kits perform the best, and how upcoming standards will affect you. Today, we're going to go a little more basic: we're going to teach you how to figure out how many Wi-Fi access points (APs) you need, and where to put them.

These rules apply whether we're talking about a single Wi-Fi router, a mesh kit like Eero, Plume, or Orbi, or a set of wire-backhauled access points like Ubiquiti's UAP-AC line or TP-Link's EAPs. Unfortunately, these "rules" are necessarily closer to "guidelines" as there are a lot of variables it's impossible to fully account for from an armchair a few thousand miles away. But if you become familiar with these rules, you should at least walk away with a better practical understanding of what to expect—and not expect—from your Wi-Fi gear and how to get the most out of it.

Before we get started

Let's go over one bit of RF theory (radio-frequency) before we get started on our ten rules—some of them will make much better sense if you understand how RF signal strength is measured and how it attenuates over distance and through obstacles.

Note: some RF engineers recommend -65dBM as the lowest signal level for maximum performance.
Enlarge / Note: some RF engineers recommend -65dBM as the lowest signal level for maximum performance.
Jim Salter

The above graph gives us some simple free space loss curves for Wi-Fi frequencies. The most important thing to understand here is what the units actually mean: dBM convert directly to milliwatts, but on a logarithmic base ten scale. For each 10dBM drop, the actual signal strength in milliwatts drops by a factor of ten. -10dBM is 0.1mW, -20dBM is 0.01mW, and so forth.

The logarithmic scale makes it possible to measure signal loss additively, rather than multiplicably. Each doubling of distance drops the signal by 6dBM, as we can clearly see when we look at the bold red 2.4GHz curve: at 1m distance, the signal is -40dBM; at 2m, it's -46dBM, and at 4m it's down to -52dBM.

Walls and other obstructions—including but not limited to human bodies, cabinets and furniture, and appliances—will attenuate the signal further. A good rule of thumb is -3dBM for each additional wall or other significant obstruction, which we'll talk more about later. You can see additional curves plotted above in finer lines for the same distances including one or two additional walls (or other obstacles).

While you should ideally have signal levels no lower than -67dBM, you shouldn't fret about trying to get them much higher than that—typically, there's no real performance difference between a blazing-hot -40dBM and a considerably-cooler -65dBM, as far away from one another on a chart as they may seem. There's a lot more going on with Wi-Fi than just raw signal strength; as long as you exceed that minimum, it doesn't really matter how much you exceed it by.

In fact, too hot of a signal can be as much of a problem as too cold—many a forum user has complained for pages about low speed test results, until finally some wise head asks "did you put your device right next to the access point? Move it a meter or two away, and try again." Sure enough, the "problem" resolves itself.

Rule 1: No more than two rooms and two walls

Our first rule for access point placement is no more than two rooms and two interior walls between access points and devices, if possible. This is a pretty fudge-y rule, because different rooms are shaped and sized differently, and different houses have different wall structures—but it's a good starting point, and it will serve you well in typically-sized houses and apartments with standard, reasonably modern sheet rock interior wall construction.

"Typically-sized," at least in most of the USA, means bedrooms about three or four meters per side and larger living areas up to five or six meters per side. If we take nine meters as the average linear distance covering "two rooms" in a straight line, and add in two interior walls at -3dBM apiece, our RF loss curve shows us that 2.4GHz signals are doing fantastic at -65dBM. 5GHz, not so much—if we need a full nine meters and two full walls, we're down to -72dBM at 5GHz. This is certainly enough to get a connection, but it's not great. In real life, a device at -72dBM on 5GHz will likely see around the same raw throughput as one at -65dBM on 2.4GHz—but the technically slower 2.4GHz connection will tend to be more reliable and exhibit consistently lower latency.

Of course, this all assumes that distance and attenuation are the only problems we face. Rural users—and suburban users with large yards—will likely have already noticed this difference and internalized the rule-of-thumb "2.4GHz is great, but man, 5GHz sucks." Urban users—or suburban folks in housing developments with postage-stamp yards—tend to have a different experience entirely, which we'll cover in Rule 2.

Listing image by Jim Salter

When Ars approaches mesh networking, we come prepared.(L to R: Google WiFi, Plume pods, and AmpliFi pods)
Enlarge / When Ars approaches mesh networking, we come prepared.(L to R: Google WiFi, Plume pods, and AmpliFi pods)
Jim Salter

Rule 2: Too much transmit power is a bug

The great thing about 2.4GHz Wi-Fi is the long range and effective penetration. The bad thing about 2.4GHz Wi-Fi is... the long range and effective penetration.

If two Wi-Fi devices within "earshot" of one another transmit on the same frequency at the same time, they accomplish nothing: the devices they were transmitting to have no way of unscrambling the signal and figuring out which bits were meant for them. Contrary to popular belief, this has nothing to do with whether a device is on your network or not—Wi-Fi network name and even password have no bearing here.

In order to (mostly) avoid this problem, any Wi-Fi device has to listen before transmitting—and if any other device is currently transmitting on the same frequency range, yours has to shut up and wait for it to finish. This still doesn't entirely alleviate the problem; if two devices both decide to transmit simultaneously, they'll "collide"—and each has to pick a random amount of time to back off and wait before trying to transmit again. The device that picks the lower random number gets to go first—unless they both picked the same random number, or some other device notices the clean air and decides to transmit before either of them.

This is called "congestion," and for most modern Wi-Fi users, it's at least as big a problem as attenuation. The more devices you have, the more congested your network is. And if they're using the same Wi-Fi channel, the more devices your neighbors have, the more congested both of your networks are—each of your devices can still congest with one another, and still have to respect airtime rules.

If your own router or access points support it, turning your transmission strength down can actually improve performance and roaming significantly—especially if you've got a mesh kit or other multiple-AP setup. 5GHz typically doesn't need to be detuned this way, since that spectrum already attenuates pretty rapidly—but it can work wonders for 2.4GHz.

A final note for those tempted to try "long-range" access points: a long-range AP can certainly pump its own signal hotter than a typical AP, and blast that signal a greater distance. But what it can't do is make your phone or laptop boost its signal to match. With this kind of imbalanced connection scenario, individual pieces of a website might load rapidly—but the whole experience feels "glitchy," because your phone or laptop struggles to upload the tens or hundreds of individual HTTP/S requests necessary to load each single webpage in the first place.

Rule 3: Use spectrum wisely

In Rule 2, we covered the fact that any device on the same channel competes with your devices for airtime, whether on your network or not. Most people won't have good enough relationships with their neighbors to convince them to turn their transmission strength down—if their router even supports that feature—but you can, hopefully, figure out what channels neighboring networks use and avoid them.

This is usually not going to be an issue with 5GHz, but for 2.4GHz it can be a pretty big deal. For that reason, we recommend that most people avoid 2.4GHz as much as possible. Where you can't avoid it, though, use an app like inSSIDer to take a look at your RF environment every now and then, and try to avoid re-using the busiest spectrum as seen in your house.

This is, unfortunately, trickier than it looks—it doesn't necessarily matter how many SSIDs you can see on a given channel; what matters is how much actual airtime is in use, and you can't get that from either SSID count or raw signal strength in the visible SSIDs. InSSIDer lets you go a step further, and look at the actual airtime utilization on each channel.

This inSSIDer chart shows you how busy each visible Wi-Fi channel is. The entire 2.4GHz spectrum is pretty much eaten alive, here.
Enlarge / This inSSIDer chart shows you how busy each visible Wi-Fi channel is. The entire 2.4GHz spectrum is pretty much eaten alive, here.

In the above inSSIDer chart, the whole 2.4GHz spectrum is pretty much useless. Don't get excited by those "empty" channels 2-5 and 7-10, by the way: 2.4GHz Wi-Fi gear defaults to 20MHz bandwidth, which means a network actually uses five channels (20MHz plus a half-channel margin on each side), not one. Networks on "Channel 1" actually extend from a hypothetical "Channel negative two" to Channel 3. Networks on Channel 6 really extend from Channel 4 through Channel 8, and networks set to Channel 11 actually occupy Channel 9 through Channel 13.

Counting the "shoulder," a 20MHz wide 2.4GHz spectrum "channel" actually occupies a little more than four actual 5MHz channels.
Enlarge / Counting the "shoulder," a 20MHz wide 2.4GHz spectrum "channel" actually occupies a little more than four actual 5MHz channels.

Congestion is a much smaller issue with 5GHz networks, because the much lower range and penetration means fewer devices to congest with. You'll frequently hear claims that there are also more 5GHz channels to work with, but in practice that bit isn't really true unless you're engineering Wi-Fi for an enterprise campus with no competing networks. Residential 5GHz Wi-Fi routers and access points are generally configured for either 40MHz or 80MHz bandwidth, which means there are effectively only two non-overlapping channels: the low band, consisting of 5MHz channels 36-64, and the high band, consisting of 5MHz channels 149-165.

Each 40MHz wide 5GHz network actually occupies a bit more than 8 real 5MHz channels. In this chart, each small "bump" represents four 5MHz channels.
Enlarge / Each 40MHz wide 5GHz network actually occupies a bit more than 8 real 5MHz channels. In this chart, each small "bump" represents four 5MHz channels.

We fully expect to see a bunch of contention over this in the comments: technically, you can fit four 40MHz wide networks or two 80MHz wide networks on the lower 5GHz band. Practically, consumer gear tends to be extremely sloppy about using overlapping channels (eg, an 80MHz channel centered on 48 or 52), making it difficult or impossible to actually pull off that degree of efficient spectrum use in realistic residential settings.

There are also DFS (Dynamic Frequency Spectrum) channels in between the two standard US consumer bands, but those must be shared with devices such as commercial and military radar systems. Many consumer devices refuse to even attempt to use DFS channels. Even if you have a router or access point willing to use DFS spectrum, it must adhere to stringent requirements to avoid interfering with any detected radar systems. Users "in the middle of nowhere" may be able to use DFS frequencies to great effect—but those users are less likely to have congestion problems in the first place.

If you live near an airport, military base, or coastal docking facility, DFS spectrum is likely not going to be a good fit for you—and if you live outside the US, your exact spectrum availability (both DFS and non-DFS) will be somewhat different than what's pictured here, depending on your local government's regulations.

Rule 4: Central placement is best

The difference between "router at the end of the house" and "access point in the middle of the house" can be night-and-day.
Enlarge / The difference between "router at the end of the house" and "access point in the middle of the house" can be night-and-day.
Jim Salter

Moving back to the "attenuation" side of things, the ideal place to put any Wi-Fi access point is in the center of the space it needs to cover. If you've got a living space that's 30 meters end-to-end, a router in the middle only needs to cover 15m on each side, whereas one on the far end (where ISP installers like to drop the coax or DSL line) would need to cover the full 30m.

This also applies in smaller spaces with more access points. Remember, Wi-Fi signals attenuate fast. Six meters—the full distance across a single, reasonably large living room—can be enough to attenuate a 5GHz signal below the optimal level, if you include a couple of obstacles such as furniture or human bodies along the way. Which leads us into our next rule...

Rule 5: Above head height, please

Ceiling mount is technically the best option—but if that's too much to ask, just sitting an AP on top of a tall bookshelf can work wonders.
Ceiling mount is technically the best option—but if that's too much to ask, just sitting an AP on top of a tall bookshelf can work wonders.
Jim Salter

The higher you can mount your access points, the better. A single human body provides roughly as much signall attenuation as an interior wall—which is part of the reason you might notice Wi-Fi at your house getting frustratingly slower or flakier than usual when many friends are over for a party.

Mounting access points—or a single router—above head height means you can avoid the need to transmit through all those pesky, signal-attenuating meat sacks. It also avoids most large furniture and appliances such as couches, tables, stoves, and bookcases.

The absolute ideal mounting is in the dead center of the room, on the ceiling. But if you can't manage that, don't worry—on top of a tall bookshelf is nearly as good, particularly if you expect the access point in question to service both the room it's in, and the room on the other side of the wall its bookshelf or cabinet is placed against.

Rule 6: Cut distances in halves

Let's say you've got some devices that are too far away from the nearest access point to get a good connection. You're lucky enough to have purchased an expandable system—or you're setting up a new multiple-access point mesh kit, and still have one left—so where do you put it?

We've seen people dither over this, and wonder if they should put an extra access point closer to the first access point (which it has to get data from) or closer to the farthest devices (which it has to get data to). The answer, generally, is neither: you put the new AP dead in the middle between its nearest upstream AP, and the farthest clients you expect it to service.

The key here is that you're trying to conserve airtime, by having the best possible connection both between your far-away devices and the new AP, and between the new AP and the closest one to it upstream. Typically, you don't want to favor either side. However, don't forget Rule 1: two rooms, two walls. If you can't split the difference evenly between the farthest clients and the upstream AP without violating Rule 1, then just place it as far away as Rule 1 allows.

If this all seems too logical and straightforward, don't worry, there's another irritating "unless-if" to consider: some higher-end mesh kits, such as Netgear's Orbi RBK-50/RBK-53 or Plume's Superpods, have an extremely high-bandwidth 4x4 backhaul connection. Because this connection is much faster than the 2x2 or 3x3 connections client devices can utilize, it might be worth settling for lower signal quality between these units, with a degraded throughput that's still close to the best your client devices can manage.

If your mesh kit offers these very fast backhaul connections, and you absolutely cannot introduce any more APs to the mix, you might actually end up better off putting your last AP closer to the clients than to its upstream. But you'll need to experiment, and pay attention to your results.

Wi-Fi is fun, isn't it?

Rule 7: Route around obstacles

A tightly packed bookshelf is a significant RF obstacle,—worth a couple of walls in its own right—even when traversed perpendicularly. Penetrating its <em>length</em> is an absolute no-go.
Enlarge / A tightly packed bookshelf is a significant RF obstacle,—worth a couple of walls in its own right—even when traversed perpendicularly. Penetrating its length is an absolute no-go.
Jim Salter

If you've got a really pesky space to work with, there may be areas that you just plain can't penetrate directly. For example, our test house has a concrete slab and several feet of packed earth obstructing the line-of-sight between the router closet and the downstairs floor. We've seen small businesses similarly frustrated at the inability to get Wi-Fi in the front of the office when the back was fine—which turned out to be due to a bookshelf full of engineering tomes lining a hallway, resulting in several linear meters of tightly-packed pulped wood attenuating the signal.

In each of these cases, the answer is to route around the obstruction with multiple access points. If you've got a Wi-Fi mesh kit, use it to your advantage to bounce signals around obstructions: get a clear line of sight to one side of your obstacle, and place an access point there which can relay from another angle that reaches behind the obstacle without needing to go directly through it.

With enough APs and careful enough placement, this may even tame early-1900s construction chickenwire-and-lath walls—we've seen people successfully place access points with clear lines of sight to one another through doorways and down halls, when penetrating the walls themselves is a job better suited to a hammer-drill than a Wi-Fi device.

If you've got too many obstacles to successfully route around, over, or under... see rule eight.

Clever AP placement can allow you to route around obstacles you can't punch straight through.
Enlarge / Clever AP placement can allow you to route around obstacles you can't punch straight through.
Jim Salter

Rule 8: It's all about the backhaul

Most consumers choose pure Wi-Fi mesh, because it's convenient: you don't have to run any wires, you just plug in a bunch of access points and let them work out the magic between them, no fuss, no muss.

As convenient as this sounds, it's pretty much a worst-of-breed solution. Remember how we talked about congestion in Rules 2 and 3? It's still a problem here. If your client device has to talk to one access point which then has to relay that data to another access point, you're now using slightly more than double the airtime.

Now, this isn't really fair—you're using double the airtime if your client device was sitting where the satellite access point is; and since you followed Rule 6—cut distances in halves—that actually means the access point's connection upstream and to the client are much higher quality than the one the client would make directly to the upstream. So even in the absolute worst-case scenario—an access point which has to talk to its client on the same channel it talks to its upstream—this two-way relay can result in less airtime consumption than the client making one, much longer-range, lower-quality connection directly upstream.

However, it's much, much better to avoid the problem by talking downstream and upstream on separate bands entirely. Dual-band access points can do this by connecting to clients on the 2.4GHz radio, and the upstream (backhaul) connection on the 5GHz radio, or vice versa. In the real world, stubborn client devices (and stubborn users) frequently want to connect in sub-optimal ways, and you end up with clients on both 2.4GHz and 5GHz, so there's no one "clean" channel to backhaul on.

Really smart kits like Eero can work around that by dynamically routing backhaul, minimizing congestion by transmitting on different bands than they are receiving on, even when those bands change. More powerful tri-band kits like Orbi RBK-50/53 or Plume Superpods can avoid the problem by the use of a second 5GHz radio; this allows them to connect to clients on either 2.4GHz or 5GHz, while still having a clean 5GHz backhaul. (In Orbi, the backhaul radio is fixed and dedicated; Plume makes allocation decisions according to what its cloud optimizer decides is the best way to use airtime in that particular environment.)

The best answer, though, is not to use Wi-Fi backhaul at all. If you can run Ethernet cable, you should—not only is it faster than Wi-Fi, it doesn't suffer from Wi-Fi's congestion problems. Under heavy network load, cheap wired access points like Ubiquiti UAP-AC-Lites or TP-Link EAP-225v3s absolutely smoke even the best mesh kits, if the mesh kits are limited to Wi-Fi backhaul only. Wired backhaul can also conveniently overcome RF-opaque obstacles—if you can't punch a signal through it or relay around it, running a cable through it works wonders!

For users who aren't having much luck with mesh Wi-Fi and can't run Ethernet cables, modern powerline gear is also worth a look. Results absolutely vary depending on the quality of the house wiring and even the types of appliances connected, but in most cases, good AV2 (AV1000 or higher) or g.hn powerline gear is extremely reliable, with low latency nearly on par with Ethernet. The actual throughput is sharply limited—realistically, expect no more than 40-80Mbps for most real-world, across-the-house links—but if your killer app is gaming, or just web browsing that feels snappy, powerline can be a much better bet than Wi-Fi.

If you do go the powerline route, though, make certain you read the manual, and take the necessary steps to encrypt your connection. The first time we tested powerline gear, we accidentally bridged our powerline adapters with a neighbor's, and reconfigured his router—which was a similar model to the one on our test network, and had a default password—before realizing our error. "Hi, I hacked your router, sorry about that" is a crappy way to introduce yourself; we don't recommend it.

Rule 9: It's (usually) not about throughput, it's about latency

The great thing about throughput is, it's one great big shiny number that you can get in really easy ways—either by connecting to a speed test site like DSLreports, or by using a tool like iperf3 to connect to a local server.

The crappy thing about throughput is, it's a terrible way to measure either user experience, or the way a Wi-Fi network performs when actually under real load. Most people become unhappy with their Wi-Fi either when web browsing, or when gaming—not when downloading a big file. In both cases, the problem isn't "how many megabits per second can this pipe handle"—it's "how many milliseconds does it take for this action to complete."

Although it's possible to see a busy network's performance degrading by looking at "speed" numbers falling all around the network, it's a lot more confusing, complicated—and unrelated to the real world—than looking at application latency, which is a function of both raw speed and how efficiently the network manages its traffic and airtime.

When we test Wi-Fi networks, our killer metric is application latency, as measured by how long it takes to load a simulated, fairly complex web page. More importantly, it's how long it takes to load those web pages with lots of other things going on at the same time. Remember how we covered congestion in Rules 2 and 3—a "really fast" network with a single device active can turn into a bog-slow nightmare with many devices active—or, in many cases, with even one really poorly-connected device is active, which leads us nicely into our final rule.

The corollary to Rule 9 is, the AC speed rating is garbage—you should trust thorough, technically competent reviews far more than you trust a manufacturer's AC speed rating on a box.

Rule 10: Your Wi-Fi network is only as fast as its slowest connected device

One device with a crappy connection can kill the quality of the network for <em>all</em> connected devices—not just itself.
One device with a crappy connection can kill the quality of the network for all connected devices—not just itself.
Jim Salter

Unfortunately, one person struggling to watch a YouTube video in "that one bedroom with the crappy Wi-Fi" isn't just having a bad experience themselves—their bad time is bringing everybody down. All by itself, a phone in the same room with its associated access point might only need 2.5 percent of the available airtime to stream a 1080P YouTube video at 5Mbps. But a phone in "the bad bedroom," struggling with buffering and slowdowns, can consume 100 percent of the network's airtime trying—and failing!—to watch the same video.

Of course, streaming is very download-intensive, and routers or access points will typically refuse to transmit 100 percent of the time. An AP with lots of data to send will generally leave a little bit of airtime available for other devices to "speak up" and request their own data, after which it splits up download airtime between the nearby device and "the bad bedroom" in order to try to fulfill both their requests. However, that still adds tens or hundreds of milliseconds to the time it takes those other devices to wait for the access point to leave them a window, though—and they still have to compete with one another when such a window opens.

It gets even worse if the user in "the bad bedroom" tries to upload a video, send an email, or post a big photo to social media. The router tries to leave some airtime open for other devices to speak up—but the user's phone is under no such restraints, and it will cheerfully eat up every bit of airtime it can. Worse, the phone has no idea how much data other users may have requested, in any brief windows they had to make requests. The router knows how much data needs to be delivered to each client individually, so it can allocate airtime for downloading data appropriately—but all the phone knows is it wants to get this stuff uploaded, so everybody's experience is awful while it does it. So if you leave all this wisdom behind with just one rule in mind, this should probably be it.

Let's block ads! (Why?)


https://news.google.com/__i/rss/rd/articles/CBMib2h0dHBzOi8vYXJzdGVjaG5pY2EuY29tL2dhZGdldHMvMjAyMC8wMi90aGUtYXJzLXRlY2huaWNhLXNlbWktc2NpZW50aWZpYy1ndWlkZS10by13aS1maS1hY2Nlc3MtcG9pbnQtcGxhY2VtZW50L9IBAA?oc=5

2020-02-23 14:30:00Z
CBMib2h0dHBzOi8vYXJzdGVjaG5pY2EuY29tL2dhZGdldHMvMjAyMC8wMi90aGUtYXJzLXRlY2huaWNhLXNlbWktc2NpZW50aWZpYy1ndWlkZS10by13aS1maS1hY2Nlc3MtcG9pbnQtcGxhY2VtZW50L9IBAA

Samsung Galaxy S20 Ultra: 120Hz vs 60Hz Battery Life Comparison - PhoneArena

The Galaxy S20 Ultra comes with a special screen that can refresh twice as fast as a traditional smartphone display: it supports a 120-Hertz refresh rate (meaning it refreshes 120 times each second, compared to 60 times for traditional screens). But what effect does 120 Hertz have on battery life?

In this Galaxy S20 Ultra 120Hz vs 60Hz battery drain test, we compare the effect of the higher refresh rate on that massive, 5,000mAh battery inside Samsung's flagship for 2020.

We have already run a number of tests and one thing is clear: 120 Hertz does make a difference that you notice even when just navigating around the phone and browsing the web. The higher refresh rate will also be a huge benefit for gamers, allowing them faster reactions, but we tested a half a dozen games and none supported the new 120Hz option yet (support is promised to come soon, though). The one area where you will not notice any difference with this new technology is in watching video. YouTube and most other platforms support video recorded in up to 60 frames per second (most of it is recorded in 30fps or 24fps), which would look perfectly smooth with a traditional 60 Hertz screen and the new 120 Hertz option will not make a difference.

With all of this in mind, to measure the battery life difference between a 120Hz experience vs a 60Hz one, we turn to our browsing battery test. Before we tell you the results, though, let us mention that we have the Exynos 990 version of the S20 Ultra and we are using the phone in the "Optimized" battery mode and the 1080p screen resolution (120Hz is not supported yet in the maximum 1440p resolution).

S20 Ultra 120Hz vs 60Hz Battery Drain Test Comparison

  • Samsung Galaxy S20 Ultra 60Hz Battery Drain Test result: 12 hours 23 minutes
  • Samsung Galaxy S20 Ultra 120Hz Battery Drain Test: 10 hours 2 minutes
  • Samsung Galaxy Note 10 Plus: 11 hours 37 minutes
  • Samsung Galaxy S10 Plus: 10 hours 33 minutes
  • Apple iPhone 11 Pro Max: 12 hours 53 minutes

These results answer the question pretty unequivocally: yes, there is a BIG difference between 120Hz and 60Hz not just on how smooth the phone performs, but also on battery life.

At 60 Hertz, the S20 Ultra is among the very best phones when it comes to battery life, but if you switch to 120 Hertz, you get a battery score that is lower by 2 hours and nearly 20 minutes. That means you get nearly 20% worse battery life when you switch to 120 Hertz.

That is a considerable difference. At 60 Hertz, the S20 Ultra ranks among the longest lasting phones we have ever tested, but when you switch to 120 Hertz battery life falls so much that it will rank slightly above the middle of the charts. It's up to you to decide whether this trade off is worth it.

What this test also shows, however, is that at 60 Hertz, Samsung has managed to create a phone that is an absolute battery beast, delivering longer battery life than the Note 10+ and the S10+, but it falls just slightly short of the iPhone 11 Pro Max on the browsing test.

We will be testing the battery life on the S20 Ultra in a lot more detail and expect to see a proper battery comparison against other phones coming very soon!

Let's block ads! (Why?)


https://news.google.com/__i/rss/rd/articles/CBMicmh0dHBzOi8vd3d3LnBob25lYXJlbmEuY29tL25ld3MvU2Ftc3VuZy1HYWxheHktUzIwLVVsdHJhLTEyMEh6LXZzLTYwSHotQmF0dGVyeS1MaWZlLURyYWluLVRlc3QtQ29tcGFyaXNvbl9pZDEyMjQ2NdIBAA?oc=5

2020-02-23 09:47:00Z
52780623492936

Ninja's Twitter account was hijacked - Engadget

Jorge Lemus/NurPhoto via Getty Images

Tech giants and sports organizations aren't the only ones wrestling with high-profile Twitter account hijacks. An intruder compromised the account of well-known streamer Ninja (aka Tyler Blevins) in mid-day on February 22nd, trying to use opportunity to rack up followers, start a beef with Fortnite star Tfue and complain when an account (possibly the perpetrator's) was inevitably suspended. The attacker even tried to extort Ninja's wife and business partner, Jessica Blevins, though this clearly wasn't her first time dealing with a wannabe hacker -- she said the intruder "lasted five minutes."

Not surprisingly, Ninja wasn't fazed either. Besides deleting the tweets, he posted a video (below) blasting an "irrelevant" person for grasping in vain for popularity. "Same script every time," he said, suggesting this wasn't a particularly sophisticated hijack.

The incident wasn't the first for Ninja. In July of last year, scammers compromised his Instagram account and pushed bogus giveaways. This makes it clear that he's a high-profile target, though, and underscores how it's still relatively easy to deface accounts even when their owners likely take security seriously. Until social media accounts are airtight, you can expect similar attacks for a long while.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.

Let's block ads! (Why?)


https://news.google.com/__i/rss/rd/articles/CBMiQ2h0dHBzOi8vd3d3LmVuZ2FkZ2V0LmNvbS8yMDIwLzAyLzIyL25pbmphLXR3aXR0ZXItYWNjb3VudC1oaWphY2tlZC_SAUdodHRwczovL3d3dy5lbmdhZGdldC5jb20vYW1wLzIwMjAvMDIvMjIvbmluamEtdHdpdHRlci1hY2NvdW50LWhpamFja2VkLw?oc=5

2020-02-23 05:36:14Z
52780625547988

Sabtu, 22 Februari 2020

What Sony's history of backward compatibility tells us about PS5 - TechRadar India

So, the PS5 is on its way, and it’s going head-to-head with the equally next-gen Xbox Series X console. And while it’s tempting to look at the hefty PS4 sales figures as a sign that Sony’s dominance will continue, there is one area that the Xbox One clearly overtook, and that’s backward compatibility.

What’s that, you ask? Backward compatibility is the ability of a console to play games published on prior platforms. Given how many games are published these days, its a slightly more daunting task than it used to be, and it’s telling that Sony largely wiped its hands of that kind of functionality years ago – even as Microsoft ensured its Xbox One consoles were still capable of playing hundreds of Xbox 360 titles.

There’s a clear financial incentive to not supporting backward compatibility: if a gamer can’t use an old disc on a new console, they’re often likely to buy the game afresh, and often paying more than before for a remastered version that’s been optimized for superior hardware.

For those of us without oodles of cash to spend, though, it can feel mean-spirited. And the issue of backward compatibility has clearly struck a chord with Sony in some way, as we know the PS5 will feature a whole load of backward compatibility for PS4 games.

That’s exciting, of course: it means you won’t be scrabbling around for PS5 games to play when you get the next-gen console into your home. Just stick in a disc or load a downloaded game from your PS4 library!

Sony’s history of backward compatibility, however, doesn’t necessarily inspire confidence that this trend is set to last – or that you’re getting quite what you might be hoping for.

PS2: technical difficulties

The PS2 remains to this day the world's bestselling console, by any manufacturer. Having launched in 2000, it went on to have unprecedented success – and it probably didn’t hurt that the original PS2 could play most of the PS1 games published on the prior console.

There were a smattering of PS1 games that didn’t make the transition seamlessly, with bugs and glitches affecting titles such as Final Fantasy Anthology, Monkey Hero, and Mortal Kombat Trilogy (via PlayStation).

But the philosophy was clear: you shouldn’t need to say goodbye to your favorite games for good, or not have a way to play them again if your old console went kaput.

The PS2 Slim, however, changed things. Ensuring old games work on newer consoles requires work, and that workload was getting bigger the longer developers were pushing out games for the console, and the more that the PlayStation platform’s architecture changed with each new machine. 

The Slim version of the console, released in 2004, had an ever bigger list of titles it struggled to play, including Worms and various NHL games from the PS1, and even some PS2 titles such as Tomorrow Never Dies and Tiger Woods PGA Tour (via PlayStation). 

There were plenty of new games being released, of course, but these issues paved the way for Sony’s acceptance that not every game would make its way onto a new console.

PS3: the beginning of the end

You may not remember this, but the PS3 had pretty excellent backward compatibility – for its original 20GB and 60GB models at least. 

These models played most PS1 and PS2 discs, bridging three different generations of games, along with the option to download these titles on the PlayStation Store – a first for Sony’s consoles on both counts.

However, this compatibility wasn’t cheap, and did drive up the cost of the console – requiring dedicated hardware parts to read the PS2 discs, not to mention increased time spent on development of the console.

Part of the reason the successive PS3 Slim was smaller and cheaper was the removal of this functionality, which paved the way for the current generation console’s stance on backward compatibility: don’t do it at all.

PS4: streaming service, not fan service

That’s right: the PS4 did not (and does not) support PS3 discs, or any before it.

This is partially due to Sony’s interest in game streaming, with its paid PS Now service enabled subscribers to access a library of several hundred legacy titles without having to own a disc or keep space for them on a hard drive. That’s all fine in theory, but the service hasn’t been without its problems, and doesn’t get around the issue of gamers having to pay to replay games they’ve already owned before.

PS5: an uncertain future

What does all this mean? We know the PS5 will have backward compatibility for the majority of PS4 games, meaning your discs and downloads won’t be consigned to history… yet.

But Sony’s previous pattern suggests this might get technically harder to keep up, as well as financially inadvisable – especially if it wants to really push its PS Now streaming service in the long term.

It’s possible that a mid-cycle upgrade (say, a PS5 Slim) may drop some of this functionality, or backwards compatibility itself could be hidden behind a paywall, either packaged within PS Plus or as a standalone purchase.

This might be naysaying, as the PS5 will also be the most powerful console Sony has built, and that might mean it doesn’t run into the same problems as previous generations of hardware.

But if we take a long term view of the PlayStation console, we can’t be sure that backward compatibility will be both available and free forever on the PS5.

Let's block ads! (Why?)


https://news.google.com/__i/rss/rd/articles/CBMiXmh0dHBzOi8vd3d3LnRlY2hyYWRhci5jb20vbmV3cy93aGF0LXNvbnlzLWhpc3Rvcnktb2YtYmFja3dhcmQtY29tcGF0aWJpbGl0eS10ZWxscy11cy1hYm91dC1wczXSAWJodHRwczovL3d3dy50ZWNocmFkYXIuY29tL2FtcC9uZXdzL3doYXQtc29ueXMtaGlzdG9yeS1vZi1iYWNrd2FyZC1jb21wYXRpYmlsaXR5LXRlbGxzLXVzLWFib3V0LXBzNQ?oc=5

2020-02-22 11:00:00Z
52780623108361

Samsung Galaxy S20 Ultra benchmarks: The new Android phone to beat - Tom's Guide

The Galaxy S20 Ultra still can't top Apple's latest iPhones for performance. But the gap between the leading Android phone and Apple's pace-setting flagships is more narrow than it was before Samsung's latest phone came along.

That's our takeaway after we had a chance to benchmark the Galaxy S20 Ultra, which is now available for pre-order in advance of its arrival in stores on March 6 along with the Galaxy S20 and Galaxy S20 Plus.

The Galaxy S20 Ultra, like the other members of the S20 family, runs on a Snapdragon 865 system-on-chip from Qualcomm. The Kryo 585 CPU in this new chipset promises a 25% performance improvement over last year's Snapdragon 855 along with a 25 percent boost in power efficiency. Qualcomm says to expect a 25% improvement in graphics rendering over the previous generation from the Adreno 650 GPU included with the Snapdragon 865.

Along with the faster processor, the Galaxy S20 Ultra also benefits from 12GB of RAM. Last year's Galaxy S10 and Galaxy S10 Plus featured 8GB in their base models (though you could pay up for an S10 Plus with 12GB of memory if you wanted).

Our testing definitely shows that the Snapdragon 865-powered Galaxy S20 Ultra delivers the best performance ever in an Android phone, beating last year's pace-setting devices quite handily in most benchmarks. And while the A13 processor Apple uses in its iPhone 11 lineup still has the better numbers, the Galaxy S20 Ultra is at least in the same ballpark. (Previous Android phones were lucky to be in the parking lot outside the ballpark.)

Here's a closer look at our Galaxy S20 Ultra benchmarks.

Galaxy S20 Ultra benchmarks: Geekbench 5

Geekbench 5 is a good indicator for a phone's overall performance, and the Galaxy s20 Ultra turned in standout numbers on this test. Samsung's new phone tallied a single-core score of 805 and a multicore result of 3,076.7. Compare that to the Galaxy Note 10 Plus, which features a Snapdragon 855 processor while matching the S20 Ultra's 12GB of RAM — Samsung's older phone had a single-core score of 736 and multicore result of 2,691. That means the Galaxy S20 Ultra improved on those numbers by 9% and 14%, respectively.

Geekbench 5 single-core scoreGeekbench 5 multicore score
Galaxy S20 Ultra (Snapdragon 865)8053,076.7
Galaxy Note 10 Plus (Snapdragon 855)7362,691
iPhone 11 Pro Max (A13 Bionic)1,3343,517
Galaxy Z Flip (Snapdragon 855 Plus)7522,685

The performance gains are bigger when you compare the Galaxy S20 Ultra to a phone without as much memory. The Pixel 4 XL features a comparatively modest 6GB of RAM to go along with its Snapdragon 855 chipset. Google's phone produced a multicore score of 2,582, so the Galaxy S20 Ultra improved upon that result by 19%.

What the Galaxy S20 Ultra can't do is match the numbers produced by phones running on Apple's A13 Bionic processor. When we ran Geekbench 5 on the iPhone 11 Pro Max, Apple's phone produced a single-core score of 1,334, well ahead of the Galaxy S20 Ultra. The iPhone 11 Pro Max's multicore score of 3,517 is nearly 13% better than the Galaxy S20 Ultra's numbers.

The Galaxy S20 Ultra's numbers didn't match the higher scores we got when we tested a Snapdragon 865-powered reference device in December, though we tested that device in a performance mode that prioritized performance over battery life. That phone got within 2% of the iPhone's Geekbench 5 score. We imagine Samsung did some tweaking to the chipset so that it could deliver solid performance while still managing to keep the 6.9-inch phone powered up.

The story here, though, is how much better the Galaxy S20 Ultra compares to the iPhone relative to last year's top Android phones. The OnePlus 7T, for example, produced one of the best multicore results we saw from an Android device in 2019 at 2,759, but the iPhone 11 Pro Max still outperformed it by 27%. The Galaxy S20 shortens that lag considerably.

Galaxy S20 benchmarks: Adobe Rush 

We saw more evidence of the gains that Samsung has made in a real-world test we like to perform using Adobe Rush. In this test, we time how long it takes to transcode a 4K video to 1080p after applying an effect and transition.

Apple's phones historically smoke all comers in this test, with the iPhone 11 Pro Max taking just 45 seconds to complete the job. And that's not a number the Galaxy S20 Ultra can match, as it finished the process in 1 minute, 16 seconds.

Still, that's a solid result for the Galaxy S20 Ultra when you consider the track record of leading Android phones on our test. The Pixel 4 takes 1 minute, 31 seconds to transcode that video clip, while the Note 10's time is three seconds slower than that. So the newer processor and extra RAM in the Galaxy S20 Ultra helped it shave 15 to 18 seconds off the time of last year's flagship Android handsets.

Galaxy S20 benchmarks: Graphics tests

As in the rest of our benchmarks, the Galaxy S20 Ultra showed decent gains over last year's top Android phones, though the iPhone continues to be at the front of the pack. In GFXBench's Aztec Ruins Vulcan test (offscreen), the S20 Ultra produced 1,319 frames, or close to 21 frames per second. The iPhone 11 Pro Max was far ahead with 1,657 frames, or 25 fps.

Phone ProcessorGFXBench Aztec Ruins Vulcan
Galaxy S20 Ultra (Snapdragon 865)1,319 (20.7 fps)
Galaxy Note 10 Plus (Snapdragon 855)1,058 (15 fps)
iPhone 11 Pro Max (A13 Bionic)1,657 (25 fps)
Galaxy Z Flip (Snapdragon 855 Plus)1,124 (17 fps)
OnePlus 7T (Snapdragon 855 Plus)1,169 (18 fps)

But compare the Galaxy S20 Ultra's numbers to those from other Android flagships. Both the OnePlus 7T and Galaxy Z Flip use the graphics-boosting Snapdragon 855 Plus chipset, but their respective scores of 1,169 and 1,124 frames were both behind the S20 Ultra's results. And the Note 10 Plus lagged the field with 1,058 frames, or 15 fps.

Outlook

You'd expect a new flagship phone to top last year's models quite handily, and on that front, the Galaxy S20 delivers. And while Samsung's phone is a better match for the top iPhone, some might have expected a $1,399 to narrow the performance gap even further — especially since the A14 chipset that's likely to power this fall's iPhone 12 models will set a new standard.

Still, the gains we've seen the Galaxy S20 Ultra make in some of our real-world tests are encouraging. And given the major camera improvements Samsung has introduced to the S20 Ultra, performance is just part of the picture for justifying this phone's four-figure cost.

Let's block ads! (Why?)


https://news.google.com/__i/rss/rd/articles/CBMiOmh0dHBzOi8vd3d3LnRvbXNndWlkZS5jb20vbmV3cy9nYWxheHktczIwLXVsdHJhLWJlbmNobWFya3PSAT5odHRwczovL3d3dy50b21zZ3VpZGUuY29tL2FtcC9uZXdzL2dhbGF4eS1zMjAtdWx0cmEtYmVuY2htYXJrcw?oc=5

2020-02-22 07:00:00Z
52780623492936

PS5 news: PlayStation 5 could get its own Amazon Alexa-style AI - Tom's Guide

If half the rumours and leaks are to be believed, the PS5 is set to come with a lot of cool stuff. From state-of-the-art controllers with haptic feedback to 8K graphics and silky-smooth framerates, the PS5 is going to be an incredible bit of hardware.

The latest brand new tidbit of information shows Sony is working on something special for its next-generation console: a smart speaker-style AI that you can address as "PlayStation", which will not only perform basic commands, but work across games to help you when you're stuck on certain parts of any given level. Consoles are about to get a whole lot smarter.

A patent filed by Sony and published last month on Patentscope shows what Sony is calling an "in-game resource surfacing platform". The platform takes the form of an AI, which can be seen in the diagram below. Players are shown asking "PlayStation, how do I defeat the boss?" and PlayStation will respond with strategies other players have used to beat the same level or skip past this point in the challenge. 

This reveals several key features about the console. For one thing, connection will be as important to the PS5 as it was supposed to be for the PS4. In the age of live-streaming games and dedicated "share" buttons, it looks like all PS5s will be connected, and PlayStation is planning on harvesting data from all its new consoles, aggregating it and using it as a resource to help players through difficult portions of games. 

PS5 AI

(Image credit: Sony Interactive Entertainment)

However, there's two very troubling words on this picture: "buy now". All this information may be aggregated, but accessing it looks like it's set to cost money. Whether it's new weapons or equipment for a character or a cache of information about the game that will help you better navigate it, having it hidden behind a paywall sets a troubling "pay to win" precedent.

Last year, EA got into hot water over its Star Wars: Battlefront 2 microtransactions. EA set some of Star Wars' most famous characters, such as Darth Vader and Luke Skywalker, as unlockable playable heroes. However, the points gamers needed to "earn" in order to unlock them were ludicrously high, but EA had a helpful "pay to unlock" system, which charged gamers real-world money to unlock part of a game they had already payed for.

The resulting backlash, including the most downvoted comment in Reddit history, forced EA to release an update to change the game, but the damage had already been done. We've seen this trend gradually increase with the rise of downloadable content over the last 15 years of gaming, but creating a "pay to win" environment would be the absolute worst thing Sony could do with the PS5. It could destroy gamers' trust for a generation. 

Star Wars Battlefront PS4

The Force was not strong with EA's lazy business decisions (Image credit: EA Games)

While all we can see so far is using the smart AI to recommend information like a smart speaker, a PlayStation interface like Amazon's Alexa or Microsoft's Cortana makes perfect sense to add to the console. With around eight months to go until its prospective "holiday 2020" launch date, it's also something that could feasibly be added to the console in time. 

Liked this?

Watch: Stunning PS5 render from ZoneOfTech

Let's block ads! (Why?)


https://news.google.com/__i/rss/rd/articles/CBMiXWh0dHBzOi8vd3d3LnRvbXNndWlkZS5jb20vbmV3cy9wczUtbmV3cy1wbGF5c3RhdGlvbi01LWNvdWxkLWdldC1pdHMtb3duLWFtYXpvbi1hbGV4YS1zdHlsZS1hadIBYWh0dHBzOi8vd3d3LnRvbXNndWlkZS5jb20vYW1wL25ld3MvcHM1LW5ld3MtcGxheXN0YXRpb24tNS1jb3VsZC1nZXQtaXRzLW93bi1hbWF6b24tYWxleGEtc3R5bGUtYWk?oc=5

2020-02-22 06:04:00Z
52780623108361