I did a good amount of gaming via Parsec and AWS this summer.
When it works, it's great. I can absolutely see this being the future of gaming as home bandwidth continues to improve, and more datacenters get built near major population centers.
It is _very_ sensitive to the connection, though. This summer, I was about 11 milliseconds away from an AWS datacenter with a 500 megabit connection. Buttery smooth perfection. No video compression artifacts. Now, I am about 60 milliseconds away with a 50 megabit connection and it is unplayable. 50 megabits/60 milliseconds is a good connection by US broadband standards these days!
Anyone who's written game netcode(either as a hobby or professionally) knows that you build the game design and game engine from the ground up to tolerate latency.
For action games most of the time it's all about building a game design where you're predicting(either via physical location or other player's actions), except in the few rare cases that do time-rewinding(most fighting games, some FPSes that combine both, most notably Counter-Strike). This is usually handled by dead-reckoning[1]
For large scale, low bandwidth games(AKA RTSes and the like where gamestate is deterministic) that's handled via lock-step[2]. The gamestate is 100% deterministic and all clients move together with a shared set of inputs in "lock step".
Both of these approaches can be tolerant to latencies up to ~600ms(back in the ole 28.8/56k days of '97 SubSpace[3] was doing ~300 players in one zone with a high skill curve and a robust netcode). They usually mask it with client side reactions that are then reconciled with the server in a robust way. If you're just dropping video frames over a network stack none of that is available to you no matter how fancy your FEC or other tricks are.
Somehow I've now got the urge to go dust off the Continuum client again and boot up SubSpace.
While you are absolutely correct, I believe the target market for Stadia is more people like my parents, who used to play casual games 10 years ago and then got too busy. They cannot justify owning a console and purchasing $60 games. But they'd be easy to sell on a $5 monthly games on demand subscription.
They will be playing with bluetooth gamepads (5ms latency) on their TV (10-20ms latency) using Wifi (5-10ms latency), so the internet streaming delay of <10ms from an edge server will be barely noticeable.
For example, Stadia is featuring "Lara Croft and the Temple of Osiris" which is a perfect game for high-latency unskilled casual play.
I thought one has to still buy/rent games on top of the $5/month. $5 is only the fee for renting “cloud hardware” - May be some games are included, but definitely not comparable to a Netflix for games. I guess it more like a Disney- ?
Stadia is free if you're happy with 1080p, and you purchase games on top. The pro subscription is around $10/month, but adds 4k res and a few new games every month.
Oh yeah, I don't doubt there's a market for this but I don't think you see it take over the same way that say Netflix did for VoD.
(FWIW I heavily use Steam's streaming client so I'm pretty familiar with most of the failure modes, it doesn't work great for everything but is convenient when the game style and network performance overlap)
But what benefit is there compared to playing a game on a phone or PC?
Look at how people use javascript for client side coding but go or rust for the back end -- the service provider pays for back end resources and they are always going to be niggardly when it comes to cpu, gpu, etc.
You'd think that "serious gamers" would obsess over the merits of games (mechanics, plot, level design, gameplay, etc), not the technical minutiae of framerates and input lag. That term always seemed odd to me.
If you're, say, playing a competitive FPS with a team (e.g. Counter Strike), you will obsess about game mechanics and latency and input lag, because after 10 hours proficiency you'll be able to notice if you're getting packet loss, if your spouse is using all the bandwidth and your crappy router with bufferbloat adds additional latency.
Generally, I agree. But to me, input lag and performance does impact.
Like watching a movie cam rip with choppy audio. You can still admire the film, but it's not going to be a pleasant experience compared to watching it in the cinema.
If the game is unpleasant to play because of input lag and the player cannot react to what is happening on screen because their response arrives half a second ir more too late, they'll have trouble appreciating all the other stuff.
We're not talking about input or rendering latency here, networked games are designed to work such that even when you have 100-300ms of round-trip latency the objective of the game is setup in such a way that your success is based on "predicting" events or the server keeps all disparate time domains in memory and can time-rewind to resolve authoritative game state.
The input lag causes both unresponsiveness and prediction. The former is obvious, a simple example for the latter: in RTS/MOBA (both rollback or lockstep netcode) the whole region around the opponent where he could be in the next tick becomes your target, the size of it depends on the latency and if RTT is big enough possible reactions become an additional factor.
> Latency is the most important thing. There's a reason high refresh monitors are loved by serious gamers.
The biggest improvement with 144Hz is motion clarity, but you can cheese it with tricks like strobing or Black Frame Insertation (BFI). I keep a CRT around for this reason.
> I cannot play a game with noticeable input lag whatsoever, even if it was at 4k HDR 144hz.
People are surprisingly tolerant of different latencies.
Many Vsync implementations end up with 4-5 frames of latency, which ends up around 75ms, and consoles that run at 30FPS (or 20 FPS like some N64 classics!) can be in excess of 100ms. If the game is designed around it, people will adapt (just like people adapt to 24fps movies).
I agree though, if you want games to feel like your is physically connected, 10-20ms is what most folks in the VR industry are targeting.
As a nearly daily user of Google Stadia I have to disagree. I use Stadia in two different households, one with a 50 Mbps connection (where Stadia works perfectly) an one with a connection capped at 10 Mbps down and 1 Mbps up. The 10 Mbps connection only allows for 720p gameplay and I get frequent stuttering every 5 minutes or so but this doesn't really ruin the experience with most story driven games like Red Dead Redemption 2 or AC Odyssey.
At first I was surprised by the playability of Stadia at such a slow connection myself but even though I cannot play shooter games I wouldn't go as far to call the experience I get with the 10 Mbps connection 'unplayable'.
I honestly want to have a map of that overlays homes and apartments with Internet connection speed and main Internet nodes/routers/datacenters. If remote work is a big part of the future, then prioritizing optimal Internet connection parameters as a part of a moving decision seems rational.
I suspect such a map doesn't exist, but if you want optimal Internet connection, what you're looking for is:
a) in a metro area that has a major internet exchange. Hurricane Electric's list of pops is a decent start: http://he.net/ip_transit.html Some of those locations are more well connected than others though. In the US, prime exchanges are really LA, SF/SJ, Seattle, Reston, New York, Miami, Dallas, Chicago. If you settle near a lesser exchange like say Denver, you'll probably have good connectivity to many networks, but many services don't have a datacenter in Denver. Also, some companies are trying to put their big datacenters farther north to save on cooling and energy costs, so bias north if possible; however, if you want excellent connectivity to South America, much of that goes through Miami.
If you're in Canada, Vancouver or Toronto are your best bets. In South America, Sao Paulo or Bogota (but note, connectivity is sparse between countries, it historically all went through Miami, but that's changed a bit). In Europe, Amsterdam, London or Frankfurt? In Asia, Singapore or Taipei. In Africa, Johanesburg. In AU/NZ, reportedly it kind of all sucks, but Melbourne/Syndey/Auckland are okish.
b) not on DSL; usually it's run with settings that guarantee you 20 ms ping to the first hop. Be careful because some of the DSL providers imply fiber to the home, but really mean fiber to the DSLAM (or whatever it's called today).
c) fiber is better than cable, but cable is ok as long as the company isn't incompetent. Comcast is much maligned, but their network is actually pretty good; local areas could be mismanaged though.
d) you need to actually put potential addresses into the sign up for service pages. Put the one you're looking at, as well as the neighbors; if you get prompts like 'you've already got service, would you like to manage it', that's a good sign that they probably actually service the address. Listed available speeds to addresses that are current customers are more likely to be accurate as well. If there's a public utility fiber, check their fiber map, but don't expect to hear back from them with firm availability or not in time to make a decision on your prospective living arrangement.
If streaming takes a large chunk of the market that has good internet quality, that's a significant revenue gone for consoles and gaming PC hardware (even if it's not 100% of the market).
I used Stadia for a while, but after a couple months, the compression started to bum me out. It really worked great, and playing the latest Assassin's Creed on a $200 Chromebook is really something else. But when I would switch to my main TV, I started thinking about how I had given up on the last uncompressed input to my TV; my game console. Dark area blocks, color banding, it all started driving me nuts.
I agree. I actually find watching game trailers on YouTube pretty pointless, as everything turns into mush and it totally loses the "precision" of the graphics. I found these cloud streaming services similar, though not quite as extreme.
I think bitrates need to be considerably higher. 4K Blu-ray uses close to 100mbit/sec, but movies have the advantage of not requiring really low latency encoding which means it is far more efficient with bitrate. I think probably looking at well over 100mbit/sec to get good PQ on these services.
50mbit should be enough for streaming games like this. However, it's very latency specific (considering the video encoding and decoding will add a few tens of ms to the latency), so you are probably closer to 100ms overall which definitely feels very laggy. That is considerably more latency than I get on a FTTH connection from London to New York, for example.
That is surprising for me to hear. I did cloud gaming a decade ago with OnLive and some Playstation Now a little later and it was never as bad as the gaming gatekeepers suggested. 60ms latency on a 50mbit connection should be good, sad it isnt.
With racing games I learned to compensate and anticipate. And there are pleeeeenty of slower games to play.
I used OnLive till the bitter end. It was awesome. A little sad the way it ended, also some tears in the rain for the game library I had purchased and the friends I had made in Space Marine multiplayer.
I was an early adopter of gforce now and I really enjoyed it while it was in beta. As soon as it was released to the public all the publishers pulled their games.
> 50 megabits/60 milliseconds is a good connection by US broadband standards these days!
Average us internet speed (speed paid for - not just speed available) is >120 megabits so I wouldn’t exactly call that good considering gamers are likely to have faster than average connections
Estimates I could find put it around 30-170Mbps, but the top end (speedtest.net average) of Canada is >150Mbps, which isn't actually common (living in/near large city). Average likely around 30-60 Mbps from experience.
When it works, it's great. I can absolutely see this being the future of gaming as home bandwidth continues to improve, and more datacenters get built near major population centers.
It is _very_ sensitive to the connection, though. This summer, I was about 11 milliseconds away from an AWS datacenter with a 500 megabit connection. Buttery smooth perfection. No video compression artifacts. Now, I am about 60 milliseconds away with a 50 megabit connection and it is unplayable. 50 megabits/60 milliseconds is a good connection by US broadband standards these days!