Tag Archives: games

Testing mobile G-Sync with the Asus G751JY: Boutique gaming’s killer feature?

Last January, we previewed how mobile G-Sync might perform on an Asus G751JY laptop that wasn’t fully certified for the feature but supported it well enough to give us a taste of what G-Sync could deliver. Today, we’re revisiting the topic, armed with a fully certified Asus G751JY-DB72. This system is nearly identical to the G751JY that we tested earlier this year, but with a handful of upgrades. Specifically, the G751JY-DB72 uses a Core i7-4720HQ CPU, 24GB of DDR3, a 256GB SSD, and a backup 1TB HDD for conventional mass storage. The system still uses a GTX 980M (4GB of RAM) and a 1,920-by-1,080, 17.3-inch screen.


At $1999 from Asus, it’s not a cheap laptop, but it’s one of the nicest and best-balanced systems I’ve ever tested. Because mobile G-Sync is a big enough feature to warrant its own treatment, we’re going to discuss the laptop’s performance and capabilities in a separate piece. For now, it’s enough to say that this is one of the best boutique laptops I’ve ever tested, even if the base model debuted a year ago.

How mobile G-Sync works

Mobile and desktop G-Sync accomplish the same goal, but they achieve it in different ways. Nvidia’s desktop G-Sync displays rely on a separate, Nvidia-built scaler unit. This scaler controls the monitor’s timing and synchronizes the display’s output with the video card. In 2013, when Nvidia debuted G-Sync, its custom scaler technology was the only way to achieve this kind of synchronization in a desktop display. That’s since changed with the launch of the VESA-backed Adaptive Sync standard (AMD calls its own implementation FreeSync). Laptops, however, don’t require custom scaler hardware — the ability to synchronize refresh rates is part of the embedded DisplayPort specification that both AMD and Nvidia use.


In order to qualify for the mobile G-Sync moniker, Nvidia requires laptop manufacturers to prove that their hardware meets certain standards. We don’t know all the details on what panels need to have, but we do know that they must support variable overdrive. Nvidia has stated that it works with ODMs to ensure that the G-Sync implementations in each laptop are tuned to the specifications of the underlying panels.


As the name implies, variable overdrive allows the display to decrease pixel ghosting by anticipating what color a pixel may need to be on the next refresh cycle and adjusting voltage accordingly. Nvidia has noted that this could result in a slight decrease in color accuracy in some conditions, but the net result should still be improved color reproduction.

G-Sync: A Goldilocks solution:

Now that we’ve covered the basics of how mobile G-Sync works, let’s talk about its specific implementation in the Asus G751JY. This laptop uses a 75Hz panel, which is important to know, because it specifies the maximum refresh speed at which G-Sync can operate. If you have a 75Hz panel and your game is kicking out a steady 200 FPS, G-Sync disables automatically and the game will switch to either V-Sync on or off. By default, NV switches to V-Sync on, since this is much less jarring then the sudden appearance of tearing, but if you prefer to disable V-Sync when the frame rate exceeds 75 FPS, you can specify that at the control panel.

This might seem less-then ideal, since gamers are typically taught to prefer high frame rates, but the relative advantage of faster FPS is subject to diminishing marginal returns. The higher the frame rate, the less visible a missed frame is.

If the frame rate falls below a certain level, however, G-Sync can run into another problem. While it doesn’t shut off due to low FPS, the GPU will automatically interpolate and insert multiple frames to smooth playback. If performance is relatively steady, this is an excellent way to smooth the game without impacting playability. If the frame rate is changing significantly from moment to moment, however, some frames will end up repeated and some will not.

PC Perspective wrote an excellent report on how FreeSync and G-Sync handle low frame rates. The graph below shows how G-Sync inserts additional frames, boosting the refresh rate as a result.


As the frame rate fluctuates, the number of frames G-Sync injects to smooth presentation can vary as well. While the end result can still be superior to not having G-Sync on at all, a variable frame rate below ~35 FPS doesn’t produce the buttery smoothness that Adaptive Sync and G-Sync provide at higher refresh rates.

This ideal window is why we call G-Sync (and Adaptive Sync) a Goldilocks solution. Both technologies work best when your frame rate is neither too high nor too low. In this case, users should target an average consistent frame rate between 40 and 60 FPS.

Testing G-Sync

One of the intrinsic problems with testing a feature like G-Sync is that it’s hard to capture the output difference without a high-speed camera. One website, Blurbusters, has built a G-Sync simulator that you can use to examine the relative impact of having G-Sync enabled vs. disabled. You can see and select various display modes to compare the output, but if you choose G-Sync, be advised that the frame rate will rise until it reaches your monitor’s maximum refresh rate, then drop and start again. You can compare the output in this mode against the various other options (V-sync enabled, disabled, frame rate drops, etc).

The best video demonstration we’ve found of G-Sync vs. V-Sync On is embedded below. I’d recommend watching it full-screen and not trying to focus too hard on any one area of the image. If you relax your eyes and focus on the green line between the two rotating outputs, you’ll see that the V-Sync output on the left has a small but noticeable stutter that the G-Sync output lacks. The relevant portion of video is at 1:10.

One problem with testing a feature like G-Sync is confirmation bias. Confirmation bias is the human tendency to look for evidence that confirms a hypothesis while ignoring or discounting evidence that could disprove it. If I know that G-Sync is enabled, I may claim that a game looks better because I expect G-Sync to deliver a marked improvement. We avoided this problem by using a single-blind A/B test.

Before each test, the laptop was configured to enable or disable G-Sync. I was then asked to choose whether G-Sync had been enabled or disabled based on how the game/benchmark ran. No frame rate information or third-party tools like FRAPS, that might inadvertently hint at whether or not G-Sync was enabled, were enabled and I was not allowed to alt-tab the game or check my results until after the entire set of test runs had been concluded.

Our initial tests of BioShock Infinite failed because the game was either running well above the 75 Hz refresh rate on the Asus G751JY (and enabling V-Sync at these higher frame rates rather than using G-Sync), or running below the 30 FPS mark when we tested at 4K using Dynamic Super Resolution. We discussed the situation with Nvidia and chose IQ settings that kept the game at the 40-50 FPS mark where G-Sync’s impact is most noticeable. Once we did, I could successfully identify whether BioShock Infinite used G-Sync or not in every single test.


We also tested The Elder Scrolls: Skyrim, though in its case, we had to install additional texture mods to pull frame rates low enough for G-Sync to kick in. Again, I was able to correctly determine whether or not G-Sync was enabled in every single test. In most cases, it took just seconds — camera pans and movement are much smoother when G-Sync is enabled.


As someone who would benchmark a llama if I could find one with a PCIe slot, I’m loathe to issue an opinion that comes down to “Trust me, it’s awesome.” In this case, however, that’s what’s called for. With G-Sync enabled, camera pans are much smoother. V-Sync just doesn’t deliver an equivalent experience — not unless your game is already holding a steady 120+ FPS frame rate and you own one of the handful of monitors that support a refresh rate that high.

Is G-Sync worth it?

The FreeSync vs G-Sync battle between AMD and Nvidia has mostly played out in the desktop space, where FreeSync / Adaptive Sync displays have generally been cheaper than their G-Sync counterparts. The situation is different in mobile, where multiple vendors are shipping G-Sync-enabled laptops, while FS/AS appear to be a no-show thus far. We’ve heard rumors that this could change in the next few months, but for now, mobile G-Sync is the only show in town.

It’s true that getting G-Sync up and running properly can require some fine-tuning, but we’re not talking about anything extravagant — if you’re comfortable adjusting in-game video settings, you can tune a game to work well in G-Sync. Older titles may require some additional intervention, but if you’re comfortable installing graphics mods, it’s easy to find frame rates that showcase the feature.

Sometimes, buying into a new technology when it initially rolls out means paying a premium for a less-than ideal experience — but that doesn’t seem to be the case here. The Asus G751JY is a well-balanced system, and the GTX 980M is unmatched in mobile GPUs. True, Nvidia now offers a desktop-class GTX 980 in an ostensibly mobile form factor, but we have some significant concerns about just how that solution will actually work in the real world. The 980M, in contrast, is a proven high-performance solution.

AMD will likely counter with its own solutions — the first FreeSync demos were originally doneon a mobile platform — but for now, if you want this technology, Nvidia is the only game in town. It’s a feature that makes a significant difference, and if we were in the market for a boutique gaming laptop, we’d put G-Sync high on our list of desired features.

Tagged , , , , , ,

League of Legends: Bidding war over e-sports team

A bidding war has broken out during the sale of a professional team of players of the fantasy game, League of Legends.

The UK’s Team Dignitas has two pro League of Legends teams on its books but tournament rules for the game state that they can only oversee one.

Bids for one of their teams have gone far beyond $500,000 (£323,000), a Dignitas spokesman told the BBC.

The final details of the sale and the team’s new owner will be revealed by the end of the month.

Big bids

The massively popular League of Legends game has an associated World Championships that pits the tops teams against each other for large cash prizes.

The five players who were world champions in 2014 shared $1m (£650,000) in prize money.

The teams meet in a virtual arena and are tasked with destroying the heart of their rivals’ base while defending their own.

Michael O’Dell, manager of Team Dignitas, said one of its teams had been part of the League of Legends Championship Series (LCS) for some time. This year, he said, Dignitas’s second or “challenger” team has also qualified for the LCS.

“The rules state that you can only manage one, so we are in the process of selling one of the teams at the moment,” he told the BBC.

Mr O’Dell confirmed they had found a buyer but would not be drawn on which team would be sold or who had bought them.

The last few weeks had seen a series of bids for the team come in from many pro-game management firms, individuals and other organisations.

“E-sports is growing so fast at the moment,” he said. “There are millionaires and billionaires coming in buying teams and there are sports stars looking to buy teams.

“It’s really strange dealing with billionaires over this,” he said.

Pro-players could also cash in later in October at the start of the 2015 transfer season, which often sees top players garner large fees to change teams.

Tim Edwards, an editor at the PC Games N website, said the size of the deal over the LoL team reflected the growing interest in e-sports by traditional media firms, brand managers and advertisers.

“It would be hard for them to reach that gaming audience any other way,” he said.

The quarter-finals of the League of Legends World Championships are being streamed on BBC Three over three days.

Tagged , , , ,

Star Wars Battlefront on the PC: Impressions and performance

For the past few days, EA’s Star Wars: Battlefront has been in open beta. We spent some time in the game in all three modes — both the Battle of Sullust, Walker Assault on Hoth, and the single-player missions that pitch you against waves of stormtroopers and other attack vehicles in a survival mode. Unlike the console players, who are stuck dealing with either 900p on the PS4 or 720p on the Xbox One, PC gamers get the full monty — as much resolution as your monitor can handle, and quality settings that truly bring Star Wars to life around you.

It’s difficult to know what to write about Star Wars: Battlefront, and for reasons that have nothing to do with the fact that this was a beta with just three game modes. The overwhelming and immediate thought when you first fire up the game is “I’m playing Star Wars!” On that front, Dice has succeeded beautifully. This game feels like a love letter to every kid who ever raced through the house clutching Han Solo’s DL-44 and making blaster noises. As you race to recover escape pods on Sullust, the capital ships overhead fire on each other (and occasionally on the planet).


I’m even willing to forgive the fact that Dice shows Imperial Star Destroyers as in-atmosphere craft over both Sullust and Hoth when they ought to have used Victory or Venator-class Star Destroyers instead.

This nostalgia is particularly strong on Hoth, whether you play as Imperial stormtroopers or the Rebel Alliance. The map is asymmetrical, meaning the two sides have vastly different goals and strategies for winning. The Rebels must activate satellite uplinks that enable Y-wing bombers to make attack runs on the advancing Walkers. The Imperials must defend the walkers against these attacks, which means keeping the satellite uplinks out of commission. As the battle progresses, you’ll fight past the iconic Kuat ion cannon and into Echo Base itself.


I know that there’ve been previous Battlefront games, but the last version came out in 2005 — long before the advent of DirectX 11, 12, or modern hardware. While that 2005 game holds up reasonably well, considering its age, it’s got nothing on the models and levels of detail Dice has brought to the table. As a nostalgia play and crazy-fun dip into first person Star Wars combat, Battlefront is a true achievement.

Simplistic design

Where Battlefront falls a bit flat is its mechanics and map design. True, we haven’t seen the entire game yet, but I played in the Battlefield 3 and BF4 betas, too. In both cases, the maps and scenarios offered to early testers were much larger and more complex than what we’ve seen this past few days. The Battle of Sullust is fun, but it’s ultimately a relatively small map with only limited use of terrain. The Battlefield series prides itself on urban environments that can be aggressively “remodeled” based on player action.

There’s none of that in any of the beta designs that were shown — even in Echo base, hurling thermal detonators or implosion devices doesn’t damage the X-Wing sitting in the hangar below. You can similarly throw thermal detonators into the ice walls carved out of Hoth, but you won’t see any terrain deformation when you do. For a company that’s built its reputation on deformable terrain and evolving combat conditions, such ommissions are surprising.


The simplicity carries over into the combat and loadout options that we saw teased so far. While there are options like a smart rocket and sniper rifle, these are unlocked with in-game credits and cannot be fired repeatedly. You get one sniper shot every ten seconds, period — while the weapon is on cooldown, you can’t even equip it. The vehicles of the BF series are gone, replaced by power-ups that you pick up and use at will. This allows for some additional flexibility, since you can trigger a TIE fighter or Airspeeder launch at will, but feels less cohesive. Without squads or classes, every trooper you face is likely to be carrying the same handful of weapons or power-ups, and the various blasters are all extremely similar. The lack of recoil may be thematically accurate, but it makes the weapons feel more similar.

Also, every single air vehicle feels like someone mounted an ion engine on a sofa. This might make sense for TIE fighters, which aren’t supposed to operate in atmospheres and have enormous square wings, but the problem extends to every single craft. Airspeeders have the turning radius of a manatee with multiple sclerosis. Even the A-Wing, a Rebel fighter specifically designed for speed and agility, feels sluggish. None of the Battlefield games are known for great flying mechanics, but flying starfighters in land-based Battlefront missions isn’t much fun.

Other game decisions are equally odd. You can buy a personal shield, but have to pay to charge it with in-game currency, unless you pick up in-game power charges. You can play as Luke Skywalker (in RoTJ costume) or Darth Vader in the Hoth mission, but both characters feel more like afterthoughts, nods to the hero units of earlier Battlefronts, than fleshed out concepts. I really enjoyed my time in-game, but I’m not convinced that this title has the staying power that EA seems to think it does. It’s missing most of the tactical underpinnings that made Battlefield interesting, and the sheer joy of playing in the Star Wars universe may not be an adequate substitute.


Guru3D has done an extensive performance workup on Battlefront under D3D11 (DirectX 12 support is planned, but not currently implemented. We attempted to sneakily activate it anyway, but when Dice says the feature isn’t ready for prime time in this version of the game, they weren’t kidding). Guru3D has quite a bit of data on the game’s current performance at various detail levels, but here’s the 10,000 foot overview:

Battlefront performance

There’s a lot of great news buried in this graph. First, the game runs beautifully on older / slower card. The Radeon R9 370 can hit 44 FPS in 1080p mode, as can the GTX 950. The R9 370, aka the Radeon 7850, was a midrange card when it was introduced in Q1 2012, but it can handle Ultra detail levels at 1080p just fine. Stepping up the stack, we see the R9 290 outperforming the GTX 970 (Nvidia can’t be thrilled about that) the R9 390X edging out the GTX 980, and the Fury X and GTX 980 Ti duking it out at the top of the stack. The Fury X has an edge at 4K, but isn’t quite as fast in 1080p.

If you own a GPU built in the last three years, chances are you can play this game at high detail levels and at least 1080p. Performance is good enough that it wouldn’t surprise us if APUs and even Intel GPUs can get in on some of this action, albeit at lower resolutions and detail levels.

Pricing, preliminary verdict

The one downside to all of this is EA’s decision to announce a $50 season pass alongside a $60 retail price. It’s a tone-deaf move for a number of reasons. First and foremost, it’s a bad idea because the last game Dice launched, Battlefield 4, was an utter disaster. While the game ran reasonably well in beta, with relatively light server loading, things collapsed in the final iteration in ways that took Dice nine months to fix. You can blame EA for this if you like, since the publisher is ultimately responsible for kicking the game out the door, but there’s precious little reason to bet on Battlefront nailing everything by ponying up for a season pass.

I’ve written in defense of DLC before, and I stand by that, but as of now, EA is asking players to pony up $60 for a base title and $50 for four DLC packs along with “Pay to Win” freebies like the DL-44 blaster, ion grenade, ion torpedo, and two-week early access to each DLC if you pay for the season pass up front. After the mediocre Battlefield Hardline and the awful launch of Battlefield 4, neither Dice nor EA deserve that kind of pre-order cash. I strongly recommend waiting to see how you like the base game and whether it launches in playable condition before buying into any additional packages.

Tatooine is used in single-player missions.

Based on what I’ve seen so far, I’d say that Battlefront absolutely nails the nostalgia and evocative aspects of the game. The lack of a single-player campaign to tie things together is a huge loss — it’s been more than a decade since we got a Star Wars single-player FPS and ten years since the last Battlefront. If you want to run around playing Star Wars, Battlefront delivers. If you’re looking for a deeper, more tactical FPS, I’m not sure this is it. And I’d wait and see how the final game reviews before ponying up for any DLC package, regardless of what enticements EA tries to offer. A game you can’t play isn’t enjoyable.

Tagged , , , , , , , ,

Cyber-thieves hit YouTube Fifa gamers

Six of the most successful Fifa video gamers to feature on YouTube have been targeted by cyber-thieves.

The hackers stole millions of Fifa coins, the games virtual currency, and sold players worth thousands of pounds.

They are thought to have convinced manufacturer EA Sports to transfer their victims’ Origin accounts to email addresses the hackers controlled.

Many other well-known players who do not make videos are also believed to have been hit.

AnesonGib, W2S, Nepenthez, Nick28T, Bateson87 and matthdgamer have more than five million YouTube subscribers between them.

Matthew Craig, the man behind matthdgamer, told the BBC: “There have been about 10 or more accounts which have been hacked over the last two weeks, me included.”

In a video, Nick28T said: “Basically, someone called in pretending to be me and… got in to my account.”

An EA representative said: “We encourage all Fifa players to secure their accounts with authentication and verification steps, which we outline on our help and our product sites.

“We are consistently working through our customer experience teams to secure accounts and make sure players are educated when account compromises are made.”

Mr Craig said EA had apologised to him about the attack and had moved quickly to help him once he had reported it.

“They got my account back, added four or five more security measures, and my account has been fine since,” he said.

Tagged , , , , , ,

William Hill bets on virtual reality racing

Anyone who has been to a racecourse will know the excitement when the horses pound towards the finish line.

But this thrill of the ride is something that many customers who bet in shops or online do not experience.

Bookmaker William Hill is keen to recreate it with virtual reality technology.

The UK’s Gambling Commission said that it would be monitoring innovations such as virtual reality to ensure that they did not encourage excessive gambling.

Using virtual reality headsets, combined with GPS racetrack data, it is giving customers the chance to view the race from the jockey’s perspective.

“Currently you place a bet and not much happens between that and the outcome,” said Crispin Nieboer, William Hill’s director of innovation.

“We want to bring customers closer to the sporting action, to experience the thrill of the ride.”

To test the possibilities, the team at William Hill labs built a 3D mock-up of Kempton Park racecourse and collected live data, via GPS trackers fitted on horses, during a training race at the course.

People trying out VR racing

Combining the data created a virtual race users can view via either Google Cardboard or an Oculus Rift.

The technology is not yet available to the public but was on show at an open day at the firm’s technology laboratory in Shoreditch.

Users first choose the horse they want to race on. Accompanied by live commentary, wearers can turn and look at other horses as well as activating a data display about the horse’s heart rate, stride and race position.

William Hill plans to add more courses and live races next year.

“Currently there are some gaps in the data so the horses suddenly accelerate in a live race, but we hope to have a proof-of-concept system ready by Christmas,” said Mr Nieboer.

The plan is to launch the service as part of the William Hill app.

“Users can choose the option to watch the race as a standard video or they can be the jockey,” Mr Nieboer said.

It could also be available in some betting shops, said Mr Nieboer, while Google Cardboard headsets were likely to be given out free at racecourses.

It is estimated that in the UK about 350,000 people have a gambling addiction, with over £7bn spent annually.

The proliferation of online betting has been blamed for making it easier to gamble and some feel services such as virtual reality could add to the problem.

The market is regulated by the UK Gambling Commission, which said that it “monitors innovation in the gambling market in order to ensure operators continue to comply with the conditions of their licences”.

“Operators are required to ensure that they offer gambling in a responsible manner, which will include offering tools to allow customers to manage their gambling activity as well as having policies and procedures in place to identify potentially problematic behaviour and interact with customers who exhibit that behaviour,” a spokesman added.

Tagged , , , , , , ,

Virtual reality: So near, yet so far

The biggest competition for virtual reality is something it’ll never beat – the real world.

As many readers of this blog like to point out, a virtual reality environment will never be a substitute for actually experiencing something.

No-one, even in the corridors of Oculus Connect, a conference for the virtual reality industry, would suggest otherwise.

Owned by Facebook, Oculus is credited with breathing new life into the virtual reality industry which had faded out after an almost cringe-worthy first-go in the nineties. Its headset, the Oculus Rift, hits shelves next year. Anticipation is huge.

But today we had a reminder of just how far we are from enjoying anything that comes even close to producing an fully-immersive world – one that can recreate common human feelings and emotions; the sense of being somewhere else, with other people, feeling different sensations.

The announcements

Oculus and Facebook made a range of announcements relating to VR today. Here are the most significant:

  • Minecraft. Veteran games maker John Carmack, now chief technology officer at Oculus, described Minecraft coming to Oculus as the biggest “win” they’ve had. It didn’t come easy – Mark Zuckerberg and Microsoft boss Satya Nadella had to sit down and iron out some issues before the deal was done.
  • Netflix, Hulu and others signed up. The on-demand giants are on board with VR – you’ll be able to watch titles on the likes of Netflix and Hulu within the headset, giving the impression of watching it on a huge screen. Many have questioned how comfortable that would be after more than about 15 minutes.
  • A $99 (£65) Gear VR. Aside from the premium Oculus Rift headset, due to be released early next year, Oculus and Samsung have created the Gear VR, a low-cost headset that uses a Samsung smartphone to power the visuals. It’s not the full VR experience, but it’s intended to be gateway for newcomers. The next headset will be $99 and work with Samsung’s Galaxy S6 range.
  • Oculus Ready. The Oculus Ready PC programme is a stamp of approval designed to help people buy computers that will be good enough to power VR. PC makers onboard include Asus, Dell and Alienware.

To hammer this home, Oculus’ chief scientist Michael Abrash took a refreshing approach to his keynote – outlining all the things Oculus could not yet do.

The problems are so great the team is not even trying to solve them – something for the next generation to tackle.

One is providing a sense of smell, a sensation so integral to experiencing, and later remembering, a new place.

Brendan Iribe, Oculus CEO

Another challenge is the ability to taste something, or hear realistically in a way that does not feel as if we’re just wearing headphones. Perhaps the biggest barrier is a sense of touch.

Haptic technology is only just beginning to recreate basic touch sensations – but it remains that in VR, it’s going to be years before you’ll stop putting your hand through virtual tables, killing the illusion in an instant.

Tough crowd

But virtual reality enthusiasts shouldn’t feel disheartened.

Right now, VR is what Space Invaders is to Call of Duty. They’re both games, sure, but they’re worlds apart. The now-primitive blip-blip-blip of 1970s arcade games were the building blocks needed to get us to where we are now.

And so the feeling among Oculus Connect is that this is just the beginning, and there’s still a long way to go.

A screen from Space Invaders, from a display at London's Science Museum

Gamers and the wider public may take a while to reach the same level of excitement felt within the industry.

Mr Abrash told delegates that they’re living in the “good old days” of VR – a time that will be looked back upon as the start of something significant.

Except it’s not quite the start. We’ve been here before. Journalists in the 1990s were writing about VR as the next big thing just as I am now. But the technology wasn’t ready then.

Is it now? There are a few veterans prowling the halls here, enticed back to the action after some time away. One was Greg Panos, who has been studying virtual and augmented reality for over two decades.

I asked him if this latest wave of VR was any different to what happened in the 90s.

Yes, he said – the difference now is that VR is good enough, and cheap enough, for companies to start making some serious money. And so it starts.

Palmer Luckey, founder at Oculus

The first battleground for VR will be gaming. Therefore the best games will win – in theory – so efforts from HTC Vive headset could disrupt Facebook’s ambition. HTC has partnered with legendary games maker Valve, and so the games should be terrific. And the Vive goes on sale first.

Sony is jostling in, too. Its Morpheus headset has two things going for it. One, it’s tied to the already immensely successful PlayStation 4, and so will likely be bundled with the console.

Shrewd move

But with the announcements made today, Facebook is giving itself a huge headstart in a new, exciting world of entertainment.

Deals with Netflix and Minecraft could give Oculus the edge, even if other competitors have better hardware, as has been the suggestion. Content, as always, is king. Vive and Morpheus will need to compete with that.

The possibilities are mind-blowingly enormous – from gaming to tele-presence, education to blockbuster movies, Facebook is trying to nurture a platform that one day could rival the mobile app ecosystem in its scale.

But – and it’s a big but – Facebook still needs to pull it off. One year on from Facebook’s purchase of Oculus VR, we’ve still not seen the technology really hit the market in any meaningful way. That means Facebook boss Mark Zuckerberg’s $2bn bet in VR bet is still wide-open.

Will that purchase be seen as a shrewd move on par with Google’s bargain-tactic $1.65bn purchase of YouTube in 2006?

Mark Zuckerberg, CEO at Facebook

Or will Oculus be Mr Zuckerberg’s MySpace – a service with great early momentum, bought by NewsCorp for $580m, only to be later offloaded for $35m? It was a newcomer that took what MySpace started and made it much much better, killing the business in the process.

That newcomer being Facebook, of course, Mr Zuckerberg certainly knows how this game works.

It’s presumably why he appeared, unannounced, at Thursday’s event, seemingly with one key purpose – to manage the expectations of developers, the press and the public.

“All of you are inventing the next major platform,” he told delegates. “This is going to go very slowly.

“Facebook is committed to this for the long term.”

He doesn’t expect “millions” of units to be sold – at least not for a while.

He was there to reassure developers that even if things don’t pick up and make millionaires straight away, he’s committed to sticking with it.

But he’s not the only one – and it could be competitors that take what Oculus has started and does things better – giving Mr Zuckerberg the MySpace treatment.

Tagged , , , ,

Devs Shine Light on Halo 5’s Overhauled Engine

Weeks away from the launch of Halo 5: Guardians, 343 Industries and Microsoft on Tuesday revealed some of the technical ingenuity behind what’s expected to be the biggest Xbox One game of the 2015 holiday season.

The game’s developers have been raving about their playtime with the upcoming sci-fi shooter. For starters they’re promising first-class digital tourism in Halo 5: Guardians, with galactic locales ranging from life-filled jungles to snowcapped mountains. Also promised is a compelling narrative that will slow down the action and drive forward a story that presents players with several possible futures as they Hunt for the Truth.

The Halo 5 Experience

Between the campaign’s battles, players can explore levels that range from relatively quiet quests to huge maps rendered by Xbox Live cloud compute and rigged for destruction.

The game’s levels, built on top of the game’s new engine, have been painted with colors from “seven distinct art palettes.”

That overhauled engine includes light-rendering technology that’s based on the physical makeup of meshes and models in Halo 5.

Possibly the biggest news for pro gamers was confirmation of a stable 60 frames per second, rendered via variable resolution.

For scenes with a high density of object, all of which request power to render, the game engine will scale the graphics back from its 1080p base resolution to keep the workload down and the frame rate stable at 60 fps.

Halo 5 Guardians

That has significant implications in the world of e-sport, according to Jon Peddie Research’s Ted Pollak, senior analyst for the games industry.

“Using dynamic resolution adjustment is an interesting strategy to maximize the game’s performance on the Xbox One’s hardware,” he told TechNewsWorld, “and it will give competitive players a better experience.”

Along with high-fidelity cosmetics and what’s promoted as an engrossing campaign, there’s another thing Halo 5: Guardians has going for it, according to Mario R. Kroll, principal at ÜberStrategist.

Halo 5: Guardians will beat several big titles to the punch by weeks, he said.

Launching early and tactfully doesn’t always ensure success, but historically it has helped, and Halo held its own in the past.

“Rather than waiting for November, when there is huge competition from the latest Call of Duty (Black Ops III), the expertly hyped fan favorite Fallout 4 — and finally, the force-driven Star Wars: Battlefront, which has everyone [but single-player campaign fans] drooling, Microsoft decided to launch its latestHalo installment this October,” Kroll told TechNewsWorld.

Halo for the Holidays

Halo’s early start on the holiday rush will benefit from several supporting moves Microsoft has planned: the special edition Xbox One, which has a fattened 1-TB fusion drive; the recently launched Halo Wars 2; and a Windows 10 experience that’s due to deliver in November.

“I believe that between superior pricing over the PS4 and Halo 5 as platform exclusive, it will certainly help Microsoft gain some ground with the Xbox One,” Kroll said.

The list of reasons for holding out on moving from Xbox 360 to Xbox One continues to shrink, Kroll noted. Halo 5: Guardians, possibility Microsoft’s biggest game for a while, is poised to drive Xbox Ones out of the storerooms and into the living rooms.

“There are a number of impressive multiplatform titles coming out that will benefit both XBox One and PlayStation 4,” Kroll said, “but arguably, Halo 5should have the biggest impact as a new release console exclusive game on the XBox One platform, in terms of encouraging sales of that hardware this fall/holiday season.”

With Sony shipping PlayStation 4s at roughly twice the volume of Xbox Ones, Microsoft needs more “arrows in its quiver,” said Kroll. With Halo, at least, Microsoft has proven time and time again it can generate impressive sales figures.

“There’s no question that the Halo franchise is highly successful and very important to Microsoft’s Xbox strategy,” Pollak said. “It behooves them to remind fans of any significant developments to keep the buzz going ahead of the upcoming holiday shopping season.”

Tagged , , , , , ,

You’ll soon be able to stream Android games live to YouTube

YouTube has announced that Android phones will soon be able to stream live video to the service, in a move partly designed to capitalize on the popularity of mobile gaming in Japan and elsewhere. The update doesn’t have a date set yet, but is said to be coming “soon,” along with a Japanese version of the new YouTube Gaming app.

Japan will be the first market in Asia to get YouTube Gaming following the launch last month in the US and UK; the announcement comes on the first day of Tokyo Game Show. “Japan’s mobile games define its gaming culture, far more so than in other countries,” says YouTube’s global gaming head Ryan Wyatt in a statement. “This trend shows there’s a real need for gamers to easily share what’s on their screen with the gaming community, as it happens.”

YouTube Gaming faces an considerable battle to compete with Amazon-owned Twitch, which has built a loyal following in the core video games community. But by focusing on mobile in untapped markets like Japan, as well as providing a slick interface and smart features like rewinding, YouTube could well find ways to set itself apart.

Tagged , , , , , ,

Asynchronous compute, AMD, Nvidia, and DX12: What we know so far

Ever since DirectX 12 was announced, AMD and Nvidia have jockeyed for position regarding which of them would offer better support for the new API and its various features. One capability that AMD has talked up extensively is GCN’s support for asynchronous compute. Asynchronous compute allows all GPUs based on AMD’s GCN architecture to perform graphics and compute workloads simultaneously. Last week, an Oxide Games employee reported that contrary to general belief, Nvidia hardware couldn’t perform asynchronous computing and that the performance impact of attempting to do so was disastrous on the company’s hardware.

This announcement kicked off a flurry of research into what Nvidia hardware did and did not support, as well as anecdotal claims that people would (or already did) return their GTX 980 Ti’s based on Ashes of the Singularity performance. We’ve spent the last few days in conversation with various sources working on the problem, including Mahigan and CrazyElf at Overclock.net, as well as parsing through various data sets and performance reports. Nvidia has not responded to our request for clarification as of yet, but here’s the situation as we currently understand it.

Nvidia, AMD, and asynchronous compute

When AMD and Nvidia talk about supporting asynchronous compute, they aren’t talking about the same hardware capability. The Asynchronous Command Engines in AMD’s GPUs (between 2-8 depending on which card you own) are capable of executing new workloads at latencies as low as a single cycle. A high-end AMD card has eight ACEs and each ACE has eight queues. Maxwell, in contrast, has two pipelines, one of which is a high-priority graphics pipeline. The other has a a queue depth of 31 — but Nvidia can’t switch contexts anywhere near as quickly as AMD can.


According to a talk given at GDC 2015, there are restrictions on Nvidia’s preeemption capabilities. Additional text below the slide explains that “the GPU can only switch contexts at draw call boundaries” and “On future GPUs, we’re working to enable finer-grained preemption, but that’s still a long way off.” To explore the various capabilities of Maxwell and GCN, users at Beyond3D and Overclock.net have used an asynchronous compute tests that evaluated the capability on both AMD and Nvidia hardware. The benchmark has been revised multiple times over the week, so early results aren’t comparable to the data we’ve seen in later runs.

Note that this is a test of asynchronous compute latency, not performance. This doesn’t test overall throughput — in other words, just how long it takes to execute — and the test is designed to demonstrate if asynchronous compute is occurring or not. Because this is a latency test, lower numbers (closer to the yellow “1” line) mean the results are closer to ideal.

Radeon R9 290

Here’s the R9 290’s performance. The yellow line is perfection — that’s what we’d get if the GPU switched and executed instantaneously. The y-axis of the graph shows normalized performance to 1x, which is where we’d expect perfect asynchronous latency to be. The red line is what we are most interested in. It shows GCN performing nearly ideally in the majority of cases, holding performance steady even as thread counts rise. Now, compare this to Nvidia’s GTX 980 Ti.


Attempting to execute graphics and compute concurrently on the GTX 980 Ti causes dips and spikes in performance and little in the way of gains. Right now, there are only a few thread counts where Nvidia matches ideal performance (latency, in this case) and many cases where it doesn’t. Further investigation has indicated that Nvidia’s asynch pipeline appears to lean on the CPU for some of its initial steps, whereas AMD’s GCN handles the job in hardware.

Right now, the best available evidence suggests that when AMD and Nvidia talk about asynchronous compute, they are talking about two very different capabilities. “Asynchronous compute,” in fact, isn’t necessarily the best name for what’s happening here. The question is whether or not Nvidia GPUs can run graphics and compute workloads concurrently. AMD can, courtesy of its ACE units.

It’s been suggested that AMD’s approach is more like Hyper-Threading, which allows the GPU to work on disparate compute and graphics workloads simultaneously without a loss of performance, whereas Nvidia may be leaning on the CPU for some of its initial setup steps and attempting to schedule simultaneous compute + graphics workload for ideal execution. Obviously that process isn’t working well yet. Since our initial article, Oxide has since stated the following:

“We actually just chatted with Nvidia about Async Compute, indeed the driver hasn’t fully implemented it yet, but it appeared like it was. We are working closely with them as they fully implement Async Compute.”

Here’s what that likely means, given Nvidia’s own presentations at GDC and the various test benchmarks that have been assembled over the past week. Maxwell does not have a GCN-style configuration of asynchronous compute engines and it cannot switch between graphics and compute workloads as quickly as GCN. According to Beyond3D user Ext3h:

“There were claims originally, that Nvidia GPUs wouldn’t even be able to execute async compute shaders in an async fashion at all, this myth was quickly debunked. What become clear, however, is that Nvidia GPUs preferred a much lighter load than AMD cards. At small loads, Nvidia GPUs would run circles around AMD cards. At high load, well, quite the opposite, up to the point where Nvidia GPUs took such a long time to process the workload that they triggered safeguards in Windows. Which caused Windows to pull the trigger and kill the driver, assuming that it got stuck.

“Final result (for now): AMD GPUs are capable of handling a much higher load. About 10x times what Nvidia GPUs can handle. But they also need also about 4x the pressure applied before they get to play out there capabilities.”

Ext3h goes on to say that preemption in Nvidia’s case is only used when switching between graphics contexts (1x graphics + 31 compute mode) and “pure compute context,” but claims that this functionality is “utterly broken” on Nvidia cards at present. He also states that while Maxwell 2 (GTX 900 family) is capable of parallel execution, “The hardware doesn’t profit from it much though, since it has only little ‘gaps’ in the shader utilization either way. So in the end, it’s still just sequential execution for most workload, even though if you did manage to stall the pipeline in some way by constructing an unfortunate workload, you could still profit from it.”

Nvidia, meanwhile, has represented to Oxide that it can implement asynchronous compute, however, and that this capability was not fully enabled in drivers. Like Oxide, we’re going to wait and see how the situation develops. The analysis thread at Beyond3D makes it very clear that this is an incredibly complex question, and much of what Nvidia and Maxwell may or may not be doing is unclear.

Earlier, we mentioned that AMD’s approach to asynchronous computing superficially resembled Hyper-Threading. There’s another way in which that analogy may prove accurate: When Hyper-Threading debuted, many AMD fans asked why Team Red hadn’t copied the feature to boost performance on K7 and K8. AMD’s response at the time was that the K7 and K8 processors had much shorter pipelines and very different architectures, and were intrinsically less likely to benefit from Hyper-Threading as a result. The P4, in contrast, had a long pipeline and a relatively high stall rate. If one thread stalled, HT allowed another thread to continue executing, which boosted the chip’s overall performance.

GCN-style asynchronous computing is unlikely to boost Maxwell performance, in other words, because Maxwell isn’t really designed for these kinds of workloads. Whether Nvidia can work around that limitation (or implement something even faster) remains to be seen.

What does this mean for gamers and DX12?

There’s been a significant amount of confusion over what this difference in asynchronous compute means for gamers and DirectX 12 support. Despite what some sites have implied, DirectX 12 does not require any specific implementation of asynchronous compute. That aside, it currently seems that AMD’s ACE’s could give the company a leg up in future DX12 performance. Whether Nvidia can perform a different type of optimization and gain similar benefits for itself is still unknown. Regarding the usefulness of asynchronous computing (AMD’s definition) itself, Kollock notes:

“First, though we are the first D3D12 title, I wouldn’t hold us up as the prime example of this feature. There are probably better demonstrations of it. This is a pretty complex topic and to fully understand it will require significant understanding of the particular GPU in question that only an IHV can provide. I certainly wouldn’t hold Ashes up as the premier example of this feature.”

Given that AMD hardware powers both the Xbox and PS4 (and possibly the upcoming Nintendo NX), it’s absolutely reasonable to think that AMD’s version of asynchronous compute could be important to the future of the DX12 standard. Talk of returning already-purchased NV cards in favor of AMD hardware, however, is rather extreme. Game developers optimize for both architectures and we expect that most will take the route that Oxide did with Ashes — if they can’t get acceptable performance from using asynchronous compute on Nvidia hardware, they simply won’t use it. Game developers are not going to throw Nvidia gamers under a bus and simply stop supporting Maxwell or Kepler GPUs.

Right now, the smart thing to do is wait and see how this plays out. I stand by Ashes of the Singularity as a solid early look at DX12 performance, but it’s one game, on early drivers, in a just-released OS. Its developers readily acknowledge that it should not be treated as the be-all, end-all of DX12 performance, and I agree with them. If you’re this concerned about how DX12 will evolve, wait another 6-12 months for more games, as well as AMD and Nvidia’s next-generation cards on 14/16nm before making a major purchase.

If AMD cards have an advantage in both hardware and upcoming title collaboration, as a recent post from AMD’s Robert Hallock stated, then we’ll find that out in the not-too-distant future. If Nvidia is able to introduce a type of asynchronous computing for its own hardware and largely match AMD’s advantage, we’ll see evidence of that, too. Either way, leaping to conclusions about which company will “win” the DX12 era is extremely premature. Those looking for additional details on the differences between asynchronous compute between AMD and Nvidia may find this post from Mahigan useful as well.  If you’re fundamentally confused about what we’re talking about, this B3D post sums up the problem with a very useful analogy.

Tagged , , , , , , , , , , , , , , , ,

New Apple TV Is Said to Focus on Games, Challenging Traditional Consoles

Apple stumbled into the games business almost by accident not long after it released the iPhone in 2007, igniting a new multibillion-dollar mobile games industry in the process.

Could a new Apple device — one linked to the television — shake up the market for game consoles?

The idea no longer seems ridiculous to many people in the games business.

Apple is expected to make games a primary selling point of its new Apple TV product, which is scheduled to be announced on Wednesday in San Francisco, according to people briefed on Apple’s plans who spoke on the condition of anonymity.

This is a big change from Apple’s previous versions of Apple TV, a device shaped like a hockey puck that for the first eight years of its existence has mainly been used to stream videos and music.

It’s tough to know how compelling the games on Apple TV will be until the company reveals the system this week. Yet many of the components necessary for a satisfying game experience will come with the device, the people say — including more power for better graphics, a new remote that could double as a controller and, perhaps most important, an app store to buy and download games.

“I think Apple’s going to create a big new category in gaming, one that others have tried and failed to create before,” said Jan Dawson, chief analyst at the technology research firm Jackdaw Research. “What the Apple TV has the potential to do is to bring casual gaming to the living room and make it a much more social activity.”

Most game executives and analysts see little chance that Apple will be able to woo hard-core fans of the leading high-end game consoles, the Xbox One from Microsoft and the PlayStation 4 from Sony — both of which will most likely still have better graphics than the new Apple TV. Gamers who fancy big-budget games like Call of Duty and Destiny will probably not be easily persuaded to switch systems.

That still leaves a large market of casual gamers whom Apple could target with the new Apple TV: people who find traditional game controllers complicated and who enjoy lighter, less epic forms of content.

The new product is expected to have a starting price around $150, according to the people briefed on the product. While that is more than double the price of the least expensive Apple TV on sale today, it is significantly less than the latest traditional game consoles, which range in price from $300 to $500, depending on the maker and configuration.

The business opportunity for Apple could be huge. The company now takes nearly a third of the revenue from sales of any games and other software purchased in its app stores. Total revenue from console games is expected to be more than $27 billion this year, which is more than a third of the $75 billion global games business, according to estimates by PricewaterhouseCoopers.

Nintendo most successfully tapped into the casual gamer market in the mid-2000s with its Wii console, which has an intuitive, motion-sensing game controller. The Wii attracted an older audience, stay-at-home parents and others who had never before played game consoles.

Nintendo struggled to hold on to casual gamers after Apple came out with the iPhone, which enabled mobile games that far exceeded anything available on phones before. Games also became the top category of apps for the iPad, which came out in 2010, helping to create huge hits like Clash of Clans and Hearthstone: Heroes of Warcraft. Booming sales of smartphones and tablets running Android, Google’s mobile operating system, further increased the numbers of people who regularly played games.

Apple’s success in games was unexpected for a company that always showed far more interest in creating products for other forms of creativity, including music, photos and films. The Mac was ignored by hard-core gamers for years in favor of Windows PCs. Initially, Apple didn’t even plan to allow games and other apps written for the iPhone after it released the device.

The makers of game consoles like the Xbox and PlayStation have sought broader audiences by adding video streaming services and other entertainment features. But those game consoles are overkill for many casual gamers.

“These are very big, clunky devices,” said Steve Perlman, an entrepreneur who worked at Apple in the 1980s and later founded WebTV, an early set-top box start-up that was acquired by Microsoft. “They’ve got fans, big power supplies.”

“Apple TV is really a modern computing device,” Mr. Perlman said.

Representatives from Microsoft, Nintendo, Sony’s United States video game division and Apple declined to comment.

James Gwertzman, chief executive of PlayFab — a company that helps developers run the online operations behind their games — is skeptical that Apple will be able to persuade developers of the latest console games to move their titles to Apple TV, which will probably not be powerful enough to play them. At the same time, the mobile game developers that Apple has already won over create very different kinds of games.

“It’s a totally different experience,” said Mr. Gwertzman. “Xbox and PlayStation have been very successful at building those living room experiences, and Apple and Android have been very good at ‘play a game on the bus’ experiences.”

The number of companies that make game-capable video streaming devices and are vying for a spot in living rooms is multiplying. Amazon Fire TV and Shield from Nvidia, for example, allow users to play games, but these systems don’t appear to have taken sales away from traditional consoles. Another inexpensive game console, Ouya, suffered from disappointing sales.

“Time will tell what the impact is,” said Matt Wuebbling, the general manager of Shield at Nvidia, referring to streaming devices like those from Nvidia and Apple. “I think it’s going after a different, more mainstream market.”

Trip Hawkins, the founder of Electronic Arts and 3DO, said the living room remained a confusing battleground that no technology company had yet conquered.

“No company has done more for the digital man-machine interface than Apple,” Mr. Hawkins said. “They’ve warmed up to games and are a worthy candidate to win the family room in the next decade, though the competition and inertia are epic.”

Tagged , , , , , ,