Google's DeepMind AI To Take On Humans At StarCraft


Jonathan O'Callaghan

Senior Staff Writer

Przemek Tokar/Shutterstock

Not content with beating the world’s greatest players at the ancient Chinese game of Go, Google is now training its artificial intelligence (AI) minds on the hugely popular game StarCraft II.

DeepMind, the UK branch of Google that develops its AI technology, says it is going to begin training its AI to presumably take on the world’s best StarCraft II players. The computer will be capable of doing anything a human can do in the game.


Mastering Go with their AlphaGo program was pretty impressive, but StarCraft II represents a whole new challenge. The real-time strategy game – which has a huge eSports audience – involves building up resources, bases, and troops to defend and attack opponents.

There are all but an infinite number of possible configurations of each match. New Scientist reports it at 101685, compared to 10170 for Go. That’s a very, very big number, highlighting the increased complexity of the game.

“For example, while the objective of the game is to beat the opponent, the player must also carry out and balance a number of sub-goals, such as gathering resources or building structures,” DeepMind noted in a blog post.

“In addition, a game can take from a few minutes to one hour to complete, meaning actions taken early in the game may not pay off for a long time. Finally, the map is only partially observed, meaning agents must use a combination of memory and planning to succeed.”

Some of the challenges the AI will face in StarCraft II. DeepMind/Blizzard

To beat human players, just like in Go, DeepMind will need to work to think like a human rather than a computer. Current in-game AI is pretty paltry compared to human players of any skill level, but beating the top dogs will require intuition and imagination – something DeepMind is getting better at doing.

Partnering with Blizzard Entertainment, the developers and publishers of StarCraft, DeepMind will use half a million in-game replays to teach their AI how to play. DeepMind has already started plugging data into its learning software, and begun trying to master the game.

Early results show there is still a long way to go. David Churchill, a professor at Memorial University of Newfoundland who advised DeepMind on its StarCraft tools, told Wired it might be five years before the computer wins.

Nonetheless, it’s a worthy goal. Aside from just being fun, getting AI to master a complex game like this will be an important step in machine learning.


  • tag
  • google,

  • robot,

  • artificial intelligence,

  • AI,

  • Deepmind,

  • Starcraft