If you thought your APM was high, wait until you see what Google's DeepMind AI can do. The project which started in 2016 has seen the AI go through many iterations as it learned from both live games with other AIs, human players, and from viewing replays of games to judge their outcome and the best strategies. The latest development is said to be an impressive one, with Google set to show off the effect of its efforts by the end of this week.
Starcraft 2 is one of the more complicated games for AIs to master. Not only is it fast paced, but its blend of macro and micro management means that AIs need to execute long term and short term plans simultaneously, something the best human players in the world can do far better than any traditionally crafted AI. However, DeepMind has developed a few strategies of its own, moving through worker rushes, to cannon rushes, and now to a new form which we're told can beat the Insane Difficulty AIs in the game already quite handily.
Its defense and aggression are both improving, according to PC Gamer's report, and we expect the latest iteration to give credible competition to human players when it's revealed later this week.
DeepMind is just one of the AI projects Google's various departments are working on. Perhaps its most famous is the "AlphaZero" AI, which has previously mastered Chess, and more recently, Go. AlphaGo, as that iteration was termed, was able to defeat world champions, as well as other top-level AI players, and even teach both those groups new strategies. Even when employing them, though, those same human and AI players still couldn't best AlphaGo regularly.
DeepMind will no doubt be the same with Starcraft 2, given enough time.