If you somehow haven’t heard about latest smartphone game, Pokémon Go (Fig. 1), then you’ve probably been distracted playing some other computer game that has garnered all your time. The game has shown up in the news a fair bit owing to its popularity, as well as where players are using it and what has happened to some of those gamers.
Before we dive into the differences between Pokémon Go and augmented reality, I wanted to provide a brief background on the game for those who have not installed it on their smartphone.
Pokémon Go is actually a form of augmented reality. It requires a smartphone with a camera and GPS, so this includes most iPhones and Android phones. The object is to catch and collect virtual Pokémon critters. In a computer game you would probably wander around a virtual world and click on the critters with a mouse to catch them. With Pokémon Go, the critters will be shown on the smartphone screen.
The difference is that the background is from the smartphone’s camera. The placement of the critters is based on location that is determined by the smartphone’s GPS. The system also uses the other positioning information to provide a smoother view of virtual objects overlaid on the real world view.
This form of augmented reality is one of the problems that players have encountered with the game. It takes concentration on the screen to track and catch the critters. Unfortunately the camera and screen combination provide a more restrictive view of the real world than what a person normally has. This can result in a person walking into other objects (including people), or into areas where they should or would not normally enter. Unfortunately, walking over a cliff or in front of a moving vehicle can be fatal.
Pokémon Go is not the first or only augmented reality application that runs on a smartphone, but it is probably the most popular at this point. There have been other tools, like Google Sky, that have used some of the hardware that Pokémon Go uses to present their own augmented reality solution.
Google Sky is an application for star gazing. It shows a star map whose view is based on the location of the user and the orientation of the camera with respect to the sky. The application tracks the movement of the smartphone, adjusting the map similar to how Pokémon Go does, but without overlaying the camera input.
Many may have thought that augmented reality—and its complement, virtual reality—requires special hardware like Google Glass or Microsoft HoloLens (Fig. 2). These glasses have the advantage of hands free use and a wider view of the real world.
Unfortunately, wearing the viewing device does not necessarily mean a safer environment. More has to do with the level of concentration that is needed to view the augmented information. It is simply a question of how distracted the viewer is. This is not limited to augmented reality, as anyone who has ever walked into a signpost while talking to a friend next to them knows.
As noted, augmented reality is nothing new, and there are a lot more examples of its use that are common. For example, My Ford Fusion backup camera is an example of augmented reality (Fig. 3). It overlays a set of lines on the usual backup camera view. One set is fixed; it is color-coded to provide an estimate of how close an object is in the view. Red is too close and green is far enough away that someone can stop easily if necessary. The curved white lines show an estimated path of the vehicle based on the orientation of the front wheels.
There are a couple of things to keep in mind about augmented reality. First, it does not have to be absolutely accurate to be useful. In the back camera view, the lines are an approximate path and tend to show an area that is larger than the vehicle itself, which provides a safer process. For Pokémon Go, the accuracy of the system must be sufficient to provide local positioning relative to other players using their own smartphones. This allows them to see the same critters in approximately the same location, which is why players often congregate in areas trying to catch the same critters.
There are a host of applications that become possible as tools like smartphones include sufficient peripherals and computational support to make many augmented reality applications possible. For example, positioning of the augmented reality information does not have to be based on something like GPS. Image recognition software can be used to determine the position of objects in view of the smartphone camera, so data can be overlaid such as the position of parts on an engine or other device.
Imagine pointing your smartphone’s camera at the latest oscilloscope and asking how to hook up a new probe. The scenario might have the smartphone handle the recognition of the oscilloscope, and even the natural language parsing of the question, or this might be shipped up to the cloud—which is the case for many new interactive devices like Amazon’s Echo. The result might be visual feedback along with an audio explanation.
Augmented reality will only become more pervasive as time goes on. Systems like 360-deg. view cameras for automotive ADAS (advanced driver assistance systems) systems offer visual feedback similar to visual feedback on GPS navigation systems.