Thinkstock
Electronicdesign 20835 Virtual Reality think large 0
Electronicdesign 20835 Virtual Reality think large 0
Electronicdesign 20835 Virtual Reality think large 0
Electronicdesign 20835 Virtual Reality think large 0
Electronicdesign 20835 Virtual Reality think large 0

Can AR and VR Change the Way Debugging is Done?

Feb. 21, 2018
Could wearing AR and VR glasses improve the way developers debug code?

Developing and debugging programs works best when one has a lot of screen real estate to work with. A single page may show the code that is being edited, but it is often useful to have many pages available for various reasons. This is especially true when debugging, as there are lots of details that a developer may want to examine as the program progresses or hits a breakpoint. Likewise, real-time trace information can be voluminous.

These days, I use two widescreen displays. There is a challenge in viewing all the information at one time, but that is secondary to having it available without having to navigate and potentially lose sight of any active window that was being used, such as the command line area or source code where a breakpoint has occurred. Part of my problem is that I just have display screens, not touchscreens. Changing focus means moving a mouse around, not just my eyeballs. It tends to be the reason why the toolbar in the primary window has the controls for debugging, but that is more for convenience rather than taking advantage of the screen real estate.

Replicating the toolbar controls is often a function of the tool being used. Some debugging tools are very customizable, while others are quite rigid. The latter are often developed and used by people who have or started with a single display.

What I have yet to see are production debugging and development tools that take advantage of augmented reality (AR) and virtual reality (VR) platforms. The ability to surround oneself with a larger array of information without having to use real hardware is advantageous, especially if one is not always at a fixed location. Having a laptop and an AR/VR headset would allow a developer to examine lots of information anywhere.

The reason I bring AR into the mix is that embedded development often involves hardware that may need to be examined in real time as well.

The challenge is controlling the system as well as data entry, as even graphical programming systems like National Instrument’s LabVIEW require a keyboard to enter text like the name of an object at some point in time. There may be less typing involved, but it is still a requirement. This type of input tends to be where most AR/VR systems fall down. They are great for point and select as well as other gestures, but text input is something else. I can touch-type and could actually use a keyboard without seeing it, but most would require at least a simulation with good feedback. There are many ways to do this, but I won’t get into them here.

AR and VR are really in their infancy, and applications such development and debugging just one of the many possibilities. The potential advantages are significant—from the ability to provide more information to providing help, training tips, and advanced driver assistance system (ADAS) -style support for developers, especially when combined with artificial intelligence (AI) and voice activated systems like Amazon’s Alexa.

For example, a developer working with a scope, logic analyzer or spectrum analyzer may have a question about how to connect or set a trigger for a particular input. Asking the AI system for help may result in the AR system showing the connections or controls on the device. Alternately, it may present virtual controls or examples.

Much of this is science fiction from the point of view that we may like something like this but it is something that is quite possible now. Most of what I have presented here exists in various forms for other AR/VR applications.

While I was originally thinking using AR/VR about software development and debugging, the approach is applicable to all aspects of design, implementation, and support for almost any engineering endeavor.

This discussion also reminds me of the command line/editor/IDE discussions (“wars”) of the past where many developers would not be caught dead using an IDE. I have also lived through debugging programs using software listings and program dumps on reams of line printer paper. I much prefer an IDE with smart editing, a customizable data display, and source code debugging to being limited to the old command line interface and raw data dumps.

The use of AR/VR hardware will come into play in our development environment as it becomes ubiquitous and less expensive. The big question will be whether tool developers will be taking the advantages into account and addressing the deficiencies.

Sponsored Recommendations

Board-Mount DC/DC Converters in Medical Applications

March 27, 2024
AC/DC or board-mount DC/DC converters provide power for medical devices. This article explains why isolation might be needed and which safety standards apply.

Use Rugged Multiband Antennas to Solve the Mobile Connectivity Challenge

March 27, 2024
Selecting and using antennas for mobile applications requires attention to electrical, mechanical, and environmental characteristics: TE modules can help.

Out-of-the-box Cellular and Wi-Fi connectivity with AWS IoT ExpressLink

March 27, 2024
This demo shows how to enroll LTE-M and Wi-Fi evaluation boards with AWS IoT Core, set up a Connected Health Solution as well as AWS AT commands and AWS IoT ExpressLink security...

How to Quickly Leverage Bluetooth AoA and AoD for Indoor Logistics Tracking

March 27, 2024
Real-time asset tracking is an important aspect of Industry 4.0. Various technologies are available for deploying Real-Time Location.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!