Measurements Put WLANs To The Test

Sept. 1, 2003
With WLANs Becoming The Network Norm, Proven Testing Must Emerge To Depict The Real-Life Performance Of 802.11 Devices.

Within business IT infrastructure, wireless is becoming an accepted and often expected mode of data communications. A myriad of hardware vendors and technologies exists, however. To gain maximum performance, an employed WLAN system must therefore use a solid testing procedure. For the customer, such procedures should summarize the potential performance of the device under test (DUT). They also must take into account the widely varying fading circumstances that exist in the workplace.

This article is the third of a three-part series that focuses on WLAN test methodologies. The first article addressed the value of WLAN performance testing. The second feature provided information on the proper benchmarking of WLAN devices. This last article will walk through an example test setup. It also will delve into test methodologies and review some typical results.

The Centaur Lab at the University of Georgia conducted the tests discussed in this article. It measured frame loss and throughput on a representative 802.11a WLAN access point (AP). In doing so, the following equipment was used:

  • A Spirent Communications SR5500 wireless channel emulator
  • A Spirent Communications SR5500 6-GHz option (block converter unit)
  • A Spirent SMB-600 chassis with a LAN-3101A card
  • A representative 802.11a access point
  • A representative 802.11a network interface card (NIC)
  • A 5.0-to-6.0-GHz circulator
  • Three Murata-to-SMA-type shielded coax cables
  • Two N-type-to-N-type shielded coax cables
  • Three N-type-to-SMA-type shielded coax cables
  • Two N-type 10-dB attenuators
  • One N-type 2-dB attenuator
  • A Compaq Evo N610c laptop

For a valid test to be conducted, the designer must ensure the proper setup of the test environment. Figure 1 shows a sample configuration for the above-mentioned test. The SR5500 is used to simulate the radio-frequency (RF) environment in a controlled and repeatable fashion. The fading characteristics vary from a clean channel or baseline to a heavily faded environment. In this way, the true performance of the WLAN device under test will be revealed. The SMB-600 acts as the Ethernet generator/analyzer. It will produce and verify the test traffic that will cross the network.

During the test-setup process, the AP and wireless NIC must be properly shielded. The data will then pass through the cabled network via the wireless channel emulator instead of over the air. Generally, shielding is accomplished via the cabled RF environment. If there is RF leakage, however, an RF shielded enclosure can be used.

Once the test setup is complete, the wireless channel emulator must be configured for both the correct test frequency and the desired fade-model parameters. This configuration must be consistent with the frequency or channel setting on the AP. Table 1 shows the channel-number-to-frequency relationship that should be used when configuring the wireless channel emulator for 802.11a testing. If the AP is set to channel 36, for example, the corresponding frequency of 5180 MHz must be set on the SR5500. The velocity of the DUT also must be configured. For these tests, a velocity of 1.5 km/hr was selected. This speed emulates a static to slow-moving user. The tests were conducted with an approximate −15-dBm input signal into the SR5500 6-GHz option. For the AP and NIC that were used in the test configuration, an output signal level of roughly −50 dBm was supplied from the SR5500.

A personal computer (PC) must be configured to route test traffic between a wired and wireless interface. To be used for this purpose, a laptop must be reasonably fast. It also has to provide a wired Ethernet connection and a CardBus interface for the wireless NIC under test. In this test, a Compaq PC provided the networking requirements. It also ensured that any performance-related issues observed during the baseline resulted from the wireless equipment rather than the forwarding performance of the laptop. To offer this guarantee, the laptop was required to supply sufficient processing power.

Before the PC can be used as a router between the wired and wireless interfaces, however, routing must be enabled. In Windows 2000 or XP, this means that the IPEnableRouter registry bit must be enabled. Consequently, routing will be enabled between both the wired and wireless interfaces.

Remember that the test-system setup does not end with the configuration of the test environment, applicable networking information, and necessary proper shielding. It is critical that the DUT can only communicate via the cabled "wireless" configuration through the wireless channel emulator. For a simple way to check the DUT's communication, disable one of the channels with the SR5500's TestKit graphical user interface (GUI). Verify that the communication link is interrupted. The only valid communication path will then be through the wireless channel emulator.

When the transmission path from the AP to the NIC is blocked, the laptop should disassociate from the AP almost instantaneously. The client should no longer be receiving beacon messages from the AP. When the transmission from the NIC to the AP is blocked, the client should disassociate only after a measurable period of time (on the order of a few seconds). In this scenario, the client is still receiving beacons from the AP, but the AP is not receiving any acknowledgements from the client. If either of these conditions is not met by the test setup, the setup is likely to have RF leakage problems. Once the test setup has been configured, the performance testing can begin.

The idea behind the analysis is to conduct three separate data runs. Each of these runs has a different fade model that simulates a different environment. The first model will simulate a clean channel that has no loss, no delay, and no modulation. This model will serve as the baseline performance. The second and third runs will contain exponential decaying Rayleigh fade models based on the IEEE paper written by Naftali Chayat. These runs will reveal how the performance of the WLAN device changes under a faded environment. The goal of these tests is to show the difference in packet loss, throughput, and latency under these different fading models.

Table 2 depicts the first exponential decaying Rayleigh fade model. It simulated an RMS delay spread of 25 ns, which was used for the second run. Table 3 shows the second exponential decaying Rayleigh fade model. This model simulated an RMS delay spread of 50 ns, which was utilized for the third run. For more information on these fade models, refer to the IEEE paper IEEE 802.11-98/156r2.

To conduct the performance tests, a Spirent SmartBits 600 used Spirent SmartFlow v2.20 software. This software allows for many different types of tests including frame loss, throughput, and latency. The basic test setup was to run a battery of tests using 120-second flows at fixed frame sizes of 64, 512, and 1518 B. These RFC 2544-recommended frame sizes provide a good overall representation of a DUT's performance for small, medium, and large frame sizes. Because all current 802.11 networks utilize simplex transmission channels, every one of the tests was performed with unidirectional test traffic.

Note that when Ethernet frames are transmitted over an 802.11 network, the frame size increases by 16 B. In the interest of simplicity, all frame-size references within this document refer to Ethernet frame sizes.

The IEEE 802.11a standard supports signaling rates of 6, 9, 12, 18, 24, 36, 48, and 54 Mbps. Due to time constraints, the exhaustive testing of all link rates was not possible. The tests were only run with a fixed signaling rate of 54 Mbps. This configuration tested the highest signaling rate possible. It also avoided any performance variations that were caused by the DUT's rate-shifting algorithm.

Fixing the data rate during testing is recommended. In the device, rate shifting can produce erroneous and unrepeatable results. The capabilities of the rate-shifting algorithm also can be tested. Such testing is not addressed in this article, however. Follow the manufacturer's directions to configure a fixed signaling rate for the devices under test.

After the baseline test run has been completed, the fade-model parameters need to be modified prior to conducting the second and third test runs. Under ideal conditions, the baseline run provides the performance capability of the DUT. Compare these results against the results that were obtained under faded conditions. It is then possible to properly ascertain how well the device performs under different faded environments.

After the successful completion of all three test runs, the data must be analyzed. Many details and factors must be considered during the data analysis for frame loss, throughput, and latency results.

Frame Loss
Of the tests, frame loss is probably the easiest one to understand. The formula can be summed up very simply:

Frame Loss = Number of Frames Transmitted − Number of Frames Received

If the number of frames that was sent equals the number of frames that was received, there was obviously no loss. Although this formula is very simple, interpreting the frame-loss characteristics for a test can vary based on different factors. The first factor is the size of the frame transmitted and at what load. With SmartFlow, the user can specify any size frame to send across the test network. He or she also can indicate any given percentage of the device's line rate.

Unlike the consumer switches that are available at mass-market computer retail stores, wireless devices can drop a large number of frames at low rates. These rates can be as low as half of the theoretical maximum. Some enterprise-level devices can also achieve these rates. This capability is most easily observed with smaller frame sizes.

Perhaps the most obvious factor that can contribute to frame loss in WLAN networks is network interference (i.e., RF fading). Such interference could prevent a frame or acknowledgement from arriving at its destination. The DUT will then attempt to retransmit the frame, thereby increasing both latency and the transmission queue.

Throughput
Per RFC 2544, the throughput tests look at the maximum rate at which the device under test in question can forward frames without suffering frame loss. In addition, SmartFlow can report an amount of frame loss that has been specified as acceptable by the user.

When a designer is analyzing the data from a throughput test, he or she has to consider a few issues. Throughput can vary greatly depending on a number of different factors. As more complex traffic patterns are introduced into a test, the throughput will typically decrease. These patterns could include multiple frame sizes at random intervals and "bursty" traffic patterns. Meanwhile, framing overhead makes the amount of usable bandwidth decrease as the frame size shrinks. Due to the 802.11 protocol and the need to acknowledge every transmitted packet, WLAN devices are particularly susceptible to this overhead. As a result, the throughput for any 802.11 device is proportional to the frame size that is used to perform the testing.

Latency
Like the frame-loss formula, the formula for latency is easy to understand at first glance:

Latency = Receive Timestamp − Transmit Timestamp

To measure latency, SmartFlow includes a timestamp in every generated frame. When the frame is received on another port, the timestamp at the time of reception is subtracted from the included transmit time. The latency is then recorded. SmartFlow keeps a running total of the average latency for the test's duration. It also can record the minimum and maximum packet latency observed during the test. The test equipment, however, can only report latency data on the frames that successfully made it from the source to the destination during the test. If no frames pass during the test, no latency data will be reported.

Obviously, latency provides information about the transmission delay across a network. Yet SmartFlow's latency results also can be used to determine how real-time services will be affected by the transmission medium. A large variance in latency can wreak havoc with time-sensitive data, such as streaming media and voice over IP (VoIP).

Unfortunately, the methodology that is used for these tests requires a PC. This computer is used to convert the test frames from the wireless medium back into Ethernet. As a result, all latency measurements have the time taken to forward data across the wireless network and across the PC router. Typically, the PC can add a large variance in latency because of its architecture and multitasking nature. Doing testing with the same PC can help to minimize this influence, although it is not an optimal solution.

All of these tests—frame loss, throughput, and latency—can be performed on any of the existing WLAN technologies (802.11a, 802.11b, or 802.11g). Be sure to consider the differences in the type of modulation that was used for the data. Also, take into account the frequency at which the test was conducted.

GETTING THE RESULTS To generate representative results, the tests were conducted using an 802.11a AP. This access point was configured for a 54-Mbps signaling link. The frame-loss plot shows three sets of curves (FIG. 2). The frame-loss tests were conducted at three different frame rates (1518-, 512-, and 64-B frames) for the three environmental scenarios: baseline (no fading), a 25-ns delay-spread fade model, and a 50-ns delay-spread fade model. On the X axis, the plot shows that the tests were conducted for varying traffic load rates on the 54-Mbps link. They ranged from 1 to 34 Mbps in 3-Mbps steps.

The tests were configured for a particular fade model. They used traffic at a specified load for a selected frame size. While configured in the baseline fade model, for example, three different frame sizes (1518, 512, and 64) were run in series. They ran at data rates ranging from 1 to 34 Mbps. Once that entire run was completed, the fade model was changed. The test was run again with the 25-ns fade model and then again with the 50-ns fade model.

The recorded results produced some expected trends. As the traffic load rate increases, for example, the frame loss also increases. This rule holds true for all of the scenarios that apply frame delay or delay spread. Similarly, as the RMS delay spread increases, so does the frame loss that is experienced for each frame size. When the loss rises for a particular frame size—given a fixed load rate—this relationship is shown.

Figure 3 shows the throughput-results plot. Basically, throughput is defined as the value at which there is 0% frame loss over a link. The test is therefore run in a binary search mode. Its goal is to once again determine the maximum throughput for the three varying frame sizes over the same three fade models. Here, the load on the link is increased until a non-0% frame loss is obtained. To determine the actual throughput value, the algorithm enters a binary search mode.

As expected, the plot shows that the throughput decreases as the RMS delay spread increases for all frame sizes. Interestingly, these results track with the frame-loss plots. At the point where the frame-loss error becomes non-zero, the throughput numbers are fairly close to the frame-loss curves.

By doing performance testing on 802.11 devices, useful insight is gained into the breakpoint for WLAN APs and other such devices. Wireless networks can already be found in a large number of homes and businesses. Yet many of the devices employed by these networks have never been subjected to anything more than rudimentary go/no-go testing. As more businesses migrate both users and systems to a WLAN, performance metrics will become critical. The need for solid, proven testing methods will then become paramount.

Sponsored Recommendations

What are the Important Considerations when Assessing Cobot Safety?

April 16, 2024
A review of the requirements of ISO/TS 15066 and how they fit in with ISO 10218-1 and 10218-2 a consideration the complexities of collaboration.

Wire & Cable Cutting Digi-Spool® Service

April 16, 2024
Explore DigiKey’s Digi-Spool® professional cutting service for efficient and precise wire and cable management. Custom-cut to your exact specifications for a variety of cable ...

DigiKey Factory Tomorrow Season 3: Sustainable Manufacturing

April 16, 2024
Industry 4.0 is helping manufacturers develop and integrate technologies such as AI, edge computing and connectivity for the factories of tomorrow. Learn more at DigiKey today...

Connectivity – The Backbone of Sustainable Automation

April 16, 2024
Advanced interfaces for signals, data, and electrical power are essential. They help save resources and costs when networking production equipment.

Comments

To join the conversation, and become an exclusive member of Electronic Design, create an account today!