Targeted at multiprotocol packet classification and forwarding, the Ayama 10000 network search engine can perform up to 266 million searches/s. Based on a ternary content-addressable-memory architecture with on-chip storage for up to 512k entries (18 Mbits), this Cypress Semiconductor Corp. network search engine is nearly twice as fast as competitive solutions. To accelerate searches, a multisearch feature allows two tables to be searched with a single 144-bit key (or two 72-bit keys).
The demand for new network services such as multiprotocol label switching, voice over Internet protocol, virtual private networks, and the need for increased quality of service and cost of service billing places more of a processing burden on the packet processor. That's because many of these services require multiple searches per packet. The Ayama 10000 offloads the packet processor and can handle data streams at up to 10 Gbits/s. The architecture of the engines can handle table entries up to 576 bits wide, and memory blocks can be dynamically allocated to handle changing system demands.
The three members in the family, the CYNSE10512, 10256, and 10128, provide address tables of 512k, 256k, and 128k entries (18 Mbits, 9 Mbits, and 4.5 Mbits), respectively. The search engines also include the company's Mini-Key power management, which can reduce chip power consumption by as much as 70%.
The company's Cynapse development tools, which provide device-level simulation with a cycle-accurate C-model, support the search engines. Included in the development kit are application programming interfaces to ease system integration, a full test suite, and reference applications for IP routing, MAC/LPM searchers, five-tuple filters, and power management. In lots of 10,000 units, the CYNSE10128, 10256, and 10512 cost $75, $135, and $275 each, respectively. Samples are immediately available. All three devices are housed in 388-contact BGA packages.
Cypress Semiconductor Corp.