FPGA verification must address user uncertainty for prototyping, system validation

[Application Feature]

FPGA verification must address user uncertainty for prototyping, system validation

The recent expansion and diversification of the FPGA verification market bears a certain resemblance to the ASIC verification market of 20 years ago, though beset with opposite challenges, thanks to the changes wrought in 20 years by Moore’s Law.  When companies such as Quickturn Systems created large logic emulation systems to verify ASICs in the early 1990s, users had to be convinced to spend significant amounts of money while dedicating floor space equivalent to a mainframe, all to verify system ASICs.  Today, FPGA verification can be addressed in add-in boards for a workstation, or even in embedded test points within the FPGA itself.

But even as customers in 1990 were reticent to move to logic emulation due to price tags, today’s FPGA verification customer may show some trepidation because such systems may seem simplistic, invisible, or of questionable value.  In many cases, however, FPGA users dare not commit to multiple-FPGA systems (or to ASICs prototyped with FPGAs) without these tools.  Newer generations of FPGAs, incorporating the equivalent of millions of gates, integrate RISC CPUs, DSP blocks, lookaside co-processors, and high-speed on-chip interconnect.  Verification of such designs is a necessity, not a luxury.

On the software-only front, all major FPGA vendors, as well as three of the major EDA suite vendors, offer “Design for Verification” tools that tie behavioral simulation to system-level test and test regression analysis.  While such tools are useful, at some point they must be combined with dedicated hardware that can tie specific FPGAs to system-level requirements.

The notion of an add-on card is the easiest conceptual hurdle for many designers to address.  This can take the form of anything from a module that integrates an FGPA to that of a solid-state drive.  However, the FPGA design community still must adopt a better feeling as to how FPGAs can aid in their own verification if they hope to have effective innovation available soon.

Products for characterizing SoCs are combined with synthesis language and IP libraries to give the designer an easy-to-configure platform for testing hardware ideas.  What began as a means of prototyping other silicon devices has become a way to validate the FPGA itself, an indication of how the FPGA verification market can be used in bootstrapping a next-generation FPGA based on known designs.

Embedded instrumentation potentially can take designers to the next step by embedded hardware test points within their FPGAs or ASICs.  The idea has been used in the past for testing chip designs through I/O pads, a concept that gave rise to the military JTAG standard.  This idea is extended to FPGAs by using test points for verifying not only individual FPGAs, but the behavior of systems employing multiple FPGAs.

Since the newest generations of FPGAs incorporate millions of equivalent gates, embedded instrumentation may soon be necessary.  At a minimum, however, FPGAs offering multiple asymmetric cores processing complex data sets in real time will require multiple complementary software and hardware verification tools to ensure first-pass design success.

Loring Wirbel is a technology analyst with more than 20 years' experience covering semiconductors, communications, embedded software, and any other topics that catch his fancy. In addition to his freelance work he contributes to several political and cultural journals and web sites.