Trends in FPGA testing and validation
Aiming for an at-speed hardware/software co-validation environment, developers want eminent domain visibility – for software and hardware.
FPGA vendors are feeling the effects of the economic downturn, but they appear to be in a position to weather the storm more easily than others in the semiconductor industry due to their broad appeal and compelling economic advantages. Despite adverse conditions, FPGAs continue to make inroads into new markets, and to expand in capability and capacity.
The infusion of FPGAs into the embedded systems community and the general convergence of hardware platforms is a positive development. Industry trends are pushing FPGAs, structured ASICs, DSPs, and conventional processors toward some virtual point of convergence. FPGAs started as almost purely programmable fabric but have since added embedded processor cores, memory, sophisticated I/O capabilities, hard-wired multipliers, and dedicated peripherals to increase their versatility as embedded platforms.
While the advancement of these devices is good news, it has brought some associated headaches to the systems designers who work with them. The phenomenal growth in design size and complexity has made FPGA debug and design verification a challenge. Although the capabilities and performance of FPGAs have advanced, the debug and design validation techniques have not kept pace. Even the simplest of RTL designs need to be debugged and validated. This requires an intensive amount of engineering effort and man-hours, the results of which are not predictable and do not always prove the design is functionally flawless.
A new approach
Today’s more sophisticated FPGAs require more sophisticated solutions – in particular, tools that can provide advanced controls and views into the embedded system.
FPGA vendors have a history of using their own on-chip analysis tools, but these solutions are no longer enough. A new generation of tools is emerging that delivers a more sophisticated on-chip instrumentation scheme and off-chip development and test environment. These new tools feature enhanced capabilities beyond what conventional FPGA tools and hardware debug/logic analyzers offer, including on-chip stimulus, on-chip analysis (such as performance monitoring), hardware-software correlation, assertions, transaction analysis, and multi-FPGA visibility, with the added ability to use the same technology in simulation and emulation environments or even for ASICs, if required.
While traditional tools primarily target hardware designers performing on-chip FPGA debug, the new tools serve a wider spectrum of system, software, and hardware engineers. For example, extending visibility beyond the software domain into the hardware interactions is a constant challenge for those who work with FPGAs. New toolsets can traverse the gap between the two domains to create an at-speed hardware/software co-validation environment, as illustrated in Figure 1. These next-generation tools are also more flexible than earlier offerings, because they don’t target only certain FPGA families, but can perform validation across all FPGA devices.
Pre-silicon instrumentation and at-speed post-silicon validation tools can enhance observability and control of internal signals. They employ on-chip reconfigurable instruments that developers can insert and customize at design time for at-speed data acquisition, performance monitoring, stimulus, and fault injection functions, but are ultimately programmed post-silicon, while the system is in operation. This is a significant advantage, because the instruments can be repurposed to serve a variety of functions without requiring a re-synthesis of the design. With in-depth visibility and stimulus-injection capabilities, these next-generation tools offer validation engineers not only the ability to observe deeply embedded signals, but also the capability to control those signals.
System, software, and hardware engineers who seek to develop more complex FPGA designs without increasing their costs or time to market must begin to use debug and validation tools that meet the following requirements:
n Multi-FPGA designs – to ensure observability and control across FPGA boundaries
n Multi-FPGA partitioning – to mitigate, rather than exacerbate, design partitioning challenges
n Multicore/multichip visibility and control – to ensure maximum insight and access into increasingly complex SoC designs
n FPGA-agnostic – usable on any FPGA, regardless of manufacturer
n Flexible – work on multiple platforms, including FPGAs, simulators, emulators, and ASICs
n Comprehensive instrumentation – with the fabric of multiplexors and transaction engines embedded in the user design, anywhere in the hierarchy
n Configurable – users can customize instrumentation at design time
n In-system programmable – the FPGA design does not have to be re-synthesized to change instrument function
n Offer on-chip stimulus, including transaction stimulus, fault insertion and stress testing – to achieve significant acceleration and benefits in delivering robust, high-quality embedded software
n Comprehensive capabilities – go beyond a logic analyzer to include performance monitoring, assertions, and transaction stimulus
It is clear that a new approach to FPGA debugging and validation is needed to reduce costs and time to market. As the market for FPGAs continues to evolve, the tools that support their cost-effective development must evolve as well.
Paul Bradley is Chief Technical Officer of DAFCA, Inc. Paul has over 20 years' experience in electronics and systems design, and specializes in product development and engineering leadership in emerging technology markets. He has held numerous engineering and technical leadership positions at Motorola, Nortel, CrossComm, Sonoma Systems, and Internet Photonics prior to joining DAFCA.