873609_33942684Functional verification is a major bottleneck in the chip design cycle. Any misstep in closing the functional correctness of a digital system costs millions of dollars in redesign, additional testing, and silicon respins. One can argue at length about its actual cost, but people in the industry usually agree that functional verification takes between 40 and 70% of a project’s labor, and about 50% of the total cost. The recent announcement of Synopsys and Freescale to broaden their collaboration to cut IC verification says it all: the two partners intend to manage “the ever-increasing cost of verification, which can encompass up to 75 percent of the total cost of product development”.

Getting actual figures about the size of the functional verification market proves to be elusive because of the way the products are tied to synthesis license deals, and because of the lack of independent analysts in EDA. Still, the simulation and emulation market of digital systems can be estimated to be at least five times larger than today’s formal verification market. But simulation can only take you so far, so one wonders why formal verification does not have a larger share. Is it because the technology is limited, or because the market is not ready?

Equivalence checking

Equivalence checking (EC) consists of verifying that a netlist implements the behavior specified by a RTL description, or that two netlists are equivalent. Historically, EC is the first industrial formal verification tool brought to the ASIC world. Cadence’s Conformal is still the reference (about 60% of the market), with Synopsys’ Formality coming second.

EC’s technology is very mature, but this does not mean no further progress is necessary. Flip-flop matching, the primarily step that consists of determining the pairs of flip-flops that need to be compared, is expected to be done quickly and automatically, with no manual guidance. Datapath verification remains a major challenge, and proving the correctness of merged arithmetic automatically is still an open problem. Last but not least, debugging is a very complicated task. Incremental verification and rectification techniques can be quite useful to help pinpointing the functional issue.

Model checking and property verification

Model checking and property verification are still a fraction of the formal verification market, with many players on the field. There are two obstacles for a larger usage of the approach. The first one is that it can be complicated to write a FSM or property that captures a particular behavior. SVA (System Verilog Assertions), OVL (Open Verification Library), and PSL (Property Specification Language) help in that regard, but they need to be more systematically used in the design community. The second obstacle is that model checking techniques can only solve relatively small problem instances. This is why some go with hybrid verification techniques (read: may be incomplete), like Magellan or 0-in, while other stick with complete formal methods, like Jasper and OneSpin.

Because writing properties can be so complicated, specialized branches grew to address specific needs, as shown below.

  • IP verification. With SoCs using IPs from many different sources, verifying the compliance of these IPs with respect to standard interfaces (e.g., PCI or USB) in the context of the application is crucial.  Conformal, with its verification IP portfolio, is in a good position to address the problem. Also OneSpin is known to have interesting technology in that space, even though they are not pushing it at the moment.
  • Timing verification. Incorrect timing constraints can lead to missing a target clock cycle, or worse, to a chip failure. Verifying timing exceptions (false paths and multi-cycle paths), as well as CDC (Clock-Domain Crossing), has become a center of attention. It is still unclear how big the market is. However several discussions with IC design companies led me to believe that verifying a set of timing exceptions (usually in the order of 10,000 SDC constraints) save one month work of an engineer. Automation and speed are keys here. Atrenta, Real Intent, and 0-in propose interesting solutions in that space.
  • Power verification. When doing power gating, one needs to verify that the application is powered back up properly. Integration with UPF or CPF provides the required automation. Conformal and CPF have an edge in that field.
  • Sequential clock gating verification. Traditional (combinatorial) clock gating is well supported by EC tools. Sequential clock gating exploits sequential dependencies to derive additional gating conditions, which can be used to save more dynamic power. It has been made popular by CalyptoEnvis is also proposing a similar technique at the netlist level. Sequential clock gating correctness cannot be expressed easily with SVA or OVL without making the verification task extremely complex, which explained why specialized verification techniques have been developed.

Where formal verification will grow

Formal verification is no longer limited to ASICs: complex systems –SoC, FPGA, and HW/SW co-design— will benefit dramatically from better formal verification techniques if they are deployed adequately.

With the ever-growing size of FPGAs (Altera’s Stratix IV packs 820k logic elements, and Xilinx’ Virtex-6 has up to 750k logic cells), it is clear that simulation will no longer be sufficient to validate the correctness of programmable logic devices. The need for FPGA EC is real, and this requires complete automation and full support for retiming –OneSpin’s 360 EC FPGA has shown some competitive solution in that space. Also note that IP verification and timing verification apply to the FPGA designs too. The real question is whether FPGA designers are willing to pay for formal verification tools.

IP verification, and verifying the correctness of a SoC using IPs, is certainly a very strong driver for more sophisticated formal verification solutions. Power verification will become part of the ASIC design flow, as EC is part of the synthesis flow. Timing verification is still looking for its footing in the design flow –one question is the debug environment, which is still relatively limited, e.g., to showing waveforms.

Looking forward, formal verification techniques can be used (and have been used) in other fields than circuit design. Any critical digital system can benefit from formal verification techniques –transportation, medical equipments, security and privacy applications. The automotive industry is one of the most obvious targets. Cars are ubiquitous, they contains more and more electronics (representing about 30% of the end price today), and a functional bug can have very costly consequences. Cars rely on digital systems for anything from optimizing their engine’s efficiency to navigation systems, entertainment, and on-board diagnosis. Soon the intra-vehicle, vehicle-to-vehicle, and vehicle-to-roadside networking will fuel innovative products, driving the needs for fast development and the highest possible level of correctness. The EDA industry is taking notice, and Mentor has certainly taken the lead there. Whether they provide the adequate functional verification framework is another matter.

Formal verification will extend its reach by addressing the hard problems of EC (datapath verification, and retiming for FPGA), by being seamlessly integrated in the synthesis flow (power and timing exception verification), and by providing practical solutions to IP and hybrid HW/SW design verification.

Tags: , , , ,

4 Comments on The formal verification market is still untapped

  1. nick says:

    Hi Olivier,

    Nice blog.

    -Nick

  2. Olivier, welcome to the Blogosphere. This is a good analysis of formal verification and its prospects. A few additional points:

    –You don’t necessarily have to write complex properties to use formal tools. Some have automated property generation capabilities, or use “canned” checks like CDC.
    –Model checking has capacity limits, but it’s as much about the type of block as the number of transistors or gates.
    –Software validation is a large potential growth area for formal techniques. Post-silicon debug is a new area that’s attracting interest.

    Keep up the good work!

  3. Lopamudra says:

    Hi Olivier,
    Thanks for the article having overall abstract perception about Formal Verification.

    I just wanted to add few points on top of that
    1) like Jasper, Cadences IFV(Incisive Formal Verifier) is also a good model checking Formal tool. They also have IEV for hybrid verification
    2) Now a days, VIPs are playing a key role for protocol verification. Like Cadence’s OCP VIP, AXI Formal-VIP etc
    3) Though model checking tool has capacity issues, but somewhat it can be solved bu “assume – guarantee” approach or other FV techmique

    Regards,
    Lopa

  4. Hi Richard,

    Yes, you are correct, you don’t have to write complex properties to use formal tools. This is why there are these specialized domains I describe in the post, like IP verification, or timing exception verification –note that I included CDC into timing exception verification, even though it should be more under a “timing verification” umbrella. The problem with model checking is that compared to EC, there really is a jump in terms of computational complexity (NP-complete vs. P-space complete). Even though a lot progresses have been done in that field, it still requires methods tailored to the class of problems to complete a proof.

    I agree software verification is an opening for formal verification. For example a company like Coverity (which provides static code analysis solutions) is using basic SAT techniques to go deeper in the branching path analysis of C/C++. There are also other techniques (like HOL or predicate logic) used here and there for some very specific applications, but they are not about to come to us any time soon.

    Post-silicon debug is a bit on the side, it uses techniques coming more from DFT than formal verification. Some successes (e.g, Dafca) show there is a market to be addressed.

    Going back to software verification and the potential expansion of formal verification techniques, I believe that there is a core of software verification that could find a lot of traction. A huge, untapped market, is the verification of simple web applications (java, javascript), where the size of the code is relatively small, but the problems posed by security, privacy, and deadlocks are extremely costly (think your typical banking or on-line shopping web applications, as well as the many application like games and polls in social networks).

Leave a Reply