We all know that functional verification is the costliest and most time-consuming aspect of ASIC design –about 50% of the total cost, and from 40% to 70% of the total project duration. And we all know that simulation is by far the prevalent verification method, even though it is inherently incomplete due to an input space that is too large to be enumerated. So formal verification, which aims at completeness, should be a thriving field, given the impact it can have on the overall cost and schedule of ASIC designs.

There is certainly no lack of competition in formal verification. The big three EDA public companies, Synopsys, Cadence, and Mentor Graphics, have all their own formal verification offering (Formality, Conformal, 0-in), and there are a number of startups, e.g., Jasper, Atrenta, Real Intent, OneSpin, Blue Pearl Software, to name a few. Formal verification products cover a wide range of applications: System Verilog Assertion (SVA) and property checking; RTL static check; equivalence checking (EC); some limited IP verification; clock-domain crossing (CDC) verification; and timing exception verification (false paths and multi-cycle paths).

Looking at the DAC submissions this year though, I am puzzled by the overwhelming number of papers focused on increasing simulation speed and coverage, as opposed to the handful of papers discussing formal techniques. And this year is not different from last year. And the year before last. Does that mean there is a lack of innovation in formal verification core techniques?

Improving simulation –higher coverage, less patterns, more automation— with formal techniques is a very active field, both in the academic and industrial world. Some inject faults in the RTL to separate the most discriminating patterns (e.g., Certess). Others use SAT and integer constraint solvers to reduce the number of patterns, or to automatically generate patterns for hard-to-cover code branches (e.g., NuSym). But success is all relative. Certess was quickly acquired last year, while NuSym is actively looking for a buyer. There are also semi-formal tools, mixing simulation and state exploration techniques (e.g., Magellan), but they a have limited usage.

What about the more fundamental formal verification technologies? The 80’s were dominated by the development of rigorous semantics models (e.g., multi-valued logic, Verilog and VHDL operational semantics for synthesis and simulation, temporal logics, and synchronous languages like Esterel and Lustre) and the introduction of BDDs. The 90’s saw EC tools spreading in the industry and the rise of model checking. The 00’s were all about SAT and model abstraction to push the capacity of EC and bring property checking to the end-user, as well as static code analysis, CDC, and timing verification. What are we going to see in this decade?

Verification has a lot of challenging problems, with incomplete or no solution at all. Here is my list:

  • Merged arithmetic. There are robust methods to verify adders and multipliers of practically any size, but no one can verify merged arithmetics as small as 32-bits.
  • Low power. This leads to complex properties capturing the correctness of sequential clock gating and power gating. The former is becoming more common, and there are techniques to address most of it (e.g., Calypto and Conformal). But the later is still waiting for a comprehensive and automated solution.
  • RTL debugging. There are a number of static code checkers, but debugging is still very poor.
  • HW/SW verification. Can we leverage deductive methods (predicate logic, HOL, rewriting system) to close the gap between software and RTL?
  • Mixed signal (analog/digital) devices: this is a very young area of research, but it should see a lot of focus given the increasing ubiquity of mixed signal designs.

If formal verification core technology is to evolve, we will see some original solutions to the problems listed above. What do you think should be added to this list? And which techniques will evolve as the most promising?


UPDATE: I had enough interesting comments and feedback about this post to motivate a follow-up post.

Tags: ,

11 Comments on Has formal verification technology stalled?

  1. Social comments and analytics for this post…

    This post was mentioned on Twitter by ocoudert: New post (RT please): Has formal verification technology stalled? http://is.gd/6YVUs

  2. Tom Anderson says:

    Olivier, interesting post but your list of “(Formality, Conformal, 0-in)” is misleading. Formality and Conformal are equivalence checkers, whereas the 0-In products are property checkers. These two types of products are solving different problems so distinction is important. Note that Cadence offers Incisive Formal Verifier (IFV) and Incisive Enterprise Verifier (IEV) for property checking.

    As to your main thesis, formal verification hasn’t stalled from my perspective. Equivalence checking is a mainstream technique that is still growing in usage. While it’s fair to say that property checking hasn’t become as pervasive as its advocates would like, it is very much a growing business within EDA that is already starting to attack some of your suggested problems, especially low power.

    tomacadence

  3. Hi Tom,

    You are correct when saying that Formality and Conformal are EC tools, while 0-in is a property checker. I was only listing a few tools as well as a few startups that all together cover a pretty wide range of verification tasks.

    It is true that EC is widely used, but I would argue that even in EC there are a couple of problems that still await practical or better solutions: merge arithmetic verification; fast, fully automated register matching; and debugging, which is still a tedious task.

    Property checking is slowly gaining grounds. It is slow because the technology still needs to be improved, and because the customer is slowly accepting the methodology –SVA and timing diagrams did help making property checking easier to express and manage. As for low power, yes, it should be a focus of formal verification: there is a strong need for automated solutions, and low power is certainly a growing segment. Cadence is definitely a player here (CPF+Conformal has is a proven flow), as well as Mentor with UPF, and a few other companies have interesting insight there, notably Atrenta. I wrote about low power verification in the past, and I am a strong believer that this is a great opportunity for EDA products to shine (for both synthesis and verification).

  4. Hi Olivier,

    Good post and, I think, prompted somewhat by our conversation.

    I think one thing that confuses the issue of success/failure of formal verification is the difference between formal verification, the technology, and formal verification, the product. I think formal verification, the technology, has made, and continues to make, great progress in terms of usability and capacity. There is clearly value in the technology and, in this sense, the technology has been successful.

    However, formal verification, as a product, has not so been successful. And, yes, I am aware that there are a number of “successful” formal verification products our there, but the ROI has been extremely low. The fact is that the formal verification market is dwarfed by the simulation market and seems destined to always be that way.

    I believe formal verification is successful when it is hidden under the covers of a product that solves a problem the user cares about. EC and CDC are examples of this. This is our philosophy at Nusym also. The problem we are solving is coverage closure and our tool has the look and feel of a standard simulator, but under the covers, formal techniques are being used. I think this use model will continue to grow, while formal verification, the product, will continue to whither. I predict that in 10-15 years there will be no formal verification products, but most, if not all, verification solutions at that time will incorporate formal technologies under the covers.

    regards,

    –chris

  5. Hi Chris,

    What prompted me to write this post was really the DAC paper submission in the verification track :). Lots of these papers are about enhanced simulation, only a few are focused on the core technologies and algorithms of formal verification. Of course our conversation drew the parallel between the outcomes of Certess and Nusym.

    You have an excellent point: while the formal verification techniques have made great progress, and continue to do so, the success of formal verification on the marketplace is a mixed bag. Yes, EC is now widely accepted and used in any ASIC design flow. But a lot of valuable products (e.g., property checking, timing verification) are struggling to find a path to growth. You are correct by pointing out that formal verification techniques do better when hidden under the hood to boost the capabilities of existing tools, most notably simulation tools. Which leads to the same question again: is it because the design community is reluctant to change its verification methodology that the formal verification market is so small?

    I hope your prediction will turn out to be wrong. In 10 years ASICs will reach 1B-gates equivalent, FPGA complexity will be around 50M gates equivalent. Software will dominate the design and complexity of SoCs. I have a hard time to believe that we will be able to verify such complex systems without some significant changes in the verification and design flows. For a start, verification will be embedded in the design/synthesis flow, as keeping the two flows separated is too much of a burden. Future SoCs will result from assembling IPs, RAMS, and cores: thus checking their correctness will require a strong push in the software verification field, and IP-based verification (a field OneSpin is starting to make a dent into) will become dominant over the classical EC. Simulation will just not cut it, and the very technologies to achieve functional closure are still to be developed.

  6. Gary Dare says:

    There’s an old saying about choosing between the devil that you know, and the devil that you don’t. The emphasis on improvements to simulation over formal verification that you are seeing in DAC submissions is not a surprise but a continuation of traditional patterns in engineering. End users have to deliver a product by a date set on a roadmap by management, so they elect methods that have worked for them in the past … and tolerate incremental advances to improve them. EDA software to take advantage of SIMD with general purpose GPU’s or MIMD with symmetric multicore in compute engines creates less angst when end users are faced with larger designs to analyze and verify. Moving to formal is a change in their flow, whereas hardware and/or software upgrades [for the next generation chip or system (wireless handset, network switch)] are not.

  7. Hi Gary,

    You’re right, people in general will go with something they are familiar with (or that has been working for them so far), thus improvements in simulation are easy to push to the customers, as opposed to a change of the verification flow. But we have seen this in the past, it takes a significant paradigm change to go to the next order of complexity (think physical verification, gate-level and RTL synthesis). I wonder which technique will emerge to address the complexity of verification. Chris thinks that formal verification as a product will be phased out to be totally hidden in the more familiar simulation platform. He may be right, but I don’t think this will be enough –unless the hardware of 2010 can be fixed via software patches.

  8. Kurt Takara says:

    Hi Oliver and Gary,

    You bring up good points. The reality of the verification situation is that there are finite resources. Now, SV testbenches are the most popular area for improvements to verification environments, so you will not see as much investment in other verification areas.

    When Chris mentions the low ROI of formal verification products, this is a very deceptive statement. This clearly illustrates that users need to first understand the benefit from any product or technology before making an investment:
    – How is the return defined (number or type of bugs found, testplan coverage, reduction in verification schedule or testing, increased quality, etc.)?
    – How do you correlate the result to $$$?
    – What investment (i.e. resources) can you commit?

    There are many examples of design teams realizing a high ROI using formal property checking. Chris may be referring to the fact that technologies by themselves do not have benefit to users, but it is the solution or application of the technology that has benefit and value to users. As formal verification is not a one-size-fits-all product or technology, users will need to understand the appropriate application for these products. Here Chris mentions CDC as an example of a successful formal verification product or solution. There are automatic formal solutions that have a lower investment, therefore a higher ROI.
    Here is a formal verification article that published today that discusses different applications for formal technology:
    http://www.scdsource.com/article.php?id=382

    This also illustrates that formal verification products should not be expected to replace simulation, but can address specific verification problems more effectively than simulation.

    Thanks,
    Kurt

  9. Hi Kurt,

    I think Chris’ comment on low ROI was about the amount of $$$ poured into formal verification. The reality is that the formal verification market is small compared to the simulation market (say 10-15% of it). There are several reasons for this.

    First of all, some verification product revenue numbers are hidden in all-what-you-can-eat deals, typically SNPS’ giving away Formality licenses whenever a customer buy Design Compiler or IC Compiler licenses (same story with with CDN’s Conformal). This is why there is no newcomer in the EC business: it is not because there is no progress needed in EC (there are quite a few!), but because SNPS/CND have captured that market and tied it to synthesis and prototyping, making EC a freebie.

    Second of all, Gary’s point means that customers will rather go with a solution that does not require re-hauling their verification flow. As long as verification TAT is reduced and coverage is increased, improved simulation is a low-risk investment with a positive return. Going with a more formal approach –property checking, verification IP based flow, etc– is out of the comfort zone for many designers.

    So it is quite possible that the best avenue of growth for formal verification is to go hand-in-hand with simulation. I am still a strong believer in RTL static verification, as long as debugging capabilities are up to the challenge.

  10. […] simultaneously, Olivier Coudert wrote a thought provoking post entitled “Has formal verification technology stalled?” He was prompted to ask this question by the fact that the number of formal verification […]

  11. […] about this strategy in a response to Olivier Coudert’s blog post on formal verification, I wrote: I think this use model will continue to grow, while formal […]

Leave a Reply