Sunday, April 29, 2012

Verification claims 70% of the chip design schedule!

Human psychology points to the fact that constant repetition of any statement registers the same into sub-conscious mind and we start believing into it. The statement, “Verification claims 70% of the schedule” has been floating around in articles, keynotes and discussions for almost 2 decades so much that even in absence of any data validating it, we believed it as a fact for a long time now. However, the progress in verification domain indicate that this number might actually by a "FACT".

20 years back, the designs were few K gates and the design team verified the RTL themselves. The test benches and tests were all developed in HDLs and sophisticated verification environment was not even part of the discussions. It was assumed that the verification accounted for roughly 50% of the effort.

Since then, the design complexity has grown exponentially and state of the art test benches with lot of metrics have pre empted legacy verification. Instead of designers, a team of verification engineers is deployed on each project to overcome the cumbersome task. Verification still continues to be an endless task demanding aggressive adoption of new techniques quite frequently.

A quick glance at the task list of verification team shows following items –
- Development of metric driven verification plan based on the specifications.
- Development of HVL+Methodology based constrained random test benches.
- Development of directed test benches for verifying processor integration in SoC.
- Power aware simulations.
- Analog mixed signal simulations.
- Debugging failures and regressing the design.
- Add tests to meet coverage goals (code, functional & assertions).
- Formal verification.
- Emulation/Hardware acceleration to speed up the turnaround time.
- Performance testing and usecases.
- Gate level simulations with different corners.
- Test vector development for post silicon validation.

The above list doesn’t include modeling for virtual platforms as it is still in early adopter stage. Along with the verification team, significant quanta of cycles are added by the design team towards debugging. If we try to quantify the CPU cycles required for verification on any project, the figures would easily over shadow any other task of the ASIC design cycle.

Excerpts from the Wilson Research study (commissioned by Mentor) indicate interesting data (approximated) –
- The industry adoption of code coverage has increased to 72 percent by 2010.
- The industry adoption of assertions had increased to 72 percent by 2010.
- Functional coverage adoption grew from 40% to 72% from 2007 to 2010.
- Constrained-random simulation techniques grew from 41% in 2007 to 69% in 2010.
- The industry adoption of formal property checking has increased by 53% from 2007 to 2010.
- Adoption of HW assisted acceleration/emulation increased by 75% from 2007 to 2010.
- Mean time a designer spends in verification has increased from an average of 46% in 2007 to 50% in 2010.
- Average verification team size grew by a whopping 58% during this period.
- 52% of chip failures were still due to functional problems.
- 66% of projects conitnue to be behind schedule. 45% of chips require two silicon passes and 25% require more than two passes.

While the biggies of the EDA industry are evolving the tools incessantly, a brigade of startups has surfaced with each trying to check this exorbitant problem of verification. The solutions are attacking the problem from multiple perspectives. Some of them are trying to shorten the regressions cycle, some moving the task from engineers to tools, some providing data mining while others providing guidance to reduce the overall efforts.

The semiconductor industry is continuously defining ways to control the volume of verification not only by adding new tools or techniques but redefining the ecosystem and collaborating at various levels. The steep rise in the usage of IPs (e.g. ARM’s market valuation reaching $12 billion, and Semico reporting the third-party IP market grew by close to 22 percent) and VIPs (read posts 1, 2, 3) is a clear indicative of this fact.

So much has been added to the arsenal of verification teams and their ownership in the ASIC design cycle that one can safely assume the verification efforts having moved from 50% in early 90s to 70% now. And since the process is still ON, it would be interesting to see if this magic figure of 70% still persist or moves up further!!!

10 comments:

  1. Linkedin Group: SystemVerilog for Verification

    Nice post and an enjoyable read. I wrote a paper on the "70% myth" for SNUG back in 2008 - and covered a lot of other myths too - for example:
    - Half of all chip developments require a re-spin, three quarters due to functional bugs
    - Pseudo-random simulation has killed off directed testing
    - Formal verification will eventually replace simulation
    See http://testandverification.com/uncategorized/lies-damned-lies-and-hardware-verification/

    Posted by Mike Bartley

    ReplyDelete
  2. Linkedin Group: Verification Management
    Discussion:Verification claims 70% of the chip design schedule!

    If the "70%" claim is correct, it is interesting to note this 70% is spent on finding incongruence between definition creation, aka specification writing (3 to 5%, say) and definition implementation, or RTL design (15%, say, I keep rest 10-12% to PD and other activities).

    So, wouldn't it be nice if we have a flow that eliminate the basic problem of translation of a badly written/non-existent specification to RTL Verilog code?
    Posted by Swapnajit Mitra

    ReplyDelete
  3. Hi Swapnajit,

    You are bringing out a fine point. Yes, that is where the bottleneck is felt. Infact, here is an alternate highlighted in one of the recent conferences -
    "If we ambitioned to create bug-free designs, generating them in a correct-by-construction fashion from high-level specifications, the verification task would obviously be made a lot easier. Confirming this trend, 68% of respondents to a recent industry survey ranked “reducing the verification effort” their #1 motivation for adopting high-level synthesis (HLS)."

    Thanks & Regards,
    Gaurav Jalan

    ReplyDelete
  4. Gaurav,

    Thanks. HLS _may_ be the answer to the problem of wasted effort on Verification. But the question that needs to be asked is: HLS has been around in various forms for many decades (since 1960s, IIRC, and actually predates RTL synthesis). So why didn't it take off? What is prohibiting it to be the savior of the current methodology flow?

    - Swapnajit.

    ReplyDelete
  5. Yes, I agree that it hasn't taken off and there could be multiple reasons for it like the need for it, tool support etc. Even now, HLS has been heavily touted for past 2-3 years but the adoption is slow. We still haven't reached the point where industry is embracing it openly. If we look back, we may never skip RTL verification completely as we haven't for GLS. Check this -
    http://whatisverification.blogspot.in/2011/06/gate-level-simulations-necessary-evil.html

    ~Gaurav

    ReplyDelete
  6. From - Semiwiki

    The design complexity grows linearly in the past two decades but the verification complexity grows exponentially. Think about the fact, the design method we are using for a 10M gates chip now is almost the same as we used for a 1M chip 10 years ago. But the verification method is compeltely different. Too many functions are cross and challenge the verification engineers. From my personal experience 70% of total efforts is at least.

    However it is interesting that even the verification is so important and 70% of the chip design. All the projects I know are run by the design background people.
    Posted by Tao Wang

    ReplyDelete
  7. Tao Wang,

    Yes, the way we used to design in past is the same carried forward today but there has been a paradigm shift in it too. Instead of building from scratch, we now focus on reusability and that is the reason the IP market is flourishing with strong prospects. Two reasons for this -
    1. Time to market. The products that we designed 10 years back enjoyed a larger product lifetime which is not the case now, plus early designs claim to grab the market share fast.
    2. Reusability has made life easy for designers to some extent. Infact in one of my current experiences, the design team is delivering the integrated blocks every day but verification will take ~2 weeks for verifying this integration.

    Even verification engineers reuse a lot of stuff within the projects or from past projects, but that is tricky and debugging the same is a challenges sometime. Stay tuned for a post on this....

    The complexity has reached to a level where now instead of designers, we have program managers running the show. As per Wally Rhines (CEO Mentor) in one of his keynote: "The verification engineer is no longer a 2nd-class citizen. In fact, the time has possibly come that the designer will actually be subordinated to the verification guy."
    Posted by Gaurav Jalan

    ReplyDelete
  8. From - Semiwiki

    "While the biggies of the EDA industry are evolving the tools incessantly"

    I'll give you mutating, I'm not sure we are getting much actual evolution. Efficiency/productivity could be improved a lot by changing a couple things:

    1. Collapsing the digital and analog flows together - power management is an analog problem (the digital and software guys seem to be in denial about it).

    2. Get above RTL for design - putting clocks in the requirements/design spec gets in the way of the tools. We could move up to an asynchronous and more software like methodology which would work across more domains than logic design.

    Statistical and formal methods should help too.

    Posted by simguru

    ReplyDelete
  9. LinkedIn Groups : ASIC Design & Verification
    Directed testing can never be killed as company always want to test the chip in situation where the module or chip is mostly used. After directed testing, for more fuller testing, we take help of directed random testing. One of the disadvantages of random testing is it may or may not hit the most wanted tests (if not properly written). Second one is it takes long simulation time.

    Narendra Bansal

    Posted by Narendra Bansal

    ReplyDelete
  10. From Semiwiki -

    Maybe an improvement in verification tools is not the only answer?

    Some of the solution has to come from the architecture, microarchitecture and design flows and not necessarily from verification.
    The actual design coding nowadays is very similar to the methods used 10-15 years ago, putting most of the workload on verification, while the designer is free to create whatever he wants.
    I believe that better definition of the microarchitecture in terms of well-defined logical blocks that can be modeled, pre-designed and pre-verified would result in a much faster integration and lower verification effort. T
    his approach is also the basis for attempts of High Level Synthesis, trying to use a higher abstraction level and prevent low level bugs.
    To implement that, we will probably need some good libraries of generic low level design components that will serve as a “language” for communication between design and architecture teams.

    Amnon

    ReplyDelete