Sunday, October 18, 2015

The magical chariot in verification!

As the holiday season kicked off in India, the antennas of my brain tickled to intercept between what is going around and relate it to verification. In past, I have made an attempt to correlate off topic subjects with verification around this time every year. Dropping the mainstream sometimes helps as it gives you a different perspective and a reason to think beyond normal, to think out of the box, to see solutions in different context and apply it to yours, in this case verification! The problem statement being – driving verification closure with growing complexity and shrinking schedules.
Before I move forward, let me share the context of these holidays. This is the time in India when festivities are at their peak for few weeks. Celebration is in the air and the diversity in the culture makes it even more fascinating. People from all over India celebrate this season and relate it to various mythological stories while worshipping different deities. The common theme across is that in the war of good and evil, good prevails finally! What is interesting though are the different stories associated with each culture detailing these wars between good and evil. In the process of growth of the evil and the evolution of good to fight it, both tend to acquire different weapons to attack as well as defend. And when the arsenal at both ends is equally equipped, the launch-pad becomes a critical factor in arriving to a decision. Possibly, that is another reason why different deities ride different animals and some of these stories talk about those magical chariots that kind of made the difference to the war.
So how does this relate to verification?
As verification engineers our quest with bugs amidst growing complexity has made us acquire different skills. We started off with directed verification using HDLs/C/scripts and soon moved to Constrained random verification. Next we picked different coverage metrics i.e. functional, code coverage and assertions. As we marched further, we added formal apps to take care of the housekeeping items that every project needs. Almost a new tool/flow keeps adding every couple of years in line with the Moore’s law J. And now if we look back, the definition of verification as a non-overlapping concern (functional only) in the ASIC design cycle few decades ago is all set to cross roads with the then perceived orthogonal concerns (clock, power, security and software). While we continue to add a new flow, tool or methodology for each of these challenges that are rocking the verification boat, what hasn’t changed much in all these years is the platform that the verification teams continue to use. Yes, new tools and techniques are required but are these additions bringing the next leap that is needed or are they just coping up with the task at hand? Is it time to think different? Time to think beyond normal? Time to think out of the box? And if YES what could be a potential direction?
This is where I come back to the mythological stories wherein when the arsenal wasn’t enough; it was the magical chariot that did the trick! Yes, maybe the answer lies in bringing the change in the platform – our SIMULATORS – the workhorse of verification! Interestingly, the answers do not need to be invented. There are alternate solutions available in form VIRTUAL PROTOTYPING or using HARDWARE ACCELERATORS/EMULATORS for RTL simulations. Adopting these solutions would give an edge on both the bugs causing menace as well as the competition! And for those who think it is costly to adopt, a lost market window for the product could be even costlier!!!
As Harry Foster mentioned in his keynote at DVCon India 2015 – It’s about time to bring in the paradigm shift from “Increasing cycles of verification TO maximising verification per cycles”. He also quoted Henry Ford, the legend who founded Ford Motor Company and revolutionized transportation and American industry.
On that note, I wish you all a Happy Dussehra!!! Happy Holidays!!!

If you liked this post you would love -
Mythology & Verification
HANUMAN of verification team!
Trishool for verification
Verification and firecrackers


  1. The Wilson Research Group 2014 functional verification study exposes many interesting trends in the techniques used and troubles seen by both designers and verification engineers. Harry Foster of Mentor Graphics has been been blogging about the study for some time now.

    So if the verification challenge is growing exponentially, why is time spent on verification declining according to the study? One trend is that emulation is playing a bigger role in verification. According to the Wilson study, one-third of the industry has adopted emulation, and for verifying large designs over 80 million gates, almost 60% of companies use emulation.

    Another trend is the adoption of static technologies which don’t use simulation but a combination of structural intent analysis with automatic formal engines. The growth of these static technologies has clearly taken off, with the study showing the adoption of static verification for design projects has jumped 62% in two years.

    What is in this static category? Harry identifies a list of these applications including SoC integration connectivity checking, deadlock detection, X semantic safety checks and coverage reachability analysis. I would add clock-domain crossing, X-optimism/pessimism analysis, and timing constraint validation, and lint checks to the list as well.

    Harry’s conclusion is that, the growth rate of 62% for formal solutions make it one of the fastest growing segments in functional verification. At Real Intent, we see this as an important driver for the adoption of our solutions in the marketplace which rely on static techniques.

    To me the answer is clear. Static analysis is the next big thing in verification. It is essential for handling billion-gate designs and getting to RTL sign-off.

  2. Hi Graham,

    Thanks much for sharing your expert comments. On the relevance of what I have been writing, while I do borrow the observations from Wilson Research study over the period of time, a lot of my posts are modulated based on my own observation while leading an army of ~300 verification consultants serving projects across the world. So at times, it is in sync with the study and sometimes not. It could be because for either of these observations, there is still a limited set indicative of the trend so the % may vary from actual.
    With that, here are my 2 cents :-

    Yes, while the challenge is growing exponentially, there are multiple factors that play the role of controlling the efforts. Some of them include - Reuse of IP/VIP, Formal playing a much bigger role than ever, more design derivatives than design from scratch and ofcourse to some extent emulation. So it is not just emulation affecting this trend. The 60% here is companies using emulation and not no. of projects which may mean the actual adoption is yet to pick up. Note that this is different from static where it is 62% of projects which I agree to.

    On your comment on Static categories, I am completely in sync. The ones you mentioned explicitly are more prevalent then some of the others in the study. Infact in my observation, Lint, CDC & Low power static are highest rated followed by Connectivity, X-optimism and timing constraint validation. Coverage reachability is more prevalent for IPs vs SOCs.

    So I completely agree with Harry & yourself on adoption of Formal to be the key to pull back the growing challenge of verification. However, I see Formal is already mainstream and I expect it to see the trends similar to adoption of coverage. However, emulation is something which I don't believe is still mainstream in terms of the associated applications like simulation acceleration, early SW development or architecture & power analysis. Cost is one important factor given that the cloud model didn't see much appreciation.

    In summary, if someone has not adopted Formal yet, the initial condition of this post i.e. having all weapons to tackle the war itself is not met. It is only after deployment of Formal, where I have spoken about emulation :)

    Over to you!