tag:blogger.com,1999:blog-9157206405178843096.post474953511568722393..comments2024-02-12T14:57:58.257+05:30Comments on siddhakarana: Verification claims 70% of the chip design schedule!Gaurav Jalanhttp://www.blogger.com/profile/16509909311582718412noreply@blogger.comBlogger10125tag:blogger.com,1999:blog-9157206405178843096.post-19010456569658105902012-05-11T08:00:04.475+05:302012-05-11T08:00:04.475+05:30From Semiwiki -
Maybe an improvement in verificat...From Semiwiki -<br /><br />Maybe an improvement in verification tools is not the only answer?<br /><br />Some of the solution has to come from the architecture, microarchitecture and design flows and not necessarily from verification.<br />The actual design coding nowadays is very similar to the methods used 10-15 years ago, putting most of the workload on verification, while the designer is free to create whatever he wants.<br />I believe that better definition of the microarchitecture in terms of well-defined logical blocks that can be modeled, pre-designed and pre-verified would result in a much faster integration and lower verification effort. T<br />his approach is also the basis for attempts of High Level Synthesis, trying to use a higher abstraction level and prevent low level bugs.<br />To implement that, we will probably need some good libraries of generic low level design components that will serve as a “language” for communication between design and architecture teams.<br /><br />AmnonGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-555776291084116812012-05-10T08:10:17.297+05:302012-05-10T08:10:17.297+05:30LinkedIn Groups : ASIC Design & Verification
...LinkedIn Groups : ASIC Design & Verification <br />Directed testing can never be killed as company always want to test the chip in situation where the module or chip is mostly used. After directed testing, for more fuller testing, we take help of directed random testing. One of the disadvantages of random testing is it may or may not hit the most wanted tests (if not properly written). Second one is it takes long simulation time.<br /><br />Narendra Bansal <br /><br />Posted by Narendra BansalGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-8104312228975802652012-05-09T07:57:47.747+05:302012-05-09T07:57:47.747+05:30From - Semiwiki
"While the biggies of the ED...From - Semiwiki<br /><br />"While the biggies of the EDA industry are evolving the tools incessantly"<br /> <br />I'll give you mutating, I'm not sure we are getting much actual evolution. Efficiency/productivity could be improved a lot by changing a couple things:<br /> <br />1. Collapsing the digital and analog flows together - power management is an analog problem (the digital and software guys seem to be in denial about it).<br /> <br />2. Get above RTL for design - putting clocks in the requirements/design spec gets in the way of the tools. We could move up to an asynchronous and more software like methodology which would work across more domains than logic design.<br /> <br />Statistical and formal methods should help too.<br /><br />Posted by simguruGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-86385447360108406712012-05-09T07:52:07.062+05:302012-05-09T07:52:07.062+05:30Tao Wang,
Yes, the way we used to design in past...Tao Wang,<br /> <br />Yes, the way we used to design in past is the same carried forward today but there has been a paradigm shift in it too. Instead of building from scratch, we now focus on reusability and that is the reason the IP market is flourishing with strong prospects. Two reasons for this - <br />1. Time to market. The products that we designed 10 years back enjoyed a larger product lifetime which is not the case now, plus early designs claim to grab the market share fast.<br /> 2. Reusability has made life easy for designers to some extent. Infact in one of my current experiences, the design team is delivering the integrated blocks every day but verification will take ~2 weeks for verifying this integration.<br /> <br />Even verification engineers reuse a lot of stuff within the projects or from past projects, but that is tricky and debugging the same is a challenges sometime. Stay tuned for a post on this....<br /> <br />The complexity has reached to a level where now instead of designers, we have program managers running the show. As per Wally Rhines (CEO Mentor) in one of his keynote: "The verification engineer is no longer a 2nd-class citizen. In fact, the time has possibly come that the designer will actually be subordinated to the verification guy." <br />Posted by Gaurav JalanGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-31252621660295357982012-05-09T07:51:45.719+05:302012-05-09T07:51:45.719+05:30From - Semiwiki
The design complexity grows linea...From - Semiwiki<br /><br />The design complexity grows linearly in the past two decades but the verification complexity grows exponentially. Think about the fact, the design method we are using for a 10M gates chip now is almost the same as we used for a 1M chip 10 years ago. But the verification method is compeltely different. Too many functions are cross and challenge the verification engineers. From my personal experience 70% of total efforts is at least.<br /> <br />However it is interesting that even the verification is so important and 70% of the chip design. All the projects I know are run by the design background people. <br />Posted by Tao WangGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-51264515448022052132012-05-09T07:51:01.090+05:302012-05-09T07:51:01.090+05:30Yes, I agree that it hasn't taken off and ther...Yes, I agree that it hasn't taken off and there could be multiple reasons for it like the need for it, tool support etc. Even now, HLS has been heavily touted for past 2-3 years but the adoption is slow. We still haven't reached the point where industry is embracing it openly. If we look back, we may never skip RTL verification completely as we haven't for GLS. Check this -<br /> http://whatisverification.blogspot.in/2011/06/gate-level-simulations-necessary-evil.html<br /> <br />~GauravGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-58423749578379757822012-05-09T07:50:50.423+05:302012-05-09T07:50:50.423+05:30Gaurav,
Thanks. HLS _may_ be the answer to the p...Gaurav, <br /><br />Thanks. HLS _may_ be the answer to the problem of wasted effort on Verification. But the question that needs to be asked is: HLS has been around in various forms for many decades (since 1960s, IIRC, and actually predates RTL synthesis). So why didn't it take off? What is prohibiting it to be the savior of the current methodology flow?<br /> <br />- Swapnajit.Gaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-58216876691447827222012-05-07T08:21:34.642+05:302012-05-07T08:21:34.642+05:30Hi Swapnajit,
You are bringing out a fine point. ...Hi Swapnajit,<br /><br />You are bringing out a fine point. Yes, that is where the bottleneck is felt. Infact, here is an alternate highlighted in one of the recent conferences - <br />"If we ambitioned to create bug-free designs, generating them in a correct-by-construction fashion from high-level specifications, the verification task would obviously be made a lot easier. Confirming this trend, 68% of respondents to a recent industry survey ranked “reducing the verification effort” their #1 motivation for adopting high-level synthesis (HLS)."<br /><br />Thanks & Regards,<br />Gaurav JalanGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-6757486515917596382012-05-07T08:17:13.971+05:302012-05-07T08:17:13.971+05:30Linkedin Group: Verification Management
Discussion...Linkedin Group: Verification Management<br />Discussion:Verification claims 70% of the chip design schedule!<br /><br /> If the "70%" claim is correct, it is interesting to note this 70% is spent on finding incongruence between definition creation, aka specification writing (3 to 5%, say) and definition implementation, or RTL design (15%, say, I keep rest 10-12% to PD and other activities). <br /><br />So, wouldn't it be nice if we have a flow that eliminate the basic problem of translation of a badly written/non-existent specification to RTL Verilog code? <br />Posted by Swapnajit MitraGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.comtag:blogger.com,1999:blog-9157206405178843096.post-57007047530720537212012-05-05T15:49:07.174+05:302012-05-05T15:49:07.174+05:30Linkedin Group: SystemVerilog for Verification
N...Linkedin Group: SystemVerilog for Verification <br /><br />Nice post and an enjoyable read. I wrote a paper on the "70% myth" for SNUG back in 2008 - and covered a lot of other myths too - for example: <br />- Half of all chip developments require a re-spin, three quarters due to functional bugs<br />- Pseudo-random simulation has killed off directed testing<br />- Formal verification will eventually replace simulation<br />See http://testandverification.com/uncategorized/lies-damned-lies-and-hardware-verification/ <br /><br />Posted by Mike BartleyGaurav Jalanhttps://www.blogger.com/profile/16509909311582718412noreply@blogger.com