Showing posts with label HVL. Show all posts
Showing posts with label HVL. Show all posts

Sunday, January 27, 2013

Evolution of the test bench - Part 2

In the last post, we looked into the directed verification approach where, the test benches were typically dumb while the tests comprised of stimuli and monitors. The progress on verification was in linear relationship with the no. of tests developed and passing. There was no concept of functional coverage and even the usage of code coverage was limited. Apart from HDLs, programming languages like C & C++ continued to support the verification infrastructure. Managing the growing complexity and constant pressure to reduce the design schedule demanded an alternate approach for verification. This gave birth to a new breed of languages – HVLs (Hardware Verification Languages).
 
HVLs
 
The first one in this category was introduced by Verisity popularly known as ‘e’ language. The base of this language was AOP (Aspect Oriented Programming) and required a separate tool (Specman) in addition to the simulator. This language spear headed the entry of HVLs into Verification and was followed by ‘Vera’ that was based on OOP (Object Oriented Programming) promoted by Synopsys. Along with these two languages, SystemC tried to penetrate this domain with support from multiple EDA vendors but couldn’t really gain wide acceptance. The main idea promoted by all these languages was CRV (Constrained Random Verification). The philosophy was to empower the test bench with all features of drivers, monitors, checkers and a library of sequences/scenarios. The generation of tests was automated with the state space exploration guided by constraints and progress measured using functional coverage.
 
Methodologies
 
As adoption of these languages spread, the early adopters started building proprietary methodologies around them. To modularize development, BCLs (Base Class Libraries) were developed by each organization. Maintaining local libraries and continuously improving them while ensuring simulator compatibility was not a sustainable solution. The EDA vendors came forward with methodologies for each of these languages to resolve the above issue and standardize the usage of language. Verisity led the show with eRM (e Reuse Methodology) followed by RVM (Reference Verification Methodology) from Synopsys. These methodologies helped in putting together a process to move from block to chip level and across projects in an organized manner thereby laying the foundation for reuse. Though verification was progressing at a fast pace with these entrants, there were some inherent issues with these solutions that left the industry wanting for something more. The drawbacks include –
 
- Requirement for an additional tool license beyond simulator
- Efficiency of simulator took a toll because of passing the control back & forth to this additional tool
- These solutions had limited portability across simulators
- As reusability picked up, finding VIPs based on the HVL was difficult
- Hardware accelerators started picking up and these HVL couldn’t compliment it completely
- Ramp up time for engineers moving across organizations was high
 
System Verilog
 
To move to the next level of standardization, Accellera decided to improve on Verilog instead of driving e or Vera as industry standard. This led to the birth of System Verilog which proved to be a game changer in multiple respects. The primary motivation behind driving SV was to have a common language for design & verification to address the issues with other HVLs. Initial thrust to System Verilog came in from Synopsys by declaring Vera as open source and extending its contribution to definition of System Verilog for verification. Further Synopsys in association with ARM moved RVM to VMM (Verification Methodology Manual) based on System Verilog providing a framework for early adopters. With IEEE recognizing SV as a standard (1800) in 2005 the acceptance rate increased further. By this time Cadence acquired Verisity after its quest of promoting SystemC as a verification language. eRM was transformed to URM (Universal Reuse Methodology) that supported e, SystemC and System Verilog. This was followed by Mentor proposing AVM (Advanced Verification Methodology) supporting System Verilog & SystemC.  Though System Verilog settled the dust by claiming maximum footprint across organizations, availability of multiple methodologies introduced inertia to industry wide reusability. The major issues faced include –
 
- Learning a new methodology almost every 18 months
- The methodologies had limited portability across simulators
- Verification env developed using VIP from 1 vendor not easily portable to another
- Teams confused in terms of road maps for these methodologies based on industry adoption
 
Road to UVM
 
To tone down this problem, Mentor and Cadence merged their methodologies and came up with OVM (Open Verification Methodology) while Synopsys continued to stick to VMM. Though the problem was reduced, still there was a need for a common methodology and Accellera took the initiative to develop one. UVM (Universal Verification Methodology) largely based on OVM and deriving featured from VMM was finally introduced. While IEEE recognized ‘e’ as an standard (1647) in 2011, it was already too late. Functional coverage, assertion coverage and code coverage all joined together to provide the quantitative metrics to answer ‘are we done’ giving rise to CDV (Coverage Driven Verification).
 
Suggested Reading - Standards War

Sunday, December 26, 2010

Standards War : A Necessary Evil

Which HVL do you use for verification? Which methodology do you follow? Which power format does your team use? Do these questions sound familiar?
No, these aren’t interview questions. These are the conversation starters of verification world!
Last few years have witnessed maximum activity in the verification space. Every EDA company was busy gearing up to propose a solution in different areas. Verification being touted as the most resource consuming activity has been a darling of the EDA community. With so much action in this space, the engineers are left in a confused state. 
The options available in various verification categories include –
HVL – e (IEEE 1647), System Verilog (IEEE 1800), Vera (Open source).
Methodology [MET] – eRM (e), RVM (Vera), AVM, URM, VMM, OVM, UVM.
Power Format [PF] – CPF, UPF (IEEE 1801).
The alternatives listed above have witnessed a full course of evolution before wide acceptance. The choice still depends on the marketing strategy, sales pitch and the final deal!!! From launch of a solution to becoming an industry standard it’s a long road to survival. During this process the verification teams spoiled for choices experience multiple challenges. A few of them include –
- For SOC development, getting third party IP is inevitable. If the Verification environment accompanying the IP is in a different HVL/ MET/ PF then integration becomes an issue.
- Choice of verification environment for IP vendors becomes critical too. Wrong choice on the above categories can even put them out of the race with some customers.
- ASIC flow involves tools from multiple vendors. A Power Format supported by verification tools might not work with implementation tools thereby adding inefficiencies in flow.
- Integrating tools from multiple vendors for verification (through PLIs etc.) may lead to long iterations (delaying schedule) in tool debug if unusual behavior is encountered.
- Given multiple options (above), hiring engineers with required skill set or training them as desired, further delays the productivity.
If having alternatives is the root cause of these issues why do we have them?
Well, to solve the next level engineering challenges, we need to make sure the solutions to the present ones are mature enough. As designs kept unfolding new problems, answers to them are made available by EDA vendors. To get a stable solution either we brainstorm together (unusual in competitive world) or each one brings up a solution and the better one is chosen when standardizing. Multiple designs, multiple engineers and multiple customers work towards making the solution robust and mature. While the EDA vendors are busy wooing customers they continue untiring efforts in order to get their proposal become an industry standard. When the industry moves to next level of design complexity, the war for defining a standard becomes a need. The ‘war’ that was, feeds to bring up a standard that makes process simpler and sets the stage to solve the next engineering challenge.
No doubt this WAR ends up in a HEALTHY outcome!

Sunday, September 26, 2010

Living the PRESENT - the current decade!

The present decade (2001-2010) witnessed the metamorphosis of verification domain from adoloscence to maturity. A lot of the initiatives from the last decade paid off well to scale verification along with the ASIC designs while a lot of new investments were made keeping the next decade in mind. Some verification features that made their mark in this decade include -

1. System Verilog - touted as a one stop solution for HDL+HVL, addressing the limitation of the designers using HDLs (Verilog/ VHDL) and the verification engineers debating on HVLs (e/Vera). An extension of Verilog 2005 and largely based on OpenVera language, System Verilog became a darling of everyone from the Semiconductors as well as the EDA companies.

2. Standardization of HVLs e & SV - System Verilog was adopted as IEEE Standard 1800-2005 and then merged with the Verilog IEEE 1394-2005, leading to IEEE Standard 1800-2009. 'e' language promoted by Verisity and bought by Cadence went ahead getting standardized as IEEE 1647-2006. 

3. CDV - Code coverage stagnated as a baseline for measuring verification progress & completeness. Functional coverage supported by the HVLs was able to put forward a different dimension and widely accepted. Assertion coverage though promising wasn't able to spread its wings wide enough, partly, since it was scenario focussed and partly because of ownership issues between design & verification teams. Even test planning evolved a lot with many tools now aiding in developing a robust test plan by keeping a track of architecture document and extending means for defining comprehensive coverage goals upfront.

4. Reusability - A jargon that still makes news every now & then. Design complexity jumping new heights and time to market panic lead to evolution of IP designs and reusability. With design, verification IPs also defined their market quite strongly. Methodologies (eRM & RVM) helped in packaging the verification environment so as to make it portable from module to chip level and between programs. SV evolution brought in OVM & VMM complementing eRM & RVM respectively. [Note - OVM is a mix of eRM & AVM]. Finally, this decade will commence with standardization of methodology - UVM.

5. AMS - The intricacies of Analog always kept it behind the digital world in terms advancements in process technologies or other methodologies. Product demands from various domains lead to single chip (packaging) and later single die solutions. Since analog & digital designers were always alien to each other these integrated designs demanded verification. AMS simulations addressed this space with various schemes like - Gate level representation of digital & transistor level representation of analog, Gate level or RTL representation of digital & behavioral modeling of the analog etc.

6. Formal verification - Directed verification opened gates to constrained random verification (CRV) where unforseen scenario generation was the focus. Given time & compute resource limitations CRV struggled to reach all possible points in the designs in a defined time. Formal approach proposes to resolve these limitations. However, slow advancement on the simulators limited the adoption of this methodology for most part of the decade.

7. Low power verification - Area and performance had been defining ASIC designs until power became an important measure. Innovative design techniques reduced power consumption with an overhead logic. EDA tools capable of verifying power aware designs - MVSIM & MVRC from Archpro (later acquired by Synopsys) and Cadence IUS & Conformal LP did quite well to address this issue. UPF & CPF emerged as two power formats that rushed to address representation of power aware designs. [UPF - IEEE 1801-2009].
 
8. Hardware accelerators - More gates, big designs, long simulation run time = bottleneck to tapeout in time. Next generation of hardware accelarators paved way for reducing the simulation turn around time. These boxes soon took shape of a complete platform for verification of the whole system and further advanced to help simulate some real time data into the designs before tapeout with much ease.

9. HW SW co-verification - With design focus shifting from the ASIC to SOC, system design & modelling became all the more important. Software testing became the next chokepoint for time to market. A well defined platform where software design & testing could start in parallel to the ASIC design came forefront. Better HW SW partioning, performance checks, system design issues etc all were well addressed with this platform.

10. Courses in verification - VLSI course became more prominent and adopted in almost all electrical & electronics courses. Late in the decade a need for focussed verification courses took centre stage. Many institutes delivering these courses addressed the need for a competent verification work force.

Next -
'Predictions for future - the next decade!'

Previous -
'Recap from the past - the last decade!'