Sunday, January 27, 2013

Evolution of the test bench - Part 2

In the last post, we looked into the directed verification approach where, the test benches were typically dumb while the tests comprised of stimuli and monitors. The progress on verification was in linear relationship with the no. of tests developed and passing. There was no concept of functional coverage and even the usage of code coverage was limited. Apart from HDLs, programming languages like C & C++ continued to support the verification infrastructure. Managing the growing complexity and constant pressure to reduce the design schedule demanded an alternate approach for verification. This gave birth to a new breed of languages – HVLs (Hardware Verification Languages).
 
HVLs
 
The first one in this category was introduced by Verisity popularly known as ‘e’ language. The base of this language was AOP (Aspect Oriented Programming) and required a separate tool (Specman) in addition to the simulator. This language spear headed the entry of HVLs into Verification and was followed by ‘Vera’ that was based on OOP (Object Oriented Programming) promoted by Synopsys. Along with these two languages, SystemC tried to penetrate this domain with support from multiple EDA vendors but couldn’t really gain wide acceptance. The main idea promoted by all these languages was CRV (Constrained Random Verification). The philosophy was to empower the test bench with all features of drivers, monitors, checkers and a library of sequences/scenarios. The generation of tests was automated with the state space exploration guided by constraints and progress measured using functional coverage.
 
Methodologies
 
As adoption of these languages spread, the early adopters started building proprietary methodologies around them. To modularize development, BCLs (Base Class Libraries) were developed by each organization. Maintaining local libraries and continuously improving them while ensuring simulator compatibility was not a sustainable solution. The EDA vendors came forward with methodologies for each of these languages to resolve the above issue and standardize the usage of language. Verisity led the show with eRM (e Reuse Methodology) followed by RVM (Reference Verification Methodology) from Synopsys. These methodologies helped in putting together a process to move from block to chip level and across projects in an organized manner thereby laying the foundation for reuse. Though verification was progressing at a fast pace with these entrants, there were some inherent issues with these solutions that left the industry wanting for something more. The drawbacks include –
 
- Requirement for an additional tool license beyond simulator
- Efficiency of simulator took a toll because of passing the control back & forth to this additional tool
- These solutions had limited portability across simulators
- As reusability picked up, finding VIPs based on the HVL was difficult
- Hardware accelerators started picking up and these HVL couldn’t compliment it completely
- Ramp up time for engineers moving across organizations was high
 
System Verilog
 
To move to the next level of standardization, Accellera decided to improve on Verilog instead of driving e or Vera as industry standard. This led to the birth of System Verilog which proved to be a game changer in multiple respects. The primary motivation behind driving SV was to have a common language for design & verification to address the issues with other HVLs. Initial thrust to System Verilog came in from Synopsys by declaring Vera as open source and extending its contribution to definition of System Verilog for verification. Further Synopsys in association with ARM moved RVM to VMM (Verification Methodology Manual) based on System Verilog providing a framework for early adopters. With IEEE recognizing SV as a standard (1800) in 2005 the acceptance rate increased further. By this time Cadence acquired Verisity after its quest of promoting SystemC as a verification language. eRM was transformed to URM (Universal Reuse Methodology) that supported e, SystemC and System Verilog. This was followed by Mentor proposing AVM (Advanced Verification Methodology) supporting System Verilog & SystemC.  Though System Verilog settled the dust by claiming maximum footprint across organizations, availability of multiple methodologies introduced inertia to industry wide reusability. The major issues faced include –
 
- Learning a new methodology almost every 18 months
- The methodologies had limited portability across simulators
- Verification env developed using VIP from 1 vendor not easily portable to another
- Teams confused in terms of road maps for these methodologies based on industry adoption
 
Road to UVM
 
To tone down this problem, Mentor and Cadence merged their methodologies and came up with OVM (Open Verification Methodology) while Synopsys continued to stick to VMM. Though the problem was reduced, still there was a need for a common methodology and Accellera took the initiative to develop one. UVM (Universal Verification Methodology) largely based on OVM and deriving featured from VMM was finally introduced. While IEEE recognized ‘e’ as an standard (1647) in 2011, it was already too late. Functional coverage, assertion coverage and code coverage all joined together to provide the quantitative metrics to answer ‘are we done’ giving rise to CDV (Coverage Driven Verification).
 
Suggested Reading - Standards War