Showing posts with label CDNLive. Show all posts
Showing posts with label CDNLive. Show all posts

Sunday, August 17, 2014

Taking a pulse of what's going on in Verification!

The design complexity today is marching forward at an accelerated speed and its effect on verification is witnessing equal leaps and bounds. The world of verification has grown multi-fold in past 5 years across all fronts covering what all to verify, how to verify and who all required to verify. Today, conferences hosted by EDA partners are not just focusing on a simulator, a language, a methodology or a new tool and instead bringing up discussions on a plethora of topics. It seems to address this growth, Cadence decided to have the 2nd day of CDNLive India 2014 mostly targeting the DV community hosting multiple tracks on verification. With hundreds of footfalls, a mixed bag of papers on all aspects of verification, an extended exhibitor pavilion for partners along with lunch and tea sessions busy with networking, the event was truly in sync with the theme, connect... share... inspire!

A decade back all such events focussed on sharing the upgrades in tools particularly the simulator. The technology update included introduction to new features, added support for language and improved performance. Today no one talks about the simulator and instead everything around it getting integrated under a hood with features like –

  • A broad VIP catalog with added support for easy bring up of test benches for IP/SoC, compliance test suite with ready made coverage/assertions. A new class of accelerated VIPs that interface easily with the emulators giving further boost to the overall productivity.
  • Updates to power aware verification involving multiple tools (formal, simulator & emulator) with support for different power formats/versions, auto generation of relevant assertions and different aides to ease debugging these scenarios.
  • How to model analog blocks to achieve high performance with electrical accuracy as part of the AMS support. Different languages supporting model development, coverage driven verification to validate these models and reuse of the environment when integrated with digital blocks.
  • A portfolio of formal apps that can accomplish the job of static checks related to connectivity, power, registers, X-prop, protocol compliance checks etc.
  • Improved support for IPXACT based flows to enable register modelling, register verification and interface connectivity promoting IP reuse with minimum issues.
  • Integrated support for coverage collection and merging across different levels (IP, sub system & SoC) of verification involving different tools or flows used to achieve verification closure.
  • Verification management featuring executable verification plans, regression management, triaging and analysis with different views based on user’s role in the project.
  • Added debug tools and support to view the transactions, filter or play around back and forth with the simulation logs.
  • Hardware accelerators with improved capacity, performance and features that enable detailed debugging, power aware support, assertions and coverage.
  • Prototyping platforms/emulators and how they enable boot up with android etc. much before the silicon arrives.
  • Improved Virtual platforms and models in sync with the above to enable early software development thereby shifting the whole product development cycle to the left.
  • Performance analysis of SoC confirming architecture stability or assisting in exploring alternates promptly.

Yes! Verification has evolved into a GODZILLA beyond the control of one HVL, one methodology and one simulator. The rules of the game have changed and the industry is responding faster than ever to this change. Observing all these changes I recollect the topic EDA360 introduced by Cadence to the industry back in 2010. While the terminology might have lost its steam, the essence of the idea seems to have realized quite a lot since then. For those who missed reading about it please refer to the below posts summarizing EDA360. Believe me it’s worth reading!


Do you agree that the current state is in sync with EDA360? Leave a comment!

If you have missed out to any of these events, don’t miss to attend DVCON India 2014 on Sept 25, 26. Registrations now open.

Sunday, December 29, 2013

Sequence Layering in UVM

Another year passed by and the verification world saw further advancements on how better to verify the rising complexity. At the core of verification there exist two pillars that have been active in simplifying the complexity all throughout. First is REUSABILITY at various levels i.e. within project from block to top level, across projects within an organization and deploying VIPs or a methodology like UVM across the industry. Second is raising ABSTRACTION levels i.e. from signals to transaction and from transaction to further abstraction layers so as to deal with complexity in a better way. Sequence layering is one such concept that incorporates the better of the two.
 
What is layering?
 
In simple terms, the concept of layering is where one encapsulates a detailed/complex function or a task and moves it at a higher level. In day to day life layering is used everywhere e.g. instead of calling a human being as an entity with 2 eyes, 1 nose, 2 ears, 2 hands, 2 legs etc. we simply refer to it as human being. When we go to a restaurant & ask for an item from the menu say barbeque chicken, we don’t explain the ingredients & process to prepare it. Latest is apps on handhelds where by just a click of an icon everything relevant is available. In short, we avoid the complexity of implementation & details associated while making use of generic stuff.
 
What is sequence layering?
 
Applying the concept of layering to sequences in UVM (any methodology) improves the code reusability by developing at a higher abstraction level. This is achieved by adding a layering agent derived from uvm_agent. The layering agent doesn’t have a driver though. All it has is a sequencer & monitor. The way it works is that there is a high level sequence item now associated with this layering sequencer. It would connect to the sequencer of the lower level protocols using the same mechanism as used by sequencer & driver in an uvm_agent. The lower level sequencer would have only 1 sequence (translation sequence that takes the higher level sequence item & translates it into lower level sequence item) running as a forever thread. Inside this sequence we have a get_next_item similar to what we do in a uvm_driver. The item is received from the higher level sequencer. It is translated by this lower level sequence & given to its driver. Once done, the same item_done kind of response is passed back from this lower level sequence to the layered sequencer indicating that it is ready for the next item.
 
Figure 1 : Concept diagram of the layering agent
 
On the analysis front, the same layering agent can also have a monitor at higher level. This monitor is connected to the monitor of the lower layer protocol. Once a lower layer packet is received it is passed on to this higher level monitor wherein it is translated into a higher level packet based on the configuration information. Once done, the higher level packet is given to the scoreboard for comparison. So we need only 1 scoreboard for all potential configurations of an IP & the layering agent monitor does the job of translation.
 
Example Implementation
 
I recently co-authored a paper on this subject at CDNLive 2013, Bangalore and we received the Best Paper Award! In this paper we describe the application of Sequence Layering where our team was involved in verifying a highly configurable memory controller supporting multiple protocols from the processor side and a no. of protocols on the DRAM memory controller front. A related blog post here.
 
I would like to hear from you in case you implemented the above or similar concepts in any of your projects. If you would like to see any other topic covered through this blog do drop in an email to siddhakarana@gmail.com (Anonymity guaranteed if requested).
 
Wish you all a happy & healthy new year!!!
 
 

Saturday, October 30, 2010

EDA 360 : Realizing what it is?

A vision paper released by Cadence in early 2010 created a BUZZ in the EDA community. The hot topic of this year’s CDNLive 2010 is EDA360. Here is a summary on this topic in a 3 part series which concludes with what EDA360 brings to our plate as a verification engineer.
Cadence has a history of creating a wave (with whatever it does) so much that it kind of enters into the DNA of the Cadence guys and the people associated. In past, when Cadence released Low Power CPF solution, my team was invited for a technical brief on the solution at Cadence premises. By that time we had heard a lot about CPF from the Sales, AEs and friends working for Cadence but we were amazed when, while we were completing the formalities to enter their building, the security guard asked if we were there to discuss Low power! I am sure this time if someone visits he would be asking for EDA360 J
WHAT IS EDA360?
For many years semiconductor industry has been religiously following Moore’s Law.  With few defined nodes to shrink size, exponential increase in cost of silicon at advance nodes, saturation in SOC architecture for a given application, time to market challenges and shriveling market window for any product, there is a need to CHANGE the way every entity associated with this industry has been operating.  With software claiming majority of product cost, the innovation and differentiation revolves more around end user applications (apps). This demands a SHIFT in the ‘traditional approach’ of developing HW, OS, SW and APPS serially, to an ‘application driven model’ where, definition of a complete platform enables HW & SW development in parallel. Recent success of iPhone (Hardware dependent platform) and Android (Hardware independent platform) over conventional players demonstrates the changing dynamics of the market. This calls for a HOLISTIC ADJUSTMENT among the partners in this ecosystem.
EDA360 is a potential answer to this CHANGE, this SHIFT and this HOLISTIC ADJUSTMENT. It addresses the need for all stakeholders in the ecosystem with the goal of improving the 3Ps (Productivity, Predictability and Profitability) for the 4th P i.e. the Product. It is an ambitious attempt to connect the product development chain by understanding the customer’s customer (the end customer). Focusing on the ‘consumer’, EDA360 preaches a top down i.e. application driven approach for product development by highlighting the grey areas in the present model and how the new model can help weed out the inefficiencies of the ecosystem thereby improving the 3Ps.
Who does what?
It starts with re-classification of the developers into –
1.  Creators – Organizations that will keep innovating silicon with the goal for better performance, low power and small die size. These teams would be constantly feeding silicon capacity and following Moore’s law. With minimal contribution towards end product differentiation, to be productive & profitable only a handful of such companies can survive.
2.  Integrators – Organizations that will optimally integrate the products from creators to actualize the end product. They would stick to mature process nodes and define application focused platforms (software stack) to serve the application needs with the goal to meet quality, cost and time to market. 
How it will be done?
To achieve the end goal, EDA360 brings forward 3 important capabilities –
-          System Realization
-          SOC Realization
-          Silicon Realization
Semiconductor industry is at a juncture where certain critical decisions become inevitable. As Charles Darwin pointed out in the Theory of Evolution – “Beneficial genetic mutations are part of random mutations occurring within an organism and are preserved because they aid survival. These mutations are passed on to the next generation where an accumulated effect results in an entirely different organism adapted to the changes in environment”. Hopefully EDA360 rolls out in a series of such mutations bringing out the much desired and demanded change essential for evolution of our industry.