Showing posts with label EDA360. Show all posts
Showing posts with label EDA360. Show all posts

Sunday, August 17, 2014

Taking a pulse of what's going on in Verification!

The design complexity today is marching forward at an accelerated speed and its effect on verification is witnessing equal leaps and bounds. The world of verification has grown multi-fold in past 5 years across all fronts covering what all to verify, how to verify and who all required to verify. Today, conferences hosted by EDA partners are not just focusing on a simulator, a language, a methodology or a new tool and instead bringing up discussions on a plethora of topics. It seems to address this growth, Cadence decided to have the 2nd day of CDNLive India 2014 mostly targeting the DV community hosting multiple tracks on verification. With hundreds of footfalls, a mixed bag of papers on all aspects of verification, an extended exhibitor pavilion for partners along with lunch and tea sessions busy with networking, the event was truly in sync with the theme, connect... share... inspire!

A decade back all such events focussed on sharing the upgrades in tools particularly the simulator. The technology update included introduction to new features, added support for language and improved performance. Today no one talks about the simulator and instead everything around it getting integrated under a hood with features like –

  • A broad VIP catalog with added support for easy bring up of test benches for IP/SoC, compliance test suite with ready made coverage/assertions. A new class of accelerated VIPs that interface easily with the emulators giving further boost to the overall productivity.
  • Updates to power aware verification involving multiple tools (formal, simulator & emulator) with support for different power formats/versions, auto generation of relevant assertions and different aides to ease debugging these scenarios.
  • How to model analog blocks to achieve high performance with electrical accuracy as part of the AMS support. Different languages supporting model development, coverage driven verification to validate these models and reuse of the environment when integrated with digital blocks.
  • A portfolio of formal apps that can accomplish the job of static checks related to connectivity, power, registers, X-prop, protocol compliance checks etc.
  • Improved support for IPXACT based flows to enable register modelling, register verification and interface connectivity promoting IP reuse with minimum issues.
  • Integrated support for coverage collection and merging across different levels (IP, sub system & SoC) of verification involving different tools or flows used to achieve verification closure.
  • Verification management featuring executable verification plans, regression management, triaging and analysis with different views based on user’s role in the project.
  • Added debug tools and support to view the transactions, filter or play around back and forth with the simulation logs.
  • Hardware accelerators with improved capacity, performance and features that enable detailed debugging, power aware support, assertions and coverage.
  • Prototyping platforms/emulators and how they enable boot up with android etc. much before the silicon arrives.
  • Improved Virtual platforms and models in sync with the above to enable early software development thereby shifting the whole product development cycle to the left.
  • Performance analysis of SoC confirming architecture stability or assisting in exploring alternates promptly.

Yes! Verification has evolved into a GODZILLA beyond the control of one HVL, one methodology and one simulator. The rules of the game have changed and the industry is responding faster than ever to this change. Observing all these changes I recollect the topic EDA360 introduced by Cadence to the industry back in 2010. While the terminology might have lost its steam, the essence of the idea seems to have realized quite a lot since then. For those who missed reading about it please refer to the below posts summarizing EDA360. Believe me it’s worth reading!


Do you agree that the current state is in sync with EDA360? Leave a comment!

If you have missed out to any of these events, don’t miss to attend DVCON India 2014 on Sept 25, 26. Registrations now open.

Monday, February 13, 2012

Verification IP : Changing landscape - Part 1

For decades, EDA industry has been working out options to improve their offerings and ensure silicon success for the semiconductor industry. A few decades back, while the EDA giants were unknown, design automation was exercised individually in every organization developing a product. Gradually these tools moved out of the design houses and the ‘make vs buy’ decision carved a new league with EDA front runners. Talking specifically about verification, while the simulation tools were procured from outside, the in house CAD groups were still required to develop a flow for automation, self checking, regression management and data analysis. In the last decade, all of this came up as a package from EDA vendors thereby further squeezing the CAD teams in the design houses. So, what next?  Well, the recent acquisitions in this space i.e. Cadence acquiring Denali, Synopsys acquiring nSys and ExpertIO indicate where we are heading. Initially the VIP providers were able to sustain with licensing and basic verification support model but now the pressure is surmounting as VIPs get commoditized. The VIP only vendors will have to identify themselves either with the EDA vendors or with Design service providers wherein the VIPs complement other offerings. Before moving ahead let’s discuss what is a VIP?
What is a VIP?
A Verification IP is a standalone ‘plug and play’ verification component that enables the verification engineer in verifying the relevant DUT (design under test) module at block, sub system and SoC level. Based on the need, the VIP can act as a BFM to drive DUT signals or MONITOR the signals and VALIDATE them for correctness and data integrity. It may have a set of protocol CHECKERS and test scenarios to confirm compliance with the standards (if any) or COVER groups identifying corner cases and test completeness. VIPs are developed using HDLs or HVLs with standard methodology or as a set of assertions. It needs to have enough APIs providing flexibility for modifying the internal units as per DUT. User should be able to extend the available monitors, scoreboard and checker hooks while developing a verification environment around this VIP. Ease of developing scenarios at different levels using available functions and sequences is important in minimizing the verification turn around cycle. Quick integration with other tools like coverage, simulate, debug and transaction analysis is critical in successfully deploying the VIP.  Finally the most crucial is the support provided by the VIP vendor to the user in responding to issues and upgrading the VIP as the standards evolve.
The common VIPs available include –
- MIPI protocols like DSI, CSI, HSI, SlimBus, Unipro, DigRF & RFFE.
- Bus protocols like AXI, AHB, APB, OCP & AMBA4.
- Interfaces like PCIexpress, USB2.0, USB3.0, Interlaken, RapidIO, JTAG, CAN, I2C, I2S, UART & SPI.
- Memory models & protocol checkers for SD/SDIO, SATA, SAS, ATAPI, DDR2/DDR3, LPDDR etc.

Companies providing the above VIPs fall into one of the below categories -
- EDA vendors
- VIP only providers (some of them offering IPs also)
- Design services companies

Although, there is a lot of IP development still continuing with the design houses, the motivation to procure VIP from outside is much higher for obvious reasons. 

In the next post we will discuss the challenges for new entrants that raise barriers to entry and role of Soc Integrators, EDA companies and design services organizations in this changing landscape.

Related posts -
Verification IP : Changing landscape - Part 2
Choosing the right VIP
 

Sunday, November 21, 2010

EDA360 Realizations : Verification perspective

EDA360 is all about bringing a CHANGE. A change in perception, planning and execution to bring about a change in the 3Ps that limit our Product. The approach widens the horizons and opens up new avenues for all the stakeholders associated with product development. With involvement of verification team at all levels in the ASIC design cycle, this new theory affects a verification engineer in multiple ways.
-    Movement to a higher abstraction layer demands verification at a new layer. The verification plan now needs to address verification of models at this layer and define an approach to reuse the efforts at following layers. The Verification IP should be portable so as to verify the IP model stand alone and when integrated into an SOC. The verification environment at this level must provide enough hooks to assist in power, performance what-if analysis. It should also have enough APIs to simulate the embedded drivers on this platform.
-   Verification IP needs to become an integral part of IP deliverable. Apart from providing required constraint random infra structure, the VIP should also include a protocol compliance suite to exercise the IP with these vectors at different levels. It must also ensure that the drivers coming as part of the IP stack can be simulated with required modification to the verification environment. Extrapolate this to sub-system or SOC level and we get a list of expectations from the verification environment at different stages of the design integration.
-    With analog claiming more area on the chip, there is a need to define a methodology to verify analog modules. Abstract layers modeling analog blocks needs to be added to ascertain their presence at all levels of design cycle. Verification of these models at different levels, defining a reusable constraint random verification flow, planning a metric driven verification approach, seamless integration verification of analog and digital blocks are major challenges to be taken care for mixed signal designs.
-    Simulating real time scenarios is needed to ensure system level functionality. Hardware acceleration is desired to drive real time data into the design before tape out while reducing simulation turn-around time. Application driven approach requires software to be developed in parallel with the hardware. To test this software early enough Virtual prototyping and emulation are potential solutions. Verification team’s involvement is required at all these levels to reuse the infrastructure and recreating failing scenarios for detailed design debug.
-    Advancement in Verification platforms has been exorbitant. Apart from coverage driven methodology, dashboards (with different views) are available to effectively manage the resources (engineers, hardware and licenses) while tracking the progress. Deploying similar flows to Mixed signal verification, Embedded software verification and System level tracking would help a lot in bringing the teams contributing to the product development together.
EDA360 preaches elimination of redundant verification to reduce cost and improve productivity, profitability and predictability. It proposes sharing the innovation in verification with other areas of the system development. A complete holistic verification is demanded that will eventually lead to more work, more jobs and more $$$ for the verification engineer J   

Related posts -
EDA360 : Realizing what it is?
EDA360 : Realizing the Realizations

Sunday, November 14, 2010

EDA 360 : Realizing the Realizations

EDA360 proposes a marriage between hardware and software. Each of them evolved independently and on integration, the final product has been “good enough” but not OPTIMAL. Either the software is unaware of the features in hardware or the hardware is incognizant with the requirements of software. Furthermore, when there is an issue with the system operation, ownership of debugging is a big question mark. EDA360 suggests correction at all levels through 3 realizations.
Silicon Realization – Associated with Creators and Integrators (serving as Creators also), it aims to design (an IP, a sub-system or a chip) for high performance, low power and small form factor. With increasing design complexity, matters related to low power, mixed signal design, 3D IC, DFM and yield are worrying the technocrats more than ever. Within the hardware arena, the tool evolution for functional, physical and electrical domains happened in silos and this constantly hampers efficiency, productivity and predictability. Amidst these technical challenges, pressure of time to market leaves no room for error. To achieve the goal of 3Ps, design teams need to control complexity at a higher abstraction level. This means that all variables of the design process should be accessible i.e. controllable and observable at a higher layer. There is a need for an integrated flow along with unified representation of intent to drive the design from this abstract layer to silicon such that there is interoperability on functional, physical and electrical parameters at each level.
SOC Realization – Associated with Integrators it extends expectations towards Creators. The traditional approach to SOC development is serial i.e. SOC à [OS + SW] à APPS. Each of these steps have least interaction and knowledge sharing, thereby leading to poor Productivity i.e. sub optimal hardware usage and increased failures in the end product. This adds to the cost & delay affecting Predictability and Profitability. Moreover, disjoint teams result into ownership issue (HW or SW problem) if the end product doesn’t behave as expected. SOC realization proposes a potential solution to such problems by treating SOC as a HW that is always accompanied with device drivers. This can be achieved by deploying a top down approach where SOC design and related (embedded) SW development happen in parallel. TLM and virtual prototyping aid in testing these SW layers well in advance before the silicon comes back. Extrapolating this approach sets expectations for IP developers to package the whole stack i.e. TLM, RTL, Netlist, design constraints, VIP and device drivers as part of IP deliverable.
System RealizationIt is associated with Integrators. In the conventional flow [HW à SW à APPS], Productivity suffers as, HW is either over designed to support unforeseen applications or under designed limiting application support. This bottom up approach involves minimal planning and no what if analysis leading to poor selection of components (HW, OS, and Drivers etc) for the system. HW SW integration and APPS development happen late and with little knowledge of each other, contributing to inefficiencies and schedule delay [re-verification and ECOs] thereby affecting Profitability. Silicon Realization suggests a top down approach where applications drive the system requirements to overcome such issues and add differentiation as well as competitiveness to the end product. Chip planning tools assist in what if analysis with available set of components for a given architecture to improve predictability. TLM, Virtual prototyping and Emulation tools help with early software development and testing. Latest development in verification when extended at multiple levels provides a defined approach for embedded SW verification and bringing up a controllable & observable dashboard to manage the complete system development lifecycle.
The changing market dynamics has ignited the debate of ‘what should be’ from ‘what is’ available and if it demands a complete change in the way we design our products we better do it now before the next product from competition kicks our product out from the market.      

Related posts -
EDA 360 : Realizing what it is?
EDA360 Realizations : Verification perspective

Saturday, October 30, 2010

EDA 360 : Realizing what it is?

A vision paper released by Cadence in early 2010 created a BUZZ in the EDA community. The hot topic of this year’s CDNLive 2010 is EDA360. Here is a summary on this topic in a 3 part series which concludes with what EDA360 brings to our plate as a verification engineer.
Cadence has a history of creating a wave (with whatever it does) so much that it kind of enters into the DNA of the Cadence guys and the people associated. In past, when Cadence released Low Power CPF solution, my team was invited for a technical brief on the solution at Cadence premises. By that time we had heard a lot about CPF from the Sales, AEs and friends working for Cadence but we were amazed when, while we were completing the formalities to enter their building, the security guard asked if we were there to discuss Low power! I am sure this time if someone visits he would be asking for EDA360 J
WHAT IS EDA360?
For many years semiconductor industry has been religiously following Moore’s Law.  With few defined nodes to shrink size, exponential increase in cost of silicon at advance nodes, saturation in SOC architecture for a given application, time to market challenges and shriveling market window for any product, there is a need to CHANGE the way every entity associated with this industry has been operating.  With software claiming majority of product cost, the innovation and differentiation revolves more around end user applications (apps). This demands a SHIFT in the ‘traditional approach’ of developing HW, OS, SW and APPS serially, to an ‘application driven model’ where, definition of a complete platform enables HW & SW development in parallel. Recent success of iPhone (Hardware dependent platform) and Android (Hardware independent platform) over conventional players demonstrates the changing dynamics of the market. This calls for a HOLISTIC ADJUSTMENT among the partners in this ecosystem.
EDA360 is a potential answer to this CHANGE, this SHIFT and this HOLISTIC ADJUSTMENT. It addresses the need for all stakeholders in the ecosystem with the goal of improving the 3Ps (Productivity, Predictability and Profitability) for the 4th P i.e. the Product. It is an ambitious attempt to connect the product development chain by understanding the customer’s customer (the end customer). Focusing on the ‘consumer’, EDA360 preaches a top down i.e. application driven approach for product development by highlighting the grey areas in the present model and how the new model can help weed out the inefficiencies of the ecosystem thereby improving the 3Ps.
Who does what?
It starts with re-classification of the developers into –
1.  Creators – Organizations that will keep innovating silicon with the goal for better performance, low power and small die size. These teams would be constantly feeding silicon capacity and following Moore’s law. With minimal contribution towards end product differentiation, to be productive & profitable only a handful of such companies can survive.
2.  Integrators – Organizations that will optimally integrate the products from creators to actualize the end product. They would stick to mature process nodes and define application focused platforms (software stack) to serve the application needs with the goal to meet quality, cost and time to market. 
How it will be done?
To achieve the end goal, EDA360 brings forward 3 important capabilities –
-          System Realization
-          SOC Realization
-          Silicon Realization
Semiconductor industry is at a juncture where certain critical decisions become inevitable. As Charles Darwin pointed out in the Theory of Evolution – “Beneficial genetic mutations are part of random mutations occurring within an organism and are preserved because they aid survival. These mutations are passed on to the next generation where an accumulated effect results in an entirely different organism adapted to the changes in environment”. Hopefully EDA360 rolls out in a series of such mutations bringing out the much desired and demanded change essential for evolution of our industry.