Thursday, December 30, 2010

2010 : Bidding Adieu

2010 has been an eventful year on all fronts for everyone.
After experiencing a worse downturn, the semiconductor industry witnessed close to +30% YoY growth in 2010. Customers, revenues, profits and jobs kept adding up month after month.


The verification space saw a lot of activity on multiple fronts.
A few highlights include -

- IEEE 1800-2009, a revision to existing System verilog standard was completed.
- UVM : Early Adopter version was released.
- Enhancements made to UPF IEEE standard.
- A lot of other (coverage, IP, AMS) standardization efforts progressed.
- ESL picked up after much delay.
- Formal verification complemented simulations more than ever.
- FPGA verification came up as hot topic.
.....

- sid'dha-karana, a blog focused on verification kicked off.
.....


On a closing note for the year, a few personal recommendations from 2010 -

MUST READ -

Evolution of design methodology - Part I

Evolution of design methodology - Part II

MUST WATCH -

Sand to silicon : the making of a chip

At this juncture I would like to thank you for your valuable feedback, support  and readership. I look forward for the same in 2011.

Wish you and your family "HAPPY & PROSPEROUS 2011".

:)

Sunday, December 26, 2010

Standards War : A Necessary Evil

Which HVL do you use for verification? Which methodology do you follow? Which power format does your team use? Do these questions sound familiar?
No, these aren’t interview questions. These are the conversation starters of verification world!
Last few years have witnessed maximum activity in the verification space. Every EDA company was busy gearing up to propose a solution in different areas. Verification being touted as the most resource consuming activity has been a darling of the EDA community. With so much action in this space, the engineers are left in a confused state. 
The options available in various verification categories include –
HVL – e (IEEE 1647), System Verilog (IEEE 1800), Vera (Open source).
Methodology [MET] – eRM (e), RVM (Vera), AVM, URM, VMM, OVM, UVM.
Power Format [PF] – CPF, UPF (IEEE 1801).
The alternatives listed above have witnessed a full course of evolution before wide acceptance. The choice still depends on the marketing strategy, sales pitch and the final deal!!! From launch of a solution to becoming an industry standard it’s a long road to survival. During this process the verification teams spoiled for choices experience multiple challenges. A few of them include –
- For SOC development, getting third party IP is inevitable. If the Verification environment accompanying the IP is in a different HVL/ MET/ PF then integration becomes an issue.
- Choice of verification environment for IP vendors becomes critical too. Wrong choice on the above categories can even put them out of the race with some customers.
- ASIC flow involves tools from multiple vendors. A Power Format supported by verification tools might not work with implementation tools thereby adding inefficiencies in flow.
- Integrating tools from multiple vendors for verification (through PLIs etc.) may lead to long iterations (delaying schedule) in tool debug if unusual behavior is encountered.
- Given multiple options (above), hiring engineers with required skill set or training them as desired, further delays the productivity.
If having alternatives is the root cause of these issues why do we have them?
Well, to solve the next level engineering challenges, we need to make sure the solutions to the present ones are mature enough. As designs kept unfolding new problems, answers to them are made available by EDA vendors. To get a stable solution either we brainstorm together (unusual in competitive world) or each one brings up a solution and the better one is chosen when standardizing. Multiple designs, multiple engineers and multiple customers work towards making the solution robust and mature. While the EDA vendors are busy wooing customers they continue untiring efforts in order to get their proposal become an industry standard. When the industry moves to next level of design complexity, the war for defining a standard becomes a need. The ‘war’ that was, feeds to bring up a standard that makes process simpler and sets the stage to solve the next engineering challenge.
No doubt this WAR ends up in a HEALTHY outcome!

Sunday, November 21, 2010

EDA360 Realizations : Verification perspective

EDA360 is all about bringing a CHANGE. A change in perception, planning and execution to bring about a change in the 3Ps that limit our Product. The approach widens the horizons and opens up new avenues for all the stakeholders associated with product development. With involvement of verification team at all levels in the ASIC design cycle, this new theory affects a verification engineer in multiple ways.
-    Movement to a higher abstraction layer demands verification at a new layer. The verification plan now needs to address verification of models at this layer and define an approach to reuse the efforts at following layers. The Verification IP should be portable so as to verify the IP model stand alone and when integrated into an SOC. The verification environment at this level must provide enough hooks to assist in power, performance what-if analysis. It should also have enough APIs to simulate the embedded drivers on this platform.
-   Verification IP needs to become an integral part of IP deliverable. Apart from providing required constraint random infra structure, the VIP should also include a protocol compliance suite to exercise the IP with these vectors at different levels. It must also ensure that the drivers coming as part of the IP stack can be simulated with required modification to the verification environment. Extrapolate this to sub-system or SOC level and we get a list of expectations from the verification environment at different stages of the design integration.
-    With analog claiming more area on the chip, there is a need to define a methodology to verify analog modules. Abstract layers modeling analog blocks needs to be added to ascertain their presence at all levels of design cycle. Verification of these models at different levels, defining a reusable constraint random verification flow, planning a metric driven verification approach, seamless integration verification of analog and digital blocks are major challenges to be taken care for mixed signal designs.
-    Simulating real time scenarios is needed to ensure system level functionality. Hardware acceleration is desired to drive real time data into the design before tape out while reducing simulation turn-around time. Application driven approach requires software to be developed in parallel with the hardware. To test this software early enough Virtual prototyping and emulation are potential solutions. Verification team’s involvement is required at all these levels to reuse the infrastructure and recreating failing scenarios for detailed design debug.
-    Advancement in Verification platforms has been exorbitant. Apart from coverage driven methodology, dashboards (with different views) are available to effectively manage the resources (engineers, hardware and licenses) while tracking the progress. Deploying similar flows to Mixed signal verification, Embedded software verification and System level tracking would help a lot in bringing the teams contributing to the product development together.
EDA360 preaches elimination of redundant verification to reduce cost and improve productivity, profitability and predictability. It proposes sharing the innovation in verification with other areas of the system development. A complete holistic verification is demanded that will eventually lead to more work, more jobs and more $$$ for the verification engineer J   

Related posts -
EDA360 : Realizing what it is?
EDA360 : Realizing the Realizations

Sunday, November 14, 2010

EDA 360 : Realizing the Realizations

EDA360 proposes a marriage between hardware and software. Each of them evolved independently and on integration, the final product has been “good enough” but not OPTIMAL. Either the software is unaware of the features in hardware or the hardware is incognizant with the requirements of software. Furthermore, when there is an issue with the system operation, ownership of debugging is a big question mark. EDA360 suggests correction at all levels through 3 realizations.
Silicon Realization – Associated with Creators and Integrators (serving as Creators also), it aims to design (an IP, a sub-system or a chip) for high performance, low power and small form factor. With increasing design complexity, matters related to low power, mixed signal design, 3D IC, DFM and yield are worrying the technocrats more than ever. Within the hardware arena, the tool evolution for functional, physical and electrical domains happened in silos and this constantly hampers efficiency, productivity and predictability. Amidst these technical challenges, pressure of time to market leaves no room for error. To achieve the goal of 3Ps, design teams need to control complexity at a higher abstraction level. This means that all variables of the design process should be accessible i.e. controllable and observable at a higher layer. There is a need for an integrated flow along with unified representation of intent to drive the design from this abstract layer to silicon such that there is interoperability on functional, physical and electrical parameters at each level.
SOC Realization – Associated with Integrators it extends expectations towards Creators. The traditional approach to SOC development is serial i.e. SOC à [OS + SW] à APPS. Each of these steps have least interaction and knowledge sharing, thereby leading to poor Productivity i.e. sub optimal hardware usage and increased failures in the end product. This adds to the cost & delay affecting Predictability and Profitability. Moreover, disjoint teams result into ownership issue (HW or SW problem) if the end product doesn’t behave as expected. SOC realization proposes a potential solution to such problems by treating SOC as a HW that is always accompanied with device drivers. This can be achieved by deploying a top down approach where SOC design and related (embedded) SW development happen in parallel. TLM and virtual prototyping aid in testing these SW layers well in advance before the silicon comes back. Extrapolating this approach sets expectations for IP developers to package the whole stack i.e. TLM, RTL, Netlist, design constraints, VIP and device drivers as part of IP deliverable.
System RealizationIt is associated with Integrators. In the conventional flow [HW à SW à APPS], Productivity suffers as, HW is either over designed to support unforeseen applications or under designed limiting application support. This bottom up approach involves minimal planning and no what if analysis leading to poor selection of components (HW, OS, and Drivers etc) for the system. HW SW integration and APPS development happen late and with little knowledge of each other, contributing to inefficiencies and schedule delay [re-verification and ECOs] thereby affecting Profitability. Silicon Realization suggests a top down approach where applications drive the system requirements to overcome such issues and add differentiation as well as competitiveness to the end product. Chip planning tools assist in what if analysis with available set of components for a given architecture to improve predictability. TLM, Virtual prototyping and Emulation tools help with early software development and testing. Latest development in verification when extended at multiple levels provides a defined approach for embedded SW verification and bringing up a controllable & observable dashboard to manage the complete system development lifecycle.
The changing market dynamics has ignited the debate of ‘what should be’ from ‘what is’ available and if it demands a complete change in the way we design our products we better do it now before the next product from competition kicks our product out from the market.      

Related posts -
EDA 360 : Realizing what it is?
EDA360 Realizations : Verification perspective

Saturday, October 30, 2010

EDA 360 : Realizing what it is?

A vision paper released by Cadence in early 2010 created a BUZZ in the EDA community. The hot topic of this year’s CDNLive 2010 is EDA360. Here is a summary on this topic in a 3 part series which concludes with what EDA360 brings to our plate as a verification engineer.
Cadence has a history of creating a wave (with whatever it does) so much that it kind of enters into the DNA of the Cadence guys and the people associated. In past, when Cadence released Low Power CPF solution, my team was invited for a technical brief on the solution at Cadence premises. By that time we had heard a lot about CPF from the Sales, AEs and friends working for Cadence but we were amazed when, while we were completing the formalities to enter their building, the security guard asked if we were there to discuss Low power! I am sure this time if someone visits he would be asking for EDA360 J
WHAT IS EDA360?
For many years semiconductor industry has been religiously following Moore’s Law.  With few defined nodes to shrink size, exponential increase in cost of silicon at advance nodes, saturation in SOC architecture for a given application, time to market challenges and shriveling market window for any product, there is a need to CHANGE the way every entity associated with this industry has been operating.  With software claiming majority of product cost, the innovation and differentiation revolves more around end user applications (apps). This demands a SHIFT in the ‘traditional approach’ of developing HW, OS, SW and APPS serially, to an ‘application driven model’ where, definition of a complete platform enables HW & SW development in parallel. Recent success of iPhone (Hardware dependent platform) and Android (Hardware independent platform) over conventional players demonstrates the changing dynamics of the market. This calls for a HOLISTIC ADJUSTMENT among the partners in this ecosystem.
EDA360 is a potential answer to this CHANGE, this SHIFT and this HOLISTIC ADJUSTMENT. It addresses the need for all stakeholders in the ecosystem with the goal of improving the 3Ps (Productivity, Predictability and Profitability) for the 4th P i.e. the Product. It is an ambitious attempt to connect the product development chain by understanding the customer’s customer (the end customer). Focusing on the ‘consumer’, EDA360 preaches a top down i.e. application driven approach for product development by highlighting the grey areas in the present model and how the new model can help weed out the inefficiencies of the ecosystem thereby improving the 3Ps.
Who does what?
It starts with re-classification of the developers into –
1.  Creators – Organizations that will keep innovating silicon with the goal for better performance, low power and small die size. These teams would be constantly feeding silicon capacity and following Moore’s law. With minimal contribution towards end product differentiation, to be productive & profitable only a handful of such companies can survive.
2.  Integrators – Organizations that will optimally integrate the products from creators to actualize the end product. They would stick to mature process nodes and define application focused platforms (software stack) to serve the application needs with the goal to meet quality, cost and time to market. 
How it will be done?
To achieve the end goal, EDA360 brings forward 3 important capabilities –
-          System Realization
-          SOC Realization
-          Silicon Realization
Semiconductor industry is at a juncture where certain critical decisions become inevitable. As Charles Darwin pointed out in the Theory of Evolution – “Beneficial genetic mutations are part of random mutations occurring within an organism and are preserved because they aid survival. These mutations are passed on to the next generation where an accumulated effect results in an entirely different organism adapted to the changes in environment”. Hopefully EDA360 rolls out in a series of such mutations bringing out the much desired and demanded change essential for evolution of our industry.

Sunday, October 10, 2010

PROOF of NO BUG... OR ...NO PROOF of BUG

To pursue a career in engineering, all of us have been through multiple courses in mathematics. All these courses involved proving equations or theorems. Given a relationship between variables we are asked to prove that relationship. There are defined methods and by following a step by step approach we are able to derive the desired proof. All this time we focus on proving LHS == RHS. Now imagine, if you are asked to prove the equivalence (LHS == RHS) by proving that there can be no way (LHS != RHS) the equation is non equivalent.
Sounds like a bully?
Interestingly, the whole practice of verifying semiconductor designs is based on this puzzling statement.
'Verification' i.e. sid'dha-karana, is a noun to the verb 'verify derived from Medieval Latin verificare : verus = true + facere = to make. So the literal meaning of the verification engineer's job is to make it true.
In an ideal world a fully verified design is the one where we have a PROOF of NO BUG (PNB) in the design. But the first learning imbibed in a verification engineer is that we cannot achieve 100% verification - an unconscious setback on the way one has been proving equality. A number of limitations (engineering or hardware resources, tools, schedule to name a few) that laid the foundation of this unachievable 100% verification has tossed the regular equation and shifted our focus from PNB to NPB  i.e. NO PROOF of BUG. The verification engineer thus endeavors to pursue all means to make sure there is no proof of a bug found (feels like a daunting task if you still relate it with proving there is no way that the equation is non equivalent). With a set of tools, methodologies, languages and checklists the hunt for the bugs i.e. all possible ways to prove the non equivalence begins. Slowly, as we approach closer to verification closure, the constantly passing regression and diminishing bug rate strengthens our assumption that no more bugs are concealed. With verification sign off, the design is labeled BUG FREE with the assumption that hopefully if no more bugs discovered, NPB == PNB. Silicon validation adds more credibility to our assumption, while the team lives with anxiety during the Si bring up. If we are fortunate enough to have customers who explored the design in the way we did in verification the assumption becomes immortal…. Happy ending!
Of course if we miss one of the way(s) of reaching to the bug(s), the result is much more costly in cash & kind when compared to our inability in proving an equation before our Maths teacher.
Maybe that’s the risk we as verification engineers assume when we pick NPB over PNB for whatever reasons.
Happy BUG hunting! :)

Sunday, October 3, 2010

Predictions for future - The next decade!

Constraints & limitations with available resources lead to innovative thinking... to inventions... to evolution...

As we step into the next decade, the verification fraternity is all set to reap the benefits of the investments made in the present decade. Listed below are technologies (in no specific order) that would define the future course of verification. [Watch out for details on each one of them in future posts.]

1. UVM - The UVM standard committee is all set to roll out the UVM1.0 standard. This methodology sets up a new base for reusability, coverage driven verification, supporting different HVLs and curtailing the confusion of (which one's better).

2. Formal verification - Faster simulation, exhaustive scenario generation, handling complexity, you name it and formal [hai na] is there for it. Improved engines shall handle larger database and provide a defined way of moving from unit to chip level with formal. The APP based approach further brings in the required focus & results.

3. Hardware accelerators - Running real time applications at pre-silicon database with faster turnaround is the need of hour. Hardware accelerators would be able to suffice this demand enabling all domains to exercise real life data before moving to back end implementation.

4. Unified coverage model - Slowly creeping into main stream the work group is making progress to unify the coverage data base across different sources and simulators thereby improving reusability across many levels.

5. AMS - With analog claiming bigger chunk of the mixed signal die, verification is not limited to address correct integration only, instead measurement of analog performance while harmonics from digital operates in tandem would be more prevalent.

6. Low power designs - Orientation to go GREEN brings in new techniques to curb power. While verification ensures correct implementation it shall move a level up & assist in estimating power figures with different application scenarios (use cases) thereby defining optimum system architecture.

7. Electronic System Level (ESL) - Dealing with complexity is easier at higher abstraction levels. System modeling is gaining traction and so is system verification. Standards like TLM2.0 opened up gates for support of transaction modeling in different languages assisting in both design & verification.

8. Cloud computing - Problems associated with verification aren't only in developing test benches, debugging or reusing. Efficient management of compute infra structure to get maximum return on investments made in hardware & software is equally important. Cloud computing defines a new approach to deal with this chronic techno-business problem.

9. Virtual platforms - A complete platform that engages all players of the eco system at the same time i.e. application developers, embedded designers, ASIC design, verification & implementation teams to aid faster time to market will be the key to next generation designs. Verification holds the key to defining & developing such platforms.

10. Courses - With so much going in Verification, don't be surprised if it emerges as a separate stream for specialization in masters degree and adding more PHDs in this domain :)

The above list is what I forsee as the key to NEXT GEN verification.
If you have any other exciting stuff that should unfold in next decade do drop in your comments.

Previous -
'Recap from the past - the last decade'
'Living the present - the current decade'

Sunday, September 26, 2010

Living the PRESENT - the current decade!

The present decade (2001-2010) witnessed the metamorphosis of verification domain from adoloscence to maturity. A lot of the initiatives from the last decade paid off well to scale verification along with the ASIC designs while a lot of new investments were made keeping the next decade in mind. Some verification features that made their mark in this decade include -

1. System Verilog - touted as a one stop solution for HDL+HVL, addressing the limitation of the designers using HDLs (Verilog/ VHDL) and the verification engineers debating on HVLs (e/Vera). An extension of Verilog 2005 and largely based on OpenVera language, System Verilog became a darling of everyone from the Semiconductors as well as the EDA companies.

2. Standardization of HVLs e & SV - System Verilog was adopted as IEEE Standard 1800-2005 and then merged with the Verilog IEEE 1394-2005, leading to IEEE Standard 1800-2009. 'e' language promoted by Verisity and bought by Cadence went ahead getting standardized as IEEE 1647-2006. 

3. CDV - Code coverage stagnated as a baseline for measuring verification progress & completeness. Functional coverage supported by the HVLs was able to put forward a different dimension and widely accepted. Assertion coverage though promising wasn't able to spread its wings wide enough, partly, since it was scenario focussed and partly because of ownership issues between design & verification teams. Even test planning evolved a lot with many tools now aiding in developing a robust test plan by keeping a track of architecture document and extending means for defining comprehensive coverage goals upfront.

4. Reusability - A jargon that still makes news every now & then. Design complexity jumping new heights and time to market panic lead to evolution of IP designs and reusability. With design, verification IPs also defined their market quite strongly. Methodologies (eRM & RVM) helped in packaging the verification environment so as to make it portable from module to chip level and between programs. SV evolution brought in OVM & VMM complementing eRM & RVM respectively. [Note - OVM is a mix of eRM & AVM]. Finally, this decade will commence with standardization of methodology - UVM.

5. AMS - The intricacies of Analog always kept it behind the digital world in terms advancements in process technologies or other methodologies. Product demands from various domains lead to single chip (packaging) and later single die solutions. Since analog & digital designers were always alien to each other these integrated designs demanded verification. AMS simulations addressed this space with various schemes like - Gate level representation of digital & transistor level representation of analog, Gate level or RTL representation of digital & behavioral modeling of the analog etc.

6. Formal verification - Directed verification opened gates to constrained random verification (CRV) where unforseen scenario generation was the focus. Given time & compute resource limitations CRV struggled to reach all possible points in the designs in a defined time. Formal approach proposes to resolve these limitations. However, slow advancement on the simulators limited the adoption of this methodology for most part of the decade.

7. Low power verification - Area and performance had been defining ASIC designs until power became an important measure. Innovative design techniques reduced power consumption with an overhead logic. EDA tools capable of verifying power aware designs - MVSIM & MVRC from Archpro (later acquired by Synopsys) and Cadence IUS & Conformal LP did quite well to address this issue. UPF & CPF emerged as two power formats that rushed to address representation of power aware designs. [UPF - IEEE 1801-2009].
 
8. Hardware accelerators - More gates, big designs, long simulation run time = bottleneck to tapeout in time. Next generation of hardware accelarators paved way for reducing the simulation turn around time. These boxes soon took shape of a complete platform for verification of the whole system and further advanced to help simulate some real time data into the designs before tapeout with much ease.

9. HW SW co-verification - With design focus shifting from the ASIC to SOC, system design & modelling became all the more important. Software testing became the next chokepoint for time to market. A well defined platform where software design & testing could start in parallel to the ASIC design came forefront. Better HW SW partioning, performance checks, system design issues etc all were well addressed with this platform.

10. Courses in verification - VLSI course became more prominent and adopted in almost all electrical & electronics courses. Late in the decade a need for focussed verification courses took centre stage. Many institutes delivering these courses addressed the need for a competent verification work force.

Next -
'Predictions for future - the next decade!'

Previous -
'Recap from the past - the last decade!'

Tuesday, September 14, 2010

Recap from the PAST - the last decade!

Verification has evolved a lot in the past 2 decades and there is more to come as we step in the next decade. This 3 part series lists out the highlights of the past, the present & predictions for the future.

Nineties had a crucial role in justifying verification to get a prominent berth in ASIC design cycles, ASIC schedules and ASIC teams. It was during this decade when verification commenced with full pace. Following are some milestones that helped achieve this. Some of them peaked & stagnated while others blossomed to scale even bigger in the following decade.

1. Standardization of HDLs and ownership of revised versions.
- VERILOG - IEEE 1364-1995 [Latest version 1364-2001, 1364-2005]
- VHDL - IEEE 1076-1987 [Latest version 1076-1993, 1076-2002]
- (OVI) Open Verilog International and VHDL International merged in 2000 to form Accellera.

2. The Simulator war.  Following are the ones that received wide acceptance -
- Verilog XL [Interpreter]
- VCS (Verilog compiled simulator) [Compiler]
- Modelsim

3. Directed verification - With low gate count, HDLs/C based verification was the softest path to tapeout. The test plans and test benches were primitive enough targeting only the features to be verified with no scalability & reusability.

4. Defined verification teams - With increasing design size (gate count) and  complexity there was a need to parallelize the effort of RTL with verification. Other factors like, having second pair of eyes to look into code and verification comprising 70% of ASIC schedule etc. ignited the look-out for dedicated verification teams to tame the bugs in the design.

5. Automation of verification infrastructure - The increasing complexity and no. of tests were a challenge to manage. Scripting took the front seat to automate the whole process of verification which includes, running tests in batch mode, checking for failures, regressions etc.  Typically every organization had a script called runsim as a initial step to get the flow working.

6. Constrained random verification - With design advancements poised to follow Moore's law, directed verification saturated and there was a need for controlled aka constrained random stimuli to uncover unforseen bugs. C++ came up to rescue followed by hardware verification languages like Vera & e.

7. Code coverage - Increasing verification complexity lead to the basic but intricate question - 'are we done'? Covering the RTL code with the given set of stimuli has been one of the important milestone defining closure since then.

8. Gate level simulations - With equivalence checkers still stabilizing, setting up gate level simulations flow was crucial to ensure the synthesized netlist was sane enough to go for a tapeout.

9. Hardware accelarators - The concept of FPGA prototyping was replicated to develop boxes that could synthesize the design and run them 10x or faster to reduce the test case turn around from days to hours.

10. ASIC design as part of the curriculum was introduced at institutes in India offering Micro-electronics/VLSI as a subject in graduation and as a complete stream for post graduation. This helped source engineers to meet the demand for the semiconductor industry.

Tuesday, September 7, 2010

SNUG 2010, Bangalore

Although it's been a couple of months, I still thought of sharing my experiences from SNUG, one of the most touted conference in VLSI industry. SNUG 2010 @ Leela Palace Bangalore was witnessed by almost 2000+ engineers.

For me it was the first time I attended this event. Thanks to the recession that forced me to pack my baggages from Noida and move to Bangalore where most of the action is.

It started with the CEO Dr. Aart de Geus sharing vision of future from Synopsys perspective with a jam packed audience from the engineering community.

A few interesting messages from his keynote -
- Growth of semiconductor business = f(technology + state of economy) = Technomics.
- India & China slowly claiming stake with 6% & 9% growth during shrinking economy worldwide.
- Even though technology tried to pace up with Moore's law, the ROI is failling behind.
 
He also listed technologies to watch out for in 2010 -
- Cloud computing
- Energy conservation
- Video applications
 
After his enlightening talk, the technical tracks picked up. Since I was keen on keeping my tryst with taming bugs I jumped into the verification track. It started off with summarizing future trends on verification and then went into core technical where Synopsys users shared their interesting experiences.
 
Interestingly, there was an EXPO planned in the evening with different partners from the VLSI ecosystem show casing their products & services - a new feature successfully added to SNUG India.
 
Next day kicked off with a keynote from Global Foundries followed by an interesting session on cloud computing and verification papers.

Overall the SNUG expereince was good. Got a chance to meet friends and old colleagues. On my way back after 2 days of conference the mood was in sync with the BAG (probably inspired by bollywood flick - 3 idiots) gifted to the SNUG attendees by Synopsys symbolizing ALL IS WELL!!!


What's in here?

Welcome to my blog which unveils the exciting world of verification in words!!!

sid'dha-karana aka Verification will explore the 'kal, aaj aur kal of verification' - literally translates to 'yesterday, today & tommorow of verification (context matters! the word 'kal' changes meaning with context of time).

This blog is primarily targeted to ASIC/SOC & related verification and will share -
past      - legacy stuff...
present  - insight into what's going around...
future    - some predictions and trends for future...

The hindi lingo above reflects my mother tongue (Hindi) and signifies that, the discussions though will cover verification irrespective of geography in general but will still have some focus on India in particular.

The breadth of the topics expected to be covered are -
- Verification basics
- Verification flow
- Verification languages & methodologies
- General about semiconductor industry

Apart from this I plan to spice it up with "Experiences", "Interview Questions" and ofcourse excerpts from discussions with the who's who from verification domain in our industry.

The above are guidelines to start off with. We can do a makeover as & when demanded, after all the only thing constant is change :)

Enjoy reading & continuing your experiments with BUG HUNTING!

Do drop in line of what you think about the blog to siddhakarana@gmail.com

Yours Truly,
Gaurav Jalan