Today the traditional methods of verifying the design
strangle with complexity seeking new leaps and bounds. We discussed in our last post about why a new
pair of eyes was added to the design flow and since then we continue to not
only increase the numbers of those pairs but also improve the lenses that these
eyes use to weed out bugs. These lenses are nothing but the flows and methodologies
we have been introducing as the design challenges continue to unfold. Today, we
have reached a point in verification where ‘one size doesn’t fit all’. While
the nature of the design commands a customized process of verifying it, even
for a given design, moving from block to sub system (UVM centric) and
to SoC/Top level (directed tests) we need to change the way we verify the
scope. Besides the level, there are certain categories of functions that are best
suited for a certain way of verification (read formal). Beyond this, modelling the
design and putting a reusable verification environment around it to accelerate
the development is another area that requires attention. With analog sitting next to
digital on the same die, verifying the two together demands a unique approach.
All in all, for the product to hit the market window in the right time you cannot
just verify the design but you need to put a well defined strategy to verify it
in the fastest and best possible fashion.
So what is OODA loop? From Wikipedia, the phrase OODA
loop refers to the decision cycle of observe,
orient, decide, and act, developed by military strategist and USAF Colonel John Boyd. According to Boyd, decision making occurs in a recurring cycle of
observe-orient-decide-act. An entity (whether an individual or an organization)
that can process this cycle quickly, observing and reacting to unfolding events
more rapidly than an opponent, can thereby "get inside" the opponent's
decision cycle and gain the advantage. The primary application of OODA loop
was at the strategic level
in military operations. Since the concept is core to defining the right
strategy, the application base continues to increase.
To some extent this OODA loop entered the DV cycle with the
introduction of Constrained Random Verification paired with Coverage Driven
verification closure. Constrained random regressions kicked off the process of
observing the gaps, analyzing the holes, decide if they need to be covered and
acting on refining the constraints further so as to direct the simulations to
cover the holes.
Today, the need for applying OODA loop is at a much
higher level i.e. to strategize on the right mix of tools, flows and
methodologies in realizing a winning edge. The outcome depends highly on the 2 O’s
i.e. Observe & Orient. In order to maximize returns on these Os, one must
be AWARE of –
1. The current process that is followed
2 Pain points in the current process and anticipated ones for
the next design
3. Different means to address the pain points
Even to address the first 2 points mentioned above, one
needs to be aware of what all to observe in the process and how to measure the
pain points. While the EDA partners try to help out in the best possible way
enabling the teams with the right mix, it is important to understand what kind
of challenges are keeping the the others in the industry busy and how are they solving these
problems. One of the premiere forums addressing this aspect is
DVCON! Last year, DVCON extended its presence from US to India and Europe. These events provide a unique opportunity to get involved and in the process - connect, share and learn. Re-iterating the words of Benjamin Franklin once again
–
Tell me and I forget.
Teach me and I remember.
Involve me and I learn.
So this is your chance to contribute to enabling the fraternity
with the Os of OODA loop!
Relevant dates -
DVCON India – September 10-11, Bangalore
DVCON Europe – November 11-12, Munich
Other posts on DVCON at siddha'karana –