Alok Jain |
All of us have heard the story of a woodcutter and the
importance of the quote “Sharpen your axe”. It applies well to everything we do
including verification. Two decades back, the focus of a verification engineer
was predominantly on “What to Verify”. As complexity grew “How to Verify”
became equally important. To enable this, EDA teams rolled out multiple
technologies & methodologies. As we try to assimilate & integrate these
flows amidst first time silicon & cost pressure, it is important for us to
sharpen our axe through continuous learning, applying the right tool for
the right job and applying it effectively.
Alok Jain, Senior Group Director in the Advanced
Verification Division at Cadence would be discussing on similar lines as part
of his DV track keynote on Day 1 at DVCon India 2016. With 20+ years of
industry experience, Alok leads the Advanced Verification Division at Cadence
India. Having associated with different technologies around verification in the
past 2 decades, Alok candidly shared his views on the challenges beyond
complexity that verification teams need to focus on. Here is a curtain raiser
for his talk "Verification of complex SoCs" –
Alok your keynote topic focuses on challenges in
verification beyond the complexity resulting from Moore’s law. Tell us more
about it?
The keynote is going to focus on challenges and potential
solution for verification of complex SoCs. Verifying a complex SoC consisting
of tens of embedded cores and hundreds of IPs is a major challenge in the
industry today. One of the big challenges is performance and capacity. Given
the size and complexity of modern SoCs, tests can run for 18-24 hours or even
more. One has to figure out how to get the best verification throughput.
Another challenge is generation of test benches and tests. The test benches have
to be developed in a way which can achieve good performance in both simulation
and hardware acceleration. Tests have to be created that stress the SoC under
the application use cases, low power scenarios, and multi-core coherency
scenarios. The tests have to be re-usable across pre-silicon and post-silicon
verification and validation platforms. Yet another challenge is coverage. One
has to measure verification coverage across formal, simulation, and
acceleration platforms at the SoC level to know when you are done. The final
challenge is how to effectively debug across RTL, test bench, and embedded
software on multiple verification platforms.
In the last decade, advancements in verification was focused
primarily on unifying HVL(s) & methodologies. What changes do you foresee
in verification flows ‘Beyond UVM’?
UVM is very well suited for IP, Sub-system and some specific
aspects of SoC verification. However, UVM is not the best approach for general
SoC verification. UVM is essentially developed for “bottom-up” verification
where the focus is on trying to exhaustively verify IP/sub-systems. SoCs
require a more “top-down” verification where the focus is on stressing the SoC
under important application use cases. There is a need to reuse SoC content
across simulation, emulation, FPGA and post-silicon. UVM is optimized for
simulation and is too slow and heavy for high speed platforms. Finally, there
is a need to drive software stimulus on CPUs in coordination with hardware
interfaces. It is difficult in UVM to drive and control software and hardware
interfaces. All this is asking us to explore options beyond UVM. The keynote
will cover some more insights into options beyond UVM.
The rise of IoT is stretching the design demands to far
ends i.e. server class vs edge node devices. How do you see verification flows catering to these demands?
Several of the requirements for IoT verification are similar
to the ones for complex SoCs. But then there are some unique additional
requirements from the IoT world. The first is simply the cost of verification.
For complex SoCs, the cost of verification has been steadily rising. For IoT
applications, one has to consider alternative methods and flows that can reduce
the cost. One option is to use some form of a correct by construction approach
where the design is specifically done in a way to enable a simpler form of
verification. Another approach is to put much more emphasis on reuse. This
includes horizontal reuse which is portability across multiple platforms and
vertical reuse which is reuse from IP to sub-system to SoC. Another requirement
is verification throughput for design with considerably more analog, mixed signal
and low power content. Finally, one has to devise verification techniques and
flows that can cater to the security and safety requirements of modern IoT
applications.
Formal took a while to become mainstream. The rise of Apps
in Formal seems to have accelerated this adoption. What’s your view on this?
Yes, I do agree that Apps has considerably accelerated the
pace of adoption of formal. Traditionally, formal tools have been developed and
used by formal PhDs and experts. The main charter and motivation of these
experts was to solve the coolest and hardest problems in formal verification.
It was only after some time that both sides (developers and users) started
realizing that formal can be used in a much more practical and usable way by
engineers to solve specific problems. This lead to the development of various
formal apps which greatly enabled the mainstream usage of formal.
This is the 3rd edition of DVCon India. What are your
expectations from the conference?
I am expecting to attend keynotes, technical papers and
panel discussions that give me an understanding of some the latest work in the
domain of design and verification of IPs, sub-systems and SoCs. In addition, I
am looking forward to the opportunity to network with some of my peers from the
industry and academia.
Thank you Alok!
Come join us in this exciting journey to contribute,
collaborate, connect & celebrate @ DVCon India 2016!
Disclaimer: “The postings on this blog are my own and not necessarily reflect the views of Aricent”