Sunday, August 27, 2017

Quick chat with Ravi Subramanian : Keynote speaker DVCon India 2017

Dr. Ravi Subramanian
For many decades, the semiconductor industry followed Moore’s law, transforming what we called as a discrete chip carrying a function on silicon into a small IP inside the SoC on silicon today. As we continue to debate beyond Moore, more than Moore or stagnation of this law and step in the world of IoT, we realize that the system is no more only a single SoC, but instead, it is a conglomeration of multiple tiny & large systems working in tandem producing interesting use cases & enhancing user experience. But are we as the verification engineering workforce ready with the required skills along with the right arsenal of tools and efficient workhorses to ride through this new challenge?

Dr. Ravi Subramanian, Vice president and General manager of Mentor’s IC Verification Solutions Division shares a holistic view on this subject in his opening keynote on Day 2 at DVCon India 2017. The talk titled Innovations in Computing, Networking, and Communications: Driving the Next Big Wave in Verification, dives into convergence of different technologies and its impact on verification. A quick chat with Ravi, revealed the excitement that we all can look forward to in his talk as well as the future that lies ahead for all of us. Read on!!!

Ravi your keynote focusses on drivers to the next big wave in verification. Tell us more about it?

Yes, my talk will focus on the amazing innovations our industry is developing with respect to computing, networking, and communications. These include the changing nature of computing, the dramatic changes in networking and storage, and the disruptive effect of new broadband communications. Yet, the next big wave in design is actually the convergence of these technologies, which is driving today’s internet-of-things and autonomous systems revolution. A common theme across these emerging systems is the need for low power, security, and safety—whether you are talking about devices on the edge or high-availability systems in the cloud. These new challenges have opened innovation opportunities for us to rethink the way we approach verification

IoT is driving the convergence of different technologies. How would it affect the way we verify the systems today?

To answer your question, I first want to step back in time to provide a framework for today’s challenges. In the 1990’s the concept of separation of concerns was introduced into engineering. Essentially, the idea is that verification would become more productive if we focused on verifying orthogonal concerns or requirements of the design separately versus trying to verify multiple concerns combined. For example, during this period of time, we learned that it is more efficient to verify functional concerns and physical concerns in separate simulation runs. This approach to verification worked well up to about 10 years ago. The emergence of mobile devices introduced new low-power requirements that made it difficult to separate concerns. For example, today we see that physical concerns (such as low power management) now can directly affect functional behavior of a device. Hence, these concerns need to be verified together. Bringing together physical, electrical, and functional has become mandatory.

The key point is that convergence of computing, networking, and communication, which is driving IoT, has introduced new layers of verification requirements that did not exist years ago, and the interaction of these requirements has had a profound effect on the way we must verify systems today.

What are the solutions that the EDA industry is driving to enable this next big wave in verification?

One contributing factor to growing verification complexity is the emergence of new layers of verification requirements, as I previously mentioned. For example, beyond the traditional functional domain, we have added clock domains, power domains, mixed-signal domains, security domains, safety requirements, software, and then obviously, overall performance requirements. Hence, we see the next big wave in verification on multiple fronts:

Continuing introductions of focused solutions optimized for specific verification concerns. Examples of these focused solutions include: formal apps focused on  verifying security features within a design or power apps used to provide complete RTL power exploration and accurate gate-level power analysis within emulation.
Emerging system-level analysis solutions, which provide new metrics and insight into the fully integrated SoC. This becomes essential for system-level performance analysis. The IoT SoC, for example, is a different beast than today’s state-of-the art networking SoC.
Greater convergence across multiple verification engines (e.g., simulation, emulation, and FPGA prototyping), which will improve productivity. The new Accellera Portable Stimulus standard will help facilitate this convergence and foster the introduction of new verification solutions.
Q4: Do you see domain specific solutions like automotive or machine learning etc. getting enabled for verification?

Yes, in fact there are multiple opportunities to leverage big data analytics to solve many system-level analysis problems. Machine learning is only one approach used today for big data analytics; however, there are others. Now, concerning domain-specific solutions in the automotive space, formal technology is being leveraged to improve productivity related to safety fault analysis.

Do you expect all workhorses (Simulation, Emulation & Formal) playing a critical role in verifying these new converged system level designs?

Obviously, this depends on the design. A project developing sensors for an IoT edge solution has different verification requirements than a project developing an automotive SoC containing multiple CPU and GPU cores, a coherent fabric, and multiple complex interfaces. Nonetheless, with increased design integration, multiple verification engines are required today that address the growing volume of verification requirements.

This is the 4th edition of DVCon in India. What are your expectations from the conference?

DVCon, in general, is recognized as the premier conference on the application of languages, tools, methodologies and standards for the design and verification of electronic systems and integrated circuits. And DVCon India is no exception, which has continued to grow in both attendance and exhibitor participation. I expect DVCon 2017 will continue to deliver high-quality technical content and provide valuable networking opportunities for its attendees. It is the premier venue to share state-of-the-art developments and connect the creative minds working on these developments.

Thank you Ravi!

Join us on Day 2 (Sep 15) of DVCon India 2017 at Leela Palace, Bangalore to attend this keynote and other exciting topics.


Disclaimer: “The postings on this blog are my own and not necessarily reflect the views of Aricent”

3 comments:

  1. Interesting post …. It is always a great pleasure to read about keynote speaker. Thank you for sharing.

    ReplyDelete
  2. Great blog about keynote speaker india. Please keep sharing more such blogs.

    ReplyDelete