Tuesday, November 6, 2018

Quick chat with Srini Maddali : Keynote speaker Accellera Day India 2018

Distant communication has come a long way over the last century! The 64 kbps telephone line enabling real time communication revolutionized traditional messenger based means. Wireless communication brought in another revolution enabling people to talk on the go! While this eased up communication, it also took technology to remote places adding millions of users to this network. With wireless & internet joining hands, a new world of possibilities opened up that continues to evolve and amaze us by the day! Qualcomm continues to lead the effort of bringing wireless technology and associated enablers to mainstream. The verification team at Qualcomm works relentlessly to defy the challenges posed by this accelerated rise in complexity every day. As we look forward for the 5G enabled world, what are some of these challenges & expectations from the next generation of verification tools & technologies?

Srini Maddali

Srini Maddali, Vice President, Technology at Qualcomm India Design Center leads the Mobile SoC development Engineering teams. With focus on Low-Power, High-Performance SoC design and enabling rapid ramp to volume of these designs, Srini has a first-hand information on these challenges which he would be sharing as part of his keynote on the Accellera Day in India at Radisson Blu Bengaluru on November 14, 2018. A quick chat with Srini unfolded the adventurous ride that awaits the verification community as we embrace proliferation of IoT through the 5G route. Read on!!!

Srini, you plan to discuss on the topic “Challenges in Validation of Low Power, High Performance, and Complex SoCs at Optimal Cost”? Tell us more about it?

Year over Year, complexity, performance and power requirements have been increasing rather drastically. In addition to that, time from the start of SoC development to customer deployment is shrinking. These pose significant challenges to SoC development teams both in design and validation. 

For our team to put a test plan comprehending each use case requirements of each domain in these Systems (System that is on a monolithic die) and then validate concurrent use cases involving multiple domains operating independently, creating conditions that would stress the designs is quite challenging. On top of this, the team must validate performance aspect of each domain independently, valid concurrent scenarios, and simulate use cases beyond the spec of system ensuring graceful exit each time. 

The challenges further become multi-fold when simulating power profiles comprising number of power domains with huge number of system use cases interacting dynamically.

The Cost aspect is beyond the resource aspect and instead the Time aspect of delivering SoC to customer on a tight schedule.

Combining all of these with multiple SoCs getting developed in parallel to leverage the best across the chips and validating entire SoC builds up a multi-dimensional challenge for any team. As part of the team, I witness this with every chip development and multiple developments per year. I would be covering these aspects in my talk.

WoW!!! Srini, you are throwing light on the whole iceberg! Wondering how has been your team’s experience with the philosophy that verification takes 70% of the design cycle?

As mentioned, with the design complexities and to validate the system use cases, verification effort starts from the architecture phase of the design till customer sampling. Teams engaging right from the architectural level discussion helps to define the test plans, defining task details and prioritizing tasks covering all aspects of system validation i.e. functional, performance, power, etc. On top of this, our teams leverage Formal and emulation platforms to cover any gaps in the coverage.

Srini you mentioned leveraging Formal technology. Has it been offloading simulation tasks or trailblazing verification?

Formal has been an integral part of our verification methodology and our teams leverage its capability in validating our designs/IPs and SoCs. We use Formal right from providing jumpstart to design validation before deploying UVM or other techniques till the last mile of coverage closure.

UVM as a methodology was proposed to solve some of these challenges for IP level & aid at SoC level too. How does your team views UVM’s contribution in alleviating these challenges?

With the challenges described, creating verification environment for every SoC is very difficult and inefficient. Having modular verification environment that enables to port IP/Design to SoC and re-use across SoCs/designs helps to improve efficiency and quality. UVM enabled to scale this to our need and certainly helped alleviating the challenges handling the complexity as well as managing multiple SoC developments. UVM has been serving very well to the complexities of last multiple years. With Designs/SoCs becoming more-and-more complex managing it with UVM TB is itself becoming a challenge.

Srini you also mentioned using emulation. How has been your team’s experience with this platform?

With the number of clock domains, power domains, and infra systems, some designs can be bit tight for emulation platforms. Based on the need and complexity we deploy all techniques including hybrid emulation to cover the SoC.

A lot of these challenges can be subsided with an integrated holistic approach to verification & validation. Do you believe Portable Stimulus aiming in those lines would provide a solution to them?

Yes, with Portable Test & Stimulus, once the vendor tools start supporting full feature set as defined in the standard, it shall enable validation at context level extending the ability to leverage block/IP level validation at the system level. This will help to cover system level scenarios effectively and get the coverage at context and use case level.

Qualcomm is a pioneer in cellular technology. 5G would enable a system of systems across all domains. Do you observe Safety & Security as next set of challenges already standing outside the door?

Safety & Security is always a challenge with constant news about vulnerabilities detected in the systems. With 5G, we will have systems that can operate differently based on the use case/environment e.g. Streaming movie or video would be a high bandwidth mode vs Automotive environment that operate in very low latency and guaranteed service vs AI mode leveraging cloud compute and so on. Security and safety will be even more critical for systems that morph based on the need/environment. It is very interesting and equally challenging topic for sure.

Srini, this year we are having Accellera Day for the first time in India. What are your expectations from the event?

It is indeed nice to see. Having such events help the VLSI community to come together, share their ideas, views and learn about the latest trends in the vast Verification universe.

Thank you Srini!

Accellera Day India 2018 is getting hosted at Radisson Blu, Bangalore on 14th November 2018. Register now!!!

Monday, March 19, 2018

Negative Testing in functional verification!!!


Imagine someone on an important call and the mobile device reboots suddenly! The call was to inform that the devices installed at the smart home seems to be behaving erratically with only elderly parents & kids to provide any further details. On booting up, the smartphone flashes that there has been a security breach and data privacy has been compromised. Amidst this chaos, the car’s cruise control didn’t respond to pressing of the pedals!!! Whew!!!.... nothing but one of the worst nightmares in the age of technology we live in! But what if some of it could be true someday? What if the user has little or no idea about that technology?

The mobile revolution has enabled a common man to access technology and use it for different applications. The data from Internet world statistics suggest that internet adoption worldwide has increased from 0.4% of world population in 1995 to 54.4% in 2017. Related data also indicate that a sizable portion of the users are aged & illiterate. The ease of use has potentially driven this adoption further with the basic assumption that devices would be functioning correctly 24x7 even if used incorrectly out of ignorance. The same assumptions are seamlessly getting extended to safety critical domains such as Medical & Auto introducing several unknown risks for the user.

So how does this impact the way we verify our designs?

Traditionally, verification is assumed to be ensuring that the RTL is an exact representation of the specifications. Given that the state space based on the design elements is so very huge, a targeted verification approach covering positive verification has been in practice all throughout. Here, Proof of no bug is assumed to be equal to No proof of bug! The only traces of anything beyond this approach include –

- Introducing asynchronous reset during the test execution to check that the design boots up correctly again.
- Introducing stimulus triggering exceptions in the design.
- Simulating architecture or design deadlock scenarios.
- Playing around with key signals per clock for low power scenarios and reviewing the corresponding design response.


But as we move forward with security and safety becoming key requirements of the design, is this good enough? There is a clear need to redefine the existing approach and bring Negative testing to mainstream! Negative testing ensures that the design can gracefully handle invalid inputs, unexpected user behavior, potential security threats or defects such as structural faults introduced while the device is operational. Amidst shrinking design schedules, negative testing really requires creative thinking coupled with focused effort. 

To start with, it is important to question the assumptions used while defining the verification plan for the design. Validating those assumptions itself can lead to a set of scenarios to be verified under this category. Next, review the constraints applied while generating stimulus to list out potential illegal inputs of interest. Caution should be taken in defining this list as the state space would be large. Reviewing it in the context (Context Aware Verification) of end application would surely help in narrowing down this illegal stimulus set. Further to this, faults need to be injected at critical points inside the DUT using EDA tools or innovative testbench techniques. This is important for safety critical applications where the design needs to respond to random faults and exit properly while notifying about the fault or even correct it. Of course not to forget that appropriate coverage needs to be applied to measure the reach of this additional effort.

As we step into an era of billions of devices empowering humans further, it is crucial that this system of systems is defect free especially when it touches safety critical part of our life. Negative testing is a potential way forward ensuring reliability of designs for such applications. As is always said – 

Better safe than sorry!


Sunday, March 4, 2018

Portable Stimulus : redefining verification play field yet again!

In the last 3+ decades, verification has come a long way! Right from quick testing by designers to dedicated verification teams moving from directed testing to constrained random and adding elements of formal apps at times, it has been an eventful journey! Standardization of UVM enabled a common framework with which the fraternity marched forward in sync with each other. Horizontal reuse viz. at the IP level experienced maximum benefits of UVM while vertical resume viz. from IP to SoC level observed limited returns. Apart from the methodology, verification has also proliferated beyond simulation or emulation into virtual prototyping, FPGA validation, post silicon functional validation & HW SW co-verification. Today, the reuse aspects are not limited from IP to SoC or across projects but between platforms too. It is extremely important to realize reuse of efforts at any level across different vehicles enabling first silicon success. The challenge however is that each of these vehicles involve multiple stakeholders like architects (Chip , SW, system), SystemC modelling engineers, RTL designers, verification engineers, prototyping experts, post silicon debuggers and SW developers, each defining & driving the stages of SoC design cycle. Different goals focusing on a specific stage, different approaches in solving these problems and different means of representing solutions has made the task of reuse across the platforms – a convoluted puzzle!!!

To solve this problem, Accellera initiated a task force called Portable Stimulus Working Group (PSWG) that reviewed the concern & potential solutions. After long & regular sessions of intense activity in last couple of years, the group has come up with a proposal in form of a standard.  Beta release of the preliminary version of the Portable Test and Stimulus Standard is now opened for public review.

The basis of the solution relies upon taking the stimulus definition across platforms to an abstraction layer where the focus is on what is needed instead of how it shall be implemented? The idea is to understand & represent the action/outcome i.e. the behavior of the DUT when the test runs. While representing these actions, the focus would be on what would be the inputs and outputs, what resources are required in the DUT, and their relationship with each other? The how part i.e. the implementation will be left to a hidden layer (read EDA tool) to generate the required stimulus for the target platform based on the scenario definition. The actions referred above are all declarative and not procedural or executable by themselves. A set of these static actions can be used to construct a scenario, analyzed for coverage to determine the paths to be traversed & dumped into a test format using the hidden layer.  

To represent these actions, the PSWG has proposed 2 formats - Domain Specific Language (DSL) that is close to System Verilog and a restricted C++ library. Both these formats have equivalent semantics that are completely interchangeable such that for each line in the DSL there is an exactly equivalent C++ library entry. If one defines the actions of an IP in DSL and another IP in C++ library, both can be read together to generate a SoC level scenario for the target platform.

While the road of moving from developing each testcase to a testbench that generates tests has been long, it’s time to take another step in the direction of standardizing the stimuli! This would feed the testbenches for IP/SoC verification & be reused by other workhorses of verification & validation in the SoC design cycle. Remember, reuse is one of the keys to keep up with Moore's law!!!

The public review period for this proposal is open until Friday, March 30, 2018. Download & review NOW!!!

Sunday, February 18, 2018

Moana of Verification!!!

Dear Readers!!!! Welcome back!!!

During this lull period of sharing thoughts, I realized that even blogging faces the same dilemma of verification on “How much is enough”. Finally, as a verification engineer should think, I decided to continue as much as I can with the hope to improve blogging frequency than before – wish there was  Moore’s law for bloggers too 😊!!! Since this is the first one of this year (Happy New year folks!!!) & by this time of the year the intent, action and discussions on new year resolutions would have died down mostly, let’s start from there.  While speaking to budding DV engineers on their new year resolutions around verification, I discovered that somehow focus on the end goal is missing & they seem to be trending into all directions. Reason? Well! Possibly many –
- The never ending gap between industry expectations & Academia.
- Missing core elements in the Job description of a verification engineers.
- Overwhelming solutions that enable verification….. etc.

Still confused on what I mean? Let me explain!

When I started my career as a verification engineer with legacy directed verification approaches, all I learnt was that, to be successful you need to have - a Nose for Bugs! In a way, that was also because one needed to maximize returns from each test so, you handcraft each one to really mine bugs. With rising silicon complexity, verification domain has been experiencing consistent advancements enabling us verify designs faster & better. During this shift, the expectations from verification engineers also kept changing i.e. demanding experience on new flow, language or methodology. The rise of SV & UVM further accelerated this shift giving a taste of elaborate & sometimes exotic code development opportunities for the DV engineers. While this continued, reuse gained momentum on the design & eventually on verification too. Due to this the code development gave way to reuse & with latest verification flow/methodologies, the verification engineers started spending more time on derivative designs that demand debug capabilities over development. 

Coming back to our budding engineers discussion, learning a new flow/methodology would end up having expectations on development work which may not really be needed always. This trend has often lead to the confusion of multiple directions where the engineer ends up settling on the means losing focus on the end goal.  As a DV engineer, our goal is to find bugs! Approaches like Directed/Constrained Random/Formal or languages, methodologies & platforms like simulator/emulator etc. are all means to hunt bugs. While expertise in many or all of these are important and occasionally may even lead to a career, the expectations from verification engineer is really to catch bugs! 

How does this relate to the topic of the blog?

Well! Moana is a film on the story of a girl named Moana, daughter of a chief of a Polynesian village. She is chosen by the ocean to reunite a mystical relic with a goddess. When a blight strikes her island, Moana sets sail in search of Maui, a legendary shapeshifting demigod, in the hope of returning the heart of Te Fiti and saving her people. In this journey of hers, she discovers that her tribe were ‘voyagers’ who have forgotten their virtue and settled as villagers on an island which would die down soon. She not only saves the island & her people but leads them back to their original self – a journey which is more exciting & enriching!

Similarly, UVM, formal apps and emulation etc. are all means to find bugs in your verification journey. Don’t just settle for the means which might be short-lived sometimes but shoot for the end goal & be the Moana of Verification!!! A worthy resolution to pursue 😊!!!    

What was your resolution on verification on this New Year???