Distant communication has come a long way over the last century! The 64 kbps telephone line enabling real time communication revolutionized traditional messenger based means. Wireless communication brought in another revolution enabling people to talk on the go! While this eased up communication, it also took technology to remote places adding millions of users to this network. With wireless & internet joining hands, a new world of possibilities opened up that continues to evolve and amaze us by the day! Qualcomm continues to lead the effort of bringing wireless technology and associated enablers to mainstream. The verification team at Qualcomm works relentlessly to defy the challenges posed by this accelerated rise in complexity every day. As we look forward for the 5G enabled world, what are some of these challenges & expectations from the next generation of verification tools & technologies?
Srini Maddali, Vice President, Technology at Qualcomm India Design Center leads the Mobile SoC development Engineering teams. With focus on Low-Power, High-Performance SoC design and enabling rapid ramp to volume of these designs, Srini has a first-hand information on these challenges which he would be sharing as part of his keynote on the Accellera Day in India at Radisson Blu Bengaluru on November 14, 2018. A quick chat with Srini unfolded the adventurous ride that awaits the verification community as we embrace proliferation of IoT through the 5G route. Read on!!!
Srini, you plan to discuss on the topic “Challenges in Validation of Low Power, High Performance, and Complex SoCs at Optimal Cost”? Tell us more about it?
Year over Year, complexity, performance and power requirements have been increasing rather drastically. In addition to that, time from the start of SoC development to customer deployment is shrinking. These pose significant challenges to SoC development teams both in design and validation.
For our team to put a test plan comprehending each use case requirements of each domain in these Systems (System that is on a monolithic die) and then validate concurrent use cases involving multiple domains operating independently, creating conditions that would stress the designs is quite challenging. On top of this, the team must validate performance aspect of each domain independently, valid concurrent scenarios, and simulate use cases beyond the spec of system ensuring graceful exit each time.
The challenges further become multi-fold when simulating power profiles comprising number of power domains with huge number of system use cases interacting dynamically.
The Cost aspect is beyond the resource aspect and instead the Time aspect of delivering SoC to customer on a tight schedule.
Combining all of these with multiple SoCs getting developed in parallel to leverage the best across the chips and validating entire SoC builds up a multi-dimensional challenge for any team. As part of the team, I witness this with every chip development and multiple developments per year. I would be covering these aspects in my talk.
WoW!!! Srini, you are throwing light on the whole iceberg! Wondering how has been your team’s experience with the philosophy that verification takes 70% of the design cycle?
As mentioned, with the design complexities and to validate the system use cases, verification effort starts from the architecture phase of the design till customer sampling. Teams engaging right from the architectural level discussion helps to define the test plans, defining task details and prioritizing tasks covering all aspects of system validation i.e. functional, performance, power, etc. On top of this, our teams leverage Formal and emulation platforms to cover any gaps in the coverage.
Srini you mentioned leveraging Formal technology. Has it been offloading simulation tasks or trailblazing verification?
Formal has been an integral part of our verification methodology and our teams leverage its capability in validating our designs/IPs and SoCs. We use Formal right from providing jumpstart to design validation before deploying UVM or other techniques till the last mile of coverage closure.
UVM as a methodology was proposed to solve some of these challenges for IP level & aid at SoC level too. How does your team views UVM’s contribution in alleviating these challenges?
With the challenges described, creating verification environment for every SoC is very difficult and inefficient. Having modular verification environment that enables to port IP/Design to SoC and re-use across SoCs/designs helps to improve efficiency and quality. UVM enabled to scale this to our need and certainly helped alleviating the challenges handling the complexity as well as managing multiple SoC developments. UVM has been serving very well to the complexities of last multiple years. With Designs/SoCs becoming more-and-more complex managing it with UVM TB is itself becoming a challenge.
Srini you also mentioned using emulation. How has been your team’s experience with this platform?
With the number of clock domains, power domains, and infra systems, some designs can be bit tight for emulation platforms. Based on the need and complexity we deploy all techniques including hybrid emulation to cover the SoC.
A lot of these challenges can be subsided with an integrated holistic approach to verification & validation. Do you believe Portable Stimulus aiming in those lines would provide a solution to them?
Yes, with Portable Test & Stimulus, once the vendor tools start supporting full feature set as defined in the standard, it shall enable validation at context level extending the ability to leverage block/IP level validation at the system level. This will help to cover system level scenarios effectively and get the coverage at context and use case level.
Qualcomm is a pioneer in cellular technology. 5G would enable a system of systems across all domains. Do you observe Safety & Security as next set of challenges already standing outside the door?
Safety & Security is always a challenge with constant news about vulnerabilities detected in the systems. With 5G, we will have systems that can operate differently based on the use case/environment e.g. Streaming movie or video would be a high bandwidth mode vs Automotive environment that operate in very low latency and guaranteed service vs AI mode leveraging cloud compute and so on. Security and safety will be even more critical for systems that morph based on the need/environment. It is very interesting and equally challenging topic for sure.
Srini, this year we are having Accellera Day for the first time in India. What are your expectations from the event?
It is indeed nice to see. Having such events help the VLSI community to come together, share their ideas, views and learn about the latest trends in the vast Verification universe.
Thank you Srini!