Moore’s law, the driving force behind
the evolution of the semiconductor industry, talks about the outcome wherein
the complexity on silicon doubles every 2 years. However, to achieve this
outcome for past 50+ years, several enablers had to evolve at the same or much
faster pace. Verification as a practice, unfolded, as part of this journey and
entered mainstream, maturing with every new technology node. To turn the wheel
around everytime, various spokes in the form of languages, EDA tools, flows,
methodologies, formats, & platforms get introduced. This requires countless
hours of contribution from individuals representing diverse set of
organizations & cultures putting the canvas together for us to paint the
picture.
Tom Fitzpatrick |
At DVCon, Accellera recognizes outstanding achievement of an individual and his/her contributions to the development of Standards. This year, Tom Fitzpatrick, Vice Chair of the Portable Stimulus Working
Group and member of the UVM Working Group, is the recipient of the 8th
annual Accellera Technical Excellence Award. Tom who represents Mentor at
Accellera, has more than 3 decades of rich experience in this industry. In
a quick chat with us, he shares his journey as a verification engineer,
technologist & evangelist!!!
Many congratulations Tom! Tell us
about yourself & how did you start into verification domain?
Thanks! I started my career as a chip
designer at Digital Equipment Corporation after graduating from MIT. During my
time there, I was the stereotypical “design engineer doing verification,” and
learned a fair amount about the EDA tools, including developing rather strong
opinions about what tools ought to be able to do and not do. After a brief
stint at a startup, I worked for a while as a verification consultant and then
moved into EDA at Cadence. It was in working on the rollout of NC-Verilog that
I really internalized the idea that verification productivity is not the same
thing as simulator performance. That idea is what has really driven me over the
years in trying to come up with new ways to make the task of verification more
efficient and comprehensive.
Great! You have witnessed verification
evolving over decades. How has been your experience on this journey?
I’m really fortunate to have “grown
up” with the industry over the years, going from schematics and vectors to
where we are now. I had the good fortune to do my Master’s thesis while working
at Tektronix, being mentored by perhaps the most brilliant engineer I have ever
known. I remember the board he was working on at the time, which had both TTL
and ECL components, multiple clock domains, including a voltage-controlled
oscillator and phase-locked loop, and he got the whole thing running on the
first pass doing all of the “simulation” and timing analysis by hand on paper.
That taught me that even as we’ve moved up in abstraction in both hardware and
verification, if you lose sight of what the system is actually going to do, no
amount of debug or fancy programming is going to help you.
For me, personally, I think the
biggest evolution in my career was joining Co-Design Automation and being part
of the team that developed SUPERLOG, the language that eventually became
System Verilog. Not only did I learn a tremendous amount from luminaries like
Phil Moorby and Peter Flake, but the company really gave me the opportunity to
become an industry evangelist for leading-edge verification. That led to
working on VMM with Janick Bergeron at Synopsys and then becoming one of the
original developers of AVM and later OVM and UVM at Mentor. From there I’ve
moved on to Portable Test and Stimulus as well.
So, what according to you were key
changes that have impacted verification domain the most?
I think there were several. The
biggest change was probably the introduction of constrained-random stimulus and
functional coverage in tools like Specman and Vera. Combined with concepts like
object-oriented programming, these really brought verification into the
software domain where you could model things like the user accidentally
pressing multiple buttons simultaneously and other things that the designer
didn’t originally think would happen. I think it was huge for the industry to
standardize on UVM, which codified those capabilities in System Verilog so users
were no longer tied to those proprietary solutions and the fact that UVM is now
the dominant methodology in the industry bears that out. As designs have become
so much more complex, including having so much software content, I hope that
Portable Stimulus will serve as the next catalyst to grow verification
productivity.
Tom, you have been associated with
Accellera for long & contributing to multiple standards in different
capacities. How has been your experience working on standards?
My experience with standards has
been entirely self-inflicted. It started when I was at Cadence and heard about a
committee standardizing Verilog but that there were no Cadence people on the
committee. I kind of joined by default, but it’s turned out to be a huge part
of my career. Aside from meeting wonderful people like Cliff Cummings, Stu
Sutherland and Dennis Brophy, my work on standards over the years has given me
some unique insights into EDA tools too. I’ve always tried to balance my “user
side,” where I want the standard to be something I could understand and use,
with my “business side,” where I have to make sure that the standard is
supportable by my company, so I’ve had to learn a lot more than someone in my
position otherwise might about how the different simulators and other tools
actually work. On a more practical note, working on standards committees has
also helped me learn everything from object-oriented programming to Robert’s
Rules of Order.
You have been one of the key drivers
behind development of Portable Test and Stimulus Standard (PSS). How was your
experience working on this standard compared to UVM?
Good question! UVM was much more of
an exercise in turning existing technology into an industry standard, which
involved getting buy-in from other stakeholders, including ultimately VMM
advocates, but we didn’t really do a whole lot of “inventing.” That all
happened mostly between Mentor and Cadence in developing the OVM originally. We
also managed to bring most of the VMM Register Abstraction Layer (RAL) into UVM
a bit later.
Portable Stimulus has been different for
two reasons. First, I’m the vice-chair of the Working Group, so I’ve had to do
a lot more planning than I did for UVM. The other is that, since the technology
is relatively new, we had the challenge of combining the disparate capabilities
and languages used by existing tools into a new declarative language that has
different requirements from a procedural language like System Verilog. We spent
a lot of time debating whether the standard should include a new language or
whether we should just use a C++ library. It took some diplomacy, but we
finally agreed to the compromise of defining the new language and semantics,
and then producing a C++ library that could be used to create a model with the
same semantics. To be honest, we could have played hardball and forced a vote
to pick only one or the other, but we wanted to keep everyone on board. Since
we made that decision, the working group has done a lot of really great work.
What are the top 2-3 challenges
that you observe we as an industry need to solve in verification domain?
Remember when I said earlier that
verification productivity is about more than simulator performance? Well, with
designs as big and involved as they are today – and only going to get more so –
we’re back at the point where you need a minimum amount of performance just to
be able to simulate the designs to take advantage of things like UVM or
Portable Stimulus without it taking days. This is actually part of the value of
Portable Stimulus in that the engine can now be an emulator, FPGA prototype or
even silicon and you can get both the performance to get results relatively
quickly and the productivity as well.
The other big challenge I think is
going to be increasing software content of designs. Back when I started,
“embedded software” meant setting up the hardware registers and then letting
the hardware do its thing. It made verification relatively easy because RTL
represents physical hardware, which doesn’t spontaneously appear and disappear,
like software. We’ve spent the last ten or so years learning how to use
software techniques in verification to model the messy stuff that happens in
the real world and making sure that the hardware would still operate correctly.
When you start trying to verify a system that has software that could
spontaneously spawn multiple threads to make something happen, it becomes much
harder. Trying to get a handle on that for debug and other analysis is going to
be a challenge.
But perhaps the biggest challenge is
going to be just handling the huge amounts of data and scenarios that are going
to have to be modelled. Think about an autonomous car, and all of the
electronics that are going to have to be verified in an environment that needs
to model lots of other cars, pedestrians, road hazards and tons of other stuff.
When I let myself think about that, it seems like that could be a larger leap
than we’ve made since I was still doing schematic capture and simulating with
vectors. I continue to be blessed to now work for a company like Siemens, that
is actively engaging this very problem.
Based on your vast experience, any
words of wisdom to the practicing & aspiring verification engineers?
I used to work with a QA engineer
who was great at finding bugs in our software. Whenever a new tool or user
interface feature came out, he would always find bugs in it. When I asked him
how he did it, he said he would try to find scenarios that the designer
probably hadn’t thought about. That’s pretty much what verification is. Most
design engineers are good enough that they can design a system to do the
specific things they think about, even specific corner cases. But they can’t
think of everything, especially with today’s (and tomorrow’s) designs.
Unfortunately, if it’s hard for the design engineer to think of, it’s probably
hard for the verification engineer to think of too. That’s why verification has
become a software problem – because that’s the only way to create those
unthought-of scenarios.
Thank you Tom for sharing insights
& your thoughts.
Many congratulations once again!!!
DVCon US 2019 - February 25-28, 2019 - Double Tree Hotel, San Jose, CA