In late March, Brian Bailey of Semiconductor Engineering published an article on standards: “Design by Architect or Committee?” This made me think of my own experience with the Accellera Unified Coverage Interoperability Standard (UCIS), something of which I am both proud and embarrassed. Proud, because when I was at Mentor Graphics I was the architect of the winning donation, and that’s a rare thing in any career — to contribute the design and architecture for an industry standard. However, I am embarrassed because I know I could have done better in a re-design. Any software engineer will tell you this: the second design is always better, because you’ve learned from the first. We did some re-design as part of the standardization effort, but not to the degree I wanted.
In retrospect, the politics of Accellera UCIS were bound to be difficult, because if you think about it, the standard allows users to easily switch simulators. That’s what the “interoperable” part means. With simulation a slowly growing market, a sort of zero-sum game, one company’s gain is another’s loss. No one is going to be enthusiastic about a standard that helps them lose business. This point was also made in Brian’s article.
I also participated in the SystemVerilog standard of the IEEE. Say what you like about SystemVerilog, it is not just design by committee, it is design by multiple committees. But those committees do really have a lot of common ground and work pretty well together. The atmosphere in Accellera UCIS meetings was more polarized.
The inception for the standard was the realization inside Mentor Graphics that coverage analysis needed a public application programming interface (API). We made the crucial decision to use the same API internally for coverage creation, reporting, and analysis, and to make it usable in a standalone fashion as well. We tried to keep it simple, easy to grasp for verification engineers who were not software developers, without the complex data models and handles that would make it more like SystemVerilog VPI. This wasn’t entirely possible, but when we were done, we had something that was complete and functional.
It remains my favorite project of my career. In the early days of formulating the API, I had great fun brainstorming with Doug Warmke and Samiran Laha. (Samiran presented a poster on the UCIS API just this past DVCon.) We then gradually re-architected the coverage GUIs with my hands-on marketing counterpart Darron May and created a suite of brand new verification management features. It culminated in the Questa Verification Management Tracker GUI, allowing test traceability analysis tying together all kinds of coverage. I myself wrote the internal machinery of the GUI, and it was the ultimate validation of the API started a few years before.
There was quite a debate within Mentor about whether to try to make the API an industry standard. This is the rarified domain of Mentor’s great tactician Dennis Brophy, so I don’t really know why we decided in favor of submitting it. I had heard there was a customer telling us to participate. I think we then expected backing from that customer, but it didn’t happen that way. One interesting twist is in the behavior of the Big Three. With three big gorillas in the room, you get a lot of two-versus-one alignments. The push to SystemVerilog 2005 was initially a Synopsys and Mentor alliance versus Cadence. Perhaps just for political balance, UCIS became Cadence and Mentor versus Synopsys. We started meeting with Cadence well before the donation was approved, so the basis for the UCIS standard was really a combined effort of Cadence and Mentor.
The most vocal customers on the committee, however, were from Synopsys. This made the negotiations in the meetings difficult for us.
How we won the committee vote to accept Mentor’s donation in June, 2009 I cannot say. This had much more to do with Dennis Brophy than with me, and certainly little to do with the merits of the competing donations. I’ll tell you, though, the most stressful day of a 25-year career was having to defend my donation to the committee, because it had to be as perfect a performance as I could muster, and it didn’t really matter. It was a political exercise, not a technical one.
The first meeting after acceptance of the donation, I produced a list of defects I wanted to correct or improve. From my point of view, this was just standard software engineering post mortem; I’d lived with the design for years and could do better. The immediate reaction, however, was not a happy one, and I had to shut up.
I wasn’t completely ignored; some of my and others’ suggested improvements were made during my remaining tenure on the committee, and more after I left Mentor and the committee. The most serious criticism of the standard, which I agree with, is that the coverage models are not really interoperable. The API is, but not the way coverage itself is stored by different simulators. While I understand users would like this, you have to ask which vendors would like this. None. Vendors would have to change their current implementation to adhere to some new way of doing things, only to increase the risk of losing their customers to another vendor. The worst problem is that coverage is rooted in particular language scopes, and language scopes aren’t even standardized. Synthesizable scopes are, but not verification scopes like those created by parameterized classes in SystemVerilog. Because this depends on a company’s proprietary elaboration algorithm, it is very unlikely this will ever be a standard.
So, bottom line, UCIS was not a “win-win, a benefit for the vendors and a benefit for the users,” as Arturo Salz said in Brian’s article. I think Mentor initiated it to increase its profile and credibility as a verification vendor, and I suspect others were dragged along by the force of customers, but without a clear and universal win-win, its full promise remains unrealized.
I will always be grateful that it was something I could participate in, and it is a highlight of my professional career. But I do look back on it as a stressful experience. I hope the UCIS will evolve and mature, and I pray it encourages an ecosystem of coverage analysis tools to develop along with it. I am interested to see some positive signs, like Mark Litterick’s DVCon paper I blogged about last time. But now UCIS has a life of its own without me. As one of its several parents, I will follow it with natural interest, and of course, some measure of pride.