In a perfect world, every designer would create designs that would perfectly match their original specifications. For the rest of us, the realities of life mean that we don't quite attain that level of perfection. There may be one or two areas in which we forgot about something -- perhaps an interaction that would cause unexpected results, or a case that we didn't fully understand from the specification.
This is, of course, why we have to verify our designs. Unfortunately, the verification methodologies in use today assume that everything a designer does is erroneous. Worse than that, these methodologies have to assume that we did nothing. In an Agile development world this is actually quite close to the truth, because using this methodology you develop the test first. This means that -- by definition -- the test must fail because there is (initially) no functionality to fulfill it. As the maxim goes, "If you don't verify it, then it is broken."
This is where one of my complaints about constrained random verification comes in -- everything is treated equally. Now, it is possible to define the corner cases -- and this is often essential in order to reduce the number of verification targets that exist -- but you can't easily target particular aspects of the design. This was part of the beauty of directed tests, in that you knew what each and every one of them did. With a constrained random methodology, the only way we know what happened is by looking at the coverage that was obtained from a particular run -- but this is of little help when the test goes wrong and you have to try and work out what it was that the test was doing and ascertain if it was even a valid test!
I used to like the approach taken by Certify -- a product now owned by Synopsys. This was a much more complete verification methodology, because before you counted something as being covered Certify made sure that there had actually been a check performed that would have indicated an invalid result if that thing (function) had not worked. What, I hear you say? Constrained random doesn't do that? Nope. It just says that something happened inside of the design and never requires that the result of that "happening" was correct. It will mark it off as covered as soon as it sees the event happen. No expectation that it will propagate a result of this "happening" to a checker. Oh well...
In an FPGA, coverage and debug take on a whole different dimension. When the design is loaded into a software simulator, you can see the value on each and every register, wire, or other doodad in the project. But when the design is loaded into a physical FPGA, you somehow have to get that data to the outside of the device and into a software package for display. Alternatively you may want to use bench test equipment such as a logic analyzer.
Consider a solution offered by Agilent whereby, for every pin on the FPGA device, you can examine up to 128 internal signals. This works for both Xilinx and Altera devices. With Xilinx, the software used to get the data out is called ChipScope Pro. I can't even scratch the surface of this tool in this column, but Xilinx offers a two-day course to bring users up to speed with this technology. However, I did find a pretty good 10-minute video on YouTube that describes ChipScope Pro as follows:
In my next column, I will look at other software that can be used to examine state information inside an FPGA. Meanwhile, do you use ChipScope or similar FPGA vendor-supplied software for debug?
@Brian: I think it is fine to post things from papers just so long as the source is clearly identified.
Good point -- on the other hand, we don;t want someone to post entire papers or presentations as 100s of comments -- in that case just provide a comment saying what it is along with a link to the main document
well I dont know if this is the right place to post or not,
I have been looking for using Chipscope Pro for PCIe rather than JTAG, since in my project, I am using PCIe based development board and it does not have the JTAG connection and I am in dire need of using something to look inside my FPGA like chipscope etc...
I've got a Zynq board from the kind folks at Xilinx
I've got an actel proASIC development board
And I've got a partial board from a previous employer simply because it has a cyclone 3 and lots of headers on it,
I've also got a very old 8052 emulator and a home made AVR board
I believe 3D ICs are basically the replacement for the PCB. In the near future, the PCB will become nothing other than a holder with the ability to add connectors and perhaps a few components that cannot be economically integrated within the chip package.