In the late 80s and early 90s, systems contained multiple complex devices. Processors were separate chips that had to be integrated at the board level. Models for these complex devices were hard to come by and emulation was not an option at that time for most companies. A technique emerged whereby the real chip could be integrated into a board and connected into the simulation process. It was argued that since the real chip was being used, it would by definition have no bugs in it, compared to those who were producing behavioral models.
The devices worked by replaying stimulus that was collected during the simulation. For example, after the device was reset, the simulator would send the first set of stimulus to the chip. It would be driven into the chip and the outputs for the next cycle captured and fed back to the simulator. That initial stimulus would be stored in a memory. When the second stimulus that corresponded to the next clock cycle was sent to the hardware modeler, it would add it to the buffer, reset the chip and replay all of the vectors collected so far. On the last cycle, it would capture the new response and send this to the simulator.
Over time, as the size of the stimulus buffer increased, the time taken for the new vector would increase. In addition, these devices were often run at slow clock speeds so that the sampling circuitry could be built cheaply and not require the same high-speed circuitry that would be found in a tester. As such, they provided no speed-up of the simulator and for extended length simulations could actually slow the software simulator down.
Most simulation companies produced a hardware modeler, but few lasted for more than a short period of time. Logic Modeling (later Synopsys) became the independent maker of choice for these.
Today, no systems like this exist, replaced by emulators and rapid prototyping systems.