Object-Oriented Programming Is Back

After years of promises and threats, it’s real. But beware of the steep learning curve.

popularity

Object-oriented programming is finally starting to look promising. For anyone who’s been following this technology, a statement like that is enough to evoke loud groans. Object-oriented programming, a.k.a. OOP, was first developed in the early 1960s. The goal was, and still is, to re-use components in software development—almost like Legos—by raising the level of abstraction for programmers and shielding them from many of the interfaces and protocols that can bog down coding.

 

It took three decades from the conceptual phase, but by the early 1990s the OOP market finally began showing signs of life in the commercial applications space. The problem in need of a solution was that application development was trailing hardware development by several years, and code that was being produced was incredibly sloppy. Commonly referred to as bloatware, applications contained so many bug fixes that they ate up much of the performance promised by new hardware—a multibillion growth market for companies like IBM, HP, Intel, and a slew of others, many of which no longer exist.

 

Fast forward to 2008 and software still isn’t keeping up with hardware. But the problem this time is on the chip side. Because software is being developed concurrently with the hardware, the best way to keep development on pace with the hardware is with re-usable parts. OOP fits right in with that strategy.

 

“The main reason that object-oriented programming exists in the software world, and why it’s being applied to the hardware world, is that it gives you new tools for managing complexity,” said Mike Meredith, president of the Open SystemC Initiative (OSCI). “You put complexity in a box, wrap it up with a bow, and they can ignore some of that complexity. You build some part of it with a published interface, and people don’t have to learn the data structures inside there and the algorithms it’s using to accomplish this stuff. The move to object-oriented comes as a direct response to all the complexity that these systems convey.”

 

While some companies have been working with their own proprietary OOP tools, most discussions in the hardware space have been confined to academic experimentation.

 

“This all comes out of Moore’s Law,” Meredith said. “The processors are so fast that people can be much more ambitious about what functionality they want to put in software. Those ambitions could only be achieved with large bodies of software, which were hard to manage. Object-oriented programming was invented as a way of managing it. On the system side, we have the same thing now. The performance of those processors came out of the number of gates available. On the embedded side, it’s the same thing. The number of gates means we have bigger processors, more processors, more memories and more hardware units on each chip. Designers can be more ambitious about what they build into the chip. That means more complex stuff, and you need ways to manage the complexity.”

 

Not everyone in the hardware world is unfamiliar with these tools, but the level of expertise is still extremely low compared with other other skills employed by hardware engineers.

 

“Object-oriented programming has been in the software community for the last five years, and there is a lot of usage of it in C++ or verification lanaguages,” said Nizar Romdan, product marketing manager in ARM’s System Design Division. “Hardware engineers have to get used to it. There are no experts yet, but it’s a lot better than it was 5 to 10 years ago. There are a lot of models and engineers who can write in C++ and SystemC.”

 

Nevertheless, the learning curve remains undeniably steep. Lock a hardware engineer and a software engineer in a room and they still may not fully grasp what the other is saying or why certain things are priorities while others are not.

 

It’s a lot easier to take a hardware engineer and give them object-oriented programming skills than to take a software engineer and give them hardware design skills,” said Glenn Perry, general manager of ESL/HDL Design Creation at Mentor Graphics. “The object-oriented world we’re moving into provides a common vernacular for all the disciplines to communicate and interact, but it doesn’t eliminate the need for a hardware engineer to put his hardware engineering work into the design. A good example of that is in high-level synthesis. You can’t just write an algorithm. You have to write it so it takes into consideration the hardware. It’s not that object-oriented programming is rocket science, but it’s not trivial, either. It’s a steep learning curve. There are tools that are being developed that deal with the complexity of object-oriented programming in the hardware world.”

 

But it does open the door for hardware and software engineers to really start talking about what new features can be added into systems and to understand what the impact will be of those new features. When two formerly separate worlds staffed with highly trained engineers begin trading information, the results are almost certain to be interesting.

 

Ed Sperling