Managing Memory With Embedded Software

Making the most of memory resources means leveraging everything hardware and software can do.


By Ann Steffora Mutschler
Memory is shaping up to be a key leverage point for embedded software going forward as it represents such a large fraction of the silicon real estate in today’s SoCs. Managing memory effectively and memory bandwidth also represents a significant fraction of the potential bottlenecks and the power dissipation. As such, everything embedded software can do to enhance how people optimize memory can have a significant effect on the reliability of the system, the cost of the silicon, the power dissipation and how robust and scalable it is as the workload increases.

“SoCs are really becoming systems, and systems thinking becomes much more important,” said Chris Rowen, Cadence fellow and cofounder of Tensilica. “That includes a lot more need for abstraction in the way that the software is put together on the systems. You can’t say, ‘Oh well these two guys know everything about the software and they are going to optimize the last byte out of it and the last cycle and it’s all just going to be so perfect.’ That was fine when you were designing the software that went into a garage-door opener, but when you are thinking about something that is connected to the Internet, interacting with the cloud and has downloadable applications, you’re talking about something that is so much more complex from the software standpoint. You have to think of it as a layered system, where there is some base level software, which is talking to hardware, then there is some operating system, then there is some middleware and stacks, then some application services and finally the user applications.”

All of those layers will be developed by different engineers. “The layers are going to speak to one another through formalized application programming interfaces and other kinds of APIs. In each of those layers the need to be simple and comprehensible inevitably leads to both more cycles and more memory because you have to buffer things more, you have to assume the worst case in terms of the size of that might get transferred—you really have to pad it out in a lot of ways,” Rowen said.

That’s good news for processor makers as these dynamics create incredible demand for processing. “The grand bargain that has emerged over the last 40 years of progress in computing is that the hardware guys create ever more computational capability and the software guys throw it away on abstraction,” he said. “I don’t mean that in a negative way. In fact, abstraction of programming is really the big goal for progress in systems. It isn’t as if we need our garage door opener to run 10,000 times faster than it used to run. We need 10,000 times more functionality. Functionality really comes from ease of programming. So in a sense we take all of this memory density and all of this computing capacity and we use it up by throwing it out, providing layers of abstraction. But it is those layers of abstraction often which really are the source of consumption not only of cycles but of more and more memory.”

An increase in memory means more bits on a chip, larger arrays of RAM, more bandwidth to off-chip memory, and the off-chip DRAMs in flash devices need to be larger.

In order to handle the abstraction, mechanisms for protection and isolation are needed, because as the complexity of the applications grow, you get different kinds of software coming from different people. Sometimes it’s some software you trust because it was generated by the guy in the cubicle next to yours and sometimes it’s some application you found out on the web and it sounded cool so you downloaded it, Rowen added.

As far as the move into the system level or overall system management, Mike Thompson, senior manager, ARC processor marketing at Synopsys said, “Say you have a tablet for instance. There you’ve got to manage the bus bandwidth and the bandwidth into memory and that is a huge deal in a lot of the systems. We are seeing a lot more uptake of networks on chips (NoCs) in chips, which is really interesting.”

Further, Marc Greenberg, product marketing director at Synopsys, added that when talking about off-chip DRAM, part of what users expect is that the memory scheduling function would be handled by the NoC or by some element within the NoC. “In that context then, we relinquish control of the memory scheduling function to the NoC and we handle all of the protocol functions—all the maintenance functions, all the protocol functions within the protocol controller when we do that. There are a couple of different ways we see both types of customers. We definitely see the NoC customers who come in, and typically the NoC wants to manage all aspects of external DRAM. We also see the customers that come in and they want the memory controller to be responsible for scheduling and reordering.”

Making it virtual
Frank Ferro, director of product marketing for system IP developer Sonics, observed that there is still a disconnect between the software engineer and the hardware engineer and trying to get them together has been a challenge. But with SoCs getting so expensive to develop, it’s starting to drive a bit more cooperation. “Specifically for memory, we saw the need to be able to move platforms in a portable manner much more quickly so when you go from one SoC to the next because of the expense, and because of the time it takes to develop the SoC, we realize that a big chunk of that expense is in the software. If hardware guys change the chip and you have to rewrite software every time, it’s extremely unproductive. It might be obvious, but they still do it.”

As such, Sonics developed what it calls interleave memory technology (IMT) in its interconnect fabric/on-chip network, which balances the load between the memory channels and basically tells the hardware engineer it doesn’t matter which bank of memory is storing the data. It basically becomes virtual to the software programmer, he explained. “All he knows is that the GPU is going to go get a piece of data out of memory – whether it is in channel 1 or channel 2 becomes transparent because what the fabric does is automatically balances the load between the banks. We try to interleave as much as possible to get the most efficient access to the DRAM.”

From the virtual perspective, Rich Rejmaniak, technical marketing engineer at Mentor Graphics, noted that one of the artifacts of virtual memory is that provided protection from one area of the memory to the other. The RTOS environment outside of Linux and Windows hasn’t always been able to run virtual memory because it requires a lot of operating system overhead—it increases the complexity and the size of the system. Still, the ability to protect the different units from each other is critical.

To address this, Mentor has added the ability in its Nucleus RTOS to use that hardware so the different applications and tasks running on the real-time operating system now can’t interfere with the kernel or with each other, he explained. It’s not necessarily meant in this environment as a security issue to prevent malicious code because everything is typically fairly trusted.

One of the big problems with embedded software is a scribbler—a piece of software that has a bug and writes in the memory when it isn’t supposed to, thereby crashing the other modules, Rejmaniak said. “It’s running fine but taking everybody else out, and you’re concentrating all your debug efforts on everyone else when they are not really the problem. By putting this protection in each module, if it strays outside of its area it triggers an interrupt from the processor and the operation can trap it and you find out very quickly where the origin of the problem is. It knocks out the vast majority of these problems very early—typically in the engineering environment before you even ship.”

Managing memory with embedded software is truly a dynamic area in SoC design today, as evidenced by the activities merely touched on above. But these just scratch the surface of the activities happening in this space. With abstraction on the rise, the potential of embedded software will be an interesting space to watch.