A New Breed Of Engineer

The industry needs a new breed of engineers that can understand both hardware and software and not just for firmware. Co-design has failed – we need co-engineers.

popularity

The industry loves to move in straight lines. Each generation of silicon is more-or-less a linear extrapolation of what came before. There are many reasons for this – products continue to evolve within the industry, adding new or higher performance interfaces, risk levels are lower when the minimum amount is changed for any chip spin, existing software is more likely to run with only minor modification, and many other completely justifiable reasons.

This behavior tends to stall the industry, locking it into ways of doing things that were cemented in the dawn of time for the industry, and like they used to say – nobody ever got fired for buying IBM. The same has been true for system architectures, for basic blocks, for programming paradigms and more.

Consider memory. It should come as no surprise that it is tough for anyone to produce a memory that is better than a DRAM. Billions of dollars have gone into optimizing these devices, while competing memories have see a trickle of funding, a few research grants, perhaps a startup giving it a try. But even the mainstream fabs are often unwilling to lend them an ear until demand has been demonstrated, and it is difficult to produce that demand when devices cannot be economically produced. The classic Catch-22 situation.

Moore’s Law has cemented this behavior. The cadence of new production geometries has provided companies with the ability to produce faster chips without having to try, often at lower power and lower cost. Adequate technologies are good enough when there is so much to be had for free. This in turn has created a highly lazy software industry that can expect to be given more processing power and more memory every few months. Why optimize anything?

The industry is about to reach an inflection, because there is little to be gained from new manufacturing technologies anymore, except for a few rich companies and applications. For most of the industry, the free ride is over, and they will have to find new ways to get more from the same.

We are seeing potential opportunities everywhere. GPUs successfully managed to force a programming paradigm shift because without it, it was impossible to get anywhere close to optimal performance from the hardware. We are seeing similar things happening with AI across the board – even system companies coming in and designing their own devices because they can find nothing on the market that comes close to satisfying their needs. They are also creating new languages, programming paradigms and tool chains to support them.

We are seeing opportunities within all segments of the market for processing near memory. This ranges from small IoT devices to data centers. It is not difficult to make these devices and demonstrate how much better they are, but they come at the cost of requiring software changes.

We are seeing open source hardware looking like a viable option for the first time, but the industry is perhaps not quite ready with the necessary ecosystem and tooling. In many cases, no tools exist because the market for, let’s say, processor verification tools, was so small that it was unviable to produce tools for them. It will take time for the industry to catch up.

We are seeing systems where memory performance and interfaces are getting in the way of progress, because a single contiguous, coherent memory system is easier to program, requiring no thought from the software.

We are seeing embedded FPGA fabrics become an essential part of AI processing and in many cases can perform other acceleration functions much faster and using less power than a general purpose CPU, but the software industry does not understand hardware and they are unable to even come close to being able to write “software” that would go into an FPGA.

We are seeing new product demands, such as security, that not only require an understanding of hardware, but also the physics surrounding devices that are being tapped into for side-band data theft or sloppy software that can be monitored using techniques such as differential power analysis.

The industry desperately needs software engineers to start becoming more hardware-aware, or conversely more hardware engineers being able to write quality software. Software productivity is no longer just how many lines of code they can bang out, we need intelligent software, more optimized software that is aware of the environment it is running on and impacting. We need a new generation of software skills.

If that happens, the rate of progress of the entire industry will accelerate. Not only will software stop wasting the resources that are there, but the hardware will be able to move forward a lot quicker, knowing that new capabilities will not just sit there being unused as they have in the past. It is time that we move not to hardware/software co-design, but a single breed of engineers who understand both, even if they work more heavily in one domain or the other.

Related
Looking for a job in the semiconductor industry?
Jobs board for the chip industry
Engineering Talent Shortage Now Top Risk Factor
New market opportunities and global competitiveness are limited by qualified people.



6 comments

Steve Hoover says:

I couldn’t agree more. Great article, Brian!

Jayaram N Mandyam (Jay) says:

Hi Mr. Brian Bailey,

True that !!! Good & Apt article !!! I am a Co-Engineer & hope to find a relevant career with a relevant company.

I hope that HR’s/relevant people from relevant companies read your article & my comment & contact me!

Thank you once again for your article.

Best Regards,
Jayaram N Mandyam (Jay)

Lewis Sternberg says:

Brian,
As ever, a great though-provoking article. Thanks!
I agree that we are at an inflection point where the next phase will be characterized by the ability to build on the previous’ phase’s gifts. That is, we now can create a hierarchy of interdependent systems with attendant emergent properties and vulnerabilities.
I agree that expanding engineers’ perspectives across engineering disciplines is necessary (HW/SW, Analog/Digital, Design/Verification, Silicon/Systems) — but not sufficient.
While engineers are a bright bunch, there is simply more information that needs to be mastered in a short timeframe than can be held in an individual mind.
I postulate that the additional skills that engineers need are not within the typical STEM curriculum. We need the difficult “soft” skills to be able to work and think together across the various disciplines. Further, I’d say that the typical set of “soft” skills as taught will need adaptation to the exacting work of design.

Rachit Jain says:

I’m a student currently studying both computer science and computer engineering and have a really deep interest in the exact interface between hardware and software. However, I’m having trouble finding a path to get to a position where I could do so. How would a student get this kind of position? What kind of job postings or companies should I look for?

Stan A Rothwell says:

“The industry desperately needs software engineers to start becoming more hardware-aware, or conversely more hardware engineers being able to write quality software.”

Can’t argue with that, but good luck convincing the HR types of that. Their attitude seems to be that you have to be highly specialized in one field, it takes 10+ years of experience to get there, and if your experience is too “out of the box” for them, they reject your resume for someone else who has added all the prerequisite buzzwords…

Theodore Wilson says:

Another great article Brian! Thank you!

I have spent a lot of time at system level and I think it is easy to be too busy dealing with givens to innovate. System level teams understand a lot and can see a lot of things to optimize but these teams are also critical to day to day operations and the opportunity cost of their focused attention is about as high as it gets in the engineering teams.

I suspect that CI/CD and project dashboards can break the log jams by letting large heterogeneous teams sled manage to better outcomes and identify, communicate and track high value targets.

An interdisciplinary tiger team or similar working on a lately discovered and critical issue seems always to solve the problem in shorter time frames.

Is the problem then a lack of or lack of attention to indicators of problems in the product?

Not sure but I suspect this is the core problem.

Leave a Reply


(Note: This name will be displayed publicly)