Role For ICs Expands In Humanoid Robots

Robotics are pushing well beyond traditional factory automation, and they are starting to look and behave very differently.


Semiconductors play a crucial role in the development and functionality of humanoid robots. Humanoid robots are advanced machines designed to resemble and perform tasks similar to humans. The integration of semiconductors in humanoid robots contributes to their sensory perception, processing capabilities, and overall functionality.

Robots are used in everything from security and defense, to industrial and manufacturing operations, entertainment, warehousing, retail, and health care. Despite those very different applications, many of the underlying technologies are similar. Performance, precision, and productivity are critical for each of them, and they depend on semiconductors for sensory perception, processing, and overall functionality. In addition, robotic development makes use of a combination of technologies, including autonomous mechanisms, embedded vision (video, data processing, facial recognition, video capturing and real-time processing), sensors, and real-time data processing and AI.

All of these technologies have seen significant advancements. For example, in a retail setting, some countries have started using mobile robots to meet and greet clients. Customers can communicate verbally with these walking robots, and in banks they can make deposits as well as withdrawals. Some restaurants use robots to deliver meals to customers at their tables, using a combination of sensors, cameras, and automation technology to perform the functions accurately while moving freely and avoiding objects.

Fig. 1: Robots are being used to deliver food and clear tables in restaurants.

While much of what is happening today would have been considered science fiction a decade ago, robots have been playing a role in some industries for many years. Factories have long used robotic arms to assemble vehicles and assist human workers, and they have been used in data storage facilities in the past to load tapes and compact disks.

“Besides cutting labor costs, automation significantly shrinks the scope for errors, thereby maximizing yield and reducing the need for rework,” said Prakash Madhvapathy, director of product marketing for Tensilica audio/voice DSPs at Cadence. “The U.S. is compelled to resort to factory automation to compete with manufacturing efficiency giants in Asia, where labor is inexpensive, and processes have been fine-tuned over long periods of time. With smaller products, such as PCB manufacturing, intervention is a bad word. Hence, U.S.-based factories look to automate as much of the process as possible — raw material in, final assembly out. Factories can run 24 x 7 at near maximal throughput, putting the U.S. back on the global manufacturing map and obviating the need to outsource such manufacturing.”

For larger systems, such as automobile manufacturing, a hybrid approach makes more sense, judiciously intermingling robotic assembly with manual assembly so as not to get tied up in an unmanageable labyrinth of pure automation. “Automation for the sake of automation is avoided, with overall efficiency being the goal,” Madhvapathy said. “Automation can be hard to introduce in an established factory, as it requires substantial plant restructuring, which can be highly disruptive. For new factories, though, automation can be more easily built from the ground up, as no existing pipeline needs to be uprooted or displaced while maintaining production continuity.”

On the customer service front, companies are using chatbots to answer calls and deliver food and supplies on college and hospital campuses. Most of those robots are stationary half-height human-like or machines on wheels. In recent years, however, the development of more human-like robots or humanoids has been gaining momentum as businesses try to further increase productivity and come up with new ways to serve customers. It is expected these humanoids will be used in factories, warehouses, hospitals, and home health care, as well as serving as domestic helpers and greeting guests in hotels.

Applications of humanoids
So what technologies are required for a humanoid to operate? And where will the evolution of humanoid-supporting innovations lead us?

Robotic arms of all sizes are in widespread use in factories to assemble vehicles, electronic devices, and many consumer products. In recent years, a hybrid model of robot arms acting as co-pilots to human workers controlling them has become available. However, many of these robotic arms have limitations. Usually, they are programmed to do repeated tasks, such as pick up and drop off components or assembling vehicle chassis.

Ultimately, developers want to create more versatile, human-like robots (humanoids) to emulate what humans can do. Today, humanoids can do many different things such as simple cooking, cleaning, and some household chores. Humanoids also can move products around or lift heavy objects in a warehouse, greet customers at a hotel or a bank, provide home health care and assistance in hospitals, and much more.

Among the humanoid applications:

Warehouse assistance
Agility Robotics’ Digit humanoids stand 5’9” tall, weigh 141 lbs., and they can bend their legs to pick up, carry, and put down objects of different sizes up to 35 lbs. They currently are being tested in an Amazon fulfillment center. Much like a human, the humanoid uses its arms, hands, and feet to balance itself when being bumped by, say, a human worker accidentally. When sensing an object in its path, it understands how to avoid the object. Also, it is able to operate 16 hours a day, and when it needs charging, the built-in autonomous feature will send the humanoid to a docking station. The cost of renting this type of humanoid is estimated to be between $15 and $20 an hour.

Apptronik has also demonstrated its humanoid for industrial applications. Its design allows the unit to function on any mobility platform or be fully mobile with its own legs. The unit is 5’8” tall, weighs 160 lbs., and can carry 55 lbs. Its battery pack provides four hours of operation.

“The productivity gains from robotic integration vary by industry and specific application, but they are invariably significant,” said Jonathon Wright, chief technologist for AI augmented testing and R&D at Eggplant (a Keysight company). “Robots excel in repetitive, precision-based tasks, increasing output consistency and quality while reducing human error. For instance, in manufacturing, automation can increase productivity up to the 20% to 30% range. This is not just about replacing human labor. It’s about augmenting it with robotic precision and endurance.”

Versatile assistant
Atlas, a humanoid from Boston Dynamics, is capable of jumping, turning around, gripping, and dropping off obstacles. It can even balance itself on one leg or do a back flip. To help navigate its environment, it uses lira and stereo sensors to observe obstacles and terrain. Atlas is 4’11” tall and weighs 180 lbs. The demo version is equipped with a battery that will last up to one hour. The company also has a large library of software available to program the humanoid’s routine.

Agibot also has recently released a 5’7” tall humanoid weighing 117 lbs. Initially, it was designed for factory assembly line tasks, such as tightening bolts, handling tools, and conducting inspections. The company also is developing a humanoid household helper to perform chores such as preparing food and cooking, pouring water, and even providing elder care. Given the shortage of home healthcare personnel, this type of humanoid may have great potential.

Humanoid design requirements
Humans are versatile. For example, we have a large degree of freedom moving left, right, forward, and backward. Our fingers are sensitive to touch. With eye-hand coordination, humans can learn about the objects they touch and the environment around them, absorbing knowledge such as not touching hot objects on the stove and staying away from a barking dog. But for humanoid technology developers, everything from vision, touch sensing, mechanical and motor control, including balancing, must be considered.

The primary sensors used in humanoids are cameras and lidar. Radar is not used as much, but potentially it can add additional capability. Today, 2D lidar can detect distance. In contrast, 3D lidar — which uses multiple beams and depends on a sensor network — will be able to paint a 3D picture of the real environment in which the humanoids operate. The fingers and hands of the humanoids will be equipped with pressure sensors so they will know when objects are being touched. Much of these advancements are possible with embedded vision technology.

“Embedded vision technology has made remarkable strides, integrating powerful image processing capabilities directly into devices,” Wright said. “This advancement allows machines and systems to ‘see’ and interpret their surroundings, leading to applications in quality inspection, navigation, and even interactive consumer products. The ongoing miniaturization and cost reduction of these systems are making them accessible for a wider range of applications.”

Embedded vision relies on cameras, radar, and lidar. Camera technology, especially in the context of automation and robotics, has evolved beyond mere image capturing. Modern cameras now incorporate AI-driven processing capabilities, enabling them to perform complex tasks like real-time object detection, identification, and tracking. This evolution is crucial for applications such as autonomous vehicles, where safe navigation depends on real-time image analysis.

Radar technology in automation has expanded beyond traditional uses, as well. With advancements in signal processing and AI, modern radar systems offer improved detection and measurement capabilities. They increasingly are used in conjunction with other sensors to provide more comprehensive environmental awareness, particularly in automotive and industrial applications.

Lidar technology, meanwhile, has become a cornerstone in applications requiring precise depth perception, like autonomous driving and aerial mapping. The recent advancements in solid-state lidar have led to more compact, reliable, and cost-effective solutions, greatly expanding their potential applications.

Motion control
In an industrial environment, humanoids would be able to walk at a speed of 3 mph, faster than most humans in the same setting. Most humanoids are capable of lifting a payload of 30 to 60 pounds. For larger payloads, the humanoid size would have to increase, even to the point of “giant” humanoids for payloads beyond hundreds of pounds. Humanoids meant for dancing, playing soccer, or other activities with quick movements will be equipped with multi-axis motion sensors for sensing and balancing.

Fig. 2: Agility demo of robot. Source: Semiconductor Engineering/Viva Technology

While basic skills and movements can be pre-programmed, it is much more useful to apply AI in humanoids, so they can learn as they go. The challenge here is that each type of environment or task requires different training models. For example, teaching a human being mathematical skills is different than teaching them to sing. In the foreseeable future, both software libraries and AI training will be required. Additionally, the trend of software development is no-code programming, or using AI to develop the codes once the task is defined. As in the case of Boston Dynamics’ dancing humanoids, a large library of codes will be available for developers.

“In robotic applications, many of the actions need to be deterministic,” explained David Fritz, vice president of hybrid and virtual systems at Siemens EDA. “To accomplish this would require additional hardware specially designed to do AI inferencing properly. The training of the neural networks happens offline or in the virtual domain. The results of the final neural network are captured, say, in a flash drive, and then used in the actual physical robots with deterministic behavior. You can measure its power, performance, latency, and all those sorts of things. If it’s my job to design an efficient neural network that can take training data, and always give me the right output for every possible combination of inputs, you could do that without writing the code. The code itself is embodied in the network itself. You know how many layers, how wide, and the ways different multiply and accumulate operations — things like that. What does this new hardware do? We started off with video GPUs and went to DSPs with vector capabilities. Software development without writing code is the trend.”

Cadence’s Madhvapathy agrees that software development is widely acknowledged to be the long pole in the tent when it comes to time to market. “Application developers under constant pressure would love for tools to do more of the work wherever automation can be achieved,” he said. “This demand has created a trend that is leaning toward no-code development. How does one achieve that? Non-factory-automation domains use graphical, schematic capture-like tools such as MATLAB/Simulink. With these tools, different functions can be drag-dropped and connected to create a software flow. The graphical tool would include a code-generator that can output platform-agnostic code or optimized code for a particular platform. Further, by enabling the compiler and the signal processing elements (DSPs) in the platform to auto-vectorize, the tools can make short work of array processing to maximally utilize the DSPs. The software developer therefore has to focus less on coding optimization efforts and more on the application needs. While currently used in other domains, these new methods eventually will make their way into robotic applications.”

Humans generally have ethical concerns. Humanoids, on the other hand, are AI machines. Once they are hacked, they may be controlled by malware to cause disruption and even destruction. Imagine a future factory with hundreds of humanoid workers on the assembly line. With distributed denial-of-service (DDoS) attacks, the worst scenario would be humanoids stop working. But if they have been taken over by malware, unpredictable actions may cause damage to properties or people. This makes cybersecurity in humanoid design critical.

“Robotics automation development, as well as deployment, are certainly targets for cybersecurity attacks,” said Bart Stevens, senior director of product marketing at Rambus. “Robotic functionalities can be simplified to regular computer systems with a focus on sensors and actuators. This means they face security problems comparable to what connected computer systems have experienced for many years. Robotics plays a crucial role in civilian, industrial, and military sectors, such as health care, manufacturing, and disaster relief. Like regular computers, robotics not only face security vulnerabilities around confidentiality, integrity, and availability, but also trust, safety, accuracy, and misuse. Robotics developers must take safety, security, and accuracy requirements into account during system design and development to protect against cyberattacks. Robust and proven mechanisms for secure networking, updating, authentication, confidentiality, integrity, privacy, authorization, attestation, intrusion prevention, configuration, tamper resistance, safety designs, and self-healing are key.”

Design and development of robotics hardware and software starts with a proper threat assessment and remediation analysis (TARA) unique to each deployment and use case. “TARA is not only a method for identifying and assessing cyber vulnerabilities, but also for identifying proper security countermeasures for mitigation,” Stevens said. “Putting TARA into practice results in hardware and software solutions that provide data-at-rest, data-in-motion, and data-in-use protection mechanisms. Depending on the development cycle, ASICs, FPGAs, ASSPs, or general-purpose merchant silicon can be used.”

He noted that developers need to select components that offer independent security-certified hardware root of trust for platform and software security and key management. “Selecting security protocol engines with built-in cryptographic accelerators ensures secure data storage and data communication capability. Depending on the product lifetime, quantum-safe cryptography should be implemented next to proven classical cryptography mechanisms.”

Further, because memories are among the key components used in electronic devices including humanoids, it is critical to protect the data stored in memories from cyberattacks.

Looking ahead
While humanoid development is in an early stage of development and deployment, major manufacturers such as BMW are eager to try out humanoids. In January 2024, BMW, and humanoid robot company, Figure, signed a commercial agreement to develop general-purpose robots for automotive manufacturing. The idea is to deploy autonomous humanoids to perform tasks that are more difficult, tedious, or unsafe throughout the manufacturing process, supporting human workers to improve overall production efficiency and safety. Other OEMs are expected to follow.

As humanoids continue to improve, as in the case of Tesla’s generation 2 Optimus, they are expected to develop new skills beyond what humans can achieve. Current improvements include increased range of motion in the neck joint, along with improved tactile sensing on all fingers, and more delicate control, such as picking up an egg without breaking it. Other institutions are conducting similar research in the area of tactile sensing.

In the future, “super” humanoids with additional built-in sensors will sense their environments better than human beings. Today, it has been shown that robots can achieve precision far beyond humans. For example, in robotic surgery, micrometers can be achieved.

The demand for humanoids will in turn drive the need for smarter, smaller, and faster chips. And with the development of AI chips and chiplets, more capabilities can be squeezed in smaller packaging within the humanoids.

“Currently, the demand for automation/robotics is not high enough to justify bespoke silicon development.” Cadence’s Madhvapathy said. “Industrial Internet of Things (IIoT) SoCs, along with consumer and automotive SoCs, are being retargeted to this adjacent market to fulfill the limited demand. Many of these chips are AI-capable, as the robotic use cases cannot be satisfactorily met with traditional compute algorithms. The progress of automation in factories lags that of consumer and automotive areas. So for the near future, robotics will rely on market-ready SoCs from these fields. Such SoCs are most certainly gaining capabilities — sometimes exponentially — generation after generation. Factory automation will inherit these advances by virtue of adopting those SoCs, leading to more sophistication. As the concept of Factory 4.0 gains a strong foothold, factory automation may come into its own, eventually requiring silicon that other markets can’t provide.”

Finally, as AI becomes an integral part of robotic systems, Keysight’s Wright predicts the deployment of AI-specific chips (AI accelerators) will become more widespread. “These chips are designed to efficiently handle machine learning algorithms, and are crucial for tasks like real-time data processing, pattern recognition, and autonomous decision-making in robotics. The future of AI chips lies in their ability to facilitate on-device processing, reducing the reliance on cloud-based systems and enabling more responsive and adaptive robotic behavior.”

Further Reading
Solving The Last-Mile Delivery Problem
Autonomous roadside delivery robots may increase operational efficiency, accuracy, and customer satisfaction.
Machine Vision Plus AI/ML Adds Vast New Opportunities
But to fully realize its potential, MV must boost performance and keep pace with changing security and market needs.

Leave a Reply

(Note: This name will be displayed publicly)