Extreme self driving; neurohackathon; hyperspectral camera-equipped drone.
Controlling autonomous vehicles in extreme conditions
In an approach that could help make self-driving cars of the future safer under hazardous road conditions, a Georgia Institute of Technology research team devised a way to help keep a driverless vehicle under control as it maneuvers at the edge of its handling limits.
According to the team comprised of researchers from Georgia Tech’s Daniel Guggenheim School of Aerospace Engineering and the School of Interactive Computing the technique, which uses advanced algorithms and onboard computing, in concert with installed sensing devices, to increase vehicular stability while maintaining performance, was evaluated by racing, sliding, and jumping one-fifth-scale, fully autonomous auto-rally cars at the equivalent of 90mph.
Traditional robotic-vehicle techniques use the same control approach whether a vehicle is driving normally or at the edge of roadway adhesion, the team explained, and this method – known as model predictive path integral control (MPPI) – was developed specifically to address the non-linear dynamics involved in controlling a vehicle near its friction limits.
Aggressive driving in a robotic vehicle – maneuvering at the edge – is a unique control problem involving a highly complex system, but by merging statistical physics with control theory, and utilizing leading-edge computation, they created a new perspective, and a new framework, for control of autonomous systems.
Specifically, the Georgia Tech researchers used a stochastic trajectory-optimization capability, based on a path-integral approach, to create their MPPI control algorithm. Then, using statistical methods, the team integrated large amounts of handling-related information, together with data on the dynamics of the vehicular system, to compute the most stable trajectories from myriad possibilities. Processed by the high-power graphics GPU that the vehicle carries, the MPPI control algorithm continuously samples data coming from GPS hardware, inertial motion sensors, and other sensors. The onboard hardware-software system performs real-time analysis of a vast number of possible trajectories and relays optimal handling decisions to the vehicle moment by moment. In essence, they said, the MPPI approach combines both the planning and execution of optimized handling decisions into a single highly efficient phase, and is regarded as the first technology to carry out this computationally demanding task; in the past, optimal- control data inputs could not be processed in real time, they noted.
Cracking the structure of neural data and the brain
To engage computer scientists in using one of the hardest systems to crack: the structure of neural data and the brain, Carnegie Mellon University’s BrainHub will host its first Neurohackathon today and tomorrow.
The brain has billions of neurons and trillions of synapses, making it an excellent source of big data, and BrainHub researchers from across CMU’s Pittsburgh campus have collected vast amounts of information from the brain using techniques including MRI and electrophysiological recordings. Through this hackathon, they said they hope to develop new methods for analyzing and understanding this data and foster new collaborative relationships between neuro-, data and computer scientists.
During the hackathon, graduate student will be given data sets gathered from the labs of neuroscience researchers in the College of Engineering, Dietrich College of Humanities and Social Sciences, Mellon College of Science and School of Computer Science and asked to develop solutions for analyzing the data.
Advanced agricultural drone
In order to give farmers very precise information on the health of their crops, EPFL spin-off Gamaya has created a drone that combines a miniature hyperspectral camera and artificial intelligence.
The drone can also be used to ensure fertilizer, pesticides and other treatments are used sparingly. The automatic, aerial system for monitoring crops uses a hyperspectral camera affixed to a drone that detects a wide range of information, including seed type, stage of growth, hydration level, parasites and diseases. The aerial views can also be used to cut costs by ensuring that herbicides, pesticides and fertilizers are only applied to areas that need them.
Gamaya developed a software program that maps the nuances of the spectrum detected by the drone onto colors visible to the human eye. Each problem has a color, which means farmers can analyze their crops in detail on their computer screen. The system also provides advice – such as how much fertilizer to use – and yield projections.