System Bits: July 3

VW emissions cheat; parallel processing speed up; DNA mini-machines.

popularity

VW emissions tests cheat code found
A team of researchers from UC San Diego, Ruhr University along with an independent researcher has uncovered the mechanism that Volkswagen used to circumvent U.S. and European emission tests over a period of at least six years before the EPA put the company on notice in 2015 for violating the Clean Air Act.

The researchers found the code that allowed onboard vehicle computers to determine that the vehicle was undergoing an emissions test, at which point the computer then activated the car’s emission-curbing systems, reducing the amount of pollutants emitted. Once the computer determined that the test was over, the systems were deactivated.

Startingly, when the emissions curbing system wasn’t running, cars emitted up to 40 times the amount of nitrogen oxides allowed under EPA regulations.

The team was led by Kirill Levchenko, a computer scientist at UC San Diego. (Source: UCSD)

Interestingly, the team said they obtained copies of the code running on Volkswagen onboard computers from the company’s own maintenance website and from forums run by car enthusiasts, for a wide range of models, including the Jetta, Golf and Passat, as well as Audi’s A and Q series.

Levchenko said, “We found evidence of the fraud right there in public view.”

The researchers have extensive experience analyzing embedded systems, such as the onboard computers in cars — known as Engine Control Units — for vulnerabilities. They said they examined 900 versions of the code and found that 400 of those included information to circumvent emissions tests.

“The Volkswagen defeat device is arguably the most complex in automotive history,” Levchenko added.

The team found that for both Volkswagen and Fiat, the vehicles’ Engine Control Unit is manufactured by automotive component giant Robert Bosch. They noted that this draws attention to the regulatory challenges of verifying software-controlled systems that may try to hide their behavior and calls for a new breed of techniques that work in an adversarial setting.

Large speedups for common parallel-computing algorithms
As is commonly known, the chips in most modern desktop computers have four cores or processing units, which can run different computational tasks in parallel, but that the chips of the future could have dozens or even hundreds of cores, and taking advantage of all that parallelism is a stiff challenge, reminded researchers from MIT’s Computer Science and Artificial Intelligence Laboratory.

Along these lines, in order to make parallel program run much more efficienct and easier to code, MIT researchers have developed Fractal, a system that they proven enables 10-fold speedups over existing systems.

Specifically, the team said that in tests on a set of benchmark algorithms that are standard in the field, Fractal frequently enabled more than 10-fold speedups over existing systems that adopt the same parallelism strategy, with a maximum of 88-fold.

The Fractal system achieves 88-fold speedups through a parallelism strategy known as speculative execution. 
(Source: MIT)


They gave the example of algorithms for solving an important problem called max flow that have proven very difficult to parallelize. After decades of research, they pointed out that the best parallel implementation of one common max-flow algorithm achieves only an eightfold speedup when it is run on 256 parallel processors. Fractal allowed a 322-fold improvement, which one-third as much code.

Fractal achieves those speedups through a parallelism strategy known as speculative execution.

Daniel Sanchez, an assistant professor of electrical engineering and computer science at MIT said, “In a conventional parallel program, you need to divide your work into tasks. But because these tasks are operating on shared data, you need to introduce some synchronization to ensure that the data dependencies that these tasks have are respected. From the mid-90s to the late 2000s, there were multiple waves of research in what we call speculative architectures. What these systems do is execute these different chunks in parallel, and if they detect a conflict, they abort and roll back one of them.”

Constantly aborting computations before they complete would not be a very efficient parallelization strategy, he said. But for many applications, aborted computations are rare enough that they end up squandering less time than the complicated checks and updates required to synchronize tasks in more conventional parallel schemes.

In fact, last year Sanchez’s group reported a system, called Swarm, that extended speculative parallelism to an important class of computational problems that involve searching data structures known as graphs.

However, research on speculative architectures has often run aground on the problem of “atomicity.” Like all parallel architectures, speculative architectures require the programmer to divide programs into tasks that can run simultaneously. But with speculative architectures, each such task is “atomic,” meaning that it should seem to execute as a single whole. Typically, each atomic task is assigned to a separate processing unit, where it effectively runs in isolation.

Switchable DNA mini-machines relay information
In a development that could be used to make nanotech sensors or amplifiers, and potentially even combine to form logic gates for a molecular computer, biomedical engineers at Georgia Tech, Emory University and Purdue University have built simple machines out of DNA, consisting of arrays whose units switch reversibly between two different shapes.

These ‘DNA machines’ can relay discrete bits of information through space or amplify a signal, according to Yonggang Ke, PhD, an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory. “In the field of DNA-based computing, the DNA contains the information, but the molecules are floating around in solution. What’s new here is that we are linking the parts together in a physical machine.”

The DNA arrays’ structures look like accordion-style retractable security gates. Extending or contracting one unit pushes nearby units to change shape as well, working like a domino cascade whose tiles are connected. The arrays’ inventors say they could be harnessed to make nanotech sensors or amplifiers. (Source: Emory University)

He noted that several laboratories have already made nanotech machines such as tweezers and walkers out of DNA, and his team’s work with DNA arrays sheds light on how to build structures with more complex, dynamic behaviors.

The arrays’ structures look like accordion-style retractable security gates. Extending or contracting one unit pushes nearby units to change shape as well, working like a domino cascade whose tiles are connected. The array units get their stability from the energy gained when DNA double helices stack up. To be stable, the units’ four segments can align as pairs side by side in two different orientations. By leaving out one strand of the DNA at the edge of an array, the engineers create an external trigger. When that strand is added, it squeezes the edge unit into changing shape.

The team has shown they can build rectangles and tubes of array units, including a cuboid that has three basic conformations, more than the two-dimensional array units with two conformations. The team is working on larger, more complex machines with three-dimensional shapes, which can be made using the same basic design principles.



Leave a Reply


(Note: This name will be displayed publicly)