System Bits: July 5

Automated data collection; toxic-gas detector; how tech evolves.

popularity

Computer vision for automated data collection
Stanford University researchers have developed a computer vision system that automates the collection of data about the elements in buildings in order to streamline the remodeling or refurbishment of existing buildings, which can be fraught with delays and cost overruns due to hidden problems.

Renovation projects live and die by the quality of information, and while newer buildings may have computerized blueprints and records — including details such as the number of rooms, doors and windows, and the square footage of floors, ceilings and walls — such information may not exist for older buildings, necessitating the time-consuming and difficult task of collecting these details manually. But the new Stanford system automates the process of getting detailed building information.

A computer vision system developed by Stanford researchers automates the collection of data about the structural elements and furnishings in buildings. (Source: Stanford University)

A computer vision system developed by Stanford researchers automates the collection of data about the structural elements and furnishings in buildings. (Source: Stanford University)

The system leverages existing 3D sensing technologies that use light to measure every feature of a building’s interior, room by room and floor by floor, to create a massive data file that captures the spatial geometry of any building. What is new is the way that the system feeds the raw data file captured by the sensors into a new computer vision algorithm, which automatically identifies structural elements such as walls and columns, as well as desks, filing cabinets and other furnishings.

The process is the brainchild of Stanford doctoral student Iro Armeni, who had been an architect on the Greek island of Corfu, and had performed custom renovations on historical buildings hundreds of years old. On those jobs, she and her colleagues used tape measures to redraw building plans, a universally common practice that is both time-consuming and often inaccurate. She knew there should be a better way, and thereby developed a computer vision system that could analyze a point cloud for a building, distinguish the rooms, and then categorize each element in each room.

Stanford doctoral student Iro Armeni (Source: Stanford University)

Stanford doctoral student Iro Armeni (Source: Stanford University)

The researchers plan to further develop the algorithm by providing a way for professionals with raw point cloud data to upload their files and receive the automatically generated results, and in the future, Armeni hopes to create an algorithm that can track the whole life cycle of a building – through design, construction, occupation and demolition.

Detecting hazardous chemical agents with low-cost, wearable sensor
MIT researchers have developed low-cost chemical sensors, made from chemically altered carbon nanotubes, that enable smartphones or other wireless devices to detect trace amounts of toxic gases.

Using the sensors, the researchers hope to design lightweight, inexpensive radio-frequency identification (RFID) badges to be used for personal safety and security. Such badges could be worn by soldiers on the battlefield to rapidly detect the presence of chemical weapons — such as nerve gas or choking agents — and by people who work around hazardous chemicals prone to leakage.  

The sensor is a circuit loaded with carbon nanotubes, which are normally highly conductive but have been wrapped in an insulating material that keeps them in a highly resistive state. When exposed to certain toxic gases, the insulating material breaks apart, and the nanotubes become significantly more conductive. This sends a signal that’s readable by a smartphone with near-field communication (NFC) technology, which allows devices to transmit data over short distances.

The sensors are sensitive enough to detect less than 10 parts per million of target toxic gases in about five seconds, and the sensors each cost about a nickel to make; roughly 4 million can be made from about 1 gram of the carbon nanotube materials.

How technology changes over time
In order to explain the way in which technologies evolve in modern society, a UCLA-led team of researchers has borrowed a technique that biologists might use to study the evolution of plants or animals, and plotted the “births” and “deaths” of every American-made car and truck model from 1896 to 2014.

The team said that in addition to uncovering interesting insights about the auto industry, the strategy — which could be used to study the evolution of phones, music, TVs or any number of other products — offers a new lens through which researchers could study cultural and technological change.

The team drew data from 3,575 car models made by 172 different manufacturers, noting the first and last year each was manufactured, similar to when a paleontologist first dates a particular fossil and last sees a particular fossil. The approach allowed them to identify periods of time when new car models were being introduced at faster- or slower-than-usual rates; and to identify periods when cars were being discontinued in greater or lesser numbers.

This approach enables scientists to test theories about a product’s evolution, as it is a framework that allows for data-based testing of hypotheses, which is not common.

Based on the study, the researchers said they can project how the electric car marketplace will evolve over the next several years: with the field now in an early phase of rapid diversification, and although it’s likely that many more electric and hybrid models will be introduced over the next 15 to 20 years, many won’t survive for very long due to increasing competition. This is expected to eventually lead to consolidation, with a small number of dominant models that will thrive.

The technique could also help make sense of the bewildering array of technologies humans have created, the team added.



Leave a Reply


(Note: This name will be displayed publicly)