System Bits: July 18

Neural network melanoma detector; ultra-high-contrast digital sensing; CRISPR takes cells to movies.

popularity

Melanoma predicted from images with a high degree of accuracy by neural network model
The poke and punch of traditional melanoma biopsies could be avoided in the near future, thanks to work by UC Santa Barbara researchers.

UCSB undergrad Abhishek Bhattacharya is using the power of artificial intelligence to help people ascertain whether that new and strange mark is, in fact, the deadly skin cancer. Bhattacharya is a biology and computer science student in UCSB’s College of Creative Studies (CCS), has so far proven a 96 percent accuracy rating with his neural network model.

UCSB undergrad researcher Abhishek Bhattacharya (Source: UCSB)

“We’re applying computer vision to solving medical problems,” said Bhattacharya, who, along with UC San Francisco (UCSF) physician and professor Dexter Hadley, developed the melanoma project and trained the neural network model to judge images of skin irregularities and predict whether the mark, mole or lesion is cause for concern.

Several factors make melanoma — the rarest form of skin cancer — difficult to detect. While skin cancer is the most common of all cancers, according to the American Cancer Society, melanoma accounts for only about one percent of skin cancers. The disease can develop as a new mole, but it often develops over existing moles, and in some cases can be mistaken for scars and benign growths, the team noted.

Other factors such as time constraints, priority of other illnesses and patient embarrassment frequently result in patients and doctors putting melanoma screening on the backburner, which could result in a dangerous oversight given that left unchecked, the tumor can spread to other parts of the body, including bones, organs and other tissue. Therefore, early detection is key to survival.

The team is trying to do is something humans can’t do, which is to predict under-the-skin pathology from looking from the top given that visual inspection is subjective even by physicians using a dermatoscope to see superficial and subsuperficial skin features and trained in the ABCDE (Area, Border, Color, Diameter, Evolution) protocol. And this type of cancer is dangerous enough that when in doubt, physicians tend to err on the side of caution.

For roughly every 30 excisions of suspicious moles and lesions, it is estimated that one proves to be a melanoma.

Machine learning can offer a better way. Taking advantage of major gains in computing power and big data — and inspired by how human brains work — scientists have figured out how neural network models, such as those that exist at the UCSF Institute for Computational Health Sciences, can be trained to predict whether a mole or lesion is melanoma based on images gathered from the world wide web.

Using images from the web with labels indicating the type of skin lesions in those images, the neural network model learns what visual aspects are closely associated with melanoma diagnoses. This project uses a convolutional neural network, an architecture modeled on an animal’s visual cortex.

The researchers reminded this still needs to be proven within a clinical setting. While a head-to-head test between the neural network model and a dermatologist looking at images might prove the model the winner, its efficacy in a real-world medical scenario is yet to be demonstrated.

Cameras that can handle any light intensity
While virtually all modern information-capture devices including cameras, audio recorders, or telephones have analog-to-digital converters in them — which is a circuit that converts the fluctuating voltages of analog signals into strings of ones and zeroes. And all commercial analog-to-digital converters (ADCs) have voltage limits, reminded MIT and Technical University of Munich researchers. If an incoming signal exceeds that limit, the ADC either cuts it off or flatlines at the maximum voltage. This phenomenon is familiar as the pops and skips of a “clipped” audio signal or as “saturation” in digital images — when, for instance, a sky that looks blue to the naked eye shows up on-camera as a sheet of white.

To deal with this, the team developed a technique, “unlimited sampling,” which can accurately digitize signals whose voltage peaks are far beyond an ADC’s voltage limit.

MIT researchers have developed a sampling scheme that is unconstrained by bandwidth, allowing analog-to-digital conversion without “clipping.” 
(Source: Jose-Luis Olivares/MIT)


The implementation of this technique could lead to cameras that capture all the gradations of color visible to the human eye, audio that doesn’t skip, and medical and environmental sensors that can handle both long periods of low activity and the sudden signal spikes that are often the events of interest, the researchers said.

However, at this time, the technique is theoretical: The researchers establish a lower bound on the rate at which an analog signal with wide voltage fluctuations should be measured, or “sampled,” in order to ensure that it can be accurately digitized.

This work extends one of the several seminal results from longtime MIT Professor Claude Shannon’s in the 1948 paper “A Mathematical Theory of Communication,” the so-called Nyquist-Shannon sampling theorem.

[Coincidentally, a new biography on Claude Shannon was published today.]

The researchers said the work was inspired by a new type of experimental ADC that captures not the voltage of a signal but its “modulo.” In the case of the new ADCs, the modulo is the remainder produced when the voltage of an analog signal is divided by the ADC’s maximum voltage.

The idea behind this is that if there is a number that is too big to store in the computer memory, the modulo of the number can be used. The act of taking the modulo is just to store the remainder.

The modulo architecture is also called the ‘self-reset ADC,’ the researchers said, which means that when the voltage crosses some threshold, it resets, which is actually implementing a modulo. The self-reset ADC sensor was proposed in electronic architecture a couple years back, and ADCs that have this capability have been prototyped.

Living cells transformed into archival data storage devices
According to researchers at Wyss Institute at Harvard University, new ways to harness DNA are being developed as a synthetic raw material to store large amounts of digital information outside of living cells, using expensive machinery.

They questioned whether living cells could be coerced — like large populations of bacteria — into using their own genomes as a biological hard drive that can be used to record information and then be tapped for it anytime. If possible, they said this could not only open entirely new possibilities of data storage, but also be engineered further into an effective memory device that may be able to record the molecular experiences cells are having during their development, or exposure to stresses and pathogens in a chronological fashion.

A new CRISPR system-based technology enables the recording of digital data, like those presenting successive frames of the movie of a galloping horse, one of the first made ever, in a population of living bacteria. In the future, this molecular recording device could allow researchers to have cells record the key changes they undergo during their development or exposure to environmental or pathogenic signals. (Source: Wyss Institute at Harvard University)

To this end, last year, a team at the Wyss Institute for Biologically Inspired Engineering and Harvard Medical School (HMS) lead by Wyss Core Faculty member George Church, Ph.D., built the first molecular recorder based on the CRISPR system, which allows cells to acquire bits of chronologically provided, DNA-encoded information to generate a memory of them in the genome of bacteria as a cell model.

The information, stored away as an array of sequences in the CRISPR locus, can be recalled and used to reconstruct a timeline of events.
As promising as this was, the team did not know what would happen when they tried to track about a hundred sequences at once, or if it would work at all. This was critical since they were aiming to use this system to record complex biological events as the ultimate goal.

Now, in a new study published in Nature, the same team shows in foundational proof-of-principle experiments that the CRISPR system, developed further as a first-of-its-kind approach, is able to encode information as complex as a digitized image of a human hand, reminiscent of some of the first paintings drawn on cave walls by early humans, and a sequence of one of the first motion pictures made ever, that of a galloping horse, in living cells.

They explained that the CRISPR system helps bacteria to develop immunity against the constant onslaught of viruses in their different environments. As a memory of survived infections, it captures viral DNA molecules and generates short so-called “spacer” sequences from them, that are added as new elements upstream of previous elements in a growing array located in the CRISPR locus of bacterial genomes.

The now famous CRISPR-Cas9 protein constantly resorts to this memory to destroy the same viruses when they return. Besides Cas9 — which has become a widely used genome engineering tool — other parts of the CRISPR system, however, have so far not been exploited much technologically.

To approach complex information on much larger scales, the team resorted to still and moving images because they represent constrained and clearly defined data sets, while a movie, in addition, offers the opportunity to have bacteria acquire information frame-wise over time.

They said they designed strategies that essentially translate the digital information contained in each pixel of an image or frame as well as the frame number into a DNA code, that, with additional sequences, is incorporated into spacers. Each frame thus becomes a collection of spacers.

In future work, the team said they will focus on establishing molecular recording devices in other cell types and on further engineering the system so that it can memorize biological information.