When AI Goes Awry


The race is on to develop intelligent systems that can drive cars, diagnose and treat complex medical conditions, and even train other machines. The problem is that no one is quite sure how to diagnose latent or less-obvious flaws in these systems—or better yet, to prevent them from occurring in the first place. While machines can do some things very well, it's still up to humans to devise... » read more

System Bits: March 20


Design has consequences Carnegie Mellon University design students are exploring ways to enhance interactions with new technologies and the power of artificial intelligence. Assistant Professor Dan Lockton teaches the course, "Environments Studio IV: Designing Environments for Social Systems" in CMU's School of Design and leads the school's new Imaginaries Lab. “We want the designers of ... » read more

AI: The Next Big Thing


The next big thing isn't actually a thing. It's a set of finely tuned statistical models. But developing, optimizing and utilizing those models, which collectively fit under the umbrella of artificial intelligence, will require some of the most advanced semiconductors ever developed. The demand for artificial intelligence is almost ubiquitous. As with all "next big things," it is a horizonta... » read more

System Bits: March 13


Wiring quantum computers According to MIT researchers, when we talk about “information technology,” we generally mean the technology part, like computers, networks, and software. But they reminded that the information itself, and its behavior in quantum systems, is a central focus for MIT’s interdisciplinary Quantum Engineering Group (QEG) as it seeks to develop quantum computing and oth... » read more

System Bits: March 6


Printed graphene biosensors According to researchers at the Fraunhofer Institute for Biomedical Engineering IBMT in St. Ingbert (in Germany’s Saarland region), cell-based biosensors can simulate the effect of various substances, such as drugs, on the human body in the laboratory but depending on the measuring principle, producing them can be expensive. As such, they aren’t used very often.... » read more

The Week in Review: IoT


Finance CyberX raised $18 million in Series B funding, bringing its total funding to $30 million. Norwest Venture Partners led the new round and was joined by ff Venture Capital, Flint Capital, Glilot Capital Partners, and OurCrowd. CyberX makes its headquarters in Framingham, Mass., with operations in Israel. The startup offers security protection for Industrial Internet of Things application... » read more

Systems Bits: Feb. 27


Prepare to prevent malicious AI use According to the University of Cambridge, 26 experts on the security implications of emerging technologies have jointly authored a ground-breaking report thereby sounding the alarm about the potential malicious use of artificial intelligence (AI) by rogue states, criminals, and terrorists. The report forecasts rapid growth in cyber-crime and the misuse of... » read more

System Bits: Feb. 13


Enabling individual manufacturing apps Researchers at the Fraunhofer Institute for Computer Graphics Research IGD focused on Industrie 4.0 recognize that manufacturing is turning toward batch sizes of one and individualized production in what is sometimes referred to as ‘highly customized mass production.’ [caption id="attachment_24131609" align="aligncenter" width="300"] The scanning ... » read more

Deep Learning Spreads


Deep learning is gaining traction across a broad swath of applications, providing more nuanced and complex behavior than machine learning offers today. Those attributes are particularly important for safety-critical devices, such as assisted or autonomous vehicles, as well as for natural language processing where a machine can recognize the intent of words based upon the context of a convers... » read more

System Bits: Jan. 23


Artificial synapse for “brain-on-a-chip” portable AI devices In the emerging field of neuromorphic computing, researchers are attempting to design computer chips that work like the human brain, which, instead of carrying out computations based on binary, on/off signaling like digital chips do today, the elements of a brain-on-a-chip would work in an analog fashion, exchanging a gradient of... » read more

← Older posts Newer posts →