Ready For Nanoimprint?


Nanoimprint has been discussed, debated, and hyped since the term was first introduced in 1996. Now, a full 20 years later, it is being taken much more seriously in light of increasing photomask costs and delays in bringing alternatives to market. Nanoimprint lithography is something like a room-temperature UV cure embossing process. The structures are patterned onto a template or mold using... » read more

3D NAND Flash Processing


Coventor’s powerful SEMulator3D semiconductor process modeling platform offers a wide range of technology development capabilities for the development of cutting edge 3D NAND Flash Technology. 3D NAND promises high memory cell density with reduced data corruption, but also brings processing challenges. The structural complexity and inherent 3D nature of devices using 3D NAND require a predict... » read more

New Memory Approaches And Issues


New memory types and approaches are being developed and tested as DRAM and Moore's Law both run out of steam, adding greatly to the confusion of what comes next and how that will affect chip designs. What fits where in the memory hierarchy is becoming less clear as the semiconductor industry grapples with these changes. New architectures, such as [getkc id="202" kc_name="fan-outs"] and [getk... » read more

How Many Cores? (Part 1)


The optimal number of processor cores in chip designs is becoming less obvious, in part due to new design and architectural options that make it harder to draw clear comparisons, and in part because just throwing more cores at a problem does not guarantee better performance. This is hardly a new problem, but it does have a sizable list of new permutations and variables—right-sized heteroge... » read more

Optimizing DDR Memory Subsystem Efficiency


The memory subsystem sits at the core of a System-on-Chip (SoC) platform and can make all the difference between a well-designed system meeting its performance requirements and a system that delivers poor performance, or even fails to operate correctly. State-of-the-art DDR memory controllers use advanced arbitration and scheduling policies to optimize DDR memory efficiency. At the same time, t... » read more

Memory Lane: Far From A Leisurely Stroll


The only semiconductor market segment that has not been taken over by the foundries and still remains dominated by IDMs is the memory sector. The memory market is the last bastion for true IDM manufacturers, who must be savvy in the changing trends in end market applications, advanced technology development, and must still determine how much and when to invest in additional capacity. With on... » read more

Heterogeneous Multi-Core Headaches


Cache coherency is becoming more pervasive—and more problematic—as the number of heterogeneous cores used in designs continues to rise. Cache coherency is an extension of caching, which has been around since the 1970s. The notion of a cache has a long history of being utilized to speed up a computer's main memory without adding expensive new components. Cache coherency's introduction coi... » read more

Power/Performance Bits: Jan. 26


New switchable material Two MIT researchers developed a thin-film material whose phase and electrical properties can be switched between metallic and semiconducting simply by applying a small voltage. The material then stays in its new configuration until switched back by another voltage. The discovery could pave the way for a new kind of nonvolatile memory. The findings involve the thin-... » read more

Shifting Performance Bottlenecks Driving Change In Chip And System Architectures


The rise of personal computing in the 1980s — along with graphical user interfaces (GUIs) and applications ranging from office apps to databases — drove the demand for faster chips capable of removing processing bottlenecks and delivering a more responsive end-user experience. Indeed, the semiconductor industry has certainly come quite a long way since IBM launched its PC way back in 1981. ... » read more

Smart Data Acceleration


As an industry, if real progress is to be made towards the level of computing that the future mandates, then the way computing problems are attacked must change. The von Neumann execution model has and will continue to serve us well, but additional techniques must be brought to bear. The next logical focus area is data—how it is accessed, and how it is transformed into real information—t... » read more

← Older posts Newer posts →