Info
Info
News Article

2018: A Remarkable Year In Semiconductor Manufacturing Part 2

News

In Part 1 of our annual industry review, Silicon Semiconductor technical editor, Mark Andrews, examined news and headlines effecting global semiconductor manufacturers from January through June of 2018. We saw 2018 begin with analysts taking stock of 2017's breakneck growth: 22 percent, which most market watchers saw as reason for modest 2018 expectations. In Q1 of last year, analysts predicted 7 to 8 percent growth. By mid-2018, analysts were revising forecasts upward, expecting as much as a 15 percent increase in chip sales. Yet amidst this renewed optimism, US President Donald Trump began a tit-for-tat campaign of tariffs aimed at select Chinese imports, many with semiconductor content. Rhetoric, accusations, threats and counter-threats became the norm as each trading partner sought an advantage. Tensions continued to escalate late in 2018 following failure by the two nations to reach an accord at the G-20 Summit in Buenos Aires. Days later on 1st December, Huawei's CFO was detained in Canada at US request, further escalating tensions. Yet at the same time, growth continued for much of 2018 even as chip and equipment sales slid in late third quarter, yet chip sales remained on track for a robust year. Opportunities and cautions alike have marked the start of 2019—a pivotal year for semiconductor manufacturers.

July 2018

NASA Embraces Commercial Electronics

Several experiments aboard the International Space Station are testing whether the space agency can move beyond traditional rad-hard components. If all goes according to plan, a SpaceX Dragon cargo ship returning from the International Space Station this fall will deliver back to Earth and waiting engineers a pair of servers that will have flown aboard the orbiting laboratory and testbed for nearly a year. The idea is to simulate a nearly year-long trip to Mars and determine whether off-the-shelf hardware can hack it in deep space.

Meanwhile, the space agency is readying an Arm processor core design as the foundation of its next generation of space electronics. It has also examined the effects of radiation on memory chips.

Under a NASA project called Spaceborne Computer, Hewlett Packard Enterprise engineers supplied a pair of two-socket servers installed on the space station. (An identical ground-based pair serves as a “control group” for the experiment.) The HPE servers were launched last August to assess whether NASA can eventually shift from expensive radiation-hardened components to commercial hardware that is proving increasingly resistant to the damaging effects of ionizing radiation.

Radiation hazards associated with a trip to Mars — at least 35 million miles at its closest approach to Earth — would likely be far greater than those in Earth's orbit. Nevertheless, HPE engineers note that current commercial electronic components far exceed current radiation hardening requirements for the space station.

The two onboard servers are meant to mirror machines at NASA's Ames Research Center that handle much of the processing required to support the space station. Mark Fernandez, an HPE engineer and co-principal investigator for the Spaceborne Computer experiment, said that one server is being run as fast as possible while the second runs slower. “If an anomaly occurs and it only occurs on the fast one, [then operators] can slow things down.”

The goal of the experiment is to determine whether the machines could survive a trip to Mars, function properly, and, if so, still provide astronauts with the right answers, explained Fernandez.

The current schedule calls for the servers to be returned to Earth in November aboard a Dragon cargo ship. HPE investigators would then conduct a failure analysis on components to determine how they “aged” after a year in space, added Fernandez.

Optical Nets Need Tunable Optics

NG-PON2, an optical networking technology starting to gain traction, needs low cost, reliable, tunable optics to be successfully deployed. NG-PON2 lets operators converge business and residential wireless and wired services on a single fiber network. Dynamic bandwidth (symmetrical and asymmetrical) can be delivered at rates up to 40 Gbits/second through wavelength bonding, potentially at lower operating costs when used with tunable optics.

Operators are showing increasing interest in the technology. In 2017, South Korea Telecom was the first to demonstrate NG-PON2, installing a software-defined Optical Line Terminal (OLT) to support XGS-PON symmetrical 10-Gbit/s transmission for IPTV and high-speed internet access to a test bed in Seoul.

Northpower Fibre demoed a live network last February, delivering 10 Gbits/s to users in Whangarei, New Zealand. For its part, Verizon conducted a trial of the technology, paving the way to commercial rollout this year.

System vendors are also showing a commitment with NG-PON2-ready OLTs for carrier networks and Optical Network Terminals (ONTs) for use at a customer's location, both available today. According to Ovum's PON equipment forecast to 2023, shipments of both systems are expected to grow significantly this year.

But there is one piece of the NG-PON2 puzzle missing - widespread availability of low-cost, reliable, tunable optics.

NG-PON2 uses both time- and wave-division multiplexing. Wavelengths assigned to each ONT can change, for example, in a protection switching scenario. This means the ONT transceivers must be capable of quickly adapting to transmission reconfigurations. In addition, the NG-PON2 channel bonding means transceivers can receive signals on multiple wavelengths.

Currently, NG-PON2 defines no minimum switching speed. The community has accepted 50msec as a minimum switching speed with 25msec and 10msec seen as potential targets. The NG-PON2 technology is at a crucial stage for operators who are looking to deploy it. As it moves toward mass markets, standards will ensure the industry can move from development to deployment.

CAE-Leti Says ‘Memory Wall' Needs New Solution to Broaden AI Capabilities

Addressing the "memory wall" and pushing for a new architectural solution enabling highly efficient performance computing for rapidly growing artificial intelligence (AI) applications are key areas of focus for CEA-Leti, the French technology research institute of CEA Tech.

Speaking at Leti's annual innovation conference in Grenoble, France, Leti CEO Emmanuel Sabonnadière said there needs to be a highly integrated and holistic approach to moving AI from software and the cloud into an embedded chip at the edge.

“We really need something at the edge, with a different architecture that is more than just CMOS, but is structurally integrated into the system, and enable autonomy from the cloud; an example can be found with autonomous vehicles: you need independence of the cloud as much as possible,” Sabonnadière said.

He commented on the Qualcomm bid for NXP being a key pointer as a driver for more computing at the edge. “Why do you think Qualcomm is buying NXP? It's for the sensing, and to put digital behind the sensing.”

To address the computing architecture paradigm, Sabonnadière said that he hopes for some breakthroughs in Let's collaboration with professor Subhasish Mitra's team at Stanford University's department of electrical engineering and computer science. Mitra's work, in development for quite some time and funded by the Defense Advanced Research Projects Agency (DARPA), the National Science Foundation and other supporters, focuses on exploring a new processing-in-memory architecture for abundant data and dense interconnections applications.

“We have a deep conviction that this is a way forward to address ‘More-than-Moore' challenges and have asked professor Mitra to create a demonstrator,” said Sabonnadière, talking about the need to validate the concepts in actual silicon circuits.

AI Flood Drives Chips to the Edge

As SEMICON West got underway, one could easily encounter many semiconductor companies working on some form of artificial intelligence. If the company makes chips, write software, is developing novel transistor architectures, or needs AI for future product generations, chances are that company has an interest in AI. The broad potential for machine learning is drawing nearly every chip vendor to explore the still-emerging technology, especially for data processing (the so-called ‘inference processing') at the edge of the network.

Deep neural networks are essentially a new way of computing. Instead of writing a program to run on a processor that spits out data, the developer streams data through an algorithmic model that filters out results through a process often referred to as inference processing. The approach first saw substantial attention after the 2012 ImageNet contest, when some algorithms delivered better results identifying pictures than a human. Computer vision was the first field to feel a big boost.

Since then, web giants such as Amazon, Google, and Facebook have started applying deep learning to video, speech, and translation. Last year, more than 300 million smartphones shipped with some form of neural-networking capabilities; 800,000 AI accelerators will ship to data centers this year; and every day, 700 million people now use some form of smart personal assistant like an Amazon Echo or Apple's Siri.

As many as 50 companies are already said to be selling or preparing some form of silicon AI accelerators. Some are IP blocks for SoCs, some are chips, and a few are systems.

The core technology is still evolving. As many as 50 technical papers on AI are published daily, “and it's going up — it couldn't be a more exciting field,” said David Patterson, the veteran co-developer of RISC who worked on Google's Tensor Processing Unit.

AI Becomes the New Moore's Law

It was almost a chant heard across the exhibit floor of SEMICON West in San Francisco: ‘Moore's Law is dead, long live AI.' According to Applied Materials, this has become the semiconductor industry's new rallying cry, which was repeated often the company's daylong symposium at SEMICON West.

“The time of the ‘node train' is coming to an end. There needs to be greater collaboration (that reaches) from materials to devices; hardware, software and systems in new avenues,” said Steve Ghanayem, former head of Applied's transistor and interconnect group who is now scouting for acquisitions and alliances to take the company in directions beyond Moore's Law.

But despite what some would have you believe, Moore's Law is not dead. The race to smaller, power-sipping, faster chips continues for the world's largest silicon fabs.

In a keynote, CEO Gary Dickerson said Applied Materials will soon announce new transistor materials that will reduce leakage current by three orders of magnitude. If this promise proves to be factual, the news Dickerson shared is nearly as big for chip makers as was Intel's advance in high-k metal gates in 2007. But today such advances are relevant only for small group of designs and companies at the very top of the silicon chip hierarchy.

Citing figures commonly used across industry, it can cost $100 million to tape out a 7nm chip, and the time from tape out to first silicon is stretching out to four months, said speakers at the symposium. “That's a check few people can write — as startup I can't afford to write a $100 million check,” said Kurt Busch, chief executive of Syntiant, a designer of an in-memory processor being developed for AI.

“I'm getting less enthusiastic about the latest nodes. They are good for Qualcomm, but that doesn't apply to everyone else,” said Dileep Bhandarkar, a server processor architect who left the company recently.

“I think this is what the end of Moore's Law looks like,” said Berkeley professor emeritus David Patterson, noting transistor costs are flat at TSMC and Intel is struggling to produce 10nm chips. “Ninety-five percent of architects think the future is about special-purpose processors,” said Patterson who had a hand in helping Google design its TPU.

Yan Borodovsky, a veteran lithographer recently retired from Intel, blessed the passing of the torch from Moore's Law to AI as a new guiding light.

“I think something beyond today's von Neumann architectures will be helped by something more than Moore. For example, memristor crossbars may become a fundamental component for neuromorphic computing…the world beyond Moore's Law may be about how many kinds of synapses you can put in a given area and how complex they are,” he said, taking a stab at an AI law.

Debugging Software Developer Gets $14M in Funding

Embedded software is becoming ever-more complex. This has led to the task of ensuring code quality and debugging will continue to take up an out-sized amount of product development time especially with the push toward more and more processing on-chip for machine learning and IoT applications. But two ex-Acorn Computer engineers who founded a company that claims to have developed a record and replay technology for software debugging this week announced a (USD) $14 million Series B funding round to further develop the product and expand in the US.

Cambridge, UK-based Undo has developed a program execution capture and replay technology, allowing vendors of complex Linux applications to quickly diagnose severe software failures in test or in production, thus enabling them to fix critical bugs that are impossible to reproduce (and, therefore, fix) by any other means. Undo's solution supports C/C++ applications today and it plans to use part of the funding to expand its language support to include Java, Python, and others.


Uber Fatality Sends AVs Back to Safety 101

In the wake of the Uber self-driving fatality in Arizona in March (2018), safety experts are urgently advising tech and automotive companies to push the reset button.

Headlines such as “How Safe Is Driverless Car Technology, Really?” or the equally alarming, “Autonomous Cars: How Safe Is Safe Enough?” or “How safe should we expect self-driving cars to be?” have peppered the media in the past few months.

None of this is good news for the developers of autonomous vehicles. The early public euphoria over a future of “driverless cars” has fizzled. The simplified arguments that assume that “AV technologies will save people's lives” and “AVs are much safer than human driving” are under scrutiny.

The industry discussion about the “safety” of automated cars and, inevitably, safety standards is overdue. The recent fatality, near misses and headlines about failures of Tesla's Auto Pilot feature that led to severe and even fatal crashes has many who embraced the promise of self-driving vehicles rethinking earlier positions.

But how does industry proceed with due diligence while regaining public confidence? Rather than requesting that the industry stop AV development, safety experts are asking technology developers, third-party testers, and policymakers to step up with innovative ways to demonstrate safety.

The day when companies like Uber simply resorted to brute-force road testing in order to rack up AV test miles is over. The auto industry wants alternatives that include virtual testing, mathematical analysis, computer simulation, and methodologies to deal with corner cases.

AI Apps Dominate Semiconductor Funding in China

China's use of facial recognition technology for widespread state surveillance has been heavily reported by media in recent months. The latest is the UK's Financial Times, which reminded everyone of a 2015 official paper that articulated a vision by leadership to have a national video surveillance network by 2020 that is omnipresent, always working and fully controllable.

The technology upon which it is based is reflected in the figures for investments in artificial intelligence (AI) and related tech companies, including semiconductor companies. One good example can be found in Beijing-based SenseTime, which in the second quarter raised more than (USD) $1.2 billion for ongoing AI research and development. Among other things, it provides AI-powered surveillance for the Chinese police. This week, it is reported that the SoftBank Vision Fund is considering putting in almost $1 billion more in SenseTime, which is already valued at over $4.5 billion, arguably the most valuable AI company in the world.

Chip Industry Fights Next Round of China Tariffs

US chipmakers and semiconductor equipment suppliers want the Trump Administration to remove 39 product categories from a list of some (USD) $16 billion worth of Chinese imports targeted for 25 percent tariffs.

Both the Semiconductor Industry Association (SIA) and the SEMI trade group testified at a US International Trade Commission public hearing Tuesday as part of the Administration's solicitation for public comment on the proposed tariffs. The SIA estimates that the proposed tariffs would impact about $3.6 billion worth of US semiconductor chip imports from China and another $2.7 billion in products related to the semiconductor supply chain.

In a written submission made prior to the hearing, the SIA argued that imposing tariffs on semiconductors and semiconductor-related products currently slated for tariffs would undermine US leadership in the field, handicap US-based chip firms in relation to international competitors and threaten the market share of US firms in China. The proposed tariffs would also cost US exports and jobs while raising the cost of manufactured consumer goods for US consumers, the SIA argues.

What's more, the SIA argues that imposing tariffs on imports of semiconductors and semiconductor related products would result in US companies paying tariffs on their own products while doing nothing to curb Chinese policies and practices deemed by the US to be anticompetitive.

In the submission, the SIA says that since the majority of US semiconductor imports from China are chips designed or manufactured in the US that have been shipped to China for test and packaging, making import statistics for semiconductors from China a misleading metric.

SEMI made a similar argument in its written submission, saying the tariffs would unfairly penalize US chip equipment manufacturers, most of which buy components from China for their equipment. Many of these components and materials are not offered by US vendors, and many are not widely available from any non-Chinese firms, according to SEMI.

DARPA Unveils Research Partners

Chip giants IBM, Intel, Nvidia, and Qualcomm, and a little-known foundry called Skywater, were among eight companies announced as prime contractors in four research projects sponsored by the US Defense Advanced Research Projects Agency (DARPA). Many others, including Arm and Globalfoundries, will act as subcontractors in the programs officially launched at an event today.

The four projects are part of DARPA's Electronics Resurgence Initiative (ERI) that is expected to receive $1.5 billion over the next five years to drive the US electronics industry forward. The programs aim to both serve the needs of the US Department of Defense and to boost the semiconductor industry at a time of diminishing returns in the pursuit of Moore's law to enable faster, cheaper, and smaller chips.

“A 53-year old exponential is unheard of … when it slows or does something different, it's incredibly frightening from a researcher's perspective,” said William Chappell, who heads DARPA's microsystems office that oversees ERI. “Our goal is to impact the industry in 2025 to 2030 with research beyond what companies would be looking at today.”

Other companies acting as prime contractors for the latest four projects include Applied Materials, Ferric Inc., and HRL Laboratories. In addition, researchers from Mentor Graphics and Xilinx recently joined DARPA to help manage ERI programs.

In one of the largest of the programs, Skywater Technology Foundry aims to show how it can define a monolithic 3D capability to deliver the equivalent of 7nm chips using its base 90nm process. The foundry was formed around a former Cypress fab in Minnesota.

Skywater will work with researchers from MIT and Stanford on DARPA's 3DSoC program. It aims to find ways to integrate novel materials such as resistive RAMs and carbon nanotubes on a base low-temperature 90nm process. Its success will be measured in terms of yields on devices that could slash computing times as much as 50x.

The project is one example of how DARPA aims, in part, to bolster chip-making in the US. Separately, DARPA will work with Globalfoundries on MRAM and future memories in a program called Foundations Required for Novel Compute (FRANC).

“The US has more 14nm fabs than anywhere in the world, but we don't have relationships to tap into them all … our intent is to tap into [fabs at companies such as] Micron, On, TI, Samsung in Austin, and others,” said Chappell, noting that DARPA has multiple ways to certify trusted federal suppliers, including foundries such as TSMC.

Intel Aims to Drive Chiplet Standard

Intel announced that it is weeks away from releasing a small but strategic piece of its proprietary packaging technology. It could become part of a future standard enabling a Lego-like design of SoCs out of chiplets.

The x86 giant is putting the final touches on a specification for its Advanced Interface Bus (AIB). AIB is a physical-layer block for the die-to-die connection in its dense, low-cost Embedded Multi-Die Interconnect Bridge (EMIB).

The company said it has already licensed the spec to a handful of partners in a government research program. It aims to make AIB available royalty-free to anyone interested through a consortium. If Intel can convince an existing consortium to offer AIB, it could be available within weeks. If the company needs to create a new consortium, the process could take as long as six months.

Low-cost, dense packages like EMIB are becoming increasingly important techniques for delivering high-performance chips at a time when traditional scaling is becoming more complex and costly. TSMC's InFO, a rival approach, is used by the A-series processor in Apple's iPhone.

Intel is keeping the ‘secret sauce' behind EMIB proprietary including the equipment and methods that it uses to build simplified bridges between chips. However, it aims to make AIB a standard interface for linking chiplets using any packaging technique, supporting its hopes to spawn an ecosystem of parts that it could tap for its own products.

Many others share the vision. “An Ethernet for chiplets is the most important goal for the CHIPS project” that Intel is part of, said Andreas Olofsson, program manager for the effort under the Defense Advanced Research Projects Agency (DARPA).

At least two other commercial efforts are in production with a separate but similar approach. Marvell launched its MoChi initiative initially as a proprietary capability under founder and former chief executive, Sehat Sutardja. Startup zGlue announced last year a similar effort as a commercial offering targeting SoCs for the internet of things. In addition, Globalfoundries is said to be working with packaging houses on other alternatives.

China vs. the US in Chip Funding Rounds

The US Department of Defense is pushing for a $2.2 billion program to fund a broad range of electronics efforts, in part over fear that competing nations including China may be further ahead in advanced chip architectures. The news came at an event where speakers agreed that Moore's Law is slowing but chip advances will continue thanks to alternatives to CMOS scaling being pursued internationally.

The event was a coming-out party hosted by the Defense Advanced Research Projects Agency (DARPA) for the Electronics Resurgence Initiative (ERI), and evolving set of research programs valued at $1.5 billion over five years. The effort is designed to counter two common enemies: the decline of Moore's Law and the rise of China.

“We want to align our common needs to counter China's desire to be the preeminent leader in next-generation semiconductors,” said Kristen Baldwin, an acting deputy assistant secretary of defense for systems engineering during the event. “The DoD wants to reverse trends that threaten our semiconductor ecosystem and lower barriers to semiconductor technology.”

The White House is requesting $2.2 billion to fund a five-year DoD program with four goals. It could provide more money for programs like ERI, create joint innovation centers for military and commercial users, expand government access to trusted chip supplies, and fuel military modernization programs ranging from AI processors, precision navigation, and timing chips to electronic warfare, she said.

The DoD will announce its first chip innovation center in August, targeting fast, secure chip design. It is already soliciting ideas for new ways to assure and verify trusted supplies of chips, “promoting security standards as [commercial] differentiators in areas such as data services and medical electronics,” she added.

Today, the US military lacks trusted sources for 14nm process technology and 2.5D chip packaging, both widely used in high-end commercial products.

“It's the first time since Bell Labs invented the transistor that DoD does not have access to the latest technology,” said William Chappell, who manages the ERI program for the Defense Advanced Research Projects Agency (DARPA). “The mechanism is broken, and we need to bind it back together. That's one reason we are all here.”

An Intel executive said that the government puts too many restrictions on trusted sources that don't ensure security, such as requiring that all fab workers are US citizens. A Globalfoundries manager noted that the 14nm process it runs in New York still has certain hard-to-untangle business links to Samsung, which developed the process.

August 2018

Is Neuromorphic Computing Closer to Commercialization?

There has been a tremendous amount of research in recent years into brain-inspired computing to tackle the ever-increasing need for faster computing and memory requirements to adapt artificial intelligence and machine learning in just about everything.

That research is now starting to bear fruit, with at least one neuromorphic computing chip developer, BrainChip, planning to detail its chip architecture next month.

Earlier this year, Barbara de Salvo, CEA-Leti's chief scientist explained the semiconductor industry could take its cue from biology to address the power requirements that traditional computing architectures now struggle to meet. She outlined the characteristics of a brain synapse, containing both memory and computing in a single architecture that can form the basis for brain-inspired non-von Neumann computer architecture. One recent trend in neuromorphic computing is to encode neuron values as pulses or spikes.

And then there's the European Human Brain Project's neuromorphic computing program, which has been working on constructing two large-scale, unique neuromorphic machines and prototyping the next generation neuromorphic chips. It recently published a paper on its first full scale simulations of a cortical microcircuit model of 80,000 neurons and 300 million synapses based on the SpiNNaker hardware to demonstrate its usability for computational neuroscience applications.

Professor Markus Diesmann, co-author of the paper and head of the computational and systems neuroscience department at the Jülich Research Center in Germany said, “There is a huge gap between the energy consumption of the brain and today's supercomputers. Neuromorphic (brain-inspired) computing allows us to investigate how close we can get to the energy efficiency of the brain using electronics.”

“It is presently unclear which computer architecture is best-suited to study whole-brain networks efficiently. The European Human Brain Project and Jülich Research Centre have performed extensive research to identify the best strategy for this highly complex problem. Today's supercomputers require several minutes to simulate one second of real time, so studies on processes like learning, which take hours and days in real time, are currently out of reach,” he added.

TSMC Sales Hurt by Virus Outbreak

Foundry giant Taiwan Semiconductor Manufacturing Company (TSMC) said a computer virus outbreak that hit the company on 3rd August will reduce its third quarter (2018) revenue by about 3 percent.

The virus, which TSMC said was accidentally spread by a faulty software installation process for a new tool, affected a number of the company's computer systems and fab tools in Taiwan. The company said Sunday that about 80 percent of the impacted tools had been recovered and that a full recovery is expected Monday.

TSMC (Hsinchu, Taiwan) said it expects the incident to cause shipment delays and additional costs that will not only hurt its third quarter revenue, but also reduce its third quarter gross margin by about 1 percent. TSMC said it is confident shipments delayed in third quarter will be recovered in the fourth quarter and maintains its forecast of high single-digit revenue growth for 2018. Most of TSMC's customers have been notified of this event, the company said.

China Tariffs to Hit the Chip Sector

Despite intense lobbying by the US semiconductor industry, the next round of US tariffs on Chinese imports will include billions of dollars' worth of semiconductors.

The Trump administration on Tuesday finalized plans to implement a 25 percent tariff later this month on a list of Chinese products worth $16 billion annually. It will mark the second group of products to be hit with the tariff as part of the escalating trade war between the US and China. Initially, $34 billion worth of Chinese products were subject to the tariff since 6th July.

The Semiconductor Industry Association (SIA) trade group estimates that about $6.3 billion per year worth of semiconductors and related products are in one of the two tariff groups, with the majority of those on the list released Tuesday. Semiconductor related items on the latest list include equipment used in semiconductor manufacturing, as well as diodes and other types of devices.

Both the SIA and the SEMI trade group, which represents capital equipment manufacturers, EDA vendors and other types of players in the electronics supply chain — lobbied hard for the removal of semiconductors and related products from the list in both written statements and testimony at a public hearing in July. The groups argued in part that imposing the tariff on chips would handicap US-based semiconductor firms in relation to international competitors and threaten the market share of US firms in China while also hurting US exports and jobs in addition to raising the cost of goods for US consumers.

Skyworks Buys Analog SoC Vendor

Analog chip vendor Skyworks Solutions plans to acquire analog SoC vendor Avnera Corp. for (USD) $405 million in cash.

Skyworks (Woburn, Mass.) said the deal would augment its wireless connectivity portfolio by adding ultra-low power analog devices to enable smart interfaces via acoustic signal processing, sensors and integrated software. Skyworks estimates that the acquisition will expand its addressable market by more than $5 billion.

Target applications for Averna's technology include AI speakers/microphones, virtual assistants, intelligent gaming controllers and vehicle in-dash systems as well as wired/wireless headsets, Skyworks said. Averna boasts customers that include: Harman, JBL, Panasonic, Philips, Pioneer, Polk, Samsung, Sennheiser, Sony, Vizio and Yamaha; the company has more than 100 issued and pending patents.

SoC Designs for Auto OEMs, Tier-Ones

ADAS developers have shifted from 'code bloat,' using thousands and even millions of lines of software code, to SoCs designed to enable autonomous driving tasks in hardware.

Automotive OEMs and Tier-1 suppliers are in a unique situation these days. Game-changing technology undertakings and hyper business growth in advanced driver assistance systems (ADAS) and autonomous cars are turning automotive design platforms upside-down. Moreover, technologies—from computer vision to 3D mapping, LIDAR to deep learning—are continuously converging and colliding along the road to making self-driving vehicles a reality. The stakes are high for this transformation from a social standpoint because lives can literally be affected.

Fatal crashes involving Tesla and Uber vehicles are a reminder that engineering self-driving cars requires extraordinary care at the system level. Automotive manufacturers are undergoing technology disruptions and consequently can no longer conduct business as usual with Tier-1 suppliers.

According to Arteris IP, OEMs, Tier-1s, and new entrants, such as Uber, should carefully review and assess design components to ensure vehicle safety and reliability.

It's about time that car OEMs and Tier-1s start looking at what's under the hood. All carmakers should get involved in choosing their technology-enabling semiconductors, the company noted. Recently, ADAS and automated driving applications have transformed from highly complex software-based products running on generic CPUs and GPUs to specialized system-on-chip (SoC) solutions employing advanced technologies, such as deep learning and neural networks via hardware accelerators. ADAS developers have shifted from “code bloat,” using thousands and even millions of lines of software code, to SoCs designed to enable autonomous driving tasks in hardware.

Taiwan to Continue Chip Foundry Industry Dominance

More than 30 years after it served as the setting for the birth of the Asian silicon semiconductor foundry industry with the formation of Taiwan Semiconductor Manufacturing Co. (TSMC) in 1987, Taiwan shows no sign of relinquishing its hold on the (USD) $62 billion global business.

TSMC remains, by far, the world's largest foundry, with 2017 revenue of $32.2 billion, more than five times that of second-ranked vendor Globalfoundries, according to market research firm IC Insights. TSMC accounted for nearly 52 percent of the foundry industry's worldwide total last year.

Taiwan is also home to the world's third-largest foundry, United Microelectronics Corp. (UMC), and the sixth-largest company in foundry sales, Powerchip Technology Corp. Combined, TSMC, UMC, and Powerchip accounted for 62 percent of all foundry sales last year.

To say that the foundry business thrives in Taiwan is an understatement. As home to three of the top six foundries worldwide, Taiwan has a network of fabs, supply chain infrastructure and perhaps most importantly, technical talent that is unrivaled. Despite concern in some quarters about having so much of the foundry industry concentrated on one earthquake-prone island, and despite challenges posed by players in other regions such as China, Taiwan's dominance is firmly entrenched.

“The reality is, there are so many fabs in Taiwan between TSMC and UMC and the memory guys [such as Powerchip] and Vanguard [Vanguard International Semiconductor Corp.] that it's hard to imagine displacing Taiwan from the number-one position,” said Len Jelinek, senior director for semiconductor manufacturing at market watcher IHS Markit.

Large-Scale Chipmaker Acquisitions May Have Peaked

The era of mega-consolidation that the semiconductor industry has been engulfed in over the past few years may have reached a peak, at least in terms of the size of deals, according to market research firm IC Insights.

In light of the recent cancelation of Qualcomm's $44 billion acquisition of NXP Semiconductors, it appears that increasing government regulatory scrutiny, the complexity of high-dollar deals and the escalation of trade wars are combining to put a ceiling on the size of semiconductor merger deals for the present, IC Insights stated.

"It is becoming less likely that semiconductor acquisitions over $40 billion can be completed or even attempted in the current geopolitical environment and brewing battles over global trade," IC Insights said in a press statement.

Of the 10 largest acquisitions of chipmakers by chipmakers ever announced, eight occurred in the last three years, including the ill-fated Qualcomm-NXP deal, the researchers noted.

Qualcomm canceled the acquisition of NXP in July 2018 after failing to gain the approval of China's Ministry of Finance in the nearly two years since the deal was first announced in October 2016. Earlier this year, US President Donald Trump blocked Broadcom from a $117 billion hostile takeover attempt of Qualcomm over concerns that the US could lose cellular technology leadership to Chinese companies.

IC Insights estimates that the total value of semiconductor industry mergers and acquisitions between 2015 and the middle of 2018 totaled $245 billion, including a record $107.3 billion in 2015. In the first half of this year, the firm estimates that semiconductor deals were worth $9.6 billion.


September 2018

Chip Sales Growth Slows

Semiconductor sales increased again in July, but may finally be showing signs of cooling, according to the Semiconductor Industry Association (SIA).

The three-month average of chip sales grew to (USD) $39.5 billion in July, up 17.4 percent from July 2017, according to the SIA. While the year-to-year growth rate remains healthy, July snapped a streak of 15 consecutive months of greater than 20 percent year-to-year increases.

Chip sales also increased on a sequential basis, but just barely. The three-month average of chip sales in July was up by a modest 0.4 percent, according to the SIA, which reports sales data compiled by the World Semiconductor Trade Statistics organization.

“The global semiconductor industry posted its highest-ever monthly sales in July, easily outpacing last July and narrowly ahead of last month's total,” said John Neuffer, SIA's president and CEO, in a statement. “Sales were up year-to-year across every major semiconductor product category and regional market, with the China and Americas markets leading the way with growth of greater than 20 percent.”

Sales increased on an annual basis by 29.4 percent in China, 20.7 percent in the Americas, 11.7 percent in Europe, 11.5 percent in Japan and 5.7 percent in the Asia-Pacific region, according to the SIA. On a month-to-month basis, sales were up 1.7 percent in China, 0.4 percent in the Americas and were flat in the Asia-Pacific region, with both Europe and Japan experiencing decreases, the SIA said.

Can We Ever Trust Fully Autonomous Vehicles?

A generation gap is emerging in mobility. People of a certain age, like to get behind the wheel of a car and take it for a spin. Many have grown up with the concept of speed and power, and that has created a joy of driving among many over the age of 30. But talk to people in Generation Z and the millennial generation and you get a totally different attitude: “Autonomous vehicles are coming, so why do we need to drive?” is something that you'll often hear from this new generation.

But how long will they have to wait for autonomous vehicles to become commonplace? Will we ever trust driverless cars? And who will be first with a mass-market autonomous vehicle offer that isn't in the luxury price range like the Tesla?

Just a few days ago, Jaguar Land Rover started to consider the trust issue and announced that it fitted virtual eyes to intelligent pods to understand how humans will trust self-driving vehicles. As part of an engineering project, it is working with cognitive psychologists to better understand how vehicle behavior affects human confidence in new technology and how much information should be shared with other road users.

The intelligent pods will run autonomously on a fabricated street scene in Coventry, UK while the behavior of pedestrians will be analyzed as they wait to cross the road. The “eyes” on the pods will seek out the pedestrian, appearing to “look” directly at them, signaling to road users that it has identified them and intends to take avoidance action. Engineers will record trust levels in the person before and after the pod makes “eye contact” to find out whether it generates sufficient confidence that it would stop for them. Previous studies suggest that as many as 63 percent of pedestrians and cyclists say they'd feel less safe sharing the road with a self-driving vehicle.

“It's second nature to glance at the driver of the approaching vehicle before stepping into the road," said Pete Bennett, future mobility research manager at Jaguar Land Rover. "Understanding how this translates in tomorrow's more automated world is important. We want to know if it is beneficial to provide humans with information about a vehicle's intentions or whether simply letting a pedestrian know it has been recognized is enough to improve confidence.”

But there are other issues that may challenge driverless cars. One is erratic human behavior, which the artificial intelligence (AI) in the car may not be able to detect, such as jaywalkers coming out onto a roadway unexpectedly when they are not supposed to be crossing a road. Will humans have to adapt to accommodate driverless cars?

Another issue is insurance. While the regulatory framework is still evolving, there could always be the issue of determining whether the driver or the manufacturer was negligent or who was responsible for an accident. In the UK, a bill has been passed in Parliament putting into place the regulatory framework that will allow companies issuing insurance to cover automated vehicles when they start appearing on roads from 2021 onwards. The legislation will give insurers the right to recover costs when technology failure causes an accident, meaning that drivers are not unfairly held responsible for accidents they could do nothing to prevent.

Disruption Ahead for Ethernet Chips

The network industry is decoupling software from hardware, enabling new opportunities in the Ethernet switching market. Data centers keep demanding more performance because the titans of cloud computing: Apple, Google, Facebook and Amazon, are constantly starved for faster connections and old design methods are not keeping pace. Disaggregation is about to hit the Ethernet chip market.

Looking at the numbers, the market has experienced a substantial increase in merchant silicon and demand for higher speed ports. 100 Gbps revenue surpassed 10 Gbps revenue for the first time in 2Q18 within the data center, furthermore, 25 Gbps serdes lanes are now more common in that space than any other previous technology. This signals the clear future trend.

A prominent way forward for chip design is increasingly defined by the transition away from network switch ASICs--dependent mainly on TSMC's process node technology--to multi-chip or “chiplet” architectures. This evolution will shake up big silicon vendors such as Broadcom and Cisco who have done business in much the same way for years.

The fact that the switch has already been disaggregated in the cloud for nearly a decade means it's now time for silicon to follow a similar mindset. Today's 12.8 Tbps fabrics may offer a glimpse of this, but the next-generation 25.6 Tbps fabrics will have multiple ASIC vendors trying new architectures.

The implementation of multiple chips, along with programmability capabilities offers increased flexibility and lower costs, all while meeting performance requirements. Just as importantly, the movement will drive costs down as speeds go much higher, with 400Gbps as the next standards benchmark and bandwidth doubling every 18 months.

5G Handsets Spark Millimeter-Wave Debate

Engineers are racing to deliver the first smartphones supporting 5G networks, mainly targeting frequencies below 6 GHz. RF experts are sharply divided over whether any of the first batch of devices will support the standard's millimeter-wave bands.

All sides agree that the handsets face a fractured market, with carriers supporting frequencies spanning 600 MHz to 28 GHz, while some are also guarding details of their spectrum plans. As a result, an optimized 5G ‘world phone' could be years away.

First-generation handsets will probably target specific regions with devices that may be slightly larger or have shorter battery lives than last year's LTE models. Price tags may inch up to stay in line with bills of materials estimated to increase 10 percent to 30 percent just for RF front-end components.

The transition to LTE faced similar problems but not at the scale of 5G, the first cellular standard to support bands at 28 GHz and up. In addition, while many countries will support 5G in 100-MHz swaths in the 3.5-GHz band, that band won't be available in the US until sometime next year. Even then, the US spectrum is expected to be at much smaller bandwidths, less than ideal for 5G.

Easing Electronic and Photonic Integration

The IC industry is counting on integrated silicon photonics to enable performance gains in networking and high-performance computing (HPC) devices as early as 2021, but integrated photonics design today remains a fragmented market compared to decades-old CMOS EDA tool infrastructure and manufacturing infrastructure. Comparatively speaking, merging electronics with photonics is a costly and time-consuming mainly manual endeavor, requiring expertise in photonics and electronics as well as test, assembly and packaging (TAP).

Mentor, along with customer Hewlett Packard Enterprise (HPE), is hoping to change that with the introduction of the LightSuite Photonic Compiler, billed as the industry's first integrated electrical/photonic layout automation tool, with a promise to help engineers do in a matter of minutes what otherwise could take weeks when performed manually.

“Right now, if you want to design a chip with silicon photonics, you need to have a Ph.D.,” said Ashkan Seyedi, a photonics research scientist at HPE. “That works, but it's not scalable.” At the same time, companies such as Synopsys and PhoeniX are moving ahead with advanced photonic-electronic design tools.

With Moore's Law running out of steam, chipmakers are eyeing a host of new technologies to enable continued scaling of semiconductors. Integrated silicon photonics is among the most highly touted. The technology has been in development for more than 20 years, and vendors have been shipping discrete photonic ICs for several years. Integration with electronic circuits in the same package has yet to go mainstream, however, and still faces competition from more mature indium phosphide technology.

“Silicon photonics has not yet hit the big time because it's always too expensive, always two to three years away,” said Seyedi.

Seyedi's ambitious goal is to help silicon photonics reach its potential by helping to enable “the Arm model of silicon photonics” — cultivating and nurturing an ecosystem that runs throughout the supply chain to provide the necessary infrastructure.

New China Tariffs Hit Chip Industry Again

US President Donald Trump again ratcheted up the US-China trade war, levying tariffs on an additional $200 billion worth of imports from China including parts and materials used in semiconductor manufacturing.

The US Trade Representative released a list of about 5,745 types of Chinese imports that will be hit with an initial 10 percent tariff beginning Sept. 24. The tariff levied on these products is set to increase to 25 percent in January.

The new tariffs announced Monday escalate a trade war that has been building for months between the world's two largest economies. The US previously imposed a 20 percent tariff on about $50 billion worth of Chinese imports in two separate tranches. Both times, China swiftly enacted reciprocal tariffs on US products exported to China.

China has vowed to match all tariffs against its products imposed by the Trump administration. In a statement Monday, Trump warned that doing so would further escalate the trade war.

"If China takes retaliatory action against our farmers or other industries, we will immediately pursue phase three, which is tariffs on approximately $267 billion of additional imports," Trump said.

At the heart of the trade war is the two sides' failure to come to terms on a comprehensive trade agreement that Trump wants to reduce the US trade deficit with China — estimated to be about $375 billion last year. The administration has also sought protection from what it considers theft of American intellectual property and forced transfer of American technology.

Chinese products on Monday's list include raw silicon and other products used in chip making, including items such as quartz reactor tubes and holders designed for insertion into diffusion and oxidation furnaces for semiconductor wafer production. The list also includes smart cards, as well as technology products used in data centers and networking gear.

Samsung Reportedly Plans to Cut Memory Production

South Korea's Samsung Electronics is cutting back plans for memory production increases in an effort to keep supplies tight in the face of slowing demand, according to a report by the Bloomberg news service.

The report, which cites unnamed sources said to be briefed on the matter, said that Samsung now expects DRAM bit growth of less than 20 percent this year and NAND flash bit growth of about 30 percent. Samsung had said earlier this year that it expected DRAM bit growth of about 20 percent and NAND bit growth of about 40 percent this year.

After tremendous growth over the past 18 months amid shortages, the memory chip market is softening, with industry analysts warning of a looming downturn amid oversupply.

Qualcomm Says Apple Gave Stolen Trade Secrets to Intel

Qualcomm turned up the heat in its high-profile feud with longtime customer Apple, accusing the iPhone maker of feeding proprietary information about Qualcomm chips to rival Intel.

In a court filing made Tuesday, Qualcomm accused Apple of "a multi-year campaign designed to steal Qualcomm's confidential information and trade secrets." The filing alleges that Apple took information provided by Qualcomm software development tools and used that information to improve the performance and time-to-market of "lower-quality modem chipsets," including those made by Intel.

Qualcomm had been the supplier of baseband chips to every generation of iPhones since the first iPhone debuted in 2007. But the relationship between the two companies turned sour in 2016 over a royalty payment dispute. Last year, the two companies traded lawsuits in multiple venues, including a $1 billion suit filed by Apple against Qualcomm and a separate suit brought by Qualcomm against Apple in San Diego last November.

Teardowns of the latest iPhones announced earlier this month have confirmed that they use Intel chipsets exclusively.

In the November suit, Qualcomm alleged that Apple accidentally passed its confidential information to Intel. But Tuesday's filing, which amends the November suit, takes the claim much further, claiming that its analysis of discovery documentation found that Apple has "wrongfully acquired, failed to protect, wrongfully used, wrongfully disclosed and outright stolen" Qualcomm's confidential information and trade secrets and used it to divert its chipset business to Intel.

The suit is currently scheduled to go to trial in April 2019.

October 2018

When Boomers Retire Knowledge Goes With Them

As baby boomers retire and take knowledge with them, millennials are moving into decision-making positions, so says a survey by IEEE's GlobalSpec. Millennials (born between 1981 and 1996,) have moved into engineering jobs and are now moving into decision-making positions. But as baby boomers retire, are they taking their experience with them? Is their experience even relevant to millennials and do millennials care? Based on the findings in the 2018 Pulse of Engineering survey conducted by IEEE GlobalSpec, it appears that companies do little to pick the brains of experienced engineers before they leave.

Knowledge loss was one of many aspects examined in the survey. Others included the number of projects, reasons for leaving jobs, and the constant pressure to do more with less. Knowledge loss is often overlooked as a key issue to address prior to a pivotal employee's retirement until a need arises that employees remaining with the company fail to grasp or know how to remedy, but were routinely handled by now-departed workers.

Fig. 1 shows that 61 percent of the 2,236 respondents say that knowledge loss is extremely or very important. Of course, “extremely” and “very” are vague at best. You can add another 24 percent who say that knowledge loss is moderately important. Yet only 2 percent say that's not important. But we don't know who the people that gave such answers are. Perhaps they are consultants or work at startups with just a few young employees.

Some people prefer not to transfer knowledge, especially as they get older. Why? Because some want to have unique knowledge and see it as a way to keep their job. Many have witnessed this impact when a long-term employee who had been at the company so long this individual was the only one who could support customers still using previous generations of products. This is particularly true of capital equipment with decades-long utility. When long-term employees retire, their former companies frequently have no one left to support customers using what is today considered an ‘outdated' system. Of course, there are fewer of these old machines in use every year; however, as has been seen with the reemergence of 200mm process tools, not every aging tool is on its way to the scrap heap, Indeed, some companies today have literally built multi-million dollar enterprises around meeting the need of servicing and rehabilitating aging process tools.

One ideal that respects the expertise of older workers while developing a means to actively transfer needed knowledge between generations is to plan for the fact that senior personnel will leave the company for one reason or another while making knowledge transfer a compensated part of their overall employment package, which is far better than leaving these issues to chance.

Tariffs Will Depress Board Sales

The first and second quarters of 2019 look to be really bad for sales of graphics add-in boards, thanks to a 10 percent tariff on items imported from China scheduled to go to 25 percent at the beginning of next year.

AIBs carry graphics processors built primarily in Taiwan. The GPUs will not be subject to the trade-war tax, but the AIBs are mostly built in China, so they will be subject to the tariffs. The taxes will apply to motherboards in notebook and desktop PCs, too, so those products also will be more expensive.

The latest AIBs carry Nvidia's new RTX GPU. These are very powerful boards, compared to previous generations and thus are more expensive than usual, excluding the recent inflationary period created by demand from crypto-miners.

A founder's edition of the GeForce RTX 2080 TI costs $1,200, if you can find one imported before September 24, otherwise it will cost $1,320. A slightly less powerful version, the GeForce RTX 2080, sold for $800 before the tariff and $880 after the import tax.

After January you will pay $1,500 for a 2080 TI and $1,000 for the 2080. US consumers will be paying that tax, not the Chinese government as some politicians would like you to believe.

The Memory Party is Winding Down

After two years of heady growth, the party appears to be winding down for DRAM and NAND flash memory chips. DRAMeXchange, a research firm that tracks memory chip pricing, is now forecasting that the average selling prices (ASPs) for both DRAM and NAND will decline sequentially in the fourth quarter. The firm also expects ASPs for both products to decline significantly in 2019 compared to this year.

NAND pricing has been soft for several months now, declining by 10 percent in the third quarter, according to DRAMeXchange, which expects NAND ASPs to decline by 10 percent to 15 percent in the fourth quarter. The firm forecasts that NAND ASPs will decline by 25 percent to 30 percent next year, thanks to sluggish demand for consumer electronics and increased 3D NAND production capacity and yields among suppliers.

Contract prices for DRAM rose only 1 percent to 2 percent in the third quarter and are forecast to decline by 5 percent or more in the fourth quarter, ending a string of nine consecutive quarters of price growth, said DRAMeXchange. The firm predicts that DRAM prices will decline by 15 percent to 20 percent in 2019, but it added that the declines may be steeper if demand weakens for servers and smartphones.

Fab Tool Sales Growth Slows Again

Global billings by North American semiconductor equipment suppliers declined on a sequential basis in September for the fourth straight month in what the SEMI trade association described as a typical third quarter lull.

The three-month average of fab tool billings in September was $2.09 billion, up 1.8 percent from September 2017 but down 6.5 percent compared to the final August billings level of $2.37 billion, SEMI said.

Since peaking in May at $2.7 billion, monthly equipment billings have declined by 23 percent. However, 2018 tool billings remain well ahead of last year's record pace.

“Quarterly global billings of North American equipment suppliers experienced their typical seasonal weakening in the most recent quarter,” said Ajit Manocha, SEMI's president and CEO, in a press statement. "Relative to the third quarter, we expect investment activity to improve for the remainder of the year.”

Fully Autonomous Tesla? Not So Fast

It's been two years since Tesla started selling the full self-driving (FSD) option. Was Elon Musk too optimistic? Or did he know that his believers were willing to go down his yellow-brick roadmap?

A week ago, Tesla quietly pulled its full self-driving option from Model 3, Model S, and Model X. Tesla announced, or more accurately, Elon Musk tweeted on 18th October that FSD is currently off the menu because it's “causing too much confusion.”

This is no small matter considering that for two years, Tesla has been charging customers $3,000 to $5,000 for a so-called FSD upgrade. Tesla has not committed to refunding any of its FSD windfall.

Until recently, Tesla's original order page promised: “All you will need to do is get in and tell your car where to go.” It added: “Your Tesla will figure out the optimal route, navigate urban streets (even without lane markings), manage complex intersections with traffic lights, stop signs, and roundabouts, and handle densely packed freeways with cars moving at high speed.”

As Phil Magney, founder and principal at VSI Labs noted, “It's been two years since Tesla started selling the FSD option, and buyers of the upgrade have nothing to show for the thousands of dollars they spent. Musk implied that FSD was a year or two away, but Tesla doesn't seem to be much closer to FSD now than they were two years ago.”

The question, then, is whether “Musk was too optimistic about how quickly full self-driving capabilities could be achieved,” noted Magney.

Have we nailed ADAS yet?

We already know that the processing power necessary for autonomous vehicles is huge. But the problem doesn't end there. Ask any sensor technology supplier. You'd be hard-pressed to find a vendor who could state flatly that the automotive industry has even nailed ADAS yet, let alone fully self-driving cars.

Chris Jacobs, vice president of autonomous transportation and automotive safety at Analog Devices Inc. (ADI), acknowledged that barriers to full autonomy are still very high. Among the biggest are an obvious lack of legislation for highly automated vehicles (HAVs) and insurance policies associated with them. Neither issue is close to resolution, he said. But equally important are sensors with adequate high resolution and better algorithms.

“Today's ADAS system can function as a warning system,” said Jacobs. “But when it comes to Level 3 actuation, we still have a long way to go.”

NASA Lunar Orbital ‘Gateway' Presents Unique Electronics Challenges

On October 3, 2018, Lockheed Martin revealed the company's crewed lunar lander concept at the International Astronautical Conference in Bremen and showed how a reusable lander aligns with NASA's lunar objectives and future Mars missions as well as a planned ‘gateway' element that will enable future exploratory missions.

“The Lunar Orbital Platform, ‘Gateway,' will give us a strategic presence in cislunar space. It will drive our activity with commercial and international partners and help us explore the Moon and its resources,” said William Gerstenmaier, associate administrator, Human Exploration and Operations Mission Directorate, at NASA headquarters in Washington. “We will ultimately translate that experience toward human missions to Mars.”

“To carry out these directives, we will have to have an advanced high-powered propulsion system,” said Dr. Mike Barrett, NASA power and propulsion element manager. “To power the gateway, we are looking at a 50-kilowatt class spacecraft, although technically, 40 kilowatts would suffice for the power propulsion unit. This enables us to construct a power and propulsion unit that will be the gateway's primary source of power. As a craft, the gateway will contain a habitable module for US personnel, a second habitable module for international personnel, and also facilities for cargo and airlocks.”

To fulfill its role in space missions, the gateway must be able to maintain and to move between different orbits around the moon, as well as to launch vehicles.

“The plan is to have an Orion crew visit the station for one month each year,” said Barrett. “During the other 11 months, science and technology experiments will be conducted from the craft.”

Key to NASA's plans are the active engagement and collaboration of the commercial electronics sector. “Over the past few years, we have been talking about missions to Mars, and about getting a crew there,” said Tim Cichan, space exploration architect at Lockheed Martin, one of the companies that has been involved with NASA. “This included the concept of a single stage reusable lander that could be fueled in orbit and make missions to the surface and back.”

In developing this technology, software plays a major role, but so do electrical components like sensors. “We have to be thinking about the deep space and lunar environments,” said Cichan. “For example, when you are conducting missions to and from the moon's surface, you are going to encounter a lot of dust during landing and takeoff. The sensor-based technology must be able to tolerate this excess dust and maintain function.

Cichan said that dust tolerance will be addressed in design as will redundancy and failover strategies for electronics components subjected to planetary and deep space conditions such as high dust and radiation.

A second feature of the gateway power and propulsion unit is its reliance on solar power, which is abundant and unfettered in space. “A solar array must be able to supply energy to the electric power and propulsion system, as well as meeting other gateway power demands for human habitation and life sustaining systems and for technology demonstrations. We are in those discussions now—about determining how much power we will need and will be able to supply,” said Barrett.

Carmakers Envy Tesla's Whole Car OTA

Whether someone agrees with Elon Musk or not, the one thing every automotive OEM secretly admires about Tesla is its EV's ability to do over-the-air (OTA) software updates for the whole car.

Tesla doesn't merely send software updates to a telematics unit inside a vehicle to update maps and apps and other software inside in-vehicle infotainment systems. It can directly send software patches to a relevant, individual ECU, for safety, security or feature upgrades.

No carmaker anywhere, except Tesla, has yet been able to pull this off.

After all, there is no such thing as “bug-free software,” observed Egil Juliussen, director of research for infotainment and advanced driver assistance systems (ADAS) for automotive at IHS Markit. For that reason alone, today's software-rich vehicles should come with OTA capabilities to correct software errors. This corrective function is even more crucial for “connected vehicles,” Juliussen pointed out, in the event of an attacker exploiting software vulnerabilities to disable a car or harm passengers. When that happens, carmakers should be equipped to quickly update the vulnerable software to prevent further damages.

“Tesla is one exception. Its system architecture is built from the ground up to make OTA easier,” said Juliussen. “Most car OEMs don't have that luxury. They have to figure out a way to make incremental changes.”

TI Goes All in With IIoT

Texas Instruments unveiled on Tuesday its new generation of industrial microprocessors, Sitara AM6x, designed to further its position within Industrial Internet of Things (IIoT) applications. TI calls it “the industry's first multi-protocol gigabit time-sensitive networking (TSN)-enabled processor family.”

“If you've followed TI over the last few years, you've noticed that we've dramatically shifted our business focus to industrial and automotive markets,” said Adrian Valenzuela, TI's director of marketing for Sitara processors. “Our goal is to be the world leader in these applications.”

Indeed, in a recent Q3 earnings conference call, TI's head of investor relations, David Pahl, said, “We continue to focus our strategy on the industrial and automotive markets, where we have been allocating our capital and driving initiatives to strengthen our position. This is based on a belief that industrial and automotive will be the fastest growing semiconductor markets. They have increasing semiconductor content. And these markets provide diversity and longevity. All of this translates to a high terminal value of our portfolio.”

The upshot of this claim is that TI is rolling the dice on its new Sitara AM6x family, treating the processor as key to TI's future in the industrial market.

November 2018

Samsung Still Spending Heavily on Capex

Samsung Electronics said that it would cut its semiconductor capital spending slightly and warned that the memory market is headed for a seasonal slowdown after two years of spectacular growth.

Samsung said that it would cut its 2018 semiconductor capex to about 24.9 trillion won (about $22.6 billion USD), a decline of about 9 percent from last year's total of 27.3 trillion won. Factoring in exchange rate fluctuations, in US dollar terms, the 2018 target represents a decline of just 7 percent from Samsung's aggressive 2017 spending level, according to Bill McClean, president of market research firm IC Insights.

Despite slowing price growth for DRAM and NAND flash memory chips over the past few months, “Samsung is barley taking its foot off the gas pedal for semiconductor capital spending,” said McClean. While third-quarter spending declined about 29 percent compared to the second quarter, the company plans to spend about $6.2 billion on semiconductor capex in the fourth quarter, an increase of 55 percent compared to the third quarter.

Q3 Chip Sales Reach All-time High

Global semiconductor sales hit yet another all-time high in the third quarter, as the chip industry remains comfortably on track to pass its revenue record of (USD) $412 billion set last year.

Third quarter chip sales totaled $122.7 billion, an increase of 4.1 percent compared to the second quarter and 13.8 percent compared to the third quarter of 2017, according to the Semiconductor Industry Association (SIA).

“While year-to-year growth has tapered in recent months, September marked the global industry's highest-ever monthly sales, and Q3 was its top-grossing quarter on record," said John Neuffer, SIA president and CEO, in a statement.

The SIA, which reports chip sales statistics compiled by the World Semiconductor Trade Statistics (WSTS) organization, said the three-month rolling average of sales hit $40.9 billion in September, up 2 percent from August and up 13.8 percent compared to September 2017.

Year-to-year sales in September were up across every major product category and regional market, with sales into China and the Americas continuing to lead the way, according to the SIA.

AI Makes Consumers Cautiously Optimistic

As AI and other emerging technologies make deeper inroads into consumers' daily lives, Intel wanted to know how Americans feel about the future, so the company asked them in a new survey. As it turns out, ambivalence abounds. American consumers have had a complicated relationship with innovation. It's thrilling that technology enables seemingly superhuman things, whether traveling by air or carrying the world's accumulated knowledge in one's pocket. But they are also aware of the risks that come with these new ‘superhuman' abilities.

The Intel Next 50 report, a study of 1,000 US consumers conducted with research firm PSB, found that consumer sentiment toward the future of technology is a mixed bag. About 40 percent of consumers surveyed said that technology would bring improvements to communities around the world and into their own lives. Overall, the report found that consumers were optimistic and excited about technology, particularly continuing improvements to smartphones and laptops.

Still, 40 percent of respondents felt that, while technology will bring improvements, it could also cause new problems. One concern that emerged strongly is social isolation. Respondents said they rely on technology for daily tasks, including staying in touch with friends and family, but the majority thought that in the future people will spend less time actually interacting with each other meaningfully.

While many people admit they rely on their smartphones, no one likes being ignored by someone with their face glued to a handset screen. People need to remember that smartphones are still a relatively new technology. It's been less than a decade since they became so prevalent and integral in our day-to-day lives. Figuring out the right balance is still a work in progress.

Automation has often been an object of ambivalence, but AI raises the stakes. We'll soon be able to not only automate repetitive tasks but also tackle those requiring complex reasoning. This offers remarkable upsides when it comes to decision making, time management and overall competitive success. But without having a crystal ball, workers can't help but wonder if, instead of simply helping them perform their jobs better, AI will make them obsolete.

Interestingly, many consumers surveyed didn't recognize the role AI already plays in their lives. Despite the proliferation of voice assistants, predictive algorithms and other common AI applications, more than one-third of respondents don't think they own any technologies that use AI. And, many of the emerging technologies people report being the most excited about including advancements in genomic medicine, artificial materials for organ or tissue transplants and progress in renewable energy will in large part be powered by AI technology.

Consumer sentiment about AI may be complicated, but it's not all negative. In the end, most simply want technology to help, not control people or greater society; striking that balance remains an ever-present concern for many people.

AMD Beats Intel, Nvidia to 7nm

Advanced Micro Devices launched its first 7nm CPU and GPU at the lucrative target of the data center. It showed working chips that delivered comparable performance to Intel's 14nm Xeon and Nvidia's 12nm Volta.

AMD has yet to reveal many details about the new chips and their performance. However, analysts are generally bullish that the company will be able to continue a significant comeback since it launched its first Zen-based chips on a 14nm process in late 2016.

“We are all about high performance … The idea is to be incredibly ambitious and recognize it's a journey,” said chief executive Lisa Su. “AMD is totally committed to the data center. This is our space and this is where we will lead.”

She demonstrated a single 7nm Epyc x86 processor narrowly beating a system with two Intel Skylake Xeons in a rendering job. Separately, AMD showed benchmarks that roughly put its 7nm Vega GPU on par with an Nvidia V100 in inference tasks.

Startup Highwai showed the 7nm Vega running its AI simulation software for autonomous vehicle navigation. The AMD chip seemed roughly comparable to Volta GPUs, said Raul Diaz, chief technologist and co-founder of the company.

“We haven't had time to do any systematic comparisons yet,” he said, noting that AI training is the app most in need of more performance.

Samsung Unrolls Foldable Display

Samsung is months away from making a foldable display that will enable a smartphone to expand into a 7.3-inch tablet. The Korean giant aims to leapfrog Apple with the new form factor running a redesigned user interface now in beta.

Attendees at the Samsung Developer Conference (Devcon) got a brief glimpse of a working prototype of the new hybrid mobile system and heard a tutorial about how to develop apps for it.

Google said that it will create extensions to the next version of Android for foldable devices. It will also release APIs to support multi-windowing interfaces that Samsung described for its foldable. Samsung and Google aim to release an emulator for the new screen sizes and their behaviors for developers before actual devices ship.

Foldable displays have been on the mobile horizon for many years. How soon they expand from a high-end niche to broader use will depend on issues including the price and power consumption of the new displays.

Samsung is expected to keep the displays proprietary to its Galaxy devices for at least the first year. The company makes higher profit margins on displays than mobile devices, creating pressure to release the components as merchant products, said analyst Patrick Moorhead of Moor Insights & Strategy, predicting that the first foldable phones could sell for $1,599.

The new Samsung display required innovations in materials and manufacturing processes to withstand hundreds of thousands of folds and achieve a 45 percent thinner size, in part from thinner polarizer layer. Samsung gave few details on the internals of the display, but it did share some specs of the foldable display and a companion cover display for the device.

A Samsung engineer noted that the cover display has full functionality but will focus on simple functions. The main display can be split into up to three active displays. The cover and main display can show different or the same images.

Samsung's One UI targets use in both smartphones and foldables. It aims to simplify the clutter of icons that have spread as devices expanded their features.

Top 500 Supercomputer List Shows China, US Gains

China extended its lead in numbers of supercomputers located within the PRC, but the US made gains in overall performance in the rankings released Monday. The latest Top 500 list comes as China and the US are running neck-in-neck in a race to deliver before 2022 an exascale-class machine, which would be 10 times more powerful than today's largest systems.

China now has 227 Top 500 supercomputers, 45 percent of the total, while the US total fell to 109 (22 percent), an historic low. “That's a big gap,” said Jack Dongarra, a professor of electrical engineering and computer science at the University of Tennessee and one of the organizers of the Top 500 list.

Vendors based in the People's Republic have also taken leadership in supercomputing. China's Lenovo, Inspur and Sugon are the top three vendors in the current rankings.

Together with Huawei at No. 8, Chinese vendors installed 295 of the Top 500 systems. By comparison, four US vendors in the top 10, Cray, HPE, Dell EMC and IBM, installed 120 systems.

The good news for the US is it still leads in overall performance at 38 percent compared to 31 percent for second-place China. Its position was boosted significantly when Summit and Sierra, the No. 1 and No. 2 systems, submitted updated results that boosted their performance levels to 143.5 and 94.6 petaflops, increases of more than 15 percent and 30 percent, respectively.

“The Summit system (developed by the Department of Energy in its Oak Ridge National Laboratory) is quite impressive and represents 10 percent of the total Top 500 performance,” said Dongarra.

Sensing the Future: TDK Buys In

TDK Corp. has enviable brand recognition. The problem is, it's known for the wrong products, according to some company observers.

The TDK brand is tightly associated with magnetic tapes (audio and videocassette) and magnetic hard disk drive (HDD) heads. The company was founded in 1935 in order to market the world's first manufactured magnetic material, ferrite. Given its heritage, TDK has ample reason to be proud of its ‘magnetic' personality.

However, in the era of digital streaming and cloud computing, TDK has had to diversify or risk irrelevance. The good news is that the company has had practice reinventing itself. For decades, it has been buying its way into new technology and product markets through mergers and acquisitions. TDK's M&A maneuvers have nimbly shifted its portfolio, navigating the company through a series of treacherous market transitions without sinking its flagship lines.

TDK is now flexing its M&A muscles in the sensors segment.

In 2017, the company made its sensor ambitions clear when it spent (USD) $1.3 billion to buy InvenSense (San Jose, California), a leading supplier of motion sensors. TDK has amassed a range of sensor technologies through acquisition, but the InvenSense deal was a watershed moment.

Baidu Backs Neuromorphic IC Developer

Swiss startup aiCTX has closed a (USD) $1.5 million pre-A funding round from Baidu Ventures to develop commercial applications for its low-power neuromorphic computing and processor designs that enable what it calls “neuromorphic intelligence.” It is targeting low-power edge-computing embedded sensory processing systems.

Founded in March 2017 based on advances in neuromorphic computing hardware developed at the Institute of Neuroinformatics of the University of Zurich and ETH Zurich, aiCTX (pronounced “AI-cortex”) is developing “full-stack” custom neuromorphic processors for a variety of artificial-intelligence (AI) edge-computing applications that require ultra-low-power and ultra-low-latency features, including autonomous robots, always-on co-processors for mobile and embedded devices, wearable health-care systems, security, IoT applications, and computing at the network edge.

Dylan Muir, senior R&D engineer at aiCTX, told EE Times that the company is building end-to-end dedicated neuromorphic IP blocks, ASICs, and SoCs as full custom computing solutions that integrate neuromorphic sensors and processors. “This approach ensures minimum size and power consumption and is fundamentally different from most other neuromorphic computing approaches that propose general-purpose solutions as a plug-and-play alternative to parts of machine-learning tool chains with conventional data paths.”

Foxconn Reportedly Plans to Slash Billions in Costs

Contract manufacturing giant Hon Hai Precision, commonly known under the trade name Foxconn, plans to cut nearly (USD) $3 billion in costs in 2019 as it grapples with increased competition and iPhone production cuts by Apple, according to the Bloomberg news service.

The Bloomberg report, citing an internal company document, said Foxconn has spent about $6.7 billion over the past 12 months. The company plans to cut its iPhone business by $865 million next year and layoff 10 percent of non-technical staff, according to the report.

The revelation comes at a time of rising concern about slowing demand for the latest iPhones. Over the past two weeks, several known suppliers of components found in the iPhone have cut revenue estimates, citing uncertainty over iPhone demand.

Apple shipped 46.89 million iPhones in its fiscal fourth quarter, which closed in September, roughly flat with the same period a year earlier.

Foxconn is headquartered in Taiwan but has a large manufacturing presence in mainland China and elsewhere. In addition to iPhones, Foxconn builds other products for Apple as well as a laptop computers and a variety of other devices for tech OEMs.

Slowing Memory Market Cools Chip Growth

Semiconductor sales growth has slowed substantially in recent months as the rapid price increases that fueled the market's growth for more than two years has slowed to a trickle.

The three-month rolling average for chip sales growth slowed to 13.8 percent in January, the lowest level since November 2016, according to the World Semiconductor Trade Statistics organization. Sales for the quarter were up 13.8 percent year-over-year, the lowest rate of growth since the fourth quarter of 2016. In fact, it marked the first time since the first quarter of 2017 that the growth rate slipped below 20 percent.

The slowing sales growth has corresponded directly with a decline in the growth rate of DRAM and NAND flash memory prices, which have slowed dramatically in recent months as the memory industry shifts from a prolonged period of supply shortage to oversupply.

"The softening memory market has started to become a 'headwind' on total IC market growth," said market research firm IC Insights in a recent report. The firm is forecasting that overall chip sales growth will fall to 6 percent in the fourth quarter.

IC Insights, which forecasts that the semiconductor industry has entered a "cooling period" after a prolonged period of expansion, noted that the memory chip market grew by only 8 percent in the third quarter compared to the second quarter. By contrast, the firm said, the memory chip market grew by 18 percent from the second quarter to the third quarter last year.

Samsung Extending its Sales Lead Over Intel

South Korea's Samsung Electronics which overtook Intel last year to become the world's top chip vendor by sales is expected to widen its lead this year, according to market research firm IC Insights.

The firm projects that Samsung will have 2018 chip sales of $83.3 billion, 19 percent more than Intel. Last year, Samsung's chip sales total bested Intel's by about 7 percent

Cumulative chip sales for the top 15 semiconductor suppliers are projected to increase by 18 percent this year, 2 percentage points more than the chip industry as a whole, according to IC Insights.

In total, nine of the top 15 chip vendors are on track to post sales increases of greater than 10 percent, IC Insights said. Four of the top 15 are expected to have gains of more than 20 percent this year, the firm said.

Despite a slowdown in memory price increases in recent months, each of the top three memory chip makers (Samsung, SK Hynix and Micron Technology) are expected to grow sales by more than 25 percent over a very strong 2017, IC Insights said. Hynix is expected to be the fastest growing chip vendor among the top 15 this year, with sales forecast to increase by 41 percent, the firm said.

The Race for a Better EV Battery

The race to dominate the electric car market hinges as much on battery technology and improved recharging infrastructure as it does on sticker price, software updates, and styling. Which is why Chinese companies are investing massive sums in matching and surpassing Tesla's industry-leading battery technology and manufacturing capacity.

Nearly all of Tesla's capacity is focused on lithium-ion technology, but other approaches are emerging that promise to change the battery technology landscape to extend the range of electric vehicles. Increasing driving range to, say, the equivalent of a tank of gas could provide the inflection point that at last accelerates electric drivetrains past the internal combustion engine.

Aluminum- and zinc-air batteries

As new battery technologies emerge, new wireless schemes are also being demonstrated that could make recharging electric vehicle batteries as fast as filling a gas tank.

With lithium-ion battery technology perhaps approaching its own Moore's Law ceiling, researchers are branching out to pursue technologies like aluminum- and zinc-air batteries that are just now entering the market. The key to those emerging technologies is boosting recharging capability while demonstrating the ability to lower energy storage cost to the baseline of roughly $100 per kilowatt-hour.

By some estimates, zinc-air batteries could hit the electric car market by as early as 2019 and eventually could be cheaper, lighter, and safer than lithium-ion. Battery startups such as EnZinc and NantEnergy are promoting zinc battery technology as an alternative to lithium-ion as a cheaper, safer alternative with potential energy densities and recharging capabilities approaching lithium-ion.

NantEnergy's world's first zinc-air rechargeable cells come with proven global deployment - serving as a sole source of power to 200,000 people. Meanwhile, the company claims that the new rechargeable cells have already broken the $100/kWh manufacturing cost barrier.

Among the drawbacks of metal-air batteries is their susceptibility to corrosion. That means that promising new approaches like aluminum-air batteries can quickly lose their stored charge. Researchers recently reported in the journal Science on a corrosion-inhibiting approach that uses oil as a buffer to reduce corrosion. The approach might also work with zinc-air batteries, boosting their shelf life.

If these and other efforts pan out, lightweight, compact zinc- and aluminum-air batteries could provide backup power to electric cars, assert proponents.

Meanwhile, Tesla continues to ramp up lithium-ion battery production at its Gigafactory in Nevada and a planned battery factory in China. Tesla and battery manufacturing partner Panasonic claim about 60 percent of global electric vehicle battery output. Chinese rivals like Contemporary Amperex Technology Ltd. (CATL) are also pouring billions into an effort to become China's Panasonic. A key to success is leveraging growing domestic demand for electric cars.

Is GM Preparing for New Mobility Models?

GM's restructuring and planned closure of multiple sedan producing plants signals recognition of the challenges to come for traditional automakers charting roadmaps further than 2025.

With the announcement this week that General Motors is cutting its global workforce by 8 percent, many commentators have put forward their analyses of why with reasons ranging from tariffs to technology.

There's no doubt that technology plays a major part, with huge amounts of global resources being injected into autonomous and electric vehicle development, infrastructure and services. For example, in the last few days, the UK government announced a £25 million ($32 million) investment across three projects enabling the British public to experience self-driving vehicle services trials in London and Edinburgh by 2021. This is just part of a £90 million ($115 million) pledge for mobility services in the government's 2018 budget.

But more than technology, GM and other "traditional" automotive manufacturers will indeed be looking at how to deal with a whole new field of potential competitors for mobility — especially with a new generation of youngsters less hooked on owning cars and more willing to buy "transport-as-a-service" (TaaS). This gives an opportunity for non-automotive companies offering mobility services as users start consuming mobility in different ways.

As the global economy moves from personal transportation as a capital purchase to personal transportation as a service, the automobile industry could evolve in ways similar to how the PC industry evolved. The most profitable companies will be those providing the full-stack software (operating system), the central AI processors (CPU), and the cloud services. Sensors will become increasingly commoditized and more manufacturing will be outsourced (bending metal will not be a significant piece of the value). Companies positioning themselves as full stack companies for autonomous vehicles (AV) include Intel, Nvidia and Qualcomm.

At the top end of the supply chain could be the new ‘robo-taxi' companies and car rental companies providing TaaS, and potentially superseding today's automotive OEMs.

With this kind of future scenario, one can see why GM and others might be worried and need to rethink their entire strategies. Addressing this potential for disruption, in its just released Autonomous Vehicles Technology Report, investment bank Woodside Capital Partners (WCP) says the market should expect to see significant consolidation at all levels of the automotive value chain over the next five years.

The report flags parallels in history. For example, when the internal combustion engine was invented in 1886, this led to around 80 automotive startups by the mid-1890s; the market consolidated once the Ford Model-T hit production 15 years later.

The challenges facing today's legacy automotive OEMs could be analogous to those faced by Eastman Kodak during 1990-2010 when digital cameras moved from low-resolution nuisances to cameras that out-performed older models and smartphones entered the market to further destroy Kodak's film-centric business models. The CEOs of today's automotive OEMs are clearly aware of the challenges and are already looking at ways in which they can change the DNA of their companies to ensure they're not displaced like Eastman Kodak was.

Woodside Capital Partners says the first nine months of 2018 saw (USD) $4.25 billion of venture capital invested across 87 companies in the AV sector including ride sharing and electric vehicle (EV) companies. This is a doubling of the $2.1 billion invested across 104 companies in the same period in 2017.

AI a Focus as US Preps Export Controls

Uncle Sam wants to restrict a few good technologies and it needs engineers to help identify them.

As part of legislation passed this summer, the US Commerce Department put out a call for input by Dec. 19 on which of 14 broad emerging technologies should face export controls. The call quickly got attention from industry veterans and groups concerned that controls could hurt US companies and worsen a growing tech trade war with China.

The call issued on Nov. 14 listed aspects of biotech, AI, quantum computing, semiconductors, robotics, drones, and advanced materials as possible candidates. It gave special attention to AI, listing 10 specific areas ranging from computer vision and natural-language processing to AI chipsets. In semiconductors, it called out even broader areas including microprocessor technology, SoCs, stacked memory on chip, and memory-centric logic.

The effort aims to determine which emerging technologies could be strategic to national security and how to identify and control them without “negatively impacting US leadership in the science, technology, engineering, and manufacturing sectors.” It did not define the range of the controls except to say that, “at a minimum, it [would] require a license for [their] export … to countries subject to a US embargo, including those subject to an arms embargo.”

A government spokesperson said that the Commerce Dept. plans to publish proposed controls on emerging technologies after reviewing comments to its call. It will take public comments on the proposed controls before making them final, but the spokesperson gave no timeline for the process.

The Commerce Dept. is expected to issue a second call early next year for guidance on what it calls fundamental or more mature technologies, including semiconductors and manufacturing equipment. The actions stem from the Foreign Investment Risk Reduction Management Act (FIRRMA) aimed to use export controls to stem a perceived leaking of sensitive technologies, especially to China.

December 2018

Comms Chips Grew Fastest in Q3

Wireless communication chip sales increased by 12.3 percent in the third quarter, the fastest growth rate of any semiconductor category, according to IHS Markit.

Wireless communication chip growth was punctuated by Intel, which saw its sales in the category grow 39 percent compared with the second quarter. IHS Markit attributed this growth largely to increasingly reliance on Intel modem chips in the latest-generation iPhones. Apple has been relying heavily on Intel modems amid its ongoing feud and legal fight with Qualcomm.

Meanwhile, sales of memory chips increased sequentially for the 10th consecutive quarter, reaching $45.1 billion, IHS Markit said. Though memory chip pricing has weakened in recent months, the firm maintained that growth in memory chips was mainly driven by higher memory density in storage and the release of new mobile phones.

The data processing market also continued to show strength, growing sequentially by 6.8 percent, according to IHS Markit. Samsung has gained 3.8 percent in market share in data processing over the past eight quarters, while Intel has lost 4.8 percent the firm said. However, Intel's data processing revenue still dwarfs Samsung by 58 percent. Combined, the two firms now account for 52 percent of data processing chip sales.

Overall, Samsung continues to lead the semiconductor industry with overall market share of 16.2 percent, compared with 14.5 percent for Intel, IHS said. Intel grew by 12.6 percent in the third quarter compared to the second quarter, with Samsung posting 9.3 percent sequential growth, IHS said.

WSTS Bumps Up Chip Sales Forecast

The World Semiconductor Trade Statistics (WSTS) organization revised upward its forecast for 2018 revenue after sales again grew on both a sequential and annual basis in October.

WSTS now expects chip sales to reach $478 billion this year, an increase of 15.9 percent from 2017. The organization, comprised of 42 semiconductor companies that pool sales data, predicts that chip sales will grow by a much more modest 2.6 percent in 2019.

The adjustment brings the WSTS forecast more in line with third party market research firms who have been estimating more aggressive growth rates for the semiconductor industry this year. The WSTS said in June it expected chip sales to grow 12.4 percent this year, but later revised the forecast upward to 15.7 percent. IC Insights, for example, has been forecasting 15-16 percent growth for the industry this year since March.

WSTS forecasts that sales will grow across all regions and all major product categories, led by an increase of 33 percent in memory chip sales, 12 percent in discretes and 11 percent in optoelectronics.

Meanwhile, growth rates for October chip sales continued to trend toward much more modest growth than earlier this year. The three-month rolling average of chip sales hit $41.8 billion in October, up 1 percent from September and up 12.7 percent compared to October 2017, according to WSTS.

“Although strong sales of DRAM products continue to boost overall market growth, sales in all other major product categories also increased year-to-year in October, and all major regional markets posted year-to-year gains," said John Neuffer, president and CEO of the Semiconductor Industry Association (SIA), in a press statement.

In a separate statement, the SIA also welcomed what it called a ‘de-escalation' of trade tensions between the US and China following a meeting between US President Donald Trump and Chinese President Xi Jinping at the G20 summit in Buenos Aires. (This report was published prior to the detention of Huawei's CFO by Canada at US request on 1st December 2018.)

Neuffer said the SIA would be monitoring progress on China's trade practices that were flagged by the US government earlier this year, particularly around subsidies, intellectual property protection and forced technology transfers.

The White House said following the meeting between Trump and Xi that the two agreed to begin structural changes with respect to forced technology transfers, intellectual property protection, non-tariff trade barriers and other issues and that the two sides would aim to have an agreement on this issues in the next 90 days.

Trump agreed not to raise tariffs from 10 percent to 25 percent on $200 billion worth of Chinese products on Jan. 1, as had previously been planned, the White House said.

"Much is at stake as the two sides endeavor to set US-China trade relations on a more productive path," Neuffer said. "SIA remains committed to working with both governments to support their efforts to achieve a successful negotiated outcome in the short term, and towards a more constructive and open economic relationship in the long term.”

Also, China said during the meeting that it would be willing to re-examine Qualcomm's $44 billion proposed acquisition of NXP, which was not consummated after the two sides failed to gain the approval of China's Ministry of Commerce, according to the White House. But Qualcomm said Monday that it would not revive the transaction and that the company considers it a dead issue.

Mobile Networks Shutdown: A Sign of What's to Come?

An expired software certificate seems to have caused a shutdown in mobile networks in the UK and Japan last week, causing significant disruption. Is this a sign of things to come with hidden IoT devices everywhere?

Last week, the UK's O2 and Japan's SoftBank suffered major mobile network outages which caused significant inconvenience and disruption due to an expired certificate in the mobility management software on the networks. This is significant and has implications for anyone in the industry building devices and systems for the internet of things (IoT).

What exactly happened? Well, the second largest mobile network operator in the UK, O2 (which is part of Telefonica) was unable to provide data and voice services for almost 24 hours to its more than 32 million 2G, 3G and 4G service connections in the country. Its network is also used by mobile virtual network operators, like Sky Mobile, Tesco Mobile and GiffGaff.

And in Japan, SoftBank, which has around 40 million mobile users in the country, was hit by the same connection issue for about four hours. Reports suggest many businesses and services were affected, including airlines, railways, and logistics companies.

Both in the UK and Japan, the outages were blamed on Ericsson equipment and software programs. Ericsson issued a statement during the day, saying: “During December 6, 2018, Ericsson has identified an issue in certain nodes in the core network resulting in network disturbances for a limited number of customers in multiple countries using two specific software versions of the SGSN-MME (Serving GPRS Support Node - Mobility Management Entity).

An initial root cause analysis indicates that the main issue was an expired certificate in the software versions installed with these customers. A complete and comprehensive root cause analysis is still in progress. Our focus is now on solving the immediate issues.”

Börje Ekholm, the president and CEO of Ericsson, said the faulty software that caused the outages was being decommissioned, and apologized not only to its customers but also to their customers.

Arm Releases IoT Predictions for 2019

The end of the year brings predictions galore, and Arm has jumped on this bandwagon with its view on what it thinks will happen in the internet of things (IoT) in 2019. It also carried out a consumer survey to find out what end users think about IoT, machine learning (ML), artificial intelligence (AI), and 5G.

Here are Arm's IoT predictions:

Intelligent home goes mainstream. There'll be more availability of IoT home products from mainstream household brands, expanding past leading consumer brands and whitegoods to encompass mainstream lighting, irrigation, heating/cooling, and other household names. This is expected to bring increased automation and efficiency to everyday tasks.

Personalized delivery. Delivery options will start to see increasing flexibility. The combination of smartphones with GPS positioning data and the increased deployment of low-cost sensors to provide visibility and tracking of assets could allow delivery to customers anywhere, not just to specified hardcoded locations like a home or office.

Better health-care service. Deployment of sensors and better connectivity in hospitals will mean that hospital personnel will have real-time visibility into the location of their equipment and orders, bringing a better quality of service to patients and reducing the time to find critical medical equipment.

Smart cities. Drivers for the development of smart cities will mature from just those efforts seeking cost reductions (e.g., LED lights or better waste management) to better citizen engagement and more revenue streams (e.g., red-light violation detection, Wi-Fi hotspot, 5G services, smart towers, crime detection/analysis, information broadcast) with the help of advanced technologies like computer vision and ML.

Smart buildings. Smart buildings will increasingly move toward space optimization, object detection for safety/security, wayfinding, and asset tracking with the help of advanced technologies like locationing, computer vision, and ML.

Fab Tool Sales Expected to Decline in 2019

After what is expected to be a second-straight record sales year in 2018, the semiconductor equipment market is projected to decline by 4 percent next year before recovering to grow by more than 20 percent in 2020, according to the SEMI trade group.

In its year-end forecast, released on Wednesday (Dec. 12) at the SEMICON Japan trade show, SEMI estimated that fab tool sales will grow 9.7 percent this year to reach a record $62.1 billion. The forecast is consistent with other forecasts released earlier this year, despite slowing sales growth in recent months.

But the forecast calls for tool sales to decline to $56.6 billion next year before rebounding to grow 20.7 percent in 2020, reaching a new record high of $71.9 billion.

SEMI projects that the market for wafer processing equipment (the largest category of semiconductor production equipment,) will grow 10.2 percent this year to reach $50.2 billion. The chip test equipment market is expected to grow by 15.6 percent this year to reach $5.4 billion, while the assembly and packaging equipment market is forecast to grow 1.9 percent to reach $4 billion, said SEMI.

South Korea, paced by continued record spending by Samsung Electronics, is expected to remain the largest regional market for fab tools for a second-straight year in 2018. China is expected to leapfrog Taiwan in 2018 to become the second-largest market for chip equipment, growing at a rate of 55.7 percent, said SEMI.

For 2019, SEMI projects that South Korea, China, and Taiwan will remain the top three markets for chip equipment.

Fab Tool Billings Fall for First Time in 2 Years

Billings among North American manufacturers of semiconductor production equipment posted a year-over-year decline last month for the first time in two years as the fab tool market continues to cool after two years of white-hot growth.

The three-month rolling average of billings for North American chip equipment firms slipped to $1.94 billion in November, down 4.2 percent compared to October and down 5.3 percent compared to November 2017, according to the SEMI trade association.

“For the first time in over two years, billings of North American equipment manufacturers are down relative to the same month the year before,” said Ajit Manocha, president and CEO of SEMI, in a press statement. “After reaching historical revenues earlier this year, billings activity is decelerating in line with weaker growth expectations for 2019.”

November marked the first time that fab tool billings declined on a year-over-year basis since May 2016.

Despite slowing growth in recent months, global sales of semiconductor equipment are expected to set an all-time high for the second-straight year in 2018. SEMI recently forecast that global semiconductor equipment sales would grow 9.7 percent this year to reach $62.1 billion before declining by 4 percent in 2019. The trade association also projected that tool sales would bounce back to grow more than 20 percent in 2020 to reach $71.9 billion.

5G: Out of the Lab, Onto the Street

Everyone has heard about 5G for years, and now it's just about here. Or at least, the beginning is just about here. Sure, widespread use is still a few years away, but before handset manufacturers, embedded devices for industrial applications, connected cars, and the like begin to take hold, some infrastructure needs to be in place.

5G consists of many parts, the most significant being what's called 5G New Radio (5G NR). 5G is coming in stages, mostly following releases of 3GPP standards. Initially, 5G will combine with LTE to form what's called the “non-standalone” implementation wherein LTE handles the control and 5G NR handles the data. Over time and as millimeter-wave (mm-Wave) frequencies come online, 5G will migrate to a standalone version without the need for LTE.

While most of 3GPP Release 15 is complete, there will be some “drops” any day now, said Analog Devices' Thomas Cameron in “5G: Where is it and where is it going?” “Don't expect to see anything else regarding base station radios in Release 15. Late drops will likely have more to do with the higher network layers and protocols.” That's because having a working 5G NR is just the beginning. Network edges and cores will need upgrades before we see the full 5G implementation.

Testing and test equipment have closely followed 5G developments. Indeed, with the inclusion of millimeter-wave frequencies, researchers and now designers have been developing phased-array antennas that, thanks to the short wavelengths, can be quite small and will fit into handsets and industrial devices. Test equipment and techniques have begun adding millimeter-wave measurements. In “5G test gears up,” we see testing moving out of the lab and into production and network tests. Two well-known ATE companies have added 5G capabilities to their equipment. Another company known for its portable RF test equipment has engineers and technicians using its equipment in early deployments.

Another test technology for 5G is emerging: An optical standard called Optical Data Interface (ODI) is poised to connect modular instruments at data rates up to 80 Gbytes/s. That's even fast enough for 5G, Larry Desjardin reports in “Optical interfaces to address 5G test.”

Today, most of the 5G effort is focused on building the infrastructure. The billions of parts that will be needed for industrial and especially consumer devices will come on its heels. But where will these parts come from? There's more to 5G than the already-hyped modems. Devices will need other active and passive components. Hailey McKeefry reports on the 5G supply chain in “Building the Early Supply Chain Path to 5G.”

Silicon Saxony Shows How Business Will Continue Despite Brexit

Britain's exit from the European Union won't shut down the electronics industry. If anyone outside the UK hadn't noticed, the country and the greater EU is in quite a confused place at the moment. We have this thing which the nation affectionately calls "Brexit." Just over half the people who voted in 2016 in the UK's referendum on whether the nation should leave or stay in the EU voted to leave; and just under half voted to stay.

This has ended up completely polarizing as well as paralyzing the country's political machine. There is breaking news almost on a daily basis concerning the latest twists and turns in the saga. Last week, members of her own party tabled a vote of no-confidence in the prime minister, but she managed to hang on with a small majority voting to keep her in.

Government and the mainstream media talk about nothing else, and it's come to a head now that the UK prime minister needs to figure out how she'll get an unpopular compromise deal through parliament: on the one hand there's already a significant number of ministers saying they won't back it; and on the other hand the European leaders say there's no more room for negotiation — take it or leave it. An impasse, surely?

And where does that leave the tech industry? Fortunately for the country, while the British government appears to have no sense of reality, the tech sector is getting on with business, despite the uncertainty that could be caused by the UK leaving the European Union in March 2019 if there is no deal.

Businesses and entrepreneurs are getting on with trying to figure out how they will do business together despite Brexit. While there appears to be a constant stand-off between the UK's Conservative party, fighting hard for a clean break, and key leaders from Europe (especially Germany and France) trade agencies and business organizations have been busy creating positive dialog and fostering partnerships.

The state of Saxony is one example of this. Saxon Prime Minister Michael Kretschmer led a delegation of science and business representatives to London in October to ensure that his state, known for its microelectronics and automotive industry centers of competence, could continue its relationship with the UK.

Hartmut Mangold, the state secretary in the Saxon State Ministry of Economic Affairs said, “The UK is one of Saxony's main trading partners. If we get an unorganized Brexit, there will be customs regulations we didn't have before or plan for. We'll help businesses to figure out how to deal with those regulations; we'll help re-organize the value chains.” He added, “We are sailing through a big fog,” referring to the uncertainty created by all the political gesturing and games of one-upmanship being played by political leaders around Brexit.

Mangold was hopeful, though, saying, “We are hoping to get an organized Brexit — we hope that on both sides of the channel there will be common sense.”


Info
×
Search the news archive

To close this popup you can press escape or click the close icon.
Logo
×
Logo
×
Register - Step 1

You may choose to subscribe to the Silicon Semiconductor Magazine, the Silicon Semiconductor Newsletter, or both. You may also request additional information if required, before submitting your application.


Please subscribe me to:

 

You chose the industry type of "Other"

Please enter the industry that you work in:
Please enter the industry that you work in:
 
X
Info
X
Info