Quantcast
Channel: Systems Design Engineering Community » High-level synthesis
Viewing all articles
Browse latest Browse all 11

Technology Implications for 2016

$
0
0

Gabe Moretti, Senior Editor

Although it is logical to expect that all sectors of the EDA industry will see improvements in 2016, some sectors will be more active as they are more directly connected with the market forces that fuel the electronics consumers market.

Verification

Michael Sanie, Senior Director of Verification Marketing at Synopsys points to changes in the requirements for EDA tools:

“With the rise of the Internet of Things (IoT), Web 2.0 applications and social media comes the demand for devices that are smaller, faster and consume lower power, despite being equipped with increasing amounts of software content. As a result, SoC designs have grown tremendously in complexity. In addition, advanced verification teams are now faced with the challenge of not only reducing functional bugs, but also accelerating both software bring-up and time to market. The process of finding and fixing functional bugs and performing software bring-up involves intricate verification flows including virtual platforms, static and formal verification, simulation, emulation and finally, FPGA-based prototyping.  Up until very recently, each step in the verification flow was isolated, time-consuming and tedious, in addition to requiring several disjoint technologies and methodologies.

In 2016, the industry will continue to strive towards greater levels of verification productivity and early software bring-up.  This will be achieved through the introduction of larger, more unified platform solutions that feature a continuum of technologies enabling faster engines, native integrations and unified compile, debug, coverage and verification IP.  With this continuum of technologies being integrated into a unified platform solution, each step in the verification flow is further streamlined, and little time is spent in transitioning between steps. The rise of such platforms will continue to enable further dramatic increases in SoC verification productivity and earlier software bring-up.”

Semiconductor Processes

The persistent effort by EDA companies to follow the predictions of Gordon Moore, commonly known as Moore’s Law, are continuing in spite of the ever growing optical challenges that the present lithography process finds.

Vassilios Gerousis, Distinguished Engineer, Cadence points out that: “While few expect 10nm production, we will definitely see 10nm test chip products this year. Some will even hit production timelines and become actual product designs. At the same time, we will see more products go into production at the 14nm and 16nm process nodes. Designers are definitely migrating from 28nm, and even skipping over 20nm.”

Zhihong Liu, Executive Chairman, ProPlus Design Solutions also thinks that advanced process nodes will be utilized this year.  “In 2016, we’ll see more designs at advanced process technologies such as FinFET at 16/14nm, and even trial projects at 10nm and 7nm. It becomes necessary for the EDA community to develop one common tool platform for process development, CAD and circuit design to help designers evaluate, select and implement new processes. However, such tool platforms did not exist before. An ongoing concern within the transistor-level design community is EDA tools such as FastSPICE simulators for verification and signoff that are not as accurate or reliable as they should be. It’s becoming a critical need as project teams move to advanced nodes and larger scale designs that require higher accuracy and incur greater risk and higher costs to fabricate the chip.”

3DIC and Thermal

Michael Buehler-Garcia points to the increased use of 3D-ICs in design.  “3D IC packaging has historically been the domain of packaging and OSAT’s. Newer offerings are driving 3D implementation from the chip design perspective. With this change, chip design techniques are being used to analyze and verify the chip stack to ensure we eliminate integration issues, especially considering that chips in a 3D stack often come from different foundries, and are verified using different processes. In 2016, we project increased “chip out” physical and circuit verification that can be performed independently on each die, and at the interfacing level (die-to-die, die-to-package, etc.). In concert with this change, we are seeing a customer-driven desire for a standardized verification process – used by chip design companies and assembly houses to ensure the manufacturability and performance of IC packages – that will significantly reduce risk of package failure, while also reducing turnaround time for both the component providers and assembly houses. By implementing a repeatable, industry-wide supported process, all participants can improve both their first-time success rate and overall product quality. Initial work with customers and package assembly houses has proven the feasibility of this approach.  As we all know, standards take a long time, but 2016 is the year to start the process.”

Dr John Parry, Electronics Industry Vertical Manager, Mentor Graphics Mechanical Analysis Division adds that thermal considerations are also increasing in importance in system design.“The trend we see in the chip thermal-mechanical space is a stronger need for qualifying new materials, methods, packaging technologies and manufacturing processes. Our advanced thermal design software, thermal characterization and active power cycling hardware is helping customers to meet this expanding need.”

Emulation

One of the areas showing significant growth is the area of hardware based emulation.  Lauro Rizzatti, a noted verification experts stated:  “In 2016 and beyond, new generations of hardware emulators will be brought to market. They will have added capabilities that are more powerful than currently available, as well as support new applications.

Hardware emulation will continue to be the foundation of a verification strategy in 2016. This isn’t so much of a prediction but a fact as standard designs routinely exceed the 100-million gate mark and processor, graphics and networking designs approach one-billion gates. Hardware emulation is the only verification tool able to take on and verify these oversized, complex designs. It’s also true as new capabilities enable emulation to be a datacenter resource, enabling worldwide project teams access to this remarkable verification tool. In 2016, this will become the new normal as project teams better leverage their investment in hardware emulation and work to avoid risk.”

The importance of emulation was also underscored by Jean-Marie Brunet, Director of Marketing, Mentor Graphics Emulation Division especially in the area of hardware/software co-verification and software debug.  “Emulation is going mainstream. In 2016, its use will continue to grow faster than the overall EDA industry. Customers are starting to look beyond traditional emulation criteria – such as capacity, speed, compile time, and easy hardware and software debugging – to an expanding list of new criteria: transaction-based verification, multi-user / multi-project access, live and off-line embedded software development and validation, and data-center operation as a centrally managed resource rather than a standalone box in the testing lab. Enterprise management applications help automate emulation by maximizing uptime and supporting intelligent job queuing. This approach not only balances the workload, but also shuffles the queued workload to prioritize critical jobs.

Software applications will continue to shape emulation’s future. For example, emulation is driving a methodology shift in the way power is analyzed and measured. Now designers can boot an OS and run real software applications on SoC designs in an emulator. Real-time switching activity information generated during emulation runs is passed to power analysis tools where power issues can be evaluated. In 2016 there will be a steady stream of new emulator applications aligning with customers’ verification needs such as design for test, coverage closure and visualization.”

Power Management

Last year Sonics introduced a software based method for power management.  As Grant Pierce, the company’s CEO believes that this year the tool should see significant acceptance.  “SoC designs have hit the power wall and the time for dynamic power management solutions is here. This marks the beginning of a multi-year, SoC development trend where system architects and sub-system engineers must consider power as a primary design constraint. Reducing power consumption is now every electrical engineer’s concern–both to enable product success as well as to support reducing greenhouse gas emissions as a global issue. SoC designers can no longer afford to ignore power management without suffering serious consequences, especially in untethered applications where users expect all-day, all-month or even all-year battery life, even with increasing functionality and seemingly constant connectivity.

The majority of SoC design teams, which don’t have the luxury of employing dedicated power engineering staff, will look to purchase third-party IP solutions that deliver orders of magnitude greater energy savings than traditional software-based approaches to power management and control. While designs for mobile phones, tablets, and the application processors that operate them grow more power sensitive with each successive generation of highly capable devices, Sonics expects dynamic, hardware-based power management solutions to be extremely attractive to a broad set of designers building products for automotive, machine vision, smart TV, data center, and IoT markets.


Figure 1. Representation of an SoC partitioned into many Power Grains with a power controller synthesized to simultaneously control each grain. The zoomed view shows the local control element on each grain – highlighting the modular and distributed construction of the ICE-Grain architecture.

As our CTO Drew Wingard has stated, hardware-based power management solutions offer distinct advantages over software-based approaches in terms of speed and parallelism. SoC designers incorporating this approach can better manage the distributed nature of their heterogeneous chips and decentralize the power management to support finer granularity of power control. They can harvest shorter periods of idle time more effectively, which means increasing portions of their chips can stay turned off longer. The bottom line is that these solutions will achieve substantial energy savings to benefit both product and societal requirements.”

High Level Synthesis

Raik Brinkmann, president and CEO of OneSpin Solutions, noted:

”High-level synthesis will get more traction because the demand is there. As more designs get comfortable using the SystemC/C++ level, demand for EDA tools supporting the task will increase, including formal verification.  Additionally, algorithmic design will be driven further in 2016 by applications on FPGAs to reduce power and increase performance over GPU. That suggests FPGA implementation and verification flows will require more automation to improve turnaround time, a viable opportunity for EDA vendors.  Finally, verification challenges on the hardware/firmware interface will increase as more complex blocks are generated and need firmware to access and drive their functions.”

As it has done throughout its existence the EDA industry will continue to be the engine that propels the growth of the electronics industry.  We have seen in the past a propensity in the electronics industry to think ahead and prepare itself to be ready to offer new products as soon as the demand materializes, so I expect that the worst it can happen is a mild slowdown in the demand for EDA tools in 2016.


Viewing all articles
Browse latest Browse all 11

Latest Images

Trending Articles





Latest Images