![]() It looks to us that Synopsys has the pole position to benefit from the trend. Google and NVIDIA are researching a similar approach, and now Samsung has announced the company has working silicon back. The impacts will be profound, and all chip designers should take note of the momentum this movement is generating. ![]() More AI in the design process could indeed enable the expansion of the software-defined hardware concept to software- designed hardware, making it both possible and economically attractive to deliver many flavors of acceleration to match the needs of the most intricate data-driven applications. ![]() We are just beginning to imagine a future where AI will be used to design chips, including other AI chips, to be far more efficient and powerful. More AI in the design process could indeed enable the expansion of the software- defined hardware concept to software- designed hardware, making it both possible and economically attractive to deliver many flavors of acceleration to match the needs of the most intricate applications. With more global system level optimization on its way, we could have the ability to create new, personalized chips, in just weeks. AI-driven design systems like DSO.ai have delivered the productivity to accelerate months-long design tasks down to days. Personalizing chips could deliver 1,000X better performance and energy efficiency but here comes the problem: it currently takes 2-3 years to put a new idea into an actual socket.ĪI could be the answer. It is based on the premise that chip could become personalized to the needs of specific applications, putting software in direct control of the instruction set architecture (ISA), chip structure (microarchitecture), and implementation method (silicon technology). Software-defined hardware has been proposed by industry luminaries as an elegant solution. As more and more software applications become data-driven, and neural networks are already crossing the trillion-neuron mark, how is the semiconductor industry going to deliver the petaflop-months of compute required for AI from the data center to the edge? Software is eating the world, but it is AI that is now eating the software. de Geus envisions a future state of the art that could cut design time from many months to just weeks.įrom Software-Defined to Software-Designed Hardware And like the first release of DSO.ai, this technology would save significant design time. Design teams could opt for higher performance (frequency), lower cost (area), or combination of all three, depending on business and market objectives. ![]() de Geus shared data from applying this second-generation AI to real design cases, demonstrating an astonishing 28% power reduction – that is over a full manufacturing technology node worth of scaling – by managing the exploration of many choice-points the chip design team could consider across this massive search space. de Geus announced that the company has taken Reinforcement-Learning-based design technology, two steps further, adding optimization for a chip’s architectural structure and end-to-end application behavior. ![]() Synopsys Co-CEO Aart de Geus held the keynote address at the annual Hot Chips conference, which took place virtually on August 23-24. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |