MathWorks and Altera Harness AI to Drive Faster Development of 5G and 6G Wireless Systems - MATLAB & Simulink

MathWorks and Altera Harness AI to Drive Faster Development of 5G and 6G Wireless Systems

Collaboration Enhances Efficiency and Reduces Costs for 5G and 6G Networks Using AI-Based Autoencoders to Compress Channel State Information Data

Natick, MA - (4 Mar 2025)

MathWorks, the leading developer of mathematical computing software, and Altera, an Intel company, today announced a collaboration to accelerate wireless development for Altera FPGAs by enabling wireless systems engineers to use AI-based autoencoders to compress Channel State Information (CSI) data and significantly reduce fronthaul traffic and bandwidth requirements. Engineers working on 5G and 6G wireless communications systems can now ensure user data integrity and maintain wireless communications systems' reliability and performance standards while reducing costs. 

“The collaboration between MathWorks and Altera enables organizations to harness the power of AI for a wide range of 5G and 6G wireless communications applications, from 5G RAN to advanced driver-assistance systems (ADAS),” said Mike Fitton, vice president and GM, Vertical Markets at  Altera. “By utilizing our FPGA AI suite and MathWorks software, developers can streamline their workflow from algorithm design to hardware implementation, ensuring their AI-based wireless systems meet the rigorous demands of modern applications.”

MathWorks offers a comprehensive tool suite that enhances AI and wireless development, particularly for Altera FPGAs. Deep Learning HDL Toolbox™ specifically addresses the needs of engineers looking to implement deep learning networks on FPGA hardware. Leveraging the capabilities of HDL Coder™, this innovative toolbox empowers users to customize, build, and deploy an efficient, high-performance Deep Learning Processor IP Core.  This advancement significantly enhances performance and flexibility in wireless applications by supporting standard networks and layers.

"AI-enabled compression is a powerful technology for the telecommunications industry," said MathWorks Principal Product Manager Houman Zarrinkoub. “MathWorks software offers a robust foundation for AI and wireless development. By integrating our tools with Altera's FPGA technologies, wireless engineers can efficiently create high-performance AI applications and advanced 5G and 6G wireless systems."

FPGA AI Suite offers push-button custom AI inference accelerator IP generation on Altera FPGAs using the OpenVINO toolkit, utilizing pre-trained AI models from popular industry frameworks. It further helps FPGA developers integrate AI inference accelerator IP seamlessly into FPGA design using best-in-class Quartus® Prime Software FPGA flows. Combining the Deep Learning Toolbox and the OpenVINO toolkit creates a streamlined path for developers to optimize AI inference on Altera FPGAs.

Wireless engineers interested in getting started with deep learning deployment on Altera FPGA devices can click this link: mathworks.com/help/deep-learning-hdl/inteldeeplearning/ug/get-started-with-deep-learning-fpga-deployment-on-intel-arria10-soc.html. For more information on how MathWorks software can accelerate wireless and AI development for Altera FPGAs, visit Simulink®, 5G Toolbox™, Deep Learning Toolbox, HDL Coder, and Deep Learning HDL Toolbox™.

About Altera

Altera® is a leader in field-programmable gate array (FPGA) technology, providing highly customizable solutions for applications across industries, including embedded, edge, network, enterprise and cloud. The company's latest FPGA offerings, including Agilex™ 5 series is industry's first FPGA with fabric infused with AI tensor blocks for increased compute density for embedded, edge computing and accelerate AI workloads.

Altera® delivers a broad portfolio of custom logic solutions — FPGAs, SoCs, CPLDs together with software tools, intellectual property (IP), embedded processors offering versatile and efficient solutions for accelerating AI tasks, particularly in applications where customization, deterministic, low latency, tight integration and energy efficiency are critical. The long product life cycles and the re-programmablility help future-proof AI designs with the fast-evolving AI trends.

FPGA AI Suite provides complete software solution to rapidly convert and optimize trained AI model to custom AI inference accelerator and deploy on Altera FPGAs enabling seamless collaboration between data scientists and FPGA engineers with easy-to-use tools.  For additional information, visit intel.com/fpgai.  

About MathWorks

MathWorks is the leading developer of mathematical computing software. MATLAB, the language of engineers and scientists, is a programming environment for algorithm development, data analysis, visualization, and numeric computation. Simulink is a block diagram environment for simulation and Model-Based Design of multidomain and embedded engineering systems. Engineers and scientists worldwide rely on these products to accelerate the pace of discovery, innovation, and development in automotive, aerospace, communications, electronics, industrial automation, and other industries. MATLAB and Simulink are fundamental teaching and research tools in the world's top universities and learning institutions. Founded in 1984, MathWorks employs more than 6,500 people in 34 offices around the world, with headquarters in Natick, Massachusetts, USA. For additional information, visit mathworks.com.

MATLAB and Simulink are registered trademarks of The MathWorks, Inc. See mathworks.com/trademarks for a list of additional trademarks. Other product or brand names may be trademarks or registered trademarks of their respective holders.