Ambarella's New N1 SoC Supports Up to 34 Billion-Parameter, Multi-Modal Large Language Models With Low Power Consumption, Enabling Generative AI for Edge Endpoint Devices

Ambarella_N1-LLM Press Image

Ambarella is demonstrating multi-modal LLMs running on its new N1 SoC series at a fraction of the power per-inference of leading GPU solutions. The company aims to bring generative AI—a transformative technology that first appeared in servers due to the large processing power required—to edge endpoint devices and on-premise hardware, across a wide range of applications such as video security analysis, robotics and a multitude of industrial applications.

Format

PNG

Source

Ambarella

Downloads