Ai Technology

Meeting the Industry Challenge for Fast Inference Hardware

  • On premises or private data centers for applications using security video, medical images, or other sensitive data (also reducing cost for bandwidth to Cloud)
  • “Edge Analytics Servers” run inference on multiple streams of data (cameras, IOT sensors, etc.)
  • Currently addressed by x86 PC/server hardware with GPU add in card
  • Power consumption and peak performance less important than cost and form factor

Innovative cutting edge technology

Centaur’s microprocessor-design technology includes both a high-performance x86 core and the industry’s first integrated AI Coprocessor for x86 systems.

  • The x86 microprocessor cores deliver high instructions/clock (IPC) for server-class applications and support the latest x86 extensions such as AVX 512 and new instructions for fast transfer of AI data.
  • The AI Coprocessor is a clean-sheet processor designed to deliver high performance and efficiency on deep-learning applications, freeing up the x86 cores for general-purpose computing.

Centaur’s AI Coprocessor can classify an image in less than 330 microseconds while providing inference throughput equivalent to 23 high-end CPU cores from other x86 vendors. Since Centaur’s technology is fully compatible with standard PCs and servers, the integrated AI Coprocessor can be augmented with off-chip GPUs or other AI accelerators for system-level scalability.

The ISC Trade Show Demo

Attendees at the ISC East trade show in NYC saw Centaur’s new technology up close for the first time. The demo showcased video analytics using Centaur’s reference system with x86-based network-video-recording (NVR) software from Qvis Labs. In addition to conventional, real-time object detection/classification, Centaur was the only vendor at the show to highlight leading-edge applications such as semantic segmentation (pixel level image classification) and a new technique for human pose estimation (“stick figures”).

Voted One of the Great Places to Work of 2020

Learn More >