Posted December 19, 2025
Research Note

Inference Hardware on the Edge

We recently conducted a study comparing multiple LLM benchmarking frameworks, including the STAC-AI™

The report is
now available
to STAC Insights subscribers

Reports

The reports and detailed configuration information are now available to eligible subscribers at the links above. To learn more about subscription options, please contact us.

 

This research note examines the rapidly evolving ecosystem of inference hardware for edge deployment in capital markets.

We explore available technologies, highlighting trends and useful categorization of products. We also discuss the different architectures by their ability to meet the demands of Latency (inference speed) and Throughput (inference volume) as well as the constraints imposed by the edge.

The report seeks to aid decision-makers in navigating the complexities of hardware on the edge.

Sign up to
our newsletter