Another successful round of STAC Summits has concluded. These events in three cities focused on artificial intelligence; C++ optimization; time sync measurement standards; new ways to scale up Python analytics; architectures for strategy backtesting; realities of public and private cloud with high-performance workloads; the future of low-latency, long-distance communication; low-latency FPGA solutions; and much more.
Each year, the capital markets get more algorithmic. Whether it's trading, investing, or managing risk and collateral, the pressures of competition and regulation are fueling a need for increasingly massive computations and increasingly huge amounts of data.
For this reason, large firms (and even some smaller ones) have datacenters packed with tens or hundreds of thousands of cores dedicated to quantitative analysis, along with petabytes of storage, while many of them are starting to leverage cloud technologies.
Public and private cloud are high on the agendas of capital markets firms (often as explicit hybrid cloud agendas). The massive scaling and innovative pricing of public cloud platforms can substantially accelerate time to market and reduce costs. Private cloud frameworks like Kubernetes promise greater business agility, as well as cost savings through higher utilization.
Are you looking for our event dedicated to AI infrastructures? AI STAC
STAC's working definition of artificial intelligence (AI) encompasses both neural approaches like Deep Learning as well as non-neural machine learning approaches such as gradient boosted trees and genetic algorithms.
At the 5 June 2017 STAC Summit in New York, our panelists discussed "Wall Street and the race to IT agility", focused on public and private cloud. Panelists were: