Today’s AI market model features big tech companies dominating the landscape. But the mainstreaming of blockchain technologies, on-device AI, and edge computing into a new AI modelling paradigm called federated learning signals a future that might be more diverse and democratic.

Santanu Bhattacharya explains how federated learning can change AI in this report from Factor Daily:

Standard machine learning models require centralizing of the training data on one machine or in a data center. For example, when an ecommerce startup wants to develop a model to understand its consumers’ propensity to purchase products, it runs the models on the data collected from its website or app. Such data may include the time spent on a particular product page, products bought together, products browsed but purchased, etc. Typically, up to thousands of data points are collected on every user over a period of time. Such data are parsed and sent over to a centralized data center or machines for computation.

Recently, a new approach was considered for models trained from user interactions with mobile devices: it is called Federated Learning. Federated learning distributes the machine learning process over to the edge. It enables mobile phones to collaboratively learn a shared model using the training data on the device and keeping the data on the device. It decouples the need for doing machine learning with the need to store the data in the cloud.

[…]

Federated learning allows for faster deployment and testing of smarter models, lower latency, and less power consumption, all while ensuring privacy. Also, in addition to providing an update to the shared model, the improved (local) model on your phone can also be used immediately, powering experiences personalized by the way you use your phone.

In the next few years, model building and computation on the edge, based on federated learning and secured with homomorphic encryption, will make significant progress. As one billion-plus smartphones equipped with AI chips and possessing significant computing power get into the market in the next three to five years, many of the ML models will be able to run locally on these mobile devices. Distributing the heavy-duty analytics and computations over smartphones “on the edge”, as opposed to central computing facilities, will drastically reduce time to develop data products such as hyper-personalized recommendation engines, e-commerce pricing engines etc. Enterprises will embrace a distributed machine learning model building framework for taking advantage of faster model deployment and to provide quicker response to fast-changing consumer behavior, besides a vastly reduced cost.