Specializes in Internet-related services and products, Google has introduced another module for its TensorFlow Machine Learning framework. The new framework, known as TensorFlow Federated (TFF), designed to make it easier to test with Machine Learning and other computations on decentralized data. As part of the announcement, it pursues hot on the heels of TensorFlow 2.0 alpha.
In its statement, Google noted that they have designed TFF on the basis of their experiences with building the federated learning technology at Google, where it powers Machine Learning models for mobile keyboard predictions and on-device search. They also said that with TFF, they put a flexible, open framework for locally accelerating decentralized computations into the bag of all TensorFlow users. The new TensorFlow framework allows developers to implement an AI system and train it across data from various sources, all while keeping each of those sources detach and local. It comes with an API, Federated Core API, which supports a wide range of computations over a decentralized dataset, and which presents a new data type that identifies both the core data and the location of that data on distributed clients.
Moreover, TensorFlow Federated comprises a local machine runtime that reproduces the computation being deployed across dataset-holding clients, where each client computes their local contribution and a centralized coordinator collecting all contributions. But from a developer point of view, the search engine giant pointed out that the federated computation is basically a function with inputs and outputs that happen to exist in diverse places.