Franklin Templeton, one of the biggest US investment firms with over $714 billion in assets under management, has turned to data analytics to create value for its clients.

Randy Bean describes the changes made by Franklin Templeton in this article from Forbes:

To compete at the highest levels of customer performance in a highly data-driven industry sector, Franklin Templeton is invested in the transformation of its data management infrastructure to accelerate business value. Current and point-in-time data is being parsed and analyzed to support key business initiatives.

The result of this data management transformation is a modern, agile infrastructure that increases business efficiency and client returns. “The sources and volume of data for making investment decisions are steadily increasing, and this transformation of our data management platform will help us use data more efficiently,” said Chris Pham, SVP of data management and data science at Franklin Templeton.

Pham is responsible for the investment management data environment at Franklin Templeton. The firm has embarked on a transformation of its traditional data environment to a modern environment which Pham characterizes as a “completely different approach” to the process of data ingestion and data sourcing. Employing a data lake infrastructure, her team has been able to deploy the first iteration in the context what she describes as “an adaptive ecosystem” that features a hub-and-spoke model to support the firm’s equity, fixed income and other investment businesses.

Franklin Templeton has achieved notable success in business process improvements such as “backtesting,” a key solution that is free from bias where an analyst can evaluate how a proposed investment strategy would perform under historical conditions. The ease and speed of working with decades of time-series data has limited this functionality in the past. The second area of success has been in incorporating alternative data sources to evaluate investable assets, which like backtesting, requires concordance and normalization of data.