Menu Our services in the selected location:
  • No services available for your region.
Select Location:
Remember my selection
Your browser is out of date.

GSAM Perspectives

The Role of Big Data in Investing

EMAIL THIS

Note: Separate multiple email address with a comma or semicolon.

SEND
Send me a copy

EMAIL THIS

Note: Separate multiple email address with a comma or semicolon.

Your Name:

Your Email Address:

OPEN EMAIL TO SEND
Send me a copy

Observations and views from investment professionals across GSAM’s Quantitative Investment Strategies team on the role of big data in potential investments.

  • Osman Ali, Portfolio Manager, Quantitative Investment Strategies, GSAM
  • Takashi Suwabe, Portfolio Manager, Quantitative Investment Strategies, GSAM
  • Dennis Walsh, Portfolio Manager, Quantitative Investment Strategies, GSAM

Can you explain your investment philosophy and how access to big data has impacted how you invest?

Osman Ali: We are focused on creating data-driven investment models that can objectively evaluate public companies globally through fundamentally-based and economically-motivated investment themes. These models have historically utilized a large set of company-specific data like publicly available financial statement, as well as market data like prices, returns, volumes, etc. With the growth and availability of non-traditional data sources such as internet web traffic, patent filings and satellite imagery, we have been using more nuanced and sometimes unconventional data to help us gain an informational advantage and make more informed investment decisions.

What types of data are you analyzing and how does it differ from what you were looking at before the Data Revolution?

Takashi Suwabe: We identify strong businesses with attractive valuations, positive sentiment and a strong connection with positive themes that are trending in the markets. The types of data we analyze now are quite a bit more expansive than what we used 10 years ago. In the past, computers could only analyze structured data, or data that is easily quantifiable and organized in a set form. New technologies allow us to analyze unstructured data, or data that is not as easily quantified. These innovations enable us to interpret information from a much wider variety of sources, including language, images and speech for the first time.

Access to new types of data, along with the ability to capture and process that data quickly, has given us new ways to capture investment themes such as momentum, value, profitability and sentiment.

The Quantitative Investment Strategies Approach to Identifying Investment Opportunities

For educational purposes only.

How has your technology and infrastructure evolved to keep up with big data?

Dennis Walsh: New data storage technologies have created the infrastructure needed to capture, analyze and make informed decisions from new forms of real-time data. For example, the growth of distributed databases, where data is stored across several platforms in place of a single platform via a centralized database, allows for highly-scalable parallel processing of vast amounts of data. This can decrease processing time by several orders of magnitude for many applications. Unstructured data storage also allows for greater flexibility in onboarding and retrieving data from non-traditional sources and in managing large amounts of text-based information.

How do portfolio managers interact with the models that are analyzing data and making recommendations?

Takashi Suwabe: Data is the basis of our investment model, but the research and portfolio construction processes still require human judgement. Portfolio managers exercise their judgment when selecting the data and analytics that we use in investing, and also when reviewing and approving each trade in every portfolio. This is to ensure that all portfolio positions make sense—that they are economically intuitive and appropriately sized given current market conditions. We do not have a computer in the corner simply shooting out trades with no human interaction.

We are researching new factors and analytics that have an impact on stock prices, and our portfolio managers drive that research. Research success for us is not finding a new stock to invest in, but rather finding a new investment factor that can help improve the way we select stocks. Investment factors should be fundamentally-based and economically-motivated, and the data enables us to empirically test our investment hypotheses. We would never work in the opposite direction—observing relationships in the data that we would seek to justify or explain after-the-fact.

Practically speaking, portfolio managers also rely on their own practitioner experience and market knowledge to assess the future success of any investment factor. Certain market trends or risk environments may bode well for particular factors and poorly for others. This awareness allows our portfolio managers to more effectively assess risk on a real-time basis.

What kinds of boundaries are you pushing now and what do you see as the future of big data-driven investment approaches like yours?

Dennis Walsh: Active management has always been about uncovering opportunities before they are priced in by the broader market. The exponential growth in data is fueling our investment decisions and research agenda. We’re seeking to push boundaries by moving beyond conventional data sources and leveraging alternative forms of data to gain an informational edge.

Today, we’re able to process more data more quickly, in an effort to uncover insights and connections that aren’t as obvious to other investors. Given new data availability and the development of machine learning techniques to learn quickly from such data, we are only at the beginning of this Data Revolution that we believe is transforming every industry globally.

What kinds of machine learning data analysis techniques do you use?

Osman Ali: Machine learning techniques allow us the flexibility to create dynamic models that adapt to the data. Quantitative techniques in the past relied on more simplistic rules for ranking companies based on certain pre-determined metrics—take price-to-book, for example—newer machine learning techniques allow algorithms to learn and adapt from constantly changing data.

Natural language processing, or NLP, uses computers to read and interpret vast amounts of text, enabling us to incorporate textual data in multiple languages from a variety of sources. One of the more obvious NLP applications is to gauge sentiment in the text—is the tone in the news articles or research reports being published on a company positive or negative? An extension of NLP is topic modeling—summarizing a large body of text into topics and themes that are easily understood by humans, but can also be used for systematic analysis in statistical and machine learning applications. For example, what subjects did company management focus on in their earnings call this quarter versus last quarter? 

NLP also allows us to pick up on subtle relationships between companies that might otherwise go unnoticed—we call this intercompany momentum. Traditional momentum focuses on the persistence of price movements for a single security, while intercompany momentum seeks to understand how the movement in price of one security might impact, albeit subtly, the movement in price of other related securities. These not-so-obvious relationships can be assembled from the clustering of companies in text-based data, appearing together in news articles, regulatory filings or research reports.

What is your approach to big data in Emerging Markets?

Dennis Walsh: We feel that the information asymmetry in emerging markets may create opportunities for data-driven investors like ourselves. A lack of available data is a sign of mispricing and uncertainty, and investors who are diligent enough to analyze and uncover potential opportunities in this environment may be rewarded. With 4,000 companies in the emerging market universe, spanning 23 countries across 6 continents, it can be a challenge to capture and digest vast amounts of disparate information, especially since the quality of data or reporting governance standards in some of these countries is lacking. Our experience and sophisticated techniques make us well-positioned to act in this space and analyze potential investments without necessarily requiring us to have analysts locally based around the world. This centralization of data processing is more scalable and allows us to cover a wider breadth of companies when compared to traditional methods.


Related Perspectives

GSAM Perspectives
Big Data is Fundamental

GSAM’s Fundamental Equity and Fixed Income teams discuss how big data can be a strategic advantage or disruptive force in companies and sectors.

GSAM Perspectives
The Political Gets Analytical

Big data has become a powerful force in the election process and is likely to serve an increasingly central role in future political camaigns.