Philippe Jordan, President, CFM on the application of science to investment
Biography: Philippe Jordan is president of fund manager CFM International, which specialises in systematic alpha strategies and alternative beta strategies. Over half of the firm’s employees are data scientists and its data team collects, cleans, and manages terabytes of incoming data every day.
Jordan joined CFM in 2005 and has an extensive background in the alternatives space, having worked for a variety of global financial institutions including Credit Suisse, Daiwa and Oppenheimer. He has deep knowledge and understanding of the investment community developed through his time at CFM and the variety of senior roles he has held, including capital markets hedge fund coverage, fund-of-fund management and hedge fund incubation. He also sits on the Board of Directors of the Alternative Investment Management Association (AIMA).
How would you characterise the evolution of the industry as the use of passive and systematic models increases?
It is a continuation of what we have seen since the end of the last century. We have seen the acceleration towards an ‘engineered’ market place. Safe to say, the market’s destination is to be fully engineered across 95 per cent of all liquidity pools.
In parallel has been another revolution, which started maybe seven years ago, of processing data that is not structured. We are in the middle of an evolutionary phase, where people using technology and systems have started to work with unstructured data in a manner that is significant, as opposed to simple structured, price-based data.
That has led to the emergence of data science, as opposed to data management, in the last decade. It is not yet stable, it’s still a growing field. There is a lot of noise, there is an awful lot of bad data. People are trying to figure out what kind of unstructured data they can do something with that’s meaningful, as opposed to interesting. What kind of tools can they use to structure that data, like AI for instance? What kind of tools can they use that they are meaningful when using AI-type techniques – I use the term broadly in order to look at unstructured data – and come up with something that is meaningful in terms of statistics.
How do you see that playing out across the development of investment strategies, from quantitative to quantamental?
The more you push that process along, and the more you get an emergence of a borderline, where people have to start interjecting heuristics as opposed to simply using systems and stats, meaning the sort of a timeframe on the data. It might be very interesting data, but if there is not much of it and it is not very structured, and if you are going to use that and you think it is meaningful you are interjecting your heuristic at some point in time. Because there is simply not enough statistical surface to come up with nice neat key stats. The noise level in stats is very high. When people are talking about quantamentals, I think that’s what they mean. They get to the point where they are handling pretty esoteric data with not many data points. They are inserting a heuristic in order to make up for the lack of statistical significance. Lots of folks are kind of grappling and working with that. It’s generated immense fields.
There have been concerns about limits to the size of passive investment vs systemic active, do you think there are systemic limits to using systematic models for investment?
This is an ontological question, taking in the knowledge of the systemic risk and imbalances of the entire system. I think the only honest answer is, I don’t know, and we will find out. I suspect that, by and large, investments on the long-only side, have probably been more passive than people like to think. Lots of ‘actively’ labelled products ended up being very closely associated, in terms of beta and people hugged the beta pretty tightly, because nobody wants to end up being short beta. I suspect that was true in the past. There was an awful lot of activity, advertised activity, where by and large people tried to stick close to the indices, because of the associated risk of missing out on the rally. I am not so sure people have become that much more passive. The message that was developed and pioneered by the likes of John Bogle [founder of Vanguard Group] are just that much more cost efficient and clear about do you want to own beta, and at what price point do you think beta should be owned? It should be owned and it should be owned passively but there is a price point and a tag associated with that, which is a fraction of active management. That’s where the real rhetoric and tension point has been, as opposed to how much actual beta is in everybody’s portfolio.
Where do you see the most interesting new investment philosophies coming from, and are they translating into viable investment strategies at the moment?
There is a vast difference between ‘interesting’ and ‘meaningful’. The work we do is trying to sift through what is noise and what is meaningful. I don’t think the quantity of noise has ever been higher than it is today. Twenty years ago doing this kind of thing was considered esoteric, and somewhat marginal. Therefore, the field didn’t have as many people in it coming up with ideas and new kinds of data, and new engineered systems. Today systematic investment has, by-and-large moved into the mainstream. That in itself is generating so many interesting things; but how many of them are meaningful? Well, that’s tricky. I think sifting through the noise is more difficult than 20 years ago.
How are you building strategies that could deliver results in that more crowded environment?
With scepticism and curiosity. You can’t shut yourself down from the new and interesting forms of unstructured data that you can look at and test, but we approach it from the point of view of scientific scepticism. Until we have convinced ourselves through the work and the data, we have to take a jaundiced view of the latest new thing. That does not mean a dogmatic view, that shuts down exploration and curiosity. It is a careful balance between keeping a process-based, evidence-based discipline, in a field with many more opportunities than in the past.
What sort of resources are most important for you in doing that?
People. There is an assumption that our business is robots and algos, but you need an awful lot of people in order to run processes that are robust. Addressing bottlenecks in this world of opportunity in terms of ideas, in terms of data, in terms of markets that we can trade. That world is fraught with bottlenecks, and so where you invest your dollars in order to leverage an opportunity, really matters. Those dollars are invested in the people; engineers, scientists, middle office, analytics, folks that design our analytics. The biggest opportunity we have, and the biggest challenge we have, is hiring good people, training good people and evolving. Evolving in our ability to tackle the new markets, new products, new forms of data, while preserving the discipline. That’s tricky.
Do you find people from the outside have the skills necessary, or do you need to hone them in order to make them effective within the company?
We have to bring people from the outside that have skills we don’t, and then we have to turn them into experts in our company. So it’s a two-stage process. Everything is team-based in what we do, nobody operates the P&L independently, or starts a network of connectivity independently. It’s touch football right through the entire production line. That means they have to become experts in how we tackle problems, and how we engineer a solution.
What is your opinion of machine learning as a tool for investment managers?
Just that; it’s a tool. It is a tool that happens to be very mediatic today. It fascinates our imagination. Within the framework of markets, it’s a very interesting tool that we found that could be most effective in a field of execution, that is a high data intensity field. As you move away from dense data, we think it’s increasingly difficult to find the validity, even if the output is at times remarkable. But it is hard to validate. It is a fascinating field in general. If you are a young scientist today, you probably want to get involved in the theories of complexity, explaining these outputs. Today we have outputs, but we don’t have a theory of complexity to understand how we get the output. That’s a fascinating scientific field, I think it has a role to play in the problem of managing money in markets and risk constraints, but I don’t think it is all of the roles.
It creates challenges for regulators, for investors, as well as for the actual asset managers themselves…
It’s the ultimate black box today. Twenty years ago people used the term black box to refer to our business, but it wasn’t a black box. We didn’t want to spill the beans, but we understood every step of the process that led to an output. In the case of machine learning, you have some thin heuristic when you are training the data set, but as you go through the iterations and how you get to the final output, well, that is a black box today because we don’t have a theory of complexity to understand how you get to that output.
As technology evolves, will automation or the use of machine learning become more effective in markets with fragmented data, such as high yield bonds, and to what extent?
Yes. Machine learning is particularly good at helping with unstructured data sets and where you do have fragmented data over time, I think it will be very helpful in completing the engineered market, meaning we now trade stocks, futures, foreign exchange and CDXs and the frontier today is the corporate bond market. The corporate bond market is essentially a data problem, it is not that structured and it’s fragmented. That’s an area where applying AI to data is very interesting.
Do you think some of the traditional theories investment management is based upon – such as Markovitz’s modern portfolio theory (MPT) – will have to be reviewed as markets become more systematic?
[MPT] was a pretty decent and meaningful approach for the previous 30 to 40 years. As one’s ability to look at a portfolio via not only correlations but also via principle component analysis has evolved, firms must start thinking about their portfolios, not just through the prism of correlations, but through principal components. That may mean running through a very large surface of principal components and trying to understand what they are allocated to and once they have some understanding of that, deciding if they want to be allocated to the first two or three principal components, or if they want to be allocated equally to the next 25.
What are CFM’s goals for the next 12 months?
The big challenges for us reside over problematics of scalability. If you start a business, how scaleable is your infrastructure? How efficient is at delivering a solution in a manner that is efficient and cost effective? Managing bottlenecks is a by-product of growth, and we have a fair amount of growth across our product lines and the manner in which we offer them. Making decisions around where to apply your resources over a large product line, and a large opportunity set, are important. How you resource your organisation to address that, that’s difficult to do and that’s probably going to be a challenge for us, not just for the next 12 months. As long as we are growing and we are diversifying the types of products that we are offering, we will be grappling with that.
Are there any challenges specific to investors?
The challenge with investors has evolved. You have got a promise challenge, you have a cost challenge, you now have a client services challenge, and a communication challenge. You need to be good, hopefully excellent in some of these matrices, I don’t think you can be excellent in all, but you need to be good at a minimum across all of them. One has to compete not only on performance matrices and on cost, but on services, tools, flexibility and modularity. Those are all things that are not direct performance challenges, they are organisational challenges, you have to be able to deliver services and modularity, well to do that efficiently is not easy. The business is more competitive than I think it was, in the sense that you have added services and modularity that is taxing on an organisation.