SuanShu 2.0

We are proud to announce the release of SuanShu 2.0! This release is the accumulation of customer feedbacks and our experience learnt in the last three years coding numerical computation algorithms. SuanShu 2.0 is a redesign of the software architecture, a rewrite of many modules, additions of new modules and functionalities driven by user demands and applications, numerous bug fixes as well as performance tuning. We believe that SuanShu 2.0 is the best numerical and statistical library ever available in Java, if not all, platform. Here are highlights of the new features available since 2.0. –          ordinary and partial differential equation solvers –          Optimization: Quadratic Programming,  Sequential Quadratic Programming, (Mixed) Integer Linear Programming, Semi-Definite Programming –          ARIMA fit –          LASSO and LARS –          Gaussian quadrature/integration –          Interpolation methods –          Trigonometric functions and physical constants –          Extreme Value Theory Continuing our tradition, we will still provide trial license and academic license for eligible schools and research institutes. Moreover, we now provide another way to get a FREE SuanShu license – the contribution license. If you are able to contribute code to the SuanShu library, you can get a permanent license. For more information, see: http://numericalmethod.com/suanshu/ We hope that you will find the new release of SuanShu more helpful than ever in your work. If you have any comments to help us improve, please do let us know. Happy birthday to TianTians and Merry Christmas to all!... read more

Trading and Investment as a Science

Here is the synopsis of my presentation at HKSFA, September 2012. The presentation can be downloaded from here. 1. Many people lose money playing the stock market. The strategies they use are nothing but superstitions. There is no scientific reason why, for example, buying on a breakout of the 250-day-moving average, would make money. Trading profits do not come from wishful thinking, ad-hoc decision, gambling, and hearsay, but diligent systematic study. • Moving average as a superstitious trading strategy. 2. Many professionals make money playing the stock market. One approach to investment decision or trading strategy is to treat it as a science. Before we make the first trade, we want to know how much money we expect to make. We want to know in what situations the strategy will make (or lose) money and how much. • Moving average as a scientific trading strategy 3. There are many mathematical tools and theories that we can use to quantify, analyse, and verify a trading strategy. We will show case some popular ones. • Markov chain (a trend-following strategy) • Cointegration (a mean-revision strategy) • Stochastic differential equations (the best trading strategy, ever!) • Extreme value theory (risk management, stop-loss) • Monte Carlo simulation (what are the success factors in a trading... read more

Using SuanShu on Amazon EC2

Cloud computing is very popular nowadays. Delegating your CPU-intensive computation (or simulation) to the cloud seems to be a smart choice. Many of our users asked if SuanShu can be run on Amazon’s Elastic Compute Cloud (EC2), because SuanShu license requires a MAC address and they have no control on which machine being used when they launch an EC2 instance. Here comes a good news! Amazon Web Service (AWS) now supports Elastic Network Interface (ENI), by which you can bind your EC2 instance to a specified network interface. Therefore, you can license your SuanShu against the MAC address of the ENI, and launch an instance with the same ENI and MAC address. For details, please visit the blog here. User guide for ENI can also be found... read more

Data Mining

The good quant trading models reveal the nature of the market; the bad ones are merely statistical artifacts. One most popular way to create spurious trading model is data snooping or data mining. Suppose we want to create a model to trade AAPL daily. We download some data of, e.g., 100 days of AAPL, from Yahoo. If we work hard enough with the data, we will find a curve (model) that explains the data very well. For example, the following curve perfectly fits the data. Suppose the prices are Of course, most of us are judicious enough to avoid this obvious over-fitting formula. Unfortunately, some may fall into the trap of it in disguise. Let’s say we want to understand what factors contribute to the AAPL price movements or returns. (We now have 99 returns.) We come up with a list of 99 possible factors, such as PE, capitalization, dividends, etc. One very popular method to find significant factors is linear regression. So, we have Guess how well this fits? The goodness-of-fit (R-squared) turns out be 100% – a perfect fit! It can be proved that this regression is a complete nonsense. Even if we throw in random values for those 99 factors, we will also end up with a perfect fit regression. Consequently, the coefficients and t-stats mean nothing. Could we do a “smaller” regression on a small subset of factors, e.g., one factor at a time, and hope to identify the most significant factor? This step-wise regression turns out to be spurious as well. For a pool of large enough factors, there is big probability of finding (the... read more

The Role of Technology in Quantitative Trading Research

I posted my presentation titled “The Role of Technology in Quantitative Trading Research” presented in HKU-HKUST-Stanford Conference in Quantitative Finance. Dec 9, 2011. Workshop On Recent Developments Of Financial Mathematics (REDFIN2011). Dec 13, 2011. You can find the powerpoint here. Abstract: There needs a technology to streamline the quantitative trading research process. Typically, quants/traders, from idea generation to strategy deployment, may take weeks if not months. This means not only loss of trading opportunity, but also a lengthy, tedious, erroneous process marred with ad-hoc decisions and primitive tools. From the organization’s perspective, comparing the paper performances of different traders is like comparing apples to oranges. The success of the firm relies on hiring the right geniuses. Our solution is a technological process that standardizes and automates most of the mechanical steps in quantitative trading research. Creating a new trading strategy should be as easy and fun as playing Legos by assembling together simpler ideas. Consequently, traders can focus their attention on what they are supposed to be best at – imagining new trading ideas/strategies. Excerpts: In reality, the research process for a quantitative trading strategy, from conceptual design to actual execution, is very time consuming, e.g., months. The backtesting step, in the broadest sense, takes the longest time. There are too many details that we can include in the backtesting code. To just name a few, data cleaning and preparation, mathematics algorithms, mock market simulation, execution and slippage assumptions, parameter calibration, sensitivity analysis, and worst of all, debugging. In practice, most people will ignore many details and make unfortunate “approximation”. This is one major reason why real and paper... read more
NUMERICAL METHOD INC Selected as a Red Herring Top 100 Asia Tech Startup

NUMERICAL METHOD INC Selected as a Red Herring Top 100 Asia Tech Startup

Hong Kong, China – Numerical Method Incorporation Limited has won the Top 100 Asia award. Numerical Method Inc. publishes SuanShu, a Java math library, and AlgoQuant, an algorithmic/quantitative trading strategy research platform. Red Herring announced its Top 100 Asia award in recognition of the leading private companies from Asia, celebrating these startups’ innovations and technologies across their respective industries. Red Herring’s Top 100 list has become a mark of distinction for identifying promising new companies and entrepreneurs. Red Herring editors were among the first to recognize that companies such as Facebook, Twitter, Google, Yahoo, Skype, Salesforce.com, YouTube, and eBay would change the way we live and work. “Choosing the companies with the strongest potential was by no means a small feat,” said Alex Vieux, publisher and CEO of Red Herring. “After rigorous contemplation and discussion, we narrowed our list down from hundreds of candidates from across Asia to the Top 100 Winners. We believe Numerical Method Inc. embodies the vision, drive and innovation that define a successful entrepreneurial venture. Numerical Method Inc. should be proud of its accomplishment, as the competition was very strong.” Red Herring’s editorial staff evaluated the companies on both quantitative and qualitative criteria, such as financial performance, technology innovation, management quality, strategy, and market penetration. This assessment of potential is complemented by a review of the track record and standing of startups relative to their sector peers, allowing Red Herring to see past the “buzz” and make the list a valuable instrument of discovery and advocacy for the most promising new business models in Asia.... read more

Mean Reversion vs. Trend Following

AlgoQuant 0.0.5 just got released! This release is particularly exciting because you no longer need a license to use AlgoQuant. AlgoQuantCore now disappears forever. The source of the entire AlgoQuant project is now available: http://www.numericalmethod.com/trac/numericalmethod/browser/algoquant Maybe even more exciting is that we ship this release with two quantitative trading strategies: one mean reversion strategy and one trend following strategy. More information can be found here: http://www.numericalmethod.com/trac/numericalmethod/wiki/AlgoQuant#TradingStrategies The question remains: when do you do mean reversion? when do you do trend following? I will leave this to the reader to figure it out. 😀... read more

On White’s (2000) Reality Check

I was asked the following question the other day on White’s reality check. QUESTION: I was reading White’s (2000) paper (http://www.ssc.wisc.edu/~bhansen/718/White2000.pdf). It seems to suggest that to examine the pnl characteristic of a strategy, we should do bootstrapping on the P&L. I am wondering why this makes sense. Why scramble the P&L time series? Why not scramble the original returns series, e.g., returns of S&P 500, paste the segments together, and feed the “fake” returns series into the strategy again? Then you can generate as many P&L time series as you like. Then you can use them to compute the expected P&L, Sharpe-ratio, etc. Would the 2nd approach be more realistic than the approach suggested in the paper? ANSWER: 1) In the statistical literature, there are basically two approaches to tackling the data snooping bias problem. The first and more popular approach focuses on data. It tries to avoid re-using the same data set and can be done by testing a particular model on randomly chosen subsamples of the original time series or bootstrapping the original time series (e.g. return). However, one may argue that this sampling approach is somewhat arbitrary and may lack desired objectivity. See Chan, Karceski, and Lakonishok (1998) and Brock, Lakonishok, and Lebaron (1992) for examples of this sampling approach. Your proposed method basically falls into this category. The second and more “formal” approach focuses on models by considering all relevant models (trading strategies) and constructing a test with properly controlled type I error (test size). Unfortunately, this method is not feasible when the number of strategies being tested is large. White’s paper basically follows... read more

Quantitative Trading: Economist Approach vs. Mathematician Approach

Thank you Lewis for introducing me to the field of “Quantitative Equity Portfolio Management”. It opens my eyes to the other spectrum of “Quantitative Trading.” Apparently what Lewis considers quantitative trading is very different from what I consider quantitative trading. I call the former an economist approach and the latter a mathematician approach. This blog piece does a very brief comparison and points out some new research directions by taking the advantages of both. Briefly, the economist approach is a two-step approach. The first step tries to predict the exceptional excess returns alpha by examining its relationships with macroeconomic factors, such as momentum, dividends, growth, fundamentals and etc. The second step is capitals allocation. The focus in the economist approach is on identifying the “right” economic factors. The mathematics  employed is relatively simple: linear regression, (constrained) quadratic programming. The trading horizon is month-on-month, quarter-on-quarter, or even years. An example of such is factor model in QEPM. In contrast, the mathematician approach tries to predict the short-term price movement by building sophisticated mathematical models for the, e.g., price time series. The focus is on finding the right mathematics to better describe the statistical properties of price process, e.g., stochastic calculus, Markov chain. Macroeconomic and fundamental factors are not often used. The trading horizon is intra-day or seconds. An example of such is volatility arbitrage in different intraday time scales. One way to appreciate the differences is by looking at their trading horizons. When trading high frequency, the company fundamentals certainly have little relevance because, e.g., the quarterly earnings do not change second-by-second. The statistical properties of the price process dominate in these... read more

Java vs c++ performance

It is very unfortunate that some people are still not aware of the fact that Java performance is comparable to that of C++. This blog piece collects the evidence to support this claim. The wrong perception about Java slowness is by-and-large because Java 1 in 1995 was indeed slower than C++. Java has improved a lot since then, e.g., hotspot. It is now version 6 and soon will be version 7. Java is now a competitive technology comparing to C/C++. In fact, in order to realistically optimize for C/C++, you need to find the “right” programmer to code it. This programmer needs to be aware of all the performance issues of C/C++, profiling, code optimization such as loop unfolding, and may even need to write code snippets in assembly. An average Joe coding in C/C++ is probably not any faster than coding in Java. (I am in general against code optimization techniques because they make the code unreadable to humans, hence unmaintainable, such as a lot of the FORTRAN/C/C++ code found in Netlib and Statlib.) More importantly, most modern software runs on multiple cores. Code optimization techniques are dwarfed by parallel computing technologies. It is significantly easier and more efficient (and more enjoyable) to write concurrent programming code in Java than in C++. Therefore, to code high performance software, I personally prefer to code for multi-core, multi-CPU, and cloud in java rather than doing code optimization in C/C++. (I AM NOT SURE WHY FORTRAN SURVIVES IN 2011. HOW ARE YOU SUPPOSED TO READ THOUSDANDS LINES OF CODE ALL IN UPPER/LOWER CASES WITH A BUNCH OF C’S AND GOTO’S EVERYWHERE?)... read more

Shopping Cart