Log in

The massive extinction of technical indicators

Recently Camaron gave a link to a very interesting webinars with the catching name The Mass Extinction of Technical Indicators?

The answer to this exctionction seems to be to use technical indicators in other ways. The key words are:

predictive turning times; 

predictive Institutional buy & sell times of algorithms;

predictive Institutional Support and Resistance zones;

predictive algorithmic time studies


The most common word is the word predictive. Then the question is:

How a technical indicator becomes predictive?

Technical indicators become predictive through the activity called data mining. 

What they are really offering is the result of their data mining process. The technical indicators are just ways to preprocess information then this information is feeded into a data mining routine. And you get results, eventually the results may also be another technical indicators or may be just BUY and SELL labels.


The key idea behind this wbinar is that the data mining would give you an edge. I can challenge this and to say that the data mining comes with data mining bias. And that already there are many commercial software and free software doing this from long time. Even the genetic algorythm in Metatrader is a decent data mining tool. So behind this claim there is nothing but HYPE.


The real question is how you do your data mining, and this those guys will never tell you because this is their proprietary secret. However the reality is that the market is getting more and more sophisticated. 

The first technical analyis was designed to beat three different categories of strategie:

-the common investors who were deploying simple strategies as buy and hold. In that case you follow their trend. They provide the trend with their mony and you just ride the tide.

-The common investors who panic and sell. The whole goal of the technical analysis was to analyse patterns of reversal of trend. This work remain valid today.

-The smart money. This is the most difficult, this is a situation when a large wealth is controlled by entity with the will to manipulate the market. This is the most challenging situation a technical analyst has to face. In fact there is not a true answer for that and that is why analysts prefer more liquid markets that hare somewhat more immune to this behaviour. 

It is difficult to solve this because those smart money they draw the market, they can drow patterns and  use them as a trap. Some studies like VSA (Volume Spread Analysis) were designed to counter this kind of phenomena but I won't bet my shirt on this. 

This stuff operated in classical markets. The classical markets are those markets for which is valdid the fractal market hypothesis by Edgar E. Peters. 

"New capital-market theory that combines fractals and other concept from chaos theory with the traditional quantitative methods to explain and predict market behavior. FMH takes into account the daily randomness of the market and anomalies such as market crashes and stampedes. It proposes that a (1) market is stable and has sufficient liquidity when it comprises of investors with different time horizons, (2) these investors stay in their 'preferred habitat' (time horizon), no matter what the market information indicates, (3) the available information may not be reflected in the market prices, and (4) the market prices trend indicates the changes in expected earnings (which mirror long-term economic trends). Proposed by Edgar E. Peters, author of the 1991 book 'Chaos and order In The Capital Markets' and the 1994 book 'Fractal Market Analysis: Applying Chaos Theory to Investment and Economics.' Also called different investment horizon theory. See also capital market theories."

Read more: http://www.businessdictionary.com/definition/fractal-market-hypothesis-FMH.html#ixzz2TMARhR3X

However with the advent of high frequncy this is largely challenged. In fact most of the activity is just one time horizon, the low latency horizon. Even the large investors need to time their entries within this horizon otherwise they will get butchered by the predatory algorythms. Of course this is a big approximation and simplification.

All this use of algorythms necessitates an extensive use of data mining. And data mining with different time horizons. We can say that different investors are using data mining with different time horizons. Do you see the slight difference here. In the past the time horizons were arelated with  buy and  sell decisions. Now the time horisons are related with data mining scheduled routine. 

And those combinations of routines may explain the relative steadiness of market states in the current market. 


  • jaguar1637 3124 days ago

    Yes, I agree

    a solution could be a bunch of indicators from the fractals class as it's written 

    but, sure those algorythm requires extensive use of data mining and a lot's of work of tuning, optimizing parameters.  In a practical way, as far as I see, big computers w/ big CPU and high memory are mandatory. 

    To perform this task, this should be made in C or in C++ or C#. In my humble opinion, MQ4L is now obsolete, or it must be connected to a database and a mathematical software like mathlab

  • JohnLast 3123 days ago

    The question I ask myself

    How could you know that the whole optimizing efforts were not in vain?

    Below those posts are my main ideas on the topic


    It looks like some strategies produce much more positive randomely generated results than others.

    This can be used as some kind of robusteness of the strategy. Comparing spinal implant with asctrend both using the logistic kernel I found out that the spinal implant is producing less randomely generated positive equity flows than the Asctrend.



  • jaguar1637 3123 days ago

    The optimization could be made w/ the genetic algorythm

    I just upload it