Log in

Group activity

  • jaguar1637 commented on a bookmark Particle Swarm Optimization 2487 days ago
    A more logical function w/ the dimension added staticdouble ObjectiveFunction(double[] x, int dim )    { z0=x[0]; z1=x[0]; y0= xx[z0][dim] ; y1= xx[z1][dim] ; return( 3.0 + (y0* y0) + (y1 * y1)...
  • jaguar1637 commented on a bookmark Particle Swarm Optimization 2487 days ago
    Wouaaaaaahhhh in the article, the function replacing Rosenbrock , Ratigrin .. is The base of the code in thes article, is the Dimension is just 2 , so , it does not explore...
  • JohnLast commented on a bookmark Particle Swarm Optimization 2487 days ago
    The code of klot is so interesting because he shows how to integrate the learning algorithm with Metatrader. Even if the genetic algorithm is different they are close relatives. There is an algorithm for MT5 but the author says that it was just for...
  • JohnLast commented on a bookmark Particle Swarm Optimization 2488 days ago
    The whole code from the article is: using System;namespace ParticleSwarmOptimization{  class Program  {    static Random ran = null;    static void Main(string[] args)   ...
  • JohnLast commented on a bookmark Particle Swarm Optimization 2488 days ago
    Est-ce que c'est toi qui a fait ce code? Je demande car il est different du code du bookmark. Moi je pensais faire exactement la meme chose. Cependant la fin ne serait pas: staticdouble ObjectiveFunction(double[] x)   ...
  • jaguar1637 commented on a bookmark Particle Swarm Optimization 2488 days ago
    This article opens quite a new debate here, becoz, the PSO is defined by  // The algorithm can be written as follows:// 1. Initialize each particle with a random velocity and random position.// 2. Calculate the cost for each particle. If the...
  • jaguar1637 commented on a bookmark Particle Swarm Optimization 2488 days ago
    Well, I check and re-check again the stuff : it's quite complicated, but instead of all the complex functions for calculating the min value from Rosenbrock (a, DIMENSION,val) into 2 dimensions X & Y double ObjectiveFunction(double x[]) { double...
  • JohnLast commented on a bookmark Particle Swarm Optimization 2488 days ago
    This article is very good because it explains how the Particle Swarm Optimization works with numeric example. It is really great mixing explanation with code.  As it looks there is a lot of reason in the nature. You can find reason and...
  • jaguar1637 commented on a bookmark Particle Swarm Optimization 2488 days ago
    Hi JOhn I trie to create a Particle Swarm optimization (Thank you to you Tovim),  while working of plenty of releases, I was able to see the famous and relevant Big Bird , But in another release, I  lost it and was unable to retrieve the...
  • JohnLast bookmarked Particle Swarm Optimization 2488 days ago
    This is very interesting article describing the Particle Swarm Optimization. There is code available in C#. I think that if we can understand how klot (check this link for the EA) made his custom genetic optimization algorythm and bypassing the...
    Comments
    • jaguar1637 2487 days ago

      return( MathExp(gamma* ( (y0* y0) + (y1 * y1) ) );

      is like

      return(MathExp(gamma* (MathPower(y0,2) +MathPower(y1,2) )));

       look at this function

    • JohnLast 2487 days ago

      MathExp(gamma*(-x1*x1-y1*y1));

      Produces:

      image

    • jaguar1637 2487 days ago

      Well, we should get something very interesting

      before going forward, I would ask Vegastart to tell us what he thinks about this

  • JohnLast uploaded the file HFT Measurement Detection and Response 2490 days ago
    What does "Bad" HFT look like, how often does it happen, and how de we detect it? This report focus on strategies that seek to create short term mispricing and jow to respond to those strategies.
  • Introduction To Stationary And Non-Stationary Processes Check this article. It is interesting, according to this article we need to use two types of indicators that transform non stationary data into stationary data. "Using non-stationary time...
  • In this article on my blog I summarized  my views on the proper way to obtain a robust solution through neroevolution.Basically I question the whole architecture of the neuron when you are going to use  genetic optimization...
  • In this excellent article you can see an implementation of neural netwotk code for Metatrader 5. In the article all the basic concepts for neural networks are explained in a use friendly manner. At the end a code is given for neural net...
    Comments
    • JohnLast 2484 days ago

      I was thinking that the RSI is used are ypou referring to this line:

      inputs[i]=(((iRSI_buf[i]-x_min)*(d2-d1))/(x_max-x_min))+d1;

      As for me the main problem with this is that it uses every tic (correct me if I am wrong).

      Wheen you use every tic you have to process a lot if information and when you use the genetic optimizer you make things very tough for it. We are not talking about the fact that the simulations for tic experts are quite unreliable.

      The second thing is the end of the article, you can't deduce from in sample tests that a neural net EA is performing better. The truth is that using weights you can optimize (curve fit) so well the past history, and make miracles.

      However those neural nets fail miserably out of sample and people still are doing this stuff because they love the concept behind it. Creating a stable artificial intelligency model is very hard. 

      Here I will cite an opinion from a guy from linkedin he finds that "the majority of researchers attempting to stabilise evolutionary trading models get the representation wrong (input data, filters and the fitness function) due to their lack of understanding of the market microstructure and unrealistic expectations arising from it (e.g. predicting price). Building a robust stationary model that has a quantifiable edge helps reducing the search space and defining the representation that can be subsequently employed in the adaptive layer of the system. The latter becomes the icing on the cake. :-)" (Vlad Shurupov).

      As for this code it is a brilliant example how the computer language is used to implement the neural net model: use of arrays and use of loops.



       

    • jaguar1637 2483 days ago

      Yes, John

      the RSI is used by to this line:

      inputs[i]=(((iRSI_buf[i]-x_min)*(d2-d1))/(x_max-x_min))+d1;

      As for me the main problem with this is that it uses every tic (correct me if I am wrong). => Yes, it's true. A NN should work once when  new bars is opened and  w/ values provided from last bars, not during the bars

       

    • JohnLast 2483 days ago

      I think that is important for two reasons:

      -quality of the simulation

      -quality of training: for every pass the genetic optmizer would need to process every tic, from one hand you can add by this a lot of noise and on another it will make very difficult and long the training process.

       

  • JohnLast commented on a bookmark Eureka: genetic programming software 2507 days ago
    I think that it will be possible to use the formula generated by Eureka and with them to modify the output in the indicator and by that to get a predictive indicator without much work. In the example from the screen shot we have several...
  • You can check also this interesting post regarding the scaling laws by Jean-Philippe in his blog.
  • by J.B. Glattfelder, A. Dupuis and R.B. Olsen Abstract We have discovered 17 new empirical scaling laws in foreign exchange data-series that hold for close to three orders of magnitude and across 13 currency...
    Comments
  • JohnLast commented on a bookmark Eureka: genetic programming software 2508 days ago
    I think this tool eureka is great in order to find relationships between the forex market and the fundamental data. In fact this tool was built with that idea to help scientifists to generate formulas helping them to understand complex...
  • JohnLast commented on a bookmark Eureka: genetic programming software 2508 days ago
    This is genetic programming, the algorythm evolve basic equations in order to achieve the best fit. As a result we havr equations that we can understand and not weights. In that way you can understand the results. For example if you do the same...
  • jaguar1637 commented on a bookmark Eureka: genetic programming software 2509 days ago
    ok, it's a regression algorithm (levenmark ?)  or a neuronal system that brought those results ?