Maybe I'm not understanding the rising() function, but it seems to be detecting a falling pattern instead of rising.

This code is true when it actually falls.
-------------------------------------------------------
vars UL_Trend = series(LowPass(Price,20));
printf(" -- UL_Trend[%.2f]", UL_Trend[0]);
if(rising(UL_Trend+5)) {
printf(" -- UL_Trend Rising[%.2f]", UL_Trend[0]);
}

Printf results:
-------------------------------------------------------
[1: Tue 15-08-04 19:20] (209.37) -- UL_Trend[209.52]
[2: Wed 15-08-05 19:20] (210.06) -- UL_Trend[209.62]
[3: Thu 15-08-06 19:20] (208.36) -- UL_Trend[209.65]
[4: Fri 15-08-07 19:20] (207.96) -- UL_Trend[209.41]
[5: Mon 15-08-10 19:20] (210.57) -- UL_Trend[209.29]
[6: Tue 15-08-11 19:20] (208.67) -- UL_Trend[209.28]
[7: Wed 15-08-12 19:20] (208.92) -- UL_Trend[209.02] -- UL_Trend Rising[209.02]
[8: Thu 15-08-13 19:20] (208.66) -- UL_Trend[208.81] -- UL_Trend Rising[208.81]
------------------------------

Does rising(UL_Trend+5), actually mean the series rises to the "left" 5 bars, which means fall 5 bars before the current time?

Also, do all 5 bars need to be sequentially rising for rising to be true, or just "generally rising" or like positive slope.

Thanks!