Volume Weighted LR Standard DeviationThis indicator analyzes market character by decomposing total volatility into three distinct, interpretable components based on a Linear Regression model.
Key Features:
Three-Component Volatility Decomposition: The indicator separates volatility based on the 'Estimate Bar Statistics' option.
Standard Mode (Estimate Bar Statistics = OFF): Calculates volatility based on the selected Source (dies führt hauptsächlich zu 'Trend'- und 'Residual'-Volatilität).
Decomposition Mode (Estimate Bar Statistics = ON): The indicator uses a statistical model ('Estimator') to calculate within-bar volatility. (Assumption: In this mode, the Source input is ignored, and an estimated mean for each bar is used instead). This separates volatility into:
Trend Volatility (Green/Red): Volatility explained by the regression's slope (Momentum).
Residual Volatility (Yellow): Volatility from price oscillating around the regression line (Mean-Reversion).
Within-Bar Volatility (Blue): Volatility from the high-low range of each bar (Noise/Choppiness).
Dual Display Modes: The indicator offers two modes to visualize this decomposition:
Absolute Mode: Displays the total standard deviation as a stacked area chart, partitioned by the variance ratio of the three components.
Normalized Mode: Displays the direct variance ratio (proportion) of each component relative to the total (0-1), ideal for identifying the dominant market character.
Calculation Options:
Normalization: An optional 'Normalize Volatility' setting calculates an Exponential Regression Curve (log-space), making the analysis suitable for growth assets.
Volume Weighting: An option (Volume weighted) applies volume weighting to all regression and volatility calculations.
Multi-Component Pivot Detection: Includes a pivot detector that identifies significant turning points (highs and lows) in both the Total Volatility and the Trend Volatility Ratio. (Note: These pivots are only plotted when 'Plot Mode' is set to 'Absolute').
Note on Confirmation (Lag): Pivot signals are confirmed using a lookback method. A pivot is only plotted after the Pivot Right Bars input has passed, which introduces an inherent lag.
Multi-Timeframe (MTF) Capability:
MTF Volatility Lines: The volatility lines can be calculated on a higher timeframe, with standard options to handle gaps (Fill Gaps) and prevent repainting (Wait for...).
Limitation: The Pivot detection (Calculate Pivots) is disabled if a Higher Timeframe (HTF) is selected.
Integrated Alerts: Includes 9 comprehensive alerts for:
Volatility character changes (e.g., 'Character Change from Noise to Trend').
Dominant character emerging (e.g., 'Bullish Trend Character Emerging').
Total Volatility pivot (High/Low) detection.
Trend Volatility pivot (High/Low) detection.
DISCLAIMER
For Informational/Educational Use Only: This indicator is provided for informational and educational purposes only. It does not constitute financial, investment, or trading advice, nor is it a recommendation to buy or sell any asset.
Use at Your Own Risk: All trading decisions you make based on the information or signals generated by this indicator are made solely at your own risk.
No Guarantee of Performance: Past performance is not an indicator of future results. The author makes no guarantee regarding the accuracy of the signals or future profitability.
No Liability: The author shall not be held liable for any financial losses or damages incurred directly or indirectly from the use of this indicator.
Signals Are Not Recommendations: The alerts and visual signals (e.g., crossovers) generated by this tool are not direct recommendations to buy or sell. They are technical observations for your own analysis and consideration.
Regression
Volume Weighted Linear Regression ChannelThis indicator plots a dynamic channel around a Linear Regression trendline. It provides a framework for identifying the prevailing trend and assessing price extremes based on volatility.
Key Features:
Linear Regression Baseline: The channel's centerline is a (Volume-Weighted) Linear Regression line. This line represents the 'best fit' for the recent price action, serving as a responsive baseline for the trend.
Volatility Decomposition: The indicator's primary feature is its ability to decompose volatility, controlled by the 'Estimate Bar Statistics' option.
Standard Mode (Estimate Bar Statistics = OFF): Calculates a standard linear regression channel. The bands represent the standard deviation of the residuals (the error) between the Source price and the regression line.
Decomposition Mode (Estimate Bar Statistics = ON): The indicator uses a statistical model ('Estimator') to calculate within-bar volatility. (Assumption: In this mode, the Source input is ignored, and an estimated mean for each bar is used for the regression). This mode displays two sets of bands:
Inner Bands: Show only the contribution of the 'residual' (trend noise) volatility, calculated proportionally.
Outer Bands: Show the total volatility (the sum of residual and within-bar components).
Volume Weighting: An option (Volume weighted) allows for volume to be incorporated into the calculation of both the linear regression and the volatility decomposition, giving more influence to high-participation bars.
Trend Projection: The calculated channel is plotted as a projection, which can be extended forward (Extend Forward) and backward (Extend Backward) in time to provide a visual guide for potential support and resistance.
Integrated Alerts: Includes a full set of built-in alerts for the Source price crossing over or under the calculated upper band, lower band, and the central regression line.
DISCLAIMER
For Informational/Educational Use Only: This indicator is provided for informational and educational purposes only. It does not constitute financial, investment, or trading advice, nor is it a recommendation to buy or sell any asset.
Use at Your Own Risk: All trading decisions you make based on the information or signals generated by this indicator are made solely at your own risk.
No Guarantee of Performance: Past performance is not an indicator of future results. The author makes no guarantee regarding the accuracy of the signals or future profitability.
No Liability: The author shall not be held liable for any financial losses or damages incurred directly or indirectly from the use of this indicator.
Signals Are Not Recommendations: The alerts and visual signals (e.g., crossovers) generated by this tool are not direct recommendations to buy or sell. They are technical observations for your own analysis and consideration.
Volume Weighted Volatility RegimeThe Volume-Weighted Volatility Regime (VWVR) is a market analysis tool that dissects total volatility to classify the current market 'character' or 'regime'. Using a Linear Regression model, it decomposes volatility into Trend, Residual (mean-reversion), and Within-Bar (noise) components.
Key Features:
Seven-Stage Regime Classification: The indicator's primary output is a regime value from -3 to +3, identifying the market state:
+3 (Strong Bull Trend): High directional, upward volatility.
+2 (Choppy Bull): Moderate upward trend with noise.
+1 (Quiet Bull): Low volatility, slight upward drift.
0 (Neutral): No clear directional bias.
-1 (Quiet Bear): Low volatility, slight downward drift.
-2 (Choppy Bear): Moderate downward trend with noise.
-3 (Strong Bear Trend): High directional, downward volatility.
Advanced Volatility Decomposition: The regime is derived from a three-component volatility model that separates price action into Trend (momentum), Residual (mean-reversion), and Within-Bar (noise) variance. The classification is determined by comparing the 'Trend' ratio against the user-defined 'Trend Threshold' and 'Quiet Threshold'.
Dual-Level Analysis: The indicator analyzes market character on two levels simultaneously:
Inter-Bar Regime (Background Color): Based on the main StdDev Length, showing the overall market character.
Intra-Bar Regime (Column Color): Based on a high-resolution analysis within each single bar ('Intra-Bar Timeframe'), showing the micro-structural character.
Calculation Options:
Statistical Model: The 'Estimate Bar Statistics' option (enabled by default) uses a statistical model ('Estimator') to perform the decomposition. (Assumption: In this mode, the Source input is ignored, and an estimated mean for each bar is used instead).
Normalization: An optional 'Normalize Volatility' setting calculates an Exponential Regression Curve (log-space).
Volume Weighting: An option (Volume weighted) applies volume weighting to all volatility calculations.
Multi-Timeframe (MTF) Capability: The entire dual-level analysis can be run on a higher timeframe (using the Timeframe input), with standard options to handle gaps (Fill Gaps) and prevent repainting (Wait for...).
Integrated Alerts: Includes 22 comprehensive alerts that trigger whenever the 'Inter-Bar Regime' or the 'Intra-Bar Regime' crosses one of the key thresholds (e.g., 'Regime crosses above Neutral Line'), or when the 'Intra-Bar Dominance' crosses the 50% mark.
Caution: Real-Time Data Behavior (Intra-Bar Repainting) This indicator uses high-resolution intra-bar data. As a result, the values on the current, unclosed bar (the real-time bar) will update dynamically as new intra-bar data arrives. This behavior is normal and necessary for this type of analysis. Signals should only be considered final after the main chart bar has closed.
DISCLAIMER
For Informational/Educational Use Only: This indicator is provided for informational and educational purposes only. It does not constitute financial, investment, or trading advice, nor is it a recommendation to buy or sell any asset.
Use at Your Own Risk: All trading decisions you make based on the information or signals generated by this indicator are made solely at your own risk.
No Guarantee of Performance: Past performance is not an indicator of future results. The author makes no guarantee regarding the accuracy of the signals or future profitability.
No Liability: The author shall not be held liable for any financial losses or damages incurred directly or indirectly from the use of this indicator.
Signals Are Not Recommendations: The alerts and visual signals (e.g., crossovers) generated by this tool are not direct recommendations to buy or sell. They are technical observations for your own analysis and consideration.
Volume Weighted Intra Bar LR Standard DeviationThis indicator analyzes market character by providing a detailed view of volatility. It applies a Linear Regression model to intra-bar price action, dissecting the total volatility of each bar into three distinct components.
Key Features:
Three-Component Volatility Decomposition: By analyzing a lower timeframe ('Intra-Bar Timeframe'), the indicator separates each bar's volatility into:
Trend Volatility (Green/Red): Volatility explained by the intra-bar linear regression slope (Momentum).
Residual Volatility (Yellow): Volatility from price oscillating around the intra-bar trendline (Mean-Reversion).
Within-Bar Volatility (Blue): Volatility derived from the range of each intra-bar candle (Noise/Choppiness).
Layered Column Visualization: The indicator plots these components as a layered column chart. The size of each colored layer visually represents the dominance of each volatility character.
Dual Display Modes: The indicator offers two modes to visualize this decomposition:
Absolute Mode: Displays the total standard deviation as the column height, showing the absolute magnitude of volatility and the contribution of each component.
Normalized Mode: Displays the components as a 100% stacked column chart (scaled from 0 to 1), focusing purely on the percentage ratio of Trend, Residual, and Noise.
Calculation Options:
Statistical Model: The 'Estimate Bar Statistics' option (enabled by default) uses a statistical model ('Estimator') to perform the decomposition. (Assumption: In this mode, the Source input is ignored, and an estimated mean for each bar is used instead).
Normalization: An optional 'Normalize Volatility' setting calculates an Exponential Regression Curve (log-space).
Volume Weighting: An option (Volume weighted) applies volume weighting to all intra-bar calculations.
Multi-Component Pivot Detection: Includes a pivot detector that identifies significant turning points (highs and lows) in both the Total Volatility and the Trend Volatility Ratio. (Note: These pivots are only plotted when 'Plot Mode' is set to 'Absolute').
Note on Confirmation (Lag): Pivot signals are confirmed using a lookback method. A pivot is only plotted after the Pivot Right Bars input has passed, which introduces an inherent lag.
Multi-Timeframe (MTF) Capability:
MTF Analysis: The entire intra-bar analysis can be run on a higher timeframe (using the Timeframe input), with standard options to handle gaps (Fill Gaps) and prevent repainting (Wait for...).
Limitation: The Pivot detection (Calculate Pivots) is disabled if a Higher Timeframe (HTF) is selected.
Integrated Alerts: Includes 9 comprehensive alerts for:
Volatility character changes (e.g., 'Character Change from Noise to Trend').
Dominant character emerging (e.g., 'Bullish Trend Character Emerging').
Total Volatility pivot (High/Low) detection.
Trend Volatility pivot (High/Low) detection.
Caution! Real-Time Data Behavior (Intra-Bar Repainting) This indicator uses high-resolution intra-bar data. As a result, the values on the current, unclosed bar (the real-time bar) will update dynamically as new intra-bar data arrives. This behavior is normal and necessary for this type of analysis. Signals should only be considered final after the main chart bar has closed.
DISCLAIMER
For Informational/Educational Use Only: This indicator is provided for informational and educational purposes only. It does not constitute financial, investment, or trading advice, nor is it a recommendation to buy or sell any asset.
Use at Your Own Risk: All trading decisions you make based on the information or signals generated by this indicator are made solely at your own risk.
No Guarantee of Performance: Past performance is not an indicator of future results. The author makes no guarantee regarding the accuracy of the signals or future profitability.
No Liability: The author shall not be held liable for any financial losses or damages incurred directly or indirectly from the use of this indicator.
Signals Are Not Recommendations: The alerts and visual signals (e.g., crossovers) generated by this tool are not direct recommendations to buy or sell. They are technical observations for your own analysis and consideration.
Volume Weighted Linear Regression BandThe Volume-Weighted Linear Regression Band (VWLRBd) is a volatility channel that uses a Linear Regression line as its dynamic baseline. Its primary feature is the decomposition of total volatility into two distinct components, visualized as layered bands.
Key Features:
Volatility Decomposition: The indicator separates volatility based on the 'Estimate Bar Statistics' option.
Standard Mode (Estimate Bar Statistics = OFF): The indicator functions as a standard (Volume-Weighted) Linear Regression Channel. It plots a single set of bands based on the standard deviation of the residuals (the error between the Source price and the regression line).
Decomposition Mode (Estimate Bar Statistics = ON): The indicator uses a statistical model ('Estimator') to calculate within-bar volatility. (Assumption: In this mode, the Source input is ignored, and an estimated mean for each bar is used for the regression). This mode displays two sets of bands:
Inner Bands: Show only the contribution of the 'residual' (trend noise) volatility, calculated proportionally.
Outer Bands: Show the total volatility (the sum of residual and within-bar components).
Regression Baseline (Linear / Exponential): The central line is a (Volume-Weighted) Linear Regression curve. An optional 'Normalize' mode performs all calculations in logarithmic space, transforming the baseline into an Exponential Regression Curve and the bands into constant percentage deviations, suitable for analyzing growth assets.
Volume Weighting: An option (Volume weighted) allows for volume to be incorporated into the calculation of both the regression baseline and the volatility decomposition, giving more influence to high-participation bars.
Multi-Timeframe (MTF) Engine: The indicator includes an MTF conversion block. When a Higher Timeframe (HTF) is selected, advanced options become available: Fill Gaps handles data gaps, and Wait for timeframe to close prevents repainting by ensuring the indicator only updates when the HTF bar closes.
Integrated Alerts: Includes a full set of built-in alerts for the source price crossing over or under the central regression line and the outermost calculated volatility band.
DISCLAIM_
For Informational/Educational Use Only: This indicator is provided for informational and educational purposes only. It does not constitute financial, investment, or trading advice, nor is it a recommendation to buy or sell any asset.
Use at Your Own Risk: All trading decisions you make based on the information or signals generated by this indicator are made solely at your own risk.
No Guarantee of Performance: Past performance is not an indicator of future results. The author makes no guarantee regarding the accuracy of the signals or future profitability.
No Liability: The author shall not be held liable for any financial losses or damages incurred directly or indirectly from the use of this indicator.
Signals Are Not Recommendations: The alerts and visual signals (e.g., crossovers) generated by this tool are not direct recommendations to buy or sell. They are technical observations for your own analysis and consideration.
LibWghtLibrary "LibWght"
This is a library of mathematical and statistical functions
designed for quantitative analysis in Pine Script. Its core
principle is the integration of a custom weighting series
(e.g., volume) into a wide array of standard technical
analysis calculations.
Key Capabilities:
1. **Universal Weighting:** All exported functions accept a `weight`
parameter. This allows standard calculations (like moving
averages, RSI, and standard deviation) to be influenced by an
external data series, such as volume or tick count.
2. **Weighted Averages and Indicators:** Includes a comprehensive
collection of weighted functions:
- **Moving Averages:** `wSma`, `wEma`, `wWma`, `wRma` (Wilder's),
`wHma` (Hull), and `wLSma` (Least Squares / Linear Regression).
- **Oscillators & Ranges:** `wRsi`, `wAtr` (Average True Range),
`wTr` (True Range), and `wR` (High-Low Range).
3. **Volatility Decomposition:** Provides functions to decompose
total variance into distinct components for market analysis.
- **Two-Way Decomposition (`wTotVar`):** Separates variance into
**between-bar** (directional) and **within-bar** (noise)
components.
- **Three-Way Decomposition (`wLRTotVar`):** Decomposes variance
relative to a linear regression into **Trend** (explained by
the LR slope), **Residual** (mean-reversion around the
LR line), and **Within-Bar** (noise) components.
- **Local Volatility (`wLRLocTotStdDev`):** Measures the total
"noise" (within-bar + residual) around the trend line.
4. **Weighted Statistics and Regression:** Provides a robust
function for Weighted Linear Regression (`wLinReg`) and a
full suite of related statistical measures:
- **Between-Bar Stats:** `wBtwVar`, `wBtwStdDev`, `wBtwStdErr`.
- **Residual Stats:** `wResVar`, `wResStdDev`, `wResStdErr`.
5. **Fallback Mechanism:** All functions are designed for reliability.
If the total weight over the lookback period is zero (e.g., in
a no-volume period), the algorithms automatically fall back to
their unweighted, uniform-weight equivalents (e.g., `wSma`
becomes a standard `ta.sma`), preventing errors and ensuring
continuous calculation.
---
**DISCLAIMER**
This library is provided "AS IS" and for informational and
educational purposes only. It does not constitute financial,
investment, or trading advice.
The author assumes no liability for any errors, inaccuracies,
or omissions in the code. Using this library to build
trading indicators or strategies is entirely at your own risk.
As a developer using this library, you are solely responsible
for the rigorous testing, validation, and performance of any
scripts you create based on these functions. The author shall
not be held liable for any financial losses incurred directly
or indirectly from the use of this library or any scripts
derived from it.
wSma(source, weight, length)
Weighted Simple Moving Average (linear kernel).
Parameters:
source (float) : series float Data to average.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 1.
Returns: series float Linear-kernel weighted mean; falls back to
the arithmetic mean if Σweight = 0.
wEma(source, weight, length)
Weighted EMA (exponential kernel).
Parameters:
source (float) : series float Data to average.
weight (float) : series float Weight series.
length (simple int) : simple int Look-back length ≥ 1.
Returns: series float Exponential-kernel weighted mean; falls
back to classic EMA if Σweight = 0.
wWma(source, weight, length)
Weighted WMA (linear kernel).
Parameters:
source (float) : series float Data to average.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 1.
Returns: series float Linear-kernel weighted mean; falls back to
classic WMA if Σweight = 0.
wRma(source, weight, length)
Weighted RMA (Wilder kernel, α = 1/len).
Parameters:
source (float) : series float Data to average.
weight (float) : series float Weight series.
length (simple int) : simple int Look-back length ≥ 1.
Returns: series float Wilder-kernel weighted mean; falls back to
classic RMA if Σweight = 0.
wHma(source, weight, length)
Weighted HMA (linear kernel).
Parameters:
source (float) : series float Data to average.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 1.
Returns: series float Linear-kernel weighted mean; falls back to
classic HMA if Σweight = 0.
wRsi(source, weight, length)
Weighted Relative Strength Index.
Parameters:
source (float) : series float Price series.
weight (float) : series float Weight series.
length (simple int) : simple int Look-back length ≥ 1.
Returns: series float Weighted RSI; uniform if Σw = 0.
wAtr(tr, weight, length)
Weighted ATR (Average True Range).
Implemented as WRMA on *true range*.
Parameters:
tr (float) : series float True Range series.
weight (float) : series float Weight series.
length (simple int) : simple int Look-back length ≥ 1.
Returns: series float Weighted ATR; uniform weights if Σw = 0.
wTr(tr, weight, length)
Weighted True Range over a window.
Parameters:
tr (float) : series float True Range series.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 1.
Returns: series float Weighted mean of TR; uniform if Σw = 0.
wR(r, weight, length)
Weighted High-Low Range over a window.
Parameters:
r (float) : series float High-Low per bar.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 1.
Returns: series float Weighted mean of range; uniform if Σw = 0.
wBtwVar(source, weight, length, biased)
Weighted Between Variance (biased/unbiased).
Parameters:
source (float) : series float Data series.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population (biased); false → sample.
Returns:
variance series float The calculated between-bar variance (σ²btw), either biased or unbiased.
sumW series float The sum of weights over the lookback period (Σw).
sumW2 series float The sum of squared weights over the lookback period (Σw²).
wBtwStdDev(source, weight, length, biased)
Weighted Between Standard Deviation.
Parameters:
source (float) : series float Data series.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population (biased); false → sample.
Returns: series float σbtw uniform if Σw = 0.
wBtwStdErr(source, weight, length, biased)
Weighted Between Standard Error.
Parameters:
source (float) : series float Data series.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population (biased); false → sample.
Returns: series float √(σ²btw / N_eff) uniform if Σw = 0.
wTotVar(mu, sigma, weight, length, biased)
Weighted Total Variance (= between-group + within-group).
Useful when each bar represents an aggregate with its own
mean* and pre-estimated σ (e.g., second-level ranges inside a
1-minute bar). Assumes the *weight* series applies to both the
group means and their σ estimates.
Parameters:
mu (float) : series float Group means (e.g., HL2 of 1-second bars).
sigma (float) : series float Pre-estimated σ of each group (same basis).
weight (float) : series float Weight series (volume, ticks, …).
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population (biased); false → sample.
Returns:
varBtw series float The between-bar variance component (σ²btw).
varWtn series float The within-bar variance component (σ²wtn).
sumW series float The sum of weights over the lookback period (Σw).
sumW2 series float The sum of squared weights over the lookback period (Σw²).
wTotStdDev(mu, sigma, weight, length, biased)
Weighted Total Standard Deviation.
Parameters:
mu (float) : series float Group means (e.g., HL2 of 1-second bars).
sigma (float) : series float Pre-estimated σ of each group (same basis).
weight (float) : series float Weight series (volume, ticks, …).
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population (biased); false → sample.
Returns: series float σtot.
wTotStdErr(mu, sigma, weight, length, biased)
Weighted Total Standard Error.
SE = √( total variance / N_eff ) with the same effective sample
size logic as `wster()`.
Parameters:
mu (float) : series float Group means (e.g., HL2 of 1-second bars).
sigma (float) : series float Pre-estimated σ of each group (same basis).
weight (float) : series float Weight series (volume, ticks, …).
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population (biased); false → sample.
Returns: series float √(σ²tot / N_eff).
wLinReg(source, weight, length)
Weighted Linear Regression.
Parameters:
source (float) : series float Data series.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 2.
Returns:
mid series float The estimated value of the regression line at the most recent bar.
slope series float The slope of the regression line.
intercept series float The intercept of the regression line.
wResVar(source, weight, midLine, slope, length, biased)
Weighted Residual Variance.
linear regression – optionally biased (population) or
unbiased (sample).
Parameters:
source (float) : series float Data series.
weight (float) : series float Weighting series (volume, etc.).
midLine (float) : series float Regression value at the last bar.
slope (float) : series float Slope per bar.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population variance (σ²_P), denominator ≈ N_eff.
false → sample variance (σ²_S), denominator ≈ N_eff - 2.
(Adjusts for 2 degrees of freedom lost to the regression).
Returns:
variance series float The calculated residual variance (σ²res), either biased or unbiased.
sumW series float The sum of weights over the lookback period (Σw).
sumW2 series float The sum of squared weights over the lookback period (Σw²).
wResStdDev(source, weight, midLine, slope, length, biased)
Weighted Residual Standard Deviation.
Parameters:
source (float) : series float Data series.
weight (float) : series float Weight series.
midLine (float) : series float Regression value at the last bar.
slope (float) : series float Slope per bar.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population (biased); false → sample.
Returns: series float σres; uniform if Σw = 0.
wResStdErr(source, weight, midLine, slope, length, biased)
Weighted Residual Standard Error.
Parameters:
source (float) : series float Data series.
weight (float) : series float Weight series.
midLine (float) : series float Regression value at the last bar.
slope (float) : series float Slope per bar.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population (biased); false → sample.
Returns: series float √(σ²res / N_eff); uniform if Σw = 0.
wLRTotVar(mu, sigma, weight, midLine, slope, length, biased)
Weighted Linear-Regression Total Variance **around the
window’s weighted mean μ**.
σ²_tot = E_w ⟶ *within-group variance*
+ Var_w ⟶ *residual variance*
+ Var_w ⟶ *trend variance*
where each bar i in the look-back window contributes
m_i = *mean* (e.g. 1-sec HL2)
σ_i = *sigma* (pre-estimated intrabar σ)
w_i = *weight* (volume, ticks, …)
ŷ_i = b₀ + b₁·x (value of the weighted LR line)
r_i = m_i − ŷ_i (orthogonal residual)
Parameters:
mu (float) : series float Per-bar mean m_i.
sigma (float) : series float Pre-estimated σ_i of each bar.
weight (float) : series float Weight series w_i (≥ 0).
midLine (float) : series float Regression value at the latest bar (ŷₙ₋₁).
slope (float) : series float Slope b₁ of the regression line.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population; false → sample.
Returns:
varRes series float The residual variance component (σ²res).
varWtn series float The within-bar variance component (σ²wtn).
varTrd series float The trend variance component (σ²trd), explained by the linear regression.
sumW series float The sum of weights over the lookback period (Σw).
sumW2 series float The sum of squared weights over the lookback period (Σw²).
wLRTotStdDev(mu, sigma, weight, midLine, slope, length, biased)
Weighted Linear-Regression Total Standard Deviation.
Parameters:
mu (float) : series float Per-bar mean m_i.
sigma (float) : series float Pre-estimated σ_i of each bar.
weight (float) : series float Weight series w_i (≥ 0).
midLine (float) : series float Regression value at the latest bar (ŷₙ₋₁).
slope (float) : series float Slope b₁ of the regression line.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population; false → sample.
Returns: series float √(σ²tot).
wLRTotStdErr(mu, sigma, weight, midLine, slope, length, biased)
Weighted Linear-Regression Total Standard Error.
SE = √( σ²_tot / N_eff ) with N_eff = Σw² / Σw² (like in wster()).
Parameters:
mu (float) : series float Per-bar mean m_i.
sigma (float) : series float Pre-estimated σ_i of each bar.
weight (float) : series float Weight series w_i (≥ 0).
midLine (float) : series float Regression value at the latest bar (ŷₙ₋₁).
slope (float) : series float Slope b₁ of the regression line.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population; false → sample.
Returns: series float √((σ²res, σ²wtn, σ²trd) / N_eff).
wLRLocTotStdDev(mu, sigma, weight, midLine, slope, length, biased)
Weighted Linear-Regression Local Total Standard Deviation.
Measures the total "noise" (within-bar + residual) around the trend.
Parameters:
mu (float) : series float Per-bar mean m_i.
sigma (float) : series float Pre-estimated σ_i of each bar.
weight (float) : series float Weight series w_i (≥ 0).
midLine (float) : series float Regression value at the latest bar (ŷₙ₋₁).
slope (float) : series float Slope b₁ of the regression line.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population; false → sample.
Returns: series float √(σ²wtn + σ²res).
wLRLocTotStdErr(mu, sigma, weight, midLine, slope, length, biased)
Weighted Linear-Regression Local Total Standard Error.
Parameters:
mu (float) : series float Per-bar mean m_i.
sigma (float) : series float Pre-estimated σ_i of each bar.
weight (float) : series float Weight series w_i (≥ 0).
midLine (float) : series float Regression value at the latest bar (ŷₙ₋₁).
slope (float) : series float Slope b₁ of the regression line.
length (int) : series int Look-back length ≥ 2.
biased (bool) : series bool true → population; false → sample.
Returns: series float √((σ²wtn + σ²res) / N_eff).
wLSma(source, weight, length)
Weighted Least Square Moving Average.
Parameters:
source (float) : series float Data series.
weight (float) : series float Weight series.
length (int) : series int Look-back length ≥ 2.
Returns: series float Least square weighted mean. Falls back
to unweighted regression if Σw = 0.
Smooth Theil-SenI wanted to build a Theil-Sen estimator that could run on more than one bar and produce smoother output than the standard implementation. Theil-Sen regression is a non-parametric method that calculates the median slope between all pairs of points in your dataset, which makes it extremely robust to outliers. The problem is that median operations produce discrete jumps, especially when you're working with limited sample sizes. Every time the median shifts from one value to another, you get a step change in your regression line, which creates visual choppiness that can be distracting even though the underlying calculations are sound.
The solution I ended up going with was convolving a Gaussian kernel around the center of the sorted lists to get a more continuous median estimate. Instead of just picking the middle value or averaging the two middle values when you have an even sample size, the Gaussian kernel weights the values near the center more heavily and smoothly tapers off as you move away from the median position. This creates a weighted average that behaves like a median in terms of robustness but produces much smoother transitions as new data points arrive and the sorted list shifts.
There are variance tradeoffs with this approach since you're no longer using the pure median, but they're minimal in practice. The kernel weighting stays concentrated enough around the center that you retain most of the outlier resistance that makes Theil-Sen useful in the first place. What you gain is a regression line that updates smoothly instead of jumping discretely, which makes it easier to spot genuine trend changes versus just the statistical noise of median recalculation. The smoothness is particularly noticeable when you're running the estimator over longer lookback periods where the sorted list is large enough that small kernel adjustments have less impact on the overall center of mass.
The Gaussian kernel itself is a bell curve centered on the median position, with a standard deviation you can tune to control how much smoothing you want. Tighter kernels stay closer to the pure median behavior and give you more discrete steps. Wider kernels spread the weighting further from the center and produce smoother output at the cost of slightly reduced outlier resistance. The default settings strike a balance that keeps the estimator robust while removing most of the visual jitter.
Running Theil-Sen on multiple bars means calculating slopes between all pairs of points across your lookback window, sorting those slopes, and then applying the Gaussian kernel to find the weighted center of that sorted distribution. This is computationally more expensive than simple moving averages or even standard linear regression, but Pine Script handles it well enough for reasonable lookback lengths. The benefit is that you get a trend estimate that doesn't get thrown off by individual spikes or anomalies in your price data, which is valuable when working with noisy instruments or during volatile periods where traditional regression lines can swing wildly.
The implementation maintains sorted arrays for both the slope calculations and the final kernel weighting, which keeps everything organized and makes the Gaussian convolution straightforward. The kernel weights are precalculated based on the distance from the center position, then applied as multipliers to the sorted slope values before summing to get the final smoothed median slope. That slope gets combined with an intercept calculation to produce the regression line values you see plotted on the chart.
What this really demonstrates is that you can take classical statistical methods like Theil-Sen and adapt them with signal processing techniques like kernel convolution to get behavior that's more suited to real-time visualization. The pure mathematical definition of a median is discrete by nature, but financial charts benefit from smooth, continuous lines that make it easier to track changes over time. By introducing the Gaussian kernel weighting, you preserve the core robustness of the median-based approach while gaining the visual smoothness of methods that use weighted averages. Whether that smoothness is worth the minor variance tradeoff depends on your use case, but for most charting applications, the improved readability makes it a good compromise.
Smart Money Dynamics Blocks — Pearson MatrixSmart Money Dynamics Blocks — Pearson Matrix
A structural fusion of Prime Number Theory, Pearson Correlation, and Cumulative Delta Geometry.
1. Mathematical Foundation
This indicator is built on the intersection of Prime Number Theory and the Pearson correlation coefficient, creating a structural framework that quantifies how price and time evolve together.
Prime numbers — unique, indivisible, and irregular — are used here as nonlinear time intervals. Each prime length (2, 3, 5, 7, 11…97) represents a regression horizon where correlation is measured between price and time. The result is a multi-scale correlation lattice — a geometric matrix that captures hidden directional strength and temporal bias beyond traditional moving averages.
2. The Pearson Matrix Logic
For every prime interval p, the indicator calculates the linear correlation:
r_p = corr(price, bar_index, p)
Each r_p reflects how closely price and time move together across a prime-defined window. All r_p values are then averaged to create avgR, a single adaptive coefficient summarizing overall structural coherence.
- When avgR > 0.8 → strong positive correlation (labeled R+).
- When avgR < -0.8 → strong negative correlation (labeled R−).
This approach gives a mathematically grounded definition of trend — one that isn’t based on pattern recognition, but on measurable correlation strength.
3. Sequential Prime Slope and Median Pivot
Using the ordered sequence of 25 prime intervals, the model computes sequential slopes between adjacent primes. These slopes represent the rate of change of structure between two prime scales. A robust median aggregator smooths the slopes, producing a clean, stable directional vector.
The system anchors this slope to the 41-bar pivot — the median of the first 25 primes — serving as the geometric midpoint of the prime lattice. The resulting yellow line on the chart is not an ordinary regression line; it’s a dynamic prime-slope function, adapting continuously with correlation feedback.
4. Regression-Style Parallel Bands
Around this prime-slope line, the indicator constructs parallel bands using standard deviation envelopes — conceptually similar to a regression channel but recalculated through the prime–Pearson matrix.
These bands adjust dynamically to:
- Volatility, via standard deviation of residuals.
- Correlation strength, via avgR sign weighting.
Together, they visualize statistical deviation geometry, making it easier to observe symmetry, expansion, and contraction phases of price structure.
5. Volume and Cumulative Delta Peaks
Below the geometric layer, the indicator incorporates a custom lower-timeframe volume feed — by default using 15-second data (custom_tf_input_volume = “15S”). This allows precise delta computation between up-volume and down-volume even on higher timeframe charts.
From this feed, the indicator accumulates delta over a configurable period (default: 100 bars). When cumulative delta reaches a local maximum or minimum, peak and trough markers appear, showing the precise bar where buying or selling pressure statistically peaked.
This combination of geometry and order flow reveals the intersection of market structure and energy — where liquidity pressure expresses itself through mathematical form.
6. Chart Interpretation
The primary chart view represents the live execution of the indicator. It displays the relationship between structural correlation and volume behavior in real time.
Orange “R+” and blue “R−” labels indicate regions of strong positive or negative Pearson correlation across the prime matrix. The yellow median prime-slope line serves as the structural backbone of the indicator, while green and red parallel bands act as dynamic regression boundaries derived from the underlying correlation strength. Peaks and troughs in cumulative delta — displayed as numerical annotations — mark statistically significant shifts in buying and selling pressure.
The secondary visualization (Prime Regression Concept) expands on this by illustrating how regression behavior evolves across prime intervals. Each colored regression fan corresponds to a prime number window (2, 3, 5, 7, …, 97), demonstrating how multiple regression lines would appear if drawn independently. The indicator integrates these into one unified geometric model — eliminating the need to plot tens of regression lines manually. It’s a conceptual tool to help visualize the internal logic: the synthesis of many small-scale regressions into a single coherent structure.
7. Interpretive Insight
This model is not a prediction tool; it’s an instrument of mathematical observation. By translating price dynamics into a prime-structured correlation space, it reveals how coherence unfolds through time — not as a forecast, but as a measurable evolution of structure.
It unifies three analytical domains:
- Prime distribution — defines a nonlinear temporal architecture.
- Pearson correlation — quantifies statistical cohesion.
- Cumulative delta — expresses behavioral imbalance in order flow.
The synthesis creates a geometric analysis of liquidity and time — where structure meets energy, and where the invisible rhythm of market flow becomes measurable.
8. Contribution & Feedback
Share your observations in the comments:
- The time gap and alternation between R+ and R− clusters.
- How different timeframes change delta sensitivity or reveal compression/expansion.
- Prime intervals/clusters that tend to sit near turning points or liquidity shifts.
- How avgR behaves across assets or regimes (trending, ranging, high-vol).
- Notable interactions with the parallel bands (touches, breaks, mean-revert).
Your field notes help others read the model more effectively and compare contexts.
Summary
- Primes define the structure.
- Pearson quantifies coherence.
- Slope median stabilizes geometry.
- Regression bands visualize deviation.
- Cumulative delta locates imbalance.
Together, they construct a framework where mathematics meets market behavior.
Z-Score Regression Bands [BOSWaves]Z-Score Regression Bands – Adaptive Trend and Volatility Insight
Overview
The Z-Score Regression Bands is a trend and volatility analysis framework designed to give traders a clear, structured view of price behavior. It combines Least Squares Moving Average (LSMA) regression, a statistical method to detect underlying trends, with Z-Score standardization, which measures how far price deviates from its recent average.
Traditional moving average bands, like Bollinger Bands, often lag behind trends or generate false signals in noisy markets. Z-Score Regression Bands addresses these limitations by:
Tracking trends accurately using LSMA regression
Normalizing deviations with Z-Scores to identify statistically significant price extremes
Visualizing multiple bands for normal, strong, and extreme moves
Highlighting trend shifts using diamond markers based on Z-Score crossings
This multi-layered approach allows traders to understand trend strength, detect overextensions, and identify periods of low or high volatility — all from a single, clear chart overlay. It is designed for traders of all levels and can be applied across scalping, day trading, swing trading, and longer-term strategies.
Theoretical Foundation
The Z-Score Regression Bands are grounded in statistical and trend analysis principles. Here’s the idea in plain terms:
Least Squares Moving Average (LSMA) – Unlike standard moving averages, LSMA fits a straight line to recent price data using regression. This “best-fit” line shows the underlying trend more precisely and reduces lag, helping traders see trend changes earlier.
Z-Score Standardization – A Z-Score expresses how far the LSMA is from its recent mean in standard deviation units. This shows whether price is unusually high or low, which can indicate potential reversals, pullbacks, or acceleration of a trend.
Multi-Band Structure – The three bands represent: Band #1: Normal range of price fluctuations; Band #2: Significant deviation from the trend; Band #3: Extreme price levels that are statistically rare. The distance between bands dynamically adapts to market volatility, allowing traders to visualize expansions (higher volatility) and contractions (lower volatility).
Trend Signals – When Z-Score crosses zero, diamonds appear on the chart. These markers signal potential trend initiation, continuation, or reversal, offering a simple alert for shifts in market momentum.
How It Works
The indicator calculates and plots several layers of information:
LSMA Regression (Trend Detection)
Computes a line that best fits recent price points.
The LSMA line smooths out minor fluctuations while reflecting the general direction of the market.
Z-Score Calculation (Deviation Measurement)
Standardizes the LSMA relative to its recent average.
Positive Z-Score → LSMA above average, negative → LSMA below average.
Helps identify overbought or oversold conditions relative to the trend.
Multi-Band Construction (Volatility Envelope)
Upper and lower bands are placed at configurable multiples of standard deviation.
Band #1 captures typical price movement, Band #2 signals stronger deviation, Band #3 highlights extreme moves.
Bands expand and contract with volatility, giving an intuitive visual guide to market conditions.
Trend Signals (Diamonds)
Appear when Z-Score crosses zero.
Indicates moments when momentum may shift, helping traders time entries or exits.
Visual Interpretation
Band width = volatility: wide bands indicate strong movement; narrow bands indicate calm periods.
LSMA shows underlying trend direction, while bands show how far price has strayed from that trend.
Interpretation
The Z-Score Regression Bands provide a multi-dimensional view of market behavior:
Trend Analysis – LSMA line slope shows general market direction.
Momentum & Volatility – Z-Score indicates whether the trend is accelerating or losing strength; band width indicates volatility levels.
Price Extremes – Price touching Band #2 or #3 may suggest overextension and potential reversals.
Trend Shifts – Diamonds signal statistically significant changes in momentum.
Cycle Awareness – Standard deviation bands help distinguish normal market fluctuations from extreme events.
By combining these insights, traders can avoid false signals and react to meaningful structural shifts in the market.
Strategy Integration
Trend Following
Enter trades when diamonds indicate momentum aligns with LSMA direction.
Use Band #1 and #2 for stop placement and partial exits.
Breakout Trading
Watch for narrow bands (low volatility) followed by price pushing outside Band #1 or #2.
Confirm with Z-Score movement in the breakout direction.
Mean Reversion/Pullback
If price reaches Band #2 or #3 without continuation, expect a pullback toward LSMA.
Exhaustion & Reversals
Flattening Z-Score near zero while price remains at extreme bands signals trend weakening.
Tighten stops or scale out before a potential reversal.
Multi-Timeframe Confirmation
High timeframe LSMA confirms the main trend.
Lower timeframe bands provide refined entry and exit points.
Technical Implementation
LSMA Regression : Best-fit line minimizes lag and captures trend slope.
Z-Score Standardization : Normalizes deviation to allow consistent interpretation across markets.
Multi-Band Envelope : Three layers for normal, strong, and extreme deviations.
Trend Signals : Automatic diamonds for Z-Score zero-crossings.
Band Fill Options : Optional shading to visualize volatility expansions and contractions.
Optimal Application
Asset Classes:
Forex : Capture breakouts, overextensions, and trend shifts.
Crypto : High-volatility adaptation with adjustable band multipliers.
Stocks/ETFs : Identify trending sectors, reversals, and pullbacks.
Indices/Futures : Track cycles and structural trends.
Timeframes:
Scalping (1–5 min) : Focus on Band #1 and trend signals for fast entries.
Intraday (15m–1h) : Use Bands #1–2 for continuation and breakout trades.
Swing (4h–Daily) : Bands #2–3 capture trend momentum and exhaustion.
Position (Daily–Weekly) : LSMA trend dominates; Bands #3 highlight regime extremes.
Performance Characteristics
Strong Performance:
Trending markets with moderate-to-high volatility
Assets with steady liquidity and identifiable cycles
Weak Performance:
Flat or highly choppy markets
Very short timeframes (<1 min) dominated by noise
Integration Tips
Combine with support/resistance, volume, or order flow analysis for confirmation.
Use bands for stops, targets, or scaling positions.
Apply multi-timeframe analysis: higher timeframe LSMA confirms main trend, lower timeframe bands refine entries.
Disclaimer
The Z-Score Regression Bands is a trading analysis tool, not a guaranteed profit system. Its effectiveness depends on market conditions, parameter selection, and disciplined risk management. Use it as part of a broader trading strategy, not in isolation.
Polynomial Regression HeatmapPolynomial Regression Heatmap – Advanced Trend & Volatility Visualizer
Overview
The Polynomial Regression Heatmap is a sophisticated trading tool designed for traders who require a clear and precise understanding of market trends and volatility. By applying a second-degree polynomial regression to price data, the indicator generates a smooth trend curve, augmented with adaptive volatility bands and a dynamic heatmap. This framework allows users to instantly recognize trend direction, potential reversals, and areas of market strength or weakness, translating complex price action into a visually intuitive map.
Unlike static trend indicators, the Polynomial Regression Heatmap adapts to changing market conditions. Its visual design—including color-coded candles, regression bands, optional polynomial channels, and breakout markers—ensures that price behavior is easy to interpret. This makes it suitable for scalping, swing trading, and longer-term strategies across multiple asset classes.
How It Works
The core of the indicator relies on fitting a second-degree polynomial to a defined lookback period of price data. This regression curve captures the non-linear nature of market movements, revealing the true trajectory of price beyond the distortions of noise or short-term volatility.
Adaptive upper and lower bands are constructed using ATR-based scaling, surrounding the regression line to reflect periods of high and low volatility. When price moves toward or beyond these bands, it signals areas of potential overextension or support/resistance.
The heatmap colors each candle based on its relative position within the bands. Green shades indicate proximity to the upper band, red shades indicate proximity to the lower band, and neutral tones represent mid-range positioning. This continuous gradient visualization provides immediate feedback on trend strength, market balance, and potential turning points.
Optional polynomial channels can be overlaid around the regression curve. These three-line channels are based on regression residuals and a fixed width multiplier, offering additional reference points for analyzing price deviations, trend continuation, and reversion zones.
Signals and Breakouts
The Polynomial Regression Heatmap includes statistical pivot-based signals to highlight actionable price movements:
Buy Signals – A triangular marker appears below the candle when a pivot low occurs below the lower regression band.
Sell Signals – A triangular marker appears above the candle when a pivot high occurs above the upper regression band.
These markers identify significant deviations from the regression curve while accounting for volatility, providing high-quality visual cues for potential entry points.
The indicator ensures clarity by spacing markers vertically using ATR-based calculations, preventing overlap during periods of high volatility. Users can rely on these signals in combination with heatmap intensity and regression slope for contextual confirmation.
Interpretation
Trend Analysis :
The slope of the polynomial regression line represents trend direction. A rising curve indicates bullish bias, a falling curve indicates bearish bias, and a flat curve indicates consolidation.
Steeper slopes suggest stronger momentum, while gradual slopes indicate more moderate trend conditions.
Volatility Assessment :
Band width provides an instant visual measure of market volatility. Narrow bands correspond to low volatility and potential consolidation, whereas wide bands indicate higher volatility and significant price swings.
Heatmap Coloring :
Candle colors visually represent price position within the bands. This allows traders to quickly identify zones of bullish or bearish pressure without performing complex calculations.
Channel Analysis (Optional) :
The polynomial channel defines zones for evaluating potential overextensions or retracements. Price interacting with these lines may suggest areas where mean-reversion or trend continuation is likely.
Breakout Signals :
Buy and Sell markers highlight pivot points relative to the regression and volatility bands. These are statistical signals, not arbitrary triggers, and should be interpreted in context with trend slope, band width, and heatmap intensity.
Strategy Integration
The Polynomial Regression Heatmap supports multiple trading approaches:
Trend Following – Enter trades in the direction of the regression slope while using the heatmap for momentum confirmation.
Pullback Entries – Use breakouts or deviations from the regression bands as low-risk entry points during trend continuation.
Mean Reversion – Price reaching outer channel boundaries can indicate potential reversal or retracement opportunities.
Multi-Timeframe Alignment – Overlay on higher and lower timeframes to filter noise and improve entry timing.
Stop-loss levels can be set just beyond the opposing regression band, while take-profit targets can be informed by the distance between the bands or the curvature of the polynomial line.
Advanced Techniques
For traders seeking greater precision:
Combine the Polynomial Regression Heatmap with volume, momentum, or volatility indicators to validate signals.
Observe the width and slope of the regression bands over time to anticipate expanding or contracting volatility.
Track sequences of breakout signals in conjunction with heatmap intensity for systematic trade management.
Adjusting regression length allows customization for different assets or timeframes, balancing responsiveness and smoothing. The combination of polynomial curve, adaptive bands, heatmap, and optional channels provides a comprehensive statistical framework for informed decision-making.
Inputs and Customization
Regression Length – Determines the number of bars used for polynomial fitting. Shorter lengths increase responsiveness; longer lengths improve smoothing.
Show Bands – Toggle visibility of the ATR-based regression bands.
Show Channel – Enable or disable the polynomial channel overlay.
Color Settings – Customize bullish, bearish, neutral, and accent colors for clarity and visual preference.
All other internal parameters are fixed to ensure consistent statistical behavior and minimize potential misconfiguration.
Why Use Polynomial Regression Heatmap
The Polynomial Regression Heatmap transforms complex price action into a clear, actionable visual framework. By combining non-linear trend mapping, adaptive volatility bands, heatmap visualization, and breakout signals, it provides a multi-dimensional perspective that is both quantitative and intuitive.
This indicator allows traders to focus on execution, interpret market structure at a glance, and evaluate trend strength, overextensions, and potential reversals in real time. Its design is compatible with scalping, swing trading, and long-term strategies, providing a robust tool for disciplined, data-driven trading.
BTC Power Law Valuation BandsBTC Power Law Rainbow
A long-term valuation framework for Bitcoin based on Power Law growth — designed to help identify macro accumulation and distribution zones, aligned with long-term investor behavior.
🔍 What Is a Power Law?
A Power Law is a mathematical relationship where one quantity varies as a power of another. In this model:
Price ≈ a × (Time)^b
It captures the non-linear, exponentially slowing growth of Bitcoin over time. Rather than using linear or cyclical models, this approach aligns with how complex systems, such as networks or monetary adoption curves, often grow — rapidly at first, and then more slowly, but persistently.
🧠 Why Power Law for BTC?
Bitcoin:
Has finite supply and increasing adoption.
Operates as a monetary network , where Metcalfe’s Law and power laws naturally emerge.
Exhibits exponential growth over logarithmic time when viewed on a log-log chart .
This makes it uniquely well-suited for power law modeling.
🌈 How to Use the Valuation Bands
The central white line represents the modeled fair value according to the power law.
Colored bands represent deviations from the model in logarithmic space, acting as macro zones:
🔵 Lower Bands: Deep value / Accumulation zones.
🟡 Mid Bands: Fair value.
🔴 Upper Bands: Euphoria / Risk of macro tops.
📐 Smart Money Concepts (SMC) Alignment
Accumulation: Occurs when price consolidates near lower bands — often aligning with institutional positioning.
Markup: As price re-enters or ascends the bands, we often see breakout behavior and trend expansion.
Distribution: When price extends above upper bands, potential for exit liquidity creation and distribution events.
Reversion: Historically, price mean-reverts toward the model — rarely staying outside the bands for long.
This makes the model useful for:
Cycle timing
Long-term DCA strategy zones
Identifying value dislocations
Filtering short-term noise
⚠️ Disclaimer
This tool is for educational and informational purposes only . It is not financial advice. The power law model is a non-predictive, mathematical framework and does not guarantee future price movements .
Always use additional tools, risk management, and your own judgment before making trading or investment decisions.
Forecasting Quadratic Regression [UPDATED V6] Forecasting Quadratic Regression applies a second-degree polynomial regression model to price data, offering a non-linear alternative to traditional linear regression. By fitting a quadratic curve of the form:
y=a+bx+cx2
the indicator captures both directional trend and curvature, allowing traders to detect momentum shifts earlier than with straight-line models.
🔹 Core Features
Fits a quadratic regression curve to user-defined lookback periods
Extends the fitted curve forward to generate forecast projections
Calculates slope curvature to highlight trend acceleration or deceleration
Adapts dynamically as new bars are added
🔹 Trading Applications
Identify potential reversal zones when the curve inflects (2nd derivative sign change)
Forecast near-term mean reversion targets or extended trend continuations
Filter trades by measuring momentum curvature rather than linear slope
Visualize higher-order structure in price beyond standard regression lines
⚠️ Note: This model is statistical and assumes past curvature informs short-term future price paths. It should be combined with confirmation signals (volume, oscillators, support/resistance) to reduce false inflection points.
Bitcoin Expectile Model [LuxAlgo]The Bitcoin Expectile Model is a novel approach to forecasting Bitcoin, inspired by the popular Bitcoin Quantile Model by PlanC. By fitting multiple Expectile regressions to the price, we highlight zones of corrections or accumulations throughout the Bitcoin price evolution.
While we strongly recommend using this model with the Bitcoin All Time History Index INDEX:BTCUSD on the 3 days or weekly timeframe using a logarithmic scale, this model can be applied to any asset using the daily timeframe or superior.
Please note that here on TradingView, this model was solely designed to be used on the Bitcoin 1W chart, however, it can be experimented on other assets or timeframes if of interest.
🔶 USAGE
The Bitcoin Expectile Model can be applied similarly to models used for Bitcoin, highlighting lower areas of possible accumulation (support) and higher areas that allow for the anticipation of potential corrections (resistance).
By default, this model fits 7 individual Expectiles Log-Log Regressions to the price, each with their respective expectile ( tau ) values (here multiplied by 100 for the user's convenience). Higher tau values will return a fit closer to the higher highs made by the price of the asset, while lower ones will return fits closer to the lower prices observed over time.
Each zone is color-coded and has a specific interpretation. The green zone is a buy zone for long-term investing, purple is an anomaly zone for market bottoms that over-extend, while red is considered the distribution zone.
The fits can be extrapolated, helping to chart a course for the possible evolution of Bitcoin prices. Users can select the end of the forecast as a date using the "Forecast End" setting.
While the model is made for Bitcoin using a log scale, other assets showing a tendency to have a trend evolving in a single direction can be used. See the chart above on QQQ weekly using a linear scale as an example.
The Start Date can also allow fitting the model more locally, rather than over a large range of prices. This can be useful to identify potential shorter-term support/resistance areas.
🔶 DETAILS
🔹 On Quantile and Expectile Regressions
Quantile and Expectile regressions are similar; both return extremities that can be used to locate and predict prices where tops/bottoms could be more likely to occur.
The main difference lies in what we are trying to minimize, which, for Quantile regression, is commonly known as Quantile loss (or pinball loss), and for Expectile regression, simply Expectile loss.
You may refer to external material to go more in-depth about these loss functions; however, while they are similar and involve weighting specific prices more than others relative to our parameter tau, Quantile regression involves minimizing a weighted mean absolute error, while Expectile regression minimizes a weighted squared error.
The squared error here allows us to compute Expectile regression more easily compared to Quantile regression, using Iteratively reweighted least squares. For Quantile regression, a more elaborate method is needed.
In terms of comparison, Quantile regression is more robust, and easier to interpret, with quantiles being related to specific probabilities involving the underlying cumulative distribution function of the dataset; on the other expectiles are harder to interpret.
🔹 Trimming & Alterations
It is common to observe certain models ignoring very early Bitcoin price ranges. By default, we start our fit at the date 2010-07-16 to align with existing models.
By default, the model uses the number of time units (days, weeks...etc) elapsed since the beginning of history + 1 (to avoid NaN with log) as independent variable, however the Bitcoin All Time History Index INDEX:BTCUSD do not include the genesis block, as such users can correct for this by enabling the "Correct for Genesis block" setting, which will add the amount of missed bars from the Genesis block to the start oh the chart history.
🔶 SETTINGS
Start Date: Starting interval of the dataset used for the fit.
Correct for genesis block: When enabled, offset the X axis by the number of bars between the Bitcoin genesis block time and the chart starting time.
🔹 Expectiles
Toggle: Enable fit for the specified expectile. Disabling one fit will make the script faster to compute.
Expectile: Expectile (tau) value multiplied by 100 used for the fit. Higher values will produce fits that are located near price tops.
🔹 Forecast
Forecast End: Time at which the forecast stops.
🔹 Model Fit
Iterations Number: Number of iterations performed during the reweighted least squares process, with lower values leading to less accurate fits, while higher values will take more time to compute.
Squeeze Momentum Regression Clouds [SciQua]╭──────────────────────────────────────────────╮
☁️ Squeeze Momentum Regression Clouds
╰──────────────────────────────────────────────╯
🔍 Overview
The Squeeze Momentum Regression Clouds (SMRC) indicator is a powerful visual tool for identifying price compression , trend strength , and slope momentum using multiple layers of linear regression Clouds. Designed to extend the classic squeeze framework, this indicator captures the behavior of price through dynamic slope detection, percentile-based spread analytics, and an optional UI for trend inspection — across up to four customizable regression Clouds .
────────────────────────────────────────────────────────────
╭────────────────╮
⚙️ Core Features
╰────────────────╯
Up to 4 Regression Clouds – Each Cloud is created from a top and bottom linear regression line over a configurable lookback window.
Slope Detection Engine – Identifies whether each band is rising, falling, or flat based on slope-to-ATR thresholds.
Spread Compression Heatmap – Highlights compressed zones using yellow intensity, derived from historical spread analysis.
Composite Trend Scoring – Aggregates directional signals from each Cloud using your chosen weighting model.
Color-Coded Candles – Optional candle coloring reflects the real-time composite score.
UI Table – A toggleable info table shows slopes, compression levels, percentile ranks, and direction scores for each Cloud.
Gradient Cloud Styling – Apply gradient coloring from Cloud 1 to Cloud 4 for visual slope intensity.
Weight Aggregation Options – Use equal weighting, inverse-length weighting, or max pooling across Clouds to determine composite trend strength.
────────────────────────────────────────────────────────────
╭──────────────────────────────────────────╮
🧪 How to Use the Indicator
1. Understand Trend Bias with Cloud Colors
╰──────────────────────────────────────────╯
Each Cloud changes color based on its current slope:
Green indicates a rising trend.
Red indicates a falling trend.
Gray indicates a flat slope — often seen during chop or transitions.
Cloud 1 typically reflects short-term structure, while Cloud 4 represents long-term directional bias. Watch for multi-Cloud alignment — when all Clouds are green or red, the trend is strong. Divergence among Clouds often signals a potential shift.
────────────────────────────────────────────────────────────
╭───────────────────────────────────────────────╮
2. Use Compression Heat to Anticipate Breakouts
╰───────────────────────────────────────────────╯
The space between each Cloud’s top and bottom regression lines is measured, normalized, and analyzed over time. When this spread tightens relative to its history, the script highlights the band with a yellow compression glow .
This visual cue helps identify squeeze zones before volatility expands. If you see compression paired with a changing slope color (e.g., gray to green), this may indicate an impending breakout.
────────────────────────────────────────────────────────────
╭─────────────────────────────────╮
3. Leverage the Optional Table UI
╰─────────────────────────────────╯
The indicator includes a dynamic, floating table that displays real-time metrics per Cloud. These include:
Slope direction and value , with historical Min/Max reference.
Top and Bottom percentile ranks , showing how price sits within the Cloud range.
Current spread width , compared to its historical norms.
Composite score , which blends trend, slope, and compression for that Cloud.
You can customize the table’s position, theme, transparency, and whether to show a combined summary score in the header.
────────────────────────────────────────────────────────────
╭─────────────────────────────────────────────╮
4. Analyze Candle Color for Composite Signals
╰─────────────────────────────────────────────╯
When enabled, the indicator colors candles based on a weighted composite score. This score factors in:
The signed slope of each Cloud (up, down, or flat)
The percentile pressure from the top and bottom bands
The degree of spread compression
Expect green candles in bullish trend phases, red candles during bearish regimes, and gray candles in mixed or low-conviction zones.
Candle coloring provides a visual shorthand for market conditions , useful for intraday scanning or historical backtesting.
────────────────────────────────────────────────────────────
╭────────────────────────╮
🧰 Configuration Guidance
╰────────────────────────╯
To tailor the indicator to your strategy:
Use Cloud lengths like 21, 34, 55, and 89 for a balanced multi-timeframe view.
Adjust the slope threshold (default 0.05) to control how sensitive the trend coloring is.
Set the spread floor (e.g., 0.15) to tune when compression is detected and visualized.
Choose your weighting style : Inverse Length (favor faster bands), Equal, or Max Pooling (most aggressive).
Set composite weights to emphasize trend slope, percentile bias, or compression—depending on your market edge.
────────────────────────────────────────────────────────────
╭────────────────╮
✅ Best Practices
╰────────────────╯
Use aligned Cloud colors across all bands to confirm trend conviction.
Combine slope direction with compression glow for early breakout entry setups.
In choppy markets, watch for Clouds 1 and 2 turning flat while Clouds 3 and 4 remain directional — a sign of potential trend exhaustion or consolidation.
Keep the table enabled during backtesting to manually evaluate how each Cloud behaved during price turns and consolidations.
────────────────────────────────────────────────────────────
╭───────────────────────╮
📌 License & Usage Terms
╰───────────────────────╯
This script is provided under the Creative Commons Attribution-NonCommercial 4.0 International License .
✅ You are allowed to:
Use this script for personal or educational purposes
Study, learn, and adapt it for your own non-commercial strategies
❌ You are not allowed to:
Resell or redistribute the script without permission
Use it inside any paid product or service
Republish without giving clear attribution to the original author
For commercial licensing , private customization, or collaborations, please contact Joshua Danford directly.
Adaptive Market Profile – Auto Detect & Dynamic Activity ZonesAdaptive Market Profile is an advanced indicator that automatically detects and displays the most relevant trend channel and market profile for any asset and timeframe. Unlike standard regression channel tools, this script uses a fully adaptive approach to identify the optimal period, providing you with the channel that best fits the current market dynamics. The calculation is based on maximizing the statistical significance of the trend using Pearson’s R coefficient, ensuring that the most relevant trend is always selected.
Within the selected channel, the indicator generates a dynamic market profile, breaking the price range into configurable zones and displaying the most active areas based on volume or the number of touches. This allows you to instantly identify high-activity price levels and potential support/resistance zones. The “most active lines” are plotted in real-time and always stay parallel to the channel, dynamically adapting to market structure.
Key features:
- Automatic detection of the optimal regression period: The script scans a wide range of lengths and selects the channel that statistically represents the strongest trend.
- Dynamic market profile: Visualizes the distribution of volume or price touches inside the trend channel, with customizable section count.
- Most active zones: Highlights the most traded or touched price levels as dynamic, parallel lines for precise support/resistance reading.
- Manual override: Optionally, users can select their own channel period for full control.
- Supports both linear and logarithmic charts: Simple toggle to match your chart scaling.
Use cases:
- Trend following and channel trading strategies.
- Quick identification of dynamic support/resistance and liquidity zones.
- Objective selection of the most statistically significant trend channel, without manual guesswork.
- Suitable for all assets and timeframes (crypto, stocks, forex, futures).
Originality:
This script goes beyond basic regression channels by integrating dynamic profile analysis and fully adaptive period detection, offering a comprehensive tool for modern technical analysts. The combination of trend detection, market profile, and activity zone mapping is unique and not available in TradingView built-ins.
Instructions:
Add Adaptive Market Profile to your chart. By default, the script automatically detects the optimal channel period and displays the corresponding regression channel with dynamic profile and activity zones. If you prefer manual control, disable “Auto trend channel period” and set your preferred period. Adjust profile settings as needed for your asset and timeframe.
For questions, suggestions, or further customization, contact Julien Eche (@Julien_Eche) directly on TradingView.
Logistic Regression ICT FVG🚀 OVERVIEW
Welcome to the Logistic Regression Fair Value Gap (FVG) System — a next-gen trading tool that blends precision gap detection with machine learning intelligence.
Unlike traditional FVG indicators, this one evolves with each bar of price action, scoring and filtering gaps based on real market behavior.
🔧 CORE FEATURES
✨ Smart Gap Detection
Automatically identifies bullish and bearish Fair Value Gaps using volatility-aware candle logic.
📊 Probability-Based Filtering
Uses logistic regression to assign each gap a confidence score (0 to 1), showing only high-probability setups.
🔁 Real-Time Retest Tracking
Continuously watches how price interacts with each gap to determine if it deserves respect.
📈 Multi-Factor Assessment
Evaluates RSI, MACD, and body size at gap formation to build a full context snapshot.
🧠 Self-Learning Engine
The logistic regression model updates on each bar using gradient descent, refining its predictions over time.
📢 Built-In Alerts
Get instant alerts when a gap forms, gets retested, or breaks.
🎨 Custom Display Options
Control the color of bullish/bearish zones, and toggle on/off probability labels for cleaner charts.
🚩 WHAT MAKES IT DIFFERENT
This isn’t just another box-drawing indicator.
While others mark every imbalance, this system thinks before it draws — using statistical modeling to filter out noise and prioritize high-impact zones.
By learning from how price behaves around gaps (not just how they form), it helps you trade only what matters — not what clutters.
⚙️ HOW IT WORKS
1️⃣ Detection
FVGs are identified using ATR-based thresholds and sharp wick imbalances.
2️⃣ Behavior Monitoring
Every gap is tracked — and if respected enough times, it becomes part of the elite training set.
3️⃣ Context Capture
Each new FVG logs RSI, MACD, and body size to provide a feature-rich context for prediction.
4️⃣ Prediction (Logistic Regression)
The model predicts how likely the gap is to be respected and assigns it a probability score.
5️⃣ Classification & Alerts
Gaps above the threshold are plotted with score labels, and alerts trigger for entry/respect/break.
⚙️ CONFIGURATION PANEL
🔧 System Inputs
• Max Retests – How many times a gap must be respected to train the model
• Prediction Threshold – Minimum score to show a gap on the chart
• Learning Rate – Controls how fast the model adapts (default: 0.009)
• Max FVG Lifetime – Expiration duration for unused gaps
• Show Historic Gaps – Show/hide expired or invalidated gaps
🎨 Visual Options
• Bullish/Bearish Colors – Set gap colors to fit your chart style
• Confidence Labels – Show probability scores next to FVGs
• Alert Toggles – Enable alerts for:
– New FVG detected
– FVG respected (entry)
– FVG invalidated (break)
💡 WHY LOGISTIC REGRESSION?
Traditional FVG tools rely on candle shapes.
This system relies on probability — by training on RSI, MACD, and price behavior, it predicts whether a gap will act as a true liquidity zone.
Logistic regression lets the system continuously adapt using new data, making it more accurate the longer it runs.
That means smarter signals, fewer false positives, and a clearer view of where real opportunities lie.
Ultimate Regression Channel v5.0 [WhiteStone_Ibrahim]Ultimate Regression Channel v5.0: Comprehensive User Guide
This indicator is designed to visualize the current trend, potential support/resistance levels, and market volatility through a statistical analysis of price action. At its core, it plots a regression line (a trend line) based on prices over a specific period and adds channels based on standard deviation around this line.
1. Core Features and Settings
Length Mode:
Numerical (Manual): You define the number of bars to be used for the regression channel calculation. You can use lower values (e.g., 50-100) for short-term analysis and higher values (e.g., 200-300) to identify long-term trends.
Automatic (Based on Market Structure): This mode automatically draws the channel starting from the highest high or lowest low that has formed within the Auto Scan Period. This allows the indicator to adapt itself to significant market turning points (swing points), which is highly useful.
Regression Model:
Linear: Calculates the trend as a straight line. It generally works well in stable, short-to-medium-term trends.
Logarithmic: Calculates the trend as a curved line. It more accurately reflects price action, especially on long-term charts or for assets that experience exponential growth/decline (like cryptocurrencies or growth stocks).
Channel Widths:
These settings determine how far from the central trend line (in terms of standard deviations) the channels will be drawn.
The 0 (Inner), 1 (Middle), and 2 (Outer) channels represent the "normal" range of price movement and the "extreme" zones. Statistically, about 95% of all price action occurs within the outer channels (2nd standard deviation).
2. Visual Extras and Their Interpretation
Breakout Style:
This feature alerts you when the price closes above the uppermost channel (Channel 2) with a green arrow/background or below the lowermost channel with a red arrow/background.
This is a very important signal. A breakout can signify that the current trend is strengthening and likely to continue (a breakout/trend-following strategy) or that the market has become overextended and may be due for a reversal (an exhaustion/top-bottom signal). It is critical to confirm this signal with other indicators (e.g., RSI, Volume).
Info Label:
This provides an at-a-glance summary of the channel on the right side of the chart:
Trend Status: Identifies the trend as "Uptrend," "Downtrend," or "Sideways" based on the slope of the centerline. The Horizontal Threshold setting allows you to filter out noise by treating very small slopes as "Sideways."
Regression Model and Length: Shows your current settings.
Trend Slope: A numerical value representing how steep or weak the trend is.
Channel Width: Shows the price difference between the outermost channels. This is a measure of current volatility. A widening channel indicates increasing volatility, while a narrowing one indicates decreasing volatility.
3. What Users Should Pay Attention To & Best Practices
Define Your Strategy: Mean Reversion or Breakout?
Mean Reversion: If the market is in a ranging or gently trending phase, the price will tend to revert to the centerline after hitting the outer channels (overbought/oversold zones). In this case, the outer channels can be considered opportunities to sell (upper channel) or buy (lower channel).
Breakout: If a strong trend is in place, a price close beyond an outer channel can be a sign that the trend is accelerating. In this scenario, one might consider taking a position in the direction of the breakout. Correctly analyzing the current market state (ranging vs. trending) is key to deciding which strategy to employ.
Don't Use It in Isolation: No indicator is a holy grail. Use the Regression Channel in conjunction with other tools. Confirm signals with RSI divergences for overbought/oversold conditions, Moving Averages for the overall trend direction, or Volume indicators to confirm the strength of a breakout.
Choose the Right Model: On shorter-term charts (e.g., 1-hour, 4-hour), the Linear model is often sufficient. However, on long-term charts like the daily, weekly, or monthly, the Logarithmic model will provide much more accurate results, especially for assets with parabolic movements.
The Power of Automatic Mode: The Automatic length mode is often the most practical choice because it finds the most logical starting point for you. It saves you the trouble of adjusting settings, especially when analyzing different assets or timeframes.
Use the Alerts: If you don't want to miss the moment the price touches a key channel line, set up an alert from the Alert Settings section for your desired line (e.g., only the "Outer Channels"). This helps you catch opportunities even when you are not in front of the screen.
Bitcoin Power Law [LuxAlgo]The Bitcoin Power Law tool is a representation of Bitcoin prices first proposed by Giovanni Santostasi, Ph.D. It plots BTCUSD daily closes on a log10-log10 scale, and fits a linear regression channel to the data.
This channel helps traders visualise when the price is historically in a zone prone to tops or located within a discounted zone subject to future growth.
🔶 USAGE
Giovanni Santostasi, Ph.D. originated the Bitcoin Power-Law Theory; this implementation places it directly on a TradingView chart. The white line shows the daily closing price, while the cyan line is the best-fit regression.
A channel is constructed from the linear fit root mean squared error (RMSE), we can observe how price has repeatedly oscillated between each channel areas through every bull-bear cycle.
Excursions into the upper channel area can be followed by price surges and finishing on a top, whereas price touching the lower channel area coincides with a cycle low.
Users can change the channel areas multipliers, helping capture moves more precisely depending on the intended usage.
This tool only works on the daily BTCUSD chart. Ticker and timeframe must match exactly for the calculations to remain valid.
🔹 Linear Scale
Users can toggle on a linear scale for the time axis, in order to obtain a higher resolution of the price, (this will affect the linear regression channel fit, making it look poorer).
🔶 DETAILS
One of the advantages of the Power Law Theory proposed by Giovanni Santostasi is its ability to explain multiple behaviors of Bitcoin. We describe some key points below.
🔹 Power-Law Overview
A power law has the form y = A·xⁿ , and Bitcoin’s key variables follow this pattern across many orders of magnitude. Empirically, price rises roughly with t⁶, hash-rate with t¹² and the number of active addresses with t³.
When we plot these on log-log axes they appear as straight lines, revealing a scale-invariant system whose behaviour repeats proportionally as it grows.
🔹 Feedback-Loop Dynamics
Growth begins with new users, whose presence pushes the price higher via a Metcalfe-style square-law. A richer price pool funds more mining hardware; the Difficulty Adjustment immediately raises the hash-rate requirement, keeping profit margins razor-thin.
A higher hash rate secures the network, which in turn attracts the next wave of users. Because risk and Difficulty act as braking forces, user adoption advances as a power of three in time rather than an unchecked S-curve. This circular causality repeats without end, producing the familiar boom-and-bust cadence around the long-term power-law channel.
🔹 Scale Invariance & Predictions
Scale invariance means that enlarging the timeline in log-log space leaves the trajectory unchanged.
The same geometric proportions that described the first dollar of value can therefore extend to a projected million-dollar bitcoin, provided no catastrophic break occurs. Institutional ETF inflows supply fresh capital but do not bend the underlying slope; only a persistent deviation from the line would falsify the current model.
🔹 Implications
The theory assigns scarcity no direct role; iterative feedback and the Difficulty Adjustment are sufficient to govern Bitcoin’s expansion. Long-term valuation should focus on position within the power-law channel, while bubbles—sharp departures above trend that later revert—are expected punctuations of an otherwise steady climb.
Beyond about 2040, disruptive technological shifts could alter the parameters, but for the next order of magnitude the present slope remains the simplest, most robust guide.
Bitcoin behaves less like a traditional asset and more like a self-organising digital organism whose value, security, and adoption co-evolve according to immutable power-law rules.
🔶 SETTINGS
🔹 General
Start Calculation: Determine the start date used by the calculation, with any prior prices being ignored. (default - 15 Jul 2010)
Use Linear Scale for X-Axis: Convert the horizontal axis from log(time) to linear calendar time
🔹 Linear Regression
Show Regression Line: Enable/disable the central power-law trend line
Regression Line Color: Choose the colour of the regression line
Mult 1: Toggle line & fill, set multiplier (default +1), pick line colour and area fill colour
Mult 2: Toggle line & fill, set multiplier (default +0.5), pick line colour and area fill colour
Mult 3: Toggle line & fill, set multiplier (default -0.5), pick line colour and area fill colour
Mult 4: Toggle line & fill, set multiplier (default -1), pick line colour and area fill colour
🔹 Style
Price Line Color: Select the colour of the BTC price plot
Auto Color: Automatically choose the best contrast colour for the price line
Price Line Width: Set the thickness of the price line (1 – 5 px)
Show Halvings: Enable/disable dotted vertical lines at each Bitcoin halving
Halvings Color: Choose the colour of the halving lines
Support and Resistance Logistic Regression | Flux Charts💎 GENERAL OVERVIEW
Introducing our new Logistic Regression Support / Resistance indicator! This tool leverages advanced statistical modeling "Logistic Regressions" to identify and project key price levels where the market is likely to find support or resistance. For more information about the process, please check the "HOW DOES IT WORK ?" section.
Logistic Regression Support / Resistance Features :
Intelligent S/R Identification : The indicator uses a logistic regression model to intelligently identify and plot significant support and resistance levels.
Predictive Probability : Each identified level comes with a calculated probability, indicating how likely it is to act as a true support or resistance based on historical data.
Retest & Break Labels : The indicator clearly marks on your chart when a detected support or resistance level is retested (price touches and respects the level) or broken (price decisively crosses through the level).
Alerts : Real-time alerts for support retests, resistance retests, support breaks, and resistance breaks.
Customizable : You can change support & resistance line style, width and colors.
🚩 UNIQUENESS
What makes this indicator truly unique is its application of logistic regression to the concept of support and resistance. Instead of merely identifying historical highs and lows, our indicator uses a statistical model to predict the future efficacy of these levels. It analyzes underlying market conditions (like RSI and body size at pivot formation) to assign a probability to each potential S/R zone. This predictive insight, combined with dynamic, real-time labeling of retests and breaks, provides a more robust and adaptive understanding of market structure than traditional, purely historical methods.
📌HOW DOES IT WORK ?
The Logistic Regression Support / Resistance indicator operates in several key steps:
First, it identifies significant pivot highs and lows on the chart based on a user-defined "Pivot Length." These pivots are potential areas of support or resistance.
For each detected pivot, the indicator extracts relevant market data at that specific point, including the RSI (Relative Strength Index) and the Body Size (the absolute difference between the open and close price of the candle). These serve as input features for the model.
The core of the indicator lies in its logistic regression model. This model is continuously trained on past pivot data and their subsequent behavior (i.e., whether they were "respected" as support/resistance multiple times). It learns the relationship between the extracted features (RSI, Body Size) and the likelihood of a pivot becoming a significant S/R level.
When a new pivot is identified, the model uses its learned insights to calculate a prediction value—a probability (from 0 to 1) that this specific pivot will act as a strong support or resistance.
If the calculated probability exceeds a user-defined "Probability Threshold," the pivot is designated a "Regression Pivot" and drawn on the chart as a support or resistance line. The indicator then actively tracks how price interacts with these levels, displaying "R" labels for retests when the price bounces off the level and "B" labels for breaks when the price closes beyond it.
⚙️ SETTINGS
1. General Configuration
Pivot Length: This setting defines the number of bars used to determine a significant high or low for pivot detection.
Target Respects: This input specifies how many times a level must be "respected" by price action for it to be considered a strong support or resistance level by the underlying model.
Probability Threshold: This is the minimum probability output from the logistic regression model for a detected pivot to be considered a valid support or resistance level and be plotted on the chart.
2. Style
Show Prediction Labels: Enable or disable labels that display the calculated probability of a newly identified regression S/R level.
Show Retests: Toggle the visibility of "R" labels on the chart, which mark instances where price has retested a support or resistance level.
Show Breaks: Toggle the visibility of "B" labels on the chart, which mark instances where price has broken through a support or resistance level.
Weighted Regression Bands (Zeiierman)█ Overview
Weighted Regression Bands is a precision-engineered trend and volatility tool designed to adapt to the real market structure instead of reacting to price noise.
This indicator analyzes Weighted High/Low medians and applies user-selectable smoothing methods — including Kalman Filtering, ALMA, and custom Linear Regression — to generate a Fair Value line. Around this, it constructs dynamic standard deviation bands that adapt in real-time to market volatility.
The result is a visually clean and structurally intelligent trend framework suitable for breakout traders, mean reversion strategies, and trend-driven analysis.
█ How It Works
⚪ Structural High/Low Analysis
At the heart of this indicator is a custom high/low weighting system. Instead of using just the raw high or low values, it calculates a midline = (high + low) / 2, then applies one of three weighting methods to determine which price zones matter most.
Users can select the method using the “Weighted HL Method” setting:
Simple
Selects the single most dominant median (highest or lowest) in the lookback window. Ideal for fast, reactive signals.
Advanced
Ranks each bar based on a composite score: median × range × recency. This method highlights structurally meaningful bars that had both volatility and recency. A built-in Kalman filter is applied for extra stability.
Smooth
Blends multiple bars into a single weighted average using smoothed decay and range. This provides the softest and most stable structural response.
⚪ Smoothing Methods (ALMA / Linear Regression)
ALMA provides responsive, low-lag smoothing for fast trend reading.
Linear Regression projects the Fair Value forward, ideal for trend modeling.
⚪ Kalman Smoothing Filter
Before trend calculations, the indicator applies an optional Kalman-style smoothing filter. This helps:
Reduce choppy false shifts in trend,
Retain signal clarity during volatile periods,
Provide stability for long-term setups.
⚪ Deviation Bands (Dynamic Volatility Envelopes)
The indicator builds ±1, ±2, and ±3 standard deviation bands around the fair value line:
Calculated from the standard deviation of price,
Bands expand and contract based on recent volatility,
Visualizes potential overbought/oversold or trending conditions.
█ How to Use
⚪ Trend Trading & Filtering
Use the Fair Value line to identify the dominant direction.
Only trade in the direction of the slope for higher probability setups.
⚪ Volatility-Based Entries
Watch for price reaching outer bands (+2σ, +3σ) for possible exhaustion.
Mean reversion entries become higher quality when far from Fair Value.
█ Settings
Length – Lookback for Weighted HL and trend smoothing
Deviation Multiplier – Controls how wide the bands are from the fair value line
Method – Choose between ALMA or Linear Regression smoothing
Smoothing – Strength of Kalman Filter (1 = none, <1 = stronger smoothing)
-----------------
Disclaimer
The content provided in my scripts, indicators, ideas, algorithms, and systems is for educational and informational purposes only. It does not constitute financial advice, investment recommendations, or a solicitation to buy or sell any financial instruments. I will not accept liability for any loss or damage, including without limitation any loss of profit, which may arise directly or indirectly from the use of or reliance on such information.
All investments involve risk, and the past performance of a security, industry, sector, market, financial product, trading strategy, backtest, or individual's trading does not guarantee future results or returns. Investors are fully responsible for any investment decisions they make. Such decisions should be based solely on an evaluation of their financial circumstances, investment objectives, risk tolerance, and liquidity needs.
Bitcoin Power Law OscillatorThis is the oscillator version of the script. The main body of the script can be found here.
Understanding the Bitcoin Power Law Model
Also called the Long-Term Bitcoin Power Law Model. The Bitcoin Power Law model tries to capture and predict Bitcoin's price growth over time. It assumes that Bitcoin's price follows an exponential growth pattern, where the price increases over time according to a mathematical relationship.
By fitting a power law to historical data, the model creates a trend line that represents this growth. It then generates additional parallel lines (support and resistance lines) to show potential price boundaries, helping to visualize where Bitcoin’s price could move within certain ranges.
In simple terms, the model helps us understand Bitcoin's general growth trajectory and provides a framework to visualize how its price could behave over the long term.
The Bitcoin Power Law has the following function:
Power Law = 10^(a + b * log10(d))
Consisting of the following parameters:
a: Power Law Intercept (default: -17.668).
b: Power Law Slope (default: 5.926).
d: Number of days since a reference point(calculated by counting bars from the reference point with an offset).
Explanation of the a and b parameters:
Roughly explained, the optimal values for the a and b parameters are determined through a process of linear regression on a log-log scale (after applying a logarithmic transformation to both the x and y axes). On this log-log scale, the power law relationship becomes linear, making it possible to apply linear regression. The best fit for the regression is then evaluated using metrics like the R-squared value, residual error analysis, and visual inspection. This process can be quite complex and is beyond the scope of this post.
Applying vertical shifts to generate the other lines:
Once the initial power-law is created, additional lines are generated by applying a vertical shift. This shift is achieved by adding a specific number of days (or years in case of this script) to the d-parameter. This creates new lines perfectly parallel to the initial power law with an added vertical shift, maintaining the same slope and intercept.
In the case of this script, shifts are made by adding +365 days, +2 * 365 days, +3 * 365 days, +4 * 365 days, and +5 * 365 days, effectively introducing one to five years of shifts. This results in a total of six Power Law lines, as outlined below (From lowest to highest):
Base Power Law Line (no shift)
1-year shifted line
2-year shifted line
3-year shifted line
4-year shifted line
5-year shifted line
The six power law lines:
Bitcoin Power Law Oscillator
This publication also includes the oscillator version of the Bitcoin Power Law. This version applies a logarithmic transformation to the price, Base Power Law Line, and 5-year shifted line using the formula: log10(x) .
The log-transformed price is then normalized using min-max normalization relative to the log-transformed Base Power Law Line and 5-year shifted line with the formula:
normalized price = log(close) - log(Base Power Law Line) / log(5-year shifted line) - log(Base Power Law Line)
Finally, the normalized price was multiplied by 5 to map its value between 0 and 5, aligning with the shifted lines.
Interpretation of the Bitcoin Power Law Model:
The shifted Power Law lines provide a framework for predicting Bitcoin's future price movements based on historical trends. These lines are created by applying a vertical shift to the initial Power Law line, with each shifted line representing a future time frame (e.g., 1 year, 2 years, 3 years, etc.).
By analyzing these shifted lines, users can make predictions about minimum price levels at specific future dates. For example, the 5-year shifted line will act as the main support level for Bitcoin’s price in 5 years, meaning that Bitcoin’s price should not fall below this line, ensuring that Bitcoin will be valued at least at this level by that time. Similarly, the 2-year shifted line will serve as the support line for Bitcoin's price in 2 years, establishing that the price should not drop below this line within that time frame.
On the other hand, the 5-year shifted line also functions as an absolute resistance , meaning Bitcoin's price will not exceed this line prior to the 5-year mark. This provides a prediction that Bitcoin cannot reach certain price levels before a specific date. For example, the price of Bitcoin is unlikely to reach $100,000 before 2021, and it will not exceed this price before the 5-year shifted line becomes relevant. After 2028, however, the price is predicted to never fall below $100,000, thanks to the support established by the shifted lines.
In essence, the shifted Power Law lines offer a way to predict both the minimum price levels that Bitcoin will hit by certain dates and the earliest dates by which certain price points will be reached. These lines help frame Bitcoin's potential future price range, offering insight into long-term price behavior and providing a guide for investors and analysts. Lets examine some examples:
Example 1:
In Example 1 it can be seen that point A on the 5-year shifted line acts as major resistance . Also it can be seen that 5 years later this price level now corresponds to the Base Power Law Line and acts as a major support at point B(Note: Vertical yearly grid lines have been added for this purpose👍).
Example 2:
In Example 2, the price level at point C on the 3-year shifted line becomes a major support three years later at point D, now aligning with the Base Power Law Line.
Finally, let's explore some future price predictions, as this script provides projections on the weekly timeframe :
Example 3:
In Example 3, the Bitcoin Power Law indicates that Bitcoin's price cannot surpass approximately $808K before 2030 as can be seen at point E, while also ensuring it will be at least $224K by then (point F).
Bitcoin Power LawThis is the main body version of the script. The Oscillator version can be found here.
Understanding the Bitcoin Power Law Model
Also called the Long-Term Bitcoin Power Law Model. The Bitcoin Power Law model tries to capture and predict Bitcoin's price growth over time. It assumes that Bitcoin's price follows an exponential growth pattern, where the price increases over time according to a mathematical relationship.
By fitting a power law to historical data, the model creates a trend line that represents this growth. It then generates additional parallel lines (support and resistance lines) to show potential price boundaries, helping to visualize where Bitcoin’s price could move within certain ranges.
In simple terms, the model helps us understand Bitcoin's general growth trajectory and provides a framework to visualize how its price could behave over the long term.
The Bitcoin Power Law has the following function:
Power Law = 10^(a + b * log10(d))
Consisting of the following parameters:
a: Power Law Intercept (default: -17.668).
b: Power Law Slope (default: 5.926).
d: Number of days since a reference point(calculated by counting bars from the reference point with an offset).
Explanation of the a and b parameters:
Roughly explained, the optimal values for the a and b parameters are determined through a process of linear regression on a log-log scale (after applying a logarithmic transformation to both the x and y axes). On this log-log scale, the power law relationship becomes linear, making it possible to apply linear regression. The best fit for the regression is then evaluated using metrics like the R-squared value, residual error analysis, and visual inspection. This process can be quite complex and is beyond the scope of this post.
Applying vertical shifts to generate the other lines:
Once the initial power-law is created, additional lines are generated by applying a vertical shift. This shift is achieved by adding a specific number of days (or years in case of this script) to the d-parameter. This creates new lines perfectly parallel to the initial power law with an added vertical shift, maintaining the same slope and intercept.
In the case of this script, shifts are made by adding +365 days, +2 * 365 days, +3 * 365 days, +4 * 365 days, and +5 * 365 days, effectively introducing one to five years of shifts. This results in a total of six Power Law lines, as outlined below (From lowest to highest):
Base Power Law Line (no shift)
1-year shifted line
2-year shifted line
3-year shifted line
4-year shifted line
5-year shifted line
The six power law lines:
Bitcoin Power Law Oscillator
This publication also includes the oscillator version of the Bitcoin Power Law. This version applies a logarithmic transformation to the price, Base Power Law Line, and 5-year shifted line using the formula: log10(x) .
The log-transformed price is then normalized using min-max normalization relative to the log-transformed Base Power Law Line and 5-year shifted line with the formula:
normalized price = log(close) - log(Base Power Law Line) / log(5-year shifted line) - log(Base Power Law Line)
Finally, the normalized price was multiplied by 5 to map its value between 0 and 5, aligning with the shifted lines.
Interpretation of the Bitcoin Power Law Model:
The shifted Power Law lines provide a framework for predicting Bitcoin's future price movements based on historical trends. These lines are created by applying a vertical shift to the initial Power Law line, with each shifted line representing a future time frame (e.g., 1 year, 2 years, 3 years, etc.).
By analyzing these shifted lines, users can make predictions about minimum price levels at specific future dates. For example, the 5-year shifted line will act as the main support level for Bitcoin’s price in 5 years, meaning that Bitcoin’s price should not fall below this line, ensuring that Bitcoin will be valued at least at this level by that time. Similarly, the 2-year shifted line will serve as the support line for Bitcoin's price in 2 years, establishing that the price should not drop below this line within that time frame.
On the other hand, the 5-year shifted line also functions as an absolute resistance , meaning Bitcoin's price will not exceed this line prior to the 5-year mark. This provides a prediction that Bitcoin cannot reach certain price levels before a specific date. For example, the price of Bitcoin is unlikely to reach $100,000 before 2021, and it will not exceed this price before the 5-year shifted line becomes relevant. After 2028, however, the price is predicted to never fall below $100,000, thanks to the support established by the shifted lines.
In essence, the shifted Power Law lines offer a way to predict both the minimum price levels that Bitcoin will hit by certain dates and the earliest dates by which certain price points will be reached. These lines help frame Bitcoin's potential future price range, offering insight into long-term price behavior and providing a guide for investors and analysts. Lets examine some examples:
Example 1:
In Example 1 it can be seen that point A on the 5-year shifted line acts as major resistance . Also it can be seen that 5 years later this price level now corresponds to the Base Power Law Line and acts as a major support at point B (Note: Vertical yearly grid lines have been added for this purpose👍).
Example 2:
In Example 2, the price level at point C on the 3-year shifted line becomes a major support three years later at point D, now aligning with the Base Power Law Line.
Finally, let's explore some future price predictions, as this script provides projections on the weekly timeframe :
Example 3:
In Example 3, the Bitcoin Power Law indicates that Bitcoin's price cannot surpass approximately $808K before 2030 as can be seen at point E, while also ensuring it will be at least $224K by then (point F).
PolyBand Convergence System (PBCS)PolyBand Convergence System (PBCS)
The PolyBand Convergence System (PBCS) is an advanced technical analysis indicator that combines multiple polynomial regressions with statistical bands to identify trend strength and potential reversal zones.
Key Features
Multi-Degree Polynomial Analysis: Combines 1st, 2nd, 3rd, and 4th degree polynomial regressions into a composite regression line
Adaptive Statistical Bands: Uses percentile-based bands enhanced with standard deviation multipliers
Asymmetric Volatility Measurement: Separately calculates upside and downside volatility for more accurate band placement
Smart Trend Detection: Identifies bullish, bearish, or neutral market conditions based on price position relative to bands
How It Works
PBCS creates a composite regression line from multiple polynomial fits to better capture the underlying price structure. This line is then surrounded by adaptive bands that represent statistical thresholds for price movement. When price breaks above the upper band, a bullish trend is signaled; when it breaks below the lower band, a bearish trend is indicated.
Customization Options
Regression Settings: Adjust source data, lookback period, and smoothing parameters
Percentile Controls: Fine-tune the statistical thresholds for upper and lower bands
Volatility Sensitivity: Modify standard deviation multipliers to control band width
Visual Preferences: Choose from multiple color schemes to match your trading platform
Disclaimer
This indicator is provided for educational and informational purposes only and does not constitute investment advice. Trading involves risk and may result in financial loss. Always perform your own research and consult with a qualified financial advisor before making any trading decisions.






















