PCA-Risk IndicatorOBJECTIVE:
The objective of this indicator is to synthesize, via PCA (Principal Component Analysis), several of the most used indicators with in order to simplify the reading of any asset on any timeframe.
It is based on my Bitcoin Risk Long Term indicator, and is the evolution of another indicator that I have not published 'Average Risk Indicator'.
The idea of this indicator is to use statistics, in this case the PCA, to reduce the number of dimensions (indicator) to aggregate them in some synthetic indicators (PCX)
I invite you to dig deeper into the PCA, but that is to try to keep as much information as possible from the raw data. The signal minus the noise.
I realized this indicator a year ago, but I publish it now because I do not see the interest to keep it private.
USAGE:
Unlike the Bitcoin Risk Long Term indicator, it does not make sense to change or disable the input indicators unless you use the 'Average Indicator' function. Because each input is weighted to generate the outputs, the PCX.
I extracted several courses (Bitcoin, Gold, S&P, CAC40) on several timeframes (W, D, 4h, 1h) of Trading view and use the Excel generated for the data on which I played the PCA analysis.
The results:
explained_variance_ratio: 0.55540809 / 0.13021972 / 0.07303142 / 0.03760925
explained_variance: 11.6639671 / 2.73470717 / 1.53371209 / 0.7898212
Interpretation:
Simply put, 55% of the information contained in each indicator can be represented with PC1, +13% with PC2, +7% with PC3, +3% with PC4.
What is important to understand is that PC1, which serves as a thermometer in a way, gives a simple indication of over-buying or over-selling area better than any other indicator.
PC2, difficult to interpret, is more reactive because precedes PC1, but can give false signals.
PC3 and PC4 do not seem relevant to me.
The way I use it is to take PC1 for Risk indicator, and display PC2 with 'Area'. When PC2 turns around and PC1 arrives on extremes, it can be good points to act.
NOTES :
- It is surprising that a simple average of all the indicators gives a fairly relevant result
- With Average indicator as Risk indicator, you can combine the indicators of your choice and see the predictive power with the staining of bars.
- You can add alerts on the levels of your choice on the Risk Indicator
- If you have any idea of adding an indicator, modification, criticism, bug found: share them, it’s appreciated!
---- FR ----
OBJECTIF :
L'objectif de cet indicateur est de synthétiser, via l'ACP (Analyse en Composantes Principales), plusieurs indicateurs parmi les plus utilisés avec afin de simplifier la lecture de n'importe quel actif sur n'importe quel timeframe.
Il est inspiré de mon indicateur 'Bitcoin Risk Long Term indicator', et est l'évolution d'un autre indicateur que je n'ai pas publié 'Average Risk Indicator'.
L'idée de cet indicateur est d'utiliser les statistiques, en l'occurence l'ACP, pour réduire le nombre de dimensions (indicateur) pour les agréger dans quelques indicateurs synthétiques (PCX)
Je vous invite à creuser l'ACP, mais c'est chercher à conserver un maximum d'informations à partir de la donnée brute. Le signal moins le bruit.
J'ai réalisé cet indicateur il y a un an, mais je le publie maintenant car je ne vois pas l'intérêt de le garder privé.
UTILISATION :
Contrairement à 'Bitcoin Risk Long Term indicator', il ne fait pas sens de modifier ou désactiver les indicateurs inputs, sauf si vous utiliser la fonction 'Average Indicator'. Car chaque input est pondéré pour générer les outputs, les PCX.
J'ai extrait plusieurs cours (Bitcoin, Gold, S&P, CAC40) sur plusieurs timeframes (W, D, 4h, 1h) de Trading view et utiliser les Excel généré pour la data sur laquelle j'ai joué l'analyse ACP.
Les résultats :
explained_variance_ratio : 0.55540809 / 0.13021972 / 0.07303142 / 0.03760925
explained_variance : 11.6639671 / 2.73470717 / 1.53371209 / 0.7898212
Interprétation :
Pour faire simple, 55% de l'information contenu dans chaque indicateur peut être représenté avec PC1, +13% avec PC2, +7% avec PC3, +3% avec PC4.
Ce qui faut y comprendre c'est que le PC1, qui sert de thermomètre en quelque sorte, donne une indication simple de zone de sur-achat ou sur-vente mieux que n'importe quel autre indicateur.
PC2, difficile à interpréter, est plus réactif car précède PC1, mais peut donner des faux signaux.
PC3 et PC4 ne me semble pas pertinent.
La manière dont je m'en sert c'est de prendre PC1 pour Risk indicator, et d'afficher PC2 avec 'Region'. Lorsque PC2 se retourne et que PC1 arrive sur des extrêmes, cela peut être des bons points pour agir.
NOTES :
- Il est étonnant de constater qu'une simple moyenne de tous les indicateurs donne un résultat assez pertinent
- Avec Average indicator comme Risk indicator, vous pouvez combiner les indicateurs de vos choix et voir la force prédictive avec la coloration des bars.
- Vous pouvez ajouter des alertes sur les niveaux de votre choix sur le Risk Indicator
- Si vous avez la moindre idée d'ajout d'indicateur, modification, critique, bug trouvé : partagez-les, c'est apprécié !
Pesquisar nos scripts por "ai"
Endpointed SSA of Price [Loxx]The Endpointed SSA of Price: A Comprehensive Tool for Market Analysis and Decision-Making
The financial markets present sophisticated challenges for traders and investors as they navigate the complexities of market behavior. To effectively interpret and capitalize on these complexities, it is crucial to employ powerful analytical tools that can reveal hidden patterns and trends. One such tool is the Endpointed SSA of Price, which combines the strengths of Caterpillar Singular Spectrum Analysis, a sophisticated time series decomposition method, with insights from the fields of economics, artificial intelligence, and machine learning.
The Endpointed SSA of Price has its roots in the interdisciplinary fusion of mathematical techniques, economic understanding, and advancements in artificial intelligence. This unique combination allows for a versatile and reliable tool that can aid traders and investors in making informed decisions based on comprehensive market analysis.
The Endpointed SSA of Price is not only valuable for experienced traders but also serves as a useful resource for those new to the financial markets. By providing a deeper understanding of market forces, this innovative indicator equips users with the knowledge and confidence to better assess risks and opportunities in their financial pursuits.
█ Exploring Caterpillar SSA: Applications in AI, Machine Learning, and Finance
Caterpillar SSA (Singular Spectrum Analysis) is a non-parametric method for time series analysis and signal processing. It is based on a combination of principles from classical time series analysis, multivariate statistics, and the theory of random processes. The method was initially developed in the early 1990s by a group of Russian mathematicians, including Golyandina, Nekrutkin, and Zhigljavsky.
Background Information:
SSA is an advanced technique for decomposing time series data into a sum of interpretable components, such as trend, seasonality, and noise. This decomposition allows for a better understanding of the underlying structure of the data and facilitates forecasting, smoothing, and anomaly detection. Caterpillar SSA is a particular implementation of SSA that has proven to be computationally efficient and effective for handling large datasets.
Uses in AI and Machine Learning:
In recent years, Caterpillar SSA has found applications in various fields of artificial intelligence (AI) and machine learning. Some of these applications include:
1. Feature extraction: Caterpillar SSA can be used to extract meaningful features from time series data, which can then serve as inputs for machine learning models. These features can help improve the performance of various models, such as regression, classification, and clustering algorithms.
2. Dimensionality reduction: Caterpillar SSA can be employed as a dimensionality reduction technique, similar to Principal Component Analysis (PCA). It helps identify the most significant components of a high-dimensional dataset, reducing the computational complexity and mitigating the "curse of dimensionality" in machine learning tasks.
3. Anomaly detection: The decomposition of a time series into interpretable components through Caterpillar SSA can help in identifying unusual patterns or outliers in the data. Machine learning models trained on these decomposed components can detect anomalies more effectively, as the noise component is separated from the signal.
4. Forecasting: Caterpillar SSA has been used in combination with machine learning techniques, such as neural networks, to improve forecasting accuracy. By decomposing a time series into its underlying components, machine learning models can better capture the trends and seasonality in the data, resulting in more accurate predictions.
Application in Financial Markets and Economics:
Caterpillar SSA has been employed in various domains within financial markets and economics. Some notable applications include:
1. Stock price analysis: Caterpillar SSA can be used to analyze and forecast stock prices by decomposing them into trend, seasonal, and noise components. This decomposition can help traders and investors better understand market dynamics, detect potential turning points, and make more informed decisions.
2. Economic indicators: Caterpillar SSA has been used to analyze and forecast economic indicators, such as GDP, inflation, and unemployment rates. By decomposing these time series, researchers can better understand the underlying factors driving economic fluctuations and develop more accurate forecasting models.
3. Portfolio optimization: By applying Caterpillar SSA to financial time series data, portfolio managers can better understand the relationships between different assets and make more informed decisions regarding asset allocation and risk management.
Application in the Indicator:
In the given indicator, Caterpillar SSA is applied to a financial time series (price data) to smooth the series and detect significant trends or turning points. The method is used to decompose the price data into a set number of components, which are then combined to generate a smoothed signal. This signal can help traders and investors identify potential entry and exit points for their trades.
The indicator applies the Caterpillar SSA method by first constructing the trajectory matrix using the price data, then computing the singular value decomposition (SVD) of the matrix, and finally reconstructing the time series using a selected number of components. The reconstructed series serves as a smoothed version of the original price data, highlighting significant trends and turning points. The indicator can be customized by adjusting the lag, number of computations, and number of components used in the reconstruction process. By fine-tuning these parameters, traders and investors can optimize the indicator to better match their specific trading style and risk tolerance.
Caterpillar SSA is versatile and can be applied to various types of financial instruments, such as stocks, bonds, commodities, and currencies. It can also be combined with other technical analysis tools or indicators to create a comprehensive trading system. For example, a trader might use Caterpillar SSA to identify the primary trend in a market and then employ additional indicators, such as moving averages or RSI, to confirm the trend and generate trading signals.
In summary, Caterpillar SSA is a powerful time series analysis technique that has found applications in AI and machine learning, as well as financial markets and economics. By decomposing a time series into interpretable components, Caterpillar SSA enables better understanding of the underlying structure of the data, facilitating forecasting, smoothing, and anomaly detection. In the context of financial trading, the technique is used to analyze price data, detect significant trends or turning points, and inform trading decisions.
█ Input Parameters
This indicator takes several inputs that affect its signal output. These inputs can be classified into three categories: Basic Settings, UI Options, and Computation Parameters.
Source: This input represents the source of price data, which is typically the closing price of an asset. The user can select other price data, such as opening price, high price, or low price. The selected price data is then utilized in the Caterpillar SSA calculation process.
Lag: The lag input determines the window size used for the time series decomposition. A higher lag value implies that the SSA algorithm will consider a longer range of historical data when extracting the underlying trend and components. This parameter is crucial, as it directly impacts the resulting smoothed series and the quality of extracted components.
Number of Computations: This input, denoted as 'ncomp,' specifies the number of eigencomponents to be considered in the reconstruction of the time series. A smaller value results in a smoother output signal, while a higher value retains more details in the series, potentially capturing short-term fluctuations.
SSA Period Normalization: This input is used to normalize the SSA period, which adjusts the significance of each eigencomponent to the overall signal. It helps in making the algorithm adaptive to different timeframes and market conditions.
Number of Bars: This input specifies the number of bars to be processed by the algorithm. It controls the range of data used for calculations and directly affects the computation time and the output signal.
Number of Bars to Render: This input sets the number of bars to be plotted on the chart. A higher value slows down the computation but provides a more comprehensive view of the indicator's performance over a longer period. This value controls how far back the indicator is rendered.
Color bars: This boolean input determines whether the bars should be colored according to the signal's direction. If set to true, the bars are colored using the defined colors, which visually indicate the trend direction.
Show signals: This boolean input controls the display of buy and sell signals on the chart. If set to true, the indicator plots shapes (triangles) to represent long and short trade signals.
Static Computation Parameters:
The indicator also includes several internal parameters that affect the Caterpillar SSA algorithm, such as Maxncomp, MaxLag, and MaxArrayLength. These parameters set the maximum allowed values for the number of computations, the lag, and the array length, ensuring that the calculations remain within reasonable limits and do not consume excessive computational resources.
█ A Note on Endpionted, Non-repainting Indicators
An endpointed indicator is one that does not recalculate or repaint its past values based on new incoming data. In other words, the indicator's previous signals remain the same even as new price data is added. This is an important feature because it ensures that the signals generated by the indicator are reliable and accurate, even after the fact.
When an indicator is non-repainting or endpointed, it means that the trader can have confidence in the signals being generated, knowing that they will not change as new data comes in. This allows traders to make informed decisions based on historical signals, without the fear of the signals being invalidated in the future.
In the case of the Endpointed SSA of Price, this non-repainting property is particularly valuable because it allows traders to identify trend changes and reversals with a high degree of accuracy, which can be used to inform trading decisions. This can be especially important in volatile markets where quick decisions need to be made.
Intelligent Exponential Moving Average Private AccessView the full documentation on this indicator here: www.kenzing.com
Note: This indicator is now intended for those who have been granted private access and may be more frequently updated than the previous versions.
Introduction
This indicator uses machine learning (Artificial Intelligence) to solve a real human problem.
The Exponential Moving Average ( EMA ) is one of the most used indicators on the planet, yet no one really knows what pair of exponential moving average lengths works best in combination with each other.
A reason for this is because no two EMA lengths are always going to be the best on every instrument, time-frame, and at any given point in time.
The " Intelligent Exponential Moving Average " solves the moving average problem by adapting the period length to match the most profitable combination of exponential moving averages in real time.
How does the Intelligent Exponential Moving Average work?
The artificial intelligence that operates these moving average lengths was created by an algorithm that tests every single combination across the entire chart history of an instrument for maximum profitability in real-time.
No matter what happens, the combination of these exponential moving averages will be the most profitable.
Can we learn from the Intelligent Moving Average?
There are many lessons to be learned from the Intelligent EMA . Most will come with time as it is still a new concept. Adopting the usefulness of this AI will change how we perceive moving averages to work.
Limitations
Ultimately, there are no limiting factors within the range of combinations that has been programmed. The exponential moving averages will operate normally, but may change lengths in unexpected ways - maybe it knows something we don't?
Thresholds
The range of exponential moving average lengths is between 5 to 40.
Additional coverage resulted in TradingView server errors.
Future Updates!
Soon, I will be publishing tools to test the AI and visualise what moving average combination the AI is currently using.
Follow and like for more content!
RSI or MACD + Tendance Kijun LTThis script is an update of my previous script "RSI + Tendance Kijun LT", I will not explain it here, if needed have a look at it :
I made a new script (and not update the previous one) because some people may not be interested by MACD and for performance perspective they may be interested to only have RSI (since you can't have both but only switch from RSI and MACD)
So now, you can choose to have MACD instead of RSI with long term trend based on Kijun still dispaying. Why am I adding MACD even if most of the time I never use it ? It's for Elliott Wave purpose and principaly for triangle. With MACD, you can easily identify if you're forming a triangle or not in an Elliott Wave perspective (I'm not speaking about chartist triangle).
As an example, you can see ETHUSD in daily and something looking similar to a triangle. We can trade it with many possibility (breakout, support/resistance) but I'm interested in to identify if it's a triangle with an EW count (not chartist) and if it's the case I will consider different scenario (triangle are most of the time wave 4 so we could have one more push leg down on this ticker)
So, in my daytrading I'm still always using RSI except when I want to verify if we have a triangle :) and I need to switch to MACD for that to check the following things :
- am I able to draw a triangle as the price did
- may I able to join A to C and B to D and still have a triangle on MACD
- If Yes, I will take care of E point because it's the start of the 5th wave (E point may be a truncated wave of the triangle and not join the line of points A to C)
By adding this in my strategy, I can anticipate different scenarios and invalidate them if I didn't get the triangle on MACD (by having a D point on MACD not respecting the triangle form). Don't forget, we can decide to don't trade a triangle as an EW count but still trade it as a chartist form (breakout or anything else).
In summary for the next days/weeks, on ETHUSD in daily time unit I will therefore wait and see if the price goes up to point D by being validated on the MACD. If so, then I will look at the possible formation of point E around prices 141$ and 156$, if at these prices I have short signals then it will be interesting to go back in short position
--------------------------------------------------------
Ce script est une mise à jour de mon script précédent "RSI + Tendance Kijun LT", je ne l'expliquerai pas ici, si besoin jetez-y un oeil :
J'ai fait un nouveau script (sans forcer la mise à jour du précédent) parce que certaines personnes peuvent ne pas être intéressées par MACD et pour des raisons de performance, elles peuvent être intéressées à n'avoir que RSI (puisque vous ne pouvez pas avoir les deux en même temps mais seulement passer de RSI et MACD)
Donc maintenant, vous pouvez choisir d'avoir MACD au lieu de RSI avec une tendance à long terme basée sur Kijun qui s'affiche toujours. Pourquoi est-ce que j'ajoute MACD même si la plupart du temps je ne l'utilise jamais ? C'est pour Elliott Wave et surtout pour le triangle. Avec MACD, vous pouvez facilement identifier si vous formez un triangle ou non dans une perspective Elliott Wave (je ne parle pas de triangle chartiste).
Par exemple, vous pouvez voir que sur ETHUSD en unité de temps journalière nous avons quelque chose qui ressemble à un triangle. Nous pouvons faire des trade de pleins de manières différentes (cassure d'oblique, support/résistance) mais je suis intéressé d'identifier si c'est un triangle avec un compte EW (pas chartiste) et si c'est le cas, je vais envisager un scénario différent (sachant que les triangles sont la plupart du temps la vague 4 et donc envisager à la sortie une dernière poussée baissière type vague 5 sur ETHUSD)
Donc, dans mon daytrading, j'utilise toujours RSI sauf quand je veux vérifier si nous avons un triangle :) et j'ai besoin de passer à MACD afin de vérifier les éléments suivantes :
- Suis-je capable de dessiner un triangle comme le prix le dessine
- puis-je joindre A à C et B à D et toujours avoir un triangle sur MACD ?
- Si oui, je m'occupe du point E car c'est le début de la 5ème vague (le point E peut être une onde tronquée du triangle et ne pas joindre la ligne des points A à C).
En ajoutant cela dans ma stratégie, je peux anticiper différents scénarios et les invalider si je n'ai pas obtenu le triangle sur MACD (en ayant un point D sur MACD ne respectant pas la forme du triangle). N'oubliez pas, nous pouvons décider de ne pas le trade comme un triangle d'un décompte EW mais de le trade simplement comme un triangle chartiste (breakout ou autre chose).
En résumé pour les prochains jours/semaines, sur ETHUSD en unité de temps journalière je vais donc patienter et voir si le prix va jusqu'au point D en étant validé sur le MACD. Si oui, alors je regarderai l'éventuelle formation du point E autour des prix 141$ et 156$, si à ces prix j'ai des signaux vendeurs alors il sera intéressant de rentrer en position short
AlgoFlex Scalper Pro⚡ AlgoFlex Scalper Pro — AI-Powered Precision Scalping
Take your scalping to the next level with AlgoFlex Scalper Pro, our most advanced buy/sell signal engine.
Built for traders who demand speed, accuracy, and clear entries/exits, this algorithm uses AI-driven logic to identify high-probability scalp trades in real time.
🚀 Key Features
AI-Enhanced Signals – Smarter trade detection for better timing and fewer false signals.
Built-In Take Profit & Stop Loss Levels – Instantly see recommended TP/SL for each trade.
Ultra-Fast Scalping Logic – Optimized for low-timeframe charts (1m, 5m, 15m).
Multi-Asset Ready – Works on Forex, Indices, Crypto, Commodities, and Stocks.
Adaptive Filtering – Adjusts to changing volatility and market conditions.
💡 How to Use
Add the script to your chart.
Select your market and preferred scalping timeframe.
Follow the Buy/Sell signals and TP/SL levels for trade ideas.
(Optional) Set alerts for instant trade notifications.
📲 Get Access
Available exclusively to AlgoFlex Premium Subscribers.
Download the AlgoFlex app from the App Store or Google Play.
Or visit 🌐 algoflex.org.
⚠ Disclaimer
All information is for educational purposes only. Always use proper risk management and test before trading live.
Target ScannerThis invite-only indicator implements an advanced Wolfe Wave pattern recognition system specifically designed for Borsa Istanbul (BIST) stock screening across multiple timeframes and mathematical ratio calculations.
**Core Technical Framework:**
The indicator employs sophisticated mathematical calculations across 10 distinct timeframes (377, 233, 144, 89, 55, 34, 21, 13, 8, 5 periods) using Elliott Wave ratio theory combined with algorithmic pattern detection. Unlike standard scanning tools that rely on basic technical indicators, this system uses quantitative Wolfe Wave analysis to identify precise entry and exit points across 560+ BIST stocks simultaneously.
**Key Features:**
• **Multi-Stock Scanning:** Simultaneously analyzes 40 stocks per list across 14 different BIST stock lists (560+ total stocks)
• **Advanced Pattern Detection:** Implements Wolfe Wave mathematical validation using 24 different ratio calculation methods including Fibonacci sequences, Elliott Wave ratios, Golden Ratio, Harmonic Patterns, Pi-based calculations, volatility-based dynamic ratios, and AI-optimized mathematical progressions
• **Real-Time Screening Table:** Displays active signals with current price, signal price, target price, expected profit percentage, and calculated stop-loss levels
• **Reliability Scoring System:** EPA (Entry Point Accuracy) and ETA (Exit Target Accuracy) scoring with historical performance tracking
• **Visual Signal Display:** Comprehensive signal boxes showing profit zones, stop-loss areas, entry levels, and estimated time to target completion
**Mathematical Implementation:**
The core algorithm calculates price relationships using configurable mathematical ratios. For bullish conditions, it identifies entry points when price action meets specific criteria:
- Point validation through ratio analysis between swing highs/lows across multiple timeframes
- Mathematical confirmation using (pv - pf) / (pv - pd) ratio calculations
- Confluence validation across timeframes with dynamic ratio adjustments
- Minimum profit threshold filtering to ensure signal quality
**Originality and Innovation:**
This implementation differs significantly from traditional scanning tools through several key innovations:
1. **Multi-Timeframe Wolfe Wave Detection:** Simultaneous pattern recognition across 10 timeframes rather than single-timeframe analysis
2. **Adaptive Ratio Systems:** 24 different mathematical calculation methods including volatility-based, time-based, momentum-based, and volume-weighted ratio adjustments
3. **BIST-Specific Optimization:** Tailored specifically for Turkish stock market characteristics with 14 pre-configured stock lists
4. **Institutional-Grade Visualization:** Advanced signal boxes with profit/loss zones, multiple entry levels, and time-based target estimation
5. **Real-Time Performance Tracking:** Dynamic EPA/ETA scoring system that tracks historical accuracy and adapts calculations
**Signal Generation Logic:**
The system generates signals when multiple mathematical conditions align:
- Wolfe Wave pattern completion across specified timeframes
- Ratio validation using selected mathematical progression (Fibonacci, Golden Ratio, Elliott Wave, etc.)
- Stop-loss calculation as percentage of target profit (default 0.5%)
- Minimum profit threshold compliance
- Multi-timeframe confluence confirmation
**Risk Management Features:**
• **Configurable Stop-Loss:** Calculated as percentage of target profit with recommended 0.3 setting for 1:3 risk-reward ratio
• **Profit Percentage Display:** Real-time calculation showing expected profit from signal price to target
• **Multiple Entry Levels:** EPA and ETA-based entry points with reliability scoring
• **Time Estimation:** Statistical analysis providing estimated bars/time to target completion
• **Visual Risk Zones:** Color-coded profit (green) and loss (red) areas for clear risk visualization
**Performance Characteristics:**
The indicator is optimized for active screening with frequent signal generation across multiple stocks. It provides both short-term and medium-term opportunities depending on the timeframe producing the signal. The system maintains historical statistics for signal accuracy and target completion timing.
**Technical Requirements:**
Requires understanding of Wolfe Wave pattern theory, Elliott Wave principles, and multi-timeframe analysis concepts. Users should be familiar with BIST market structure and Turkish stock trading mechanics. The indicator demands active monitoring due to the high-frequency nature of multi-stock scanning.
**Market Application:**
Specifically designed for Borsa Istanbul stocks with comprehensive coverage across major sectors. Works effectively in both trending and ranging market conditions due to its adaptive ratio selection and multi-timeframe approach. Best suited for traders focusing on Turkish equity markets with pattern-based strategies.
**Customization Options:**
• **14 Stock Lists:** Pre-configured BIST stock groups for sector-specific analysis
• **24 Ratio Methods:** From conservative Fibonacci to aggressive AI-optimized calculations
• **Quote Pair Integration:** Optional currency pair specification for international analysis
• **Timeframe Flexibility:** Customizable chart timeframe for signal generation
• **Table Positioning:** Multiple display options with size and color customization
• **Alert Integration:** Comprehensive alert system for real-time signal notifications
Crypto Narratives Performance [SwissAlgo]Crypto Narratives Performance Index
--------------------------------------------------------
What this indicator is
This script displays a relative performance index that compares the market capitalization trends of various crypto categories (narratives) against a selected 'Base asset' (BTC, ETH, or SOL) over a configurable rolling time window (default: 14-day).
It’s designed to help users observe sector rotation dynamics across the crypto ecosystem — such as whether DeFi is outperforming ETH, or if AI coins are underperforming relative to BTC.
--------------------------------------------------------
What it does
This indicator measures the percentage change in total market cap of a selected crypto sector over a user-defined lookback period, and compares it to the percentage change in market cap of a chosen base asset over the same period. The result is expressed as a ratio and transformed into a z-score, normalized over the last 180 bars. This allows the user to easily identify whether the sector is outperforming or underperforming the base asset in relative terms.
It also includes a smoothed signal line, a performance table, and marked background zones (levels of standard deviations) to help interpret potential extremes in sector outperformance or underperformance.
--------------------------------------------------------
How it works
It retrieves daily market capitalization data for both the selected base asset and sector from TradingView's CRYPTOCAP: data feed.
It computes the percent change in $ market cap over one of the following selectable periods: 1, 3, 7, 14, 30, or 60 days (14-day is the default).
The percentage change of the base is subtracted from the percentage change of the sector, producing a raw relative performance differential.
This differential is then normalized into a Z-Score, using a 180-day rolling mean and standard deviation.
The Z-Score is smoothed using an exponential moving average (EMA), and plotted against a secondary EMA signal line (to track potential performance trend changes).
A visual table compares the performance of all listed sectors against the selected base, ranked and annotated with basic symbols (stars for performance, alerts for underperformance vs. the selected 'Base Asset', i.e. BTC or ETH or SOL).
--------------------------------------------------------
Visual Features
* Color-coded plot line: Turns green, yellow, orange, or red based on zone and momentum.
* Signal line: Gray EMA of the z-score for trend comparison.
* Background fill zones:
±3 = "Extreme" outperform/underperform
±2 to ±3 = "Strong" zone
±1 to ±2 = Mild over/underperformance
±1 to -1 = Neutral performance range
* Dynamic Table:
Displays all sector vs. base performance differences.
Highlights the selected comparison sector.
Uses emojis (⭐/⚠️) for relative status at a glance.
--------------------------------------------------------
Who may benefit
This script may assist:
Crypto analysts tracking capital rotation across narratives.
Swing traders looking to spot momentum trends in crypto sectors.
Portfolio allocators observing which groups are leading or lagging relative to majors (BTC, ETH, SOL).
Developers or researchers evaluating sentiment shifts across categories (e.g., AI tokens rising vs. DeFi).
It is not a buy/sell signal tool — it's a sector/crypto narratives -relative monitor.
--------------------------------------------------------
Key Applications
Detect sector rotation (e.g., when Layer 1s start to outperform ETH, or BTC/SOL).
Monitor if certain categories are experiencing sustained interest or fading momentum.
Compare the strength of emerging narratives like DePIN, RWA, or World Liberty vs. majors.
Identify possible "mean-reversion" setups when a sector is excessively stretched relative to its historical norm.
--------------------------------------------------------
Limitations
Data dependency: All calculations rely on TradingView’s CRYPTOCAP: market cap feeds.
Normalization window: The z-score normalization is static at 180 bars; in choppy markets this may over-smooth or underreact.
Asset inclusion: The sectors reflect predefined index aggregates. Not all coins in a category may be equally weighted or relevant.
Lag: EMA smoothing introduces delay in reactive plotting.
No intra-day support: Works best on daily timeframes, as CRYPTOCAP: feeds are daily-only.
Not predictive: This script reflects past capital flows. It does not forecast future price moves.
--------------------------------------------------------
Customization
Users can adjust the following:
Base asset: BTC, ETH, SOL
Crypto sector (comparison): Choose from 11+ sectors, including DeFi, AI, Memes, Layer 1, etc.
Rolling performance period: Choose between 1–60 days.
Smoothing settings: Length of the EMA for the ratio and signal line.
Show/hide info table: Useful for screen space management.
Special Notes:
Please set the chart timeframe at 1-day in line with CRYPTOCAP data availability.
Please select the dark color scheme to view table and colors properly.
--------------------------------------------------------
Risk Disclaimer
This indicator is for informational and educational purposes only. It does not constitute financial advice, trading advice, or an invitation to engage in any financial strategy. Always conduct your own due diligence before making investment decisions. Use at your own risk.
Market conditions may shift rapidly, and past sector performance is not necessarily indicative of future outcomes. This tool is best used as part of a broader analytical framework, not in isolation.
Protected script: source code is hidden to preserve logic integrity and prevent tampering.
If you need clarification or encounter unexpected behavior with data feeds, please check the TradingView Help Center or post in the "Indicators and Strategies" section of the TradingView community.
CyberCandle SwiftEdgeCyberCandle SwiftEdge
Overview
CyberCandle SwiftEdge is a cutting-edge, AI-inspired trading indicator designed for traders seeking precision and clarity in trend-following and swing trading. Powered by SwiftEdge, it combines Heikin Ashi candles, a gradient-colored Exponential Moving Average (EMA), and a Relative Strength Index (RSI) to deliver clear buy and sell signals. Featuring glowing visuals, dynamic signal icons, and a customizable RSI dashboard in the top-right corner, this script offers a futuristic interface for identifying high-probability trade setups on various timeframes (e.g., 1H, 4H).
What It Does
CyberCandle SwiftEdge integrates three powerful components to generate actionable trading signals:
Heikin Ashi Candles: Smooths price action to highlight trends, reducing market noise and making reversals easier to spot.
Gradient EMA: A 100-period EMA with dynamic color transitions (blue/cyan for uptrends, red/pink for downtrends) to confirm market direction.
RSI Dashboard: A neon-lit display showing RSI levels, indicating overbought (>70), oversold (<30), or neutral (30-70) conditions.
Buy and sell signals are marked with prominent, glowing icons (triangles and arrows) based on trend direction, momentum, and specific Heikin Ashi patterns. The script’s customizable parameters allow traders to tailor the strategy to their preferences, balancing signal frequency and precision.
How It Works
The strategy leverages the synergy of Heikin Ashi, EMA, and RSI to filter trades and highlight opportunities:
Trend Direction: The price must be above the EMA for buy signals (bullish trend) or below for sell signals (bearish trend). The EMA’s gradient color shifts based on its slope, visually reinforcing trend strength.
Momentum Confirmation: RSI must exceed a user-defined threshold (default: 50) for buy signals or fall below it for sell signals, ensuring momentum supports the trade.
Candle Patterns: Buy signals require a green Heikin Ashi candle (close > open), with the two prior candles having minimal upper wicks (≤5% of candle body) and being red (indicating a retracement). Sell signals require a red candle, minimal lower wicks, and two prior green candles.
RSI Dashboard: Positioned in the top-right corner, it features a glowing circle (red for overbought, green for oversold, blue for neutral), the current RSI value, and a status indicator (triangle for extremes, square for neutral). This provides instant momentum insights without cluttering the chart.
By combining Heikin Ashi’s trend clarity, EMA’s directional filter, and RSI’s momentum validation, CyberCandle SwiftEdge minimizes false signals and highlights trades with strong potential. Its vibrant, AI-like visuals make it easy to interpret at a glance.
How to Use It
Add to Chart: In TradingView, search for "CyberCandle SwiftEdge" and add it to your chart. Set the chart to Heikin Ashi candles for optimal compatibility.
Interpret Signals:
Buy Signal: Large green triangles and arrows appear below candles when the price is above the EMA, RSI is above the buy threshold (default: 50), and conditions for a bullish retracement are met. Consider entering a long position with a 1:2 risk/reward ratio.
Sell Signal: Large red triangles and arrows appear above candles when the price is below the EMA, RSI is below the sell threshold (default: 50), and conditions for a bearish retracement are met. Consider entering a short position.
RSI Dashboard: Monitor the top-right dashboard. A red circle (RSI > 70) suggests caution for buys, a green circle (RSI < 30) indicates potential buying opportunities, and a blue circle (RSI 30-70) signals neutrality.
Customize Parameters: Open the indicator’s settings to adjust:
EMA Length (default: 100): Increase (e.g., 200) for longer-term trends or decrease (e.g., 50) for shorter-term sensitivity.
RSI Length (default: 14): Adjust for more (e.g., 7) or less (e.g., 21) responsive momentum signals.
RSI Buy/Sell Thresholds (default: 50): Set higher (e.g., 55) for buys or lower (e.g., 45) for sells to require stronger momentum.
Wick Tolerance (default: 0.05): Increase (e.g., 0.1) to allow larger wicks, generating more signals, or decrease (e.g., 0.02) for stricter conditions.
Require Retracement (default: true): Disable to remove the two-candle retracement requirement, increasing signal frequency.
Trading: Use signals in conjunction with the RSI dashboard and market context. For example, avoid buy signals if the RSI dashboard is red (overbought). Always apply proper risk management, such as setting stop-losses based on recent lows/highs.
What Makes It Original
CyberCandle SwiftEdge stands out due to its futuristic, AI-inspired visual design and user-friendly customization:
Neon Aesthetics: Glowing Heikin Ashi candles, gradient EMA, and dynamic signal icons (triangles and arrows) with RSI-driven transparency create a high-tech, immersive experience.
RSI Dashboard: A compact, top-right display with a neon circle, RSI value, and adaptive status indicator (triangle/square) provides instant momentum insights without cluttering the chart.
Customizability: Users can fine-tune EMA length, RSI parameters, wick tolerance, and retracement requirements via TradingView’s settings, balancing signal frequency and precision.
Integrated Approach: The synergy of Heikin Ashi’s trend clarity, EMA’s directional strength, and RSI’s momentum validation offers a cohesive strategy that reduces false signals.
Why This Combination?
The script combines Heikin Ashi, EMA, and RSI for a complementary effect:
Heikin Ashi smooths price fluctuations, making it ideal for identifying sustained trends and retracements, which are critical for the strategy’s signal logic.
EMA provides a reliable trend filter, ensuring signals align with the broader market direction. Its gradient color enhances visual trend recognition.
RSI adds momentum context, confirming that signals occur during favorable conditions (e.g., RSI > 50 for buys). The dashboard makes RSI intuitive, even for non-technical users.
Together, these components create a balanced system that captures trend reversals after retracements, validated by momentum, with a visually engaging interface that simplifies decision-making.
Tips
Best used on volatile assets (e.g., BTC/USD, EUR/USD) and higher timeframes (1H, 4H) for clearer trends.
Experiment with parameters in the settings to match your trading style (e.g., increase wick tolerance for more signals).
Combine with other analysis (e.g., support/resistance) for higher-confidence trades.
Note
This indicator is for informational purposes and does not guarantee profits. Always backtest and use proper risk management before trading.
ICT Swiftedge# ICT SwiftEdge: Advanced Market Structure Trading System
**Overview**
ICT SwiftEdge is a powerful trading system built upon the foundation of ICTProTools' ICT Breakers, licensed under the Mozilla Public License 2.0 (mozilla.org). This script has been significantly enhanced by to combine market structure analysis with modern technical indicators and a sleek, AI-inspired statistics dashboard. The goal is to provide traders with a comprehensive tool for identifying high-probability trade setups, managing exits, and tracking performance in a visually intuitive way.
**Credits**
This script is a derivative work based on the original "ICT Breakers" by ICTProTools, used with permission under the Mozilla Public License 2.0. Significant enhancements, including RSI-MA signals, trend filtering, dynamic timeframe adjustments, dual exit strategies, and an AI-style statistics dashboard, were developed by . We express our gratitude to ICTProTools for their foundational work in market structure analysis.
**What It Does**
ICT SwiftEdge integrates multiple trading concepts to help traders identify and manage trades based on market structure and momentum:
- **Market Structure Analysis**: Identifies Break of Structure (BOS) and Market Structure Shift (MSS) patterns, which signal potential trend continuations or reversals. BOS indicates a continuation of the current trend, while MSS highlights a shift in market direction, providing key entry points.
- **RSI-MA Signals**: Generates "BUY" and "SELL" signals when BOS or MSS patterns align with the Relative Strength Index (RSI) smoothed by a Moving Average (RSI-MA). Signals are filtered to occur only when RSI-MA is above 50 (for buys) or below 50 (for sells), ensuring momentum supports the trade direction.
- **Trend Filtering**: Prevents multiple signals in the same trend, ensuring only one buy or sell signal per trend direction, reducing noise and improving trade clarity.
- **Dynamic Timeframe Adjustment**: Automatically adjusts pivot points, RSI, and MA parameters based on the selected chart timeframe (1M to 1D), optimizing performance across different market conditions.
- **Flexible Exit Strategies**: Offers two user-selectable exit methods:
- **Trailing Stop-Loss (TSL)**: Exits trades when price moves against the position by a user-defined distance (in points), locking in profits or limiting losses.
- **RSI-MA Exit**: Exits trades when RSI-MA crosses the 50 level, signaling a potential loss of momentum.
- Users can enable either or both strategies, providing flexibility to adapt to different trading styles.
- **AI-Style Statistics Dashboard**: Displays real-time trade performance metrics in a futuristic, neon-colored interface, including total trades, wins, losses, win/loss ratio, and win percentage. This helps traders evaluate the system's effectiveness without external tools.
**Why This Combination?**
The integration of these components creates a synergistic trading system:
- **BOS/MSS and RSI-MA**: Combining market structure breaks with RSI-MA ensures entries are based on both price action (structure) and momentum (RSI-MA), increasing the likelihood of high-probability trades.
- **Trend Filtering**: By limiting signals to one per trend, the system avoids overtrading and focuses on significant market moves.
- **Dynamic Adjustments**: Timeframe-specific parameters make the system versatile, suitable for scalping (1M, 5M) or swing trading (4H, 1D).
- **Dual Exit Strategies**: TSL protects profits during trending markets, while RSI-MA exits are ideal for range-bound or reversing markets, catering to diverse market conditions.
- **Statistics Dashboard**: Provides immediate feedback on trade performance, enabling data-driven decision-making without manual tracking.
This combination balances technical precision with user-friendly visuals, making it accessible to both novice and experienced traders.
**How to Use**
1. **Add to Chart**: Apply the script to any TradingView chart.
2. **Configure Settings**:
- **Chart Timeframe**: Select your chart's timeframe (1M to 1D) to optimize parameters.
- **Structure Timeframe**: Choose a timeframe for market structure analysis (leave blank for chart timeframe).
- **Exit Strategy**: Enable Trailing Stop-Loss (`useTslExit`), RSI-MA Exit (`useRsiMaExit`), or both. Adjust `tslPoints` for TSL distance.
- **Show Signals/Labels**: Toggle `showSignals` and `showExit` to display "BUY", "SELL", and "EXIT" labels.
- **Dashboard**: Enable `showDashboard` to view trade statistics. Customize colors with `dashboardBgColor` and `dashboardTextColor`.
3. **Trading**:
- Look for "BUY" or "SELL" labels to enter trades when BOS/MSS aligns with RSI-MA.
- Exit trades at "EXIT" labels based on your chosen strategy.
- Monitor the statistics dashboard to track performance (total trades, win/loss ratio, win percentage).
4. **Alerts**: Set up alerts for BOS, MSS, buy, sell, or exit signals using the provided alert conditions.
**License**
This script is licensed under the Mozilla Public License 2.0 (mozilla.org). The source code is available for review and modification under the terms of this license.
**Compliance with TradingView House Rules**
This publication adheres to TradingView's House Rules and Scripts Publication Rules. It provides a clear, self-contained description of the script's functionality, credits the original author (ICTProTools), and explains the rationale for combining indicators. The script contains no promotional content, offensive language, or proprietary restrictions beyond MPL 2.0.
**Note**
Trading involves risk, and past performance is not indicative of future results. Always backtest and validate the system on your preferred markets and timeframes before live trading.
Enjoy trading with ICT SwiftEdge, and let data-driven insights guide your decisions!
VWAP + EMA Retracement Indicator SwiftEdgeVWAP + EMA Retracement Indicator
Overview
The VWAP + EMA Retracement Indicator is a powerful and visually engaging tool designed to help traders identify high-probability buy and sell opportunities in trending markets. By combining the Volume Weighted Average Price (VWAP) with two Exponential Moving Averages (EMAs) and a unique retracement-based signal logic, this indicator pinpoints moments when the price pulls back to a key zone before resuming its trend. Its modern, AI-inspired visuals and customizable features make it both intuitive and adaptable for traders of all levels.
What It Does
This indicator generates buy and sell signals based on a sophisticated yet straightforward strategy:
Buy Signals: Triggered when the price is above VWAP, has recently retraced to the zone between two EMAs (default 12 and 21 periods), and a strong bullish candle closes above both EMAs.
Sell Signals: Triggered when the price is below VWAP, has retraced to the EMA zone, and a strong bearish candle closes below both EMAs.
Signal Filtering: A customizable cooldown period ensures that only the first signal in a sequence is shown, reducing noise while preserving opportunities for new trends.
Confidence Scores: Each signal includes an AI-inspired confidence score (0-100%), calculated from candle strength and price distance to VWAP, helping traders gauge signal reliability.
The indicator’s visuals enhance decision-making with dynamic gradient lines, a highlighted retracement zone, and clear signal labels, all customizable to suit your preferences.
How It Works
The indicator integrates several components that work together to create a cohesive trading tool:
VWAP: Acts as a dynamic support/resistance level, reflecting the average price weighted by volume. It filters signals to ensure buys occur in uptrends (price above VWAP) and sells in downtrends (price below VWAP).
Dual EMAs: Two EMAs (default 12 and 21 periods) define a retracement zone where the price is likely to consolidate before continuing its trend. Signals are generated only after the price exits this zone with conviction.
Retracement Logic: The indicator looks for price pullbacks to the EMA zone within a user-defined lookback window (default 5 candles), ensuring signals align with trend continuation patterns.
Candle Strength: Signals require strong candles (bullish for buys, bearish for sells) with a minimum body size based on the Average True Range (ATR), filtering out weak or indecisive moves.
Cooldown Mechanism: A unique feature that prevents signal clutter by allowing only the first signal within a user-defined period (default 3 candles), balancing responsiveness with clarity.
Confidence Score: Combines candle body size and price distance to VWAP to assign a score, giving traders an at-a-glance measure of signal strength without needing external analysis.
These components are carefully combined to capture high-probability setups while minimizing false signals, making the indicator suitable for both short-term and swing trading.
How to Use It
Add to Chart: Apply the indicator to a 15-minute chart (recommended) or your preferred timeframe.
Customize Settings:
VWAP Source: Choose the price source (default: hlc3).
EMA Periods: Adjust the fast and slow EMA periods (default: 12 and 21).
Retracement Window: Set how many candles to look back for retracement (default: 5).
ATR Period & Body Size: Define candle strength requirements (default: 14 ATR period, 0.3 multiplier).
Cooldown Period: Control the minimum candles between signals (default: 3; set to 0 to disable).
Candle Requirements: Toggle whether signals require bullish/bearish candles or entire candle above/below EMAs.
Visuals: Enable/disable gradient colors, retracement zone, confidence scores, and choose a color scheme (Neon, Light, or Dark).
Interpret Signals:
Buy: A green "Buy" label with a confidence score appears below the candle when conditions are met.
Sell: A red "Sell" label with a confidence score appears above the candle.
Use the confidence score to prioritize higher-probability signals (e.g., above 80%).
Trade Management: Combine signals with your risk management strategy, such as setting stop-loss below the retracement zone and targeting a 1:2 risk-reward ratio.
Why It’s Unique
The VWAP + EMA Retracement Indicator stands out due to its thoughtful integration of classic indicators with modern enhancements:
Balanced Signal Filtering: The cooldown mechanism ensures clarity without missing key opportunities, unlike many indicators that overwhelm with frequent signals.
AI-Inspired Confidence: The confidence score simplifies decision-making by quantifying signal strength, mimicking advanced analytical tools in an accessible way.
Elegant Visuals: Dynamic gradients, a highlighted retracement zone, and customizable color schemes (Neon, Light, Dark) create a sleek, futuristic interface that’s both functional and visually appealing.
Flexibility: Extensive customization options let traders tailor the indicator to their style, from conservative swing trading to aggressive scalping.
PVSRA Volume Suite with Volume DeltaPVSRA Volume Suite with Volume Delta
🔹 Overview
This indicator is a Volume Suite that enhances PVSRA (Price, Volume, Support, Resistance Analysis) by incorporating Volume Delta and AI-driven predictive alerts. It is designed to help traders analyze volume pressure, market trends, and price movements with color-coded visualizations.
📌 Key Features
PVSRA Volume Color Coding – Highlights vector candles based on extreme volume/spread conditions.
Volume Delta Analysis – Tracks buying/selling pressure using up/down volume data.
AI-Powered Predictive Alerts – Identifies potential trend shifts based on volume and trend context.
Volatility-Adjusted Thresholds – Dynamically adapts volume conditions based on ATR (Average True Range).
Customizable MA & Symbol Overrides – Allows traders to tweak settings for personalized market insights.
Debug & Diagnostic Labels – Shows statistical z-scores, thresholds, and volume dynamics.
How It Works
PVSRA Color Coding – The script classifies candles into four categories based on volume and spread analysis:
🔴 Red Vector → Extreme bearish volume/spread
🟢 Green Vector → Extreme bullish volume/spread
🟣 Violet Vector → Above-average bearish volume
🔵 Blue Vector → Above-average bullish volume
Volume Delta Calculation – Uses lower timeframe volume analysis to estimate up/down volume differentials.
Trend & Predictive Alerts – Combines EMA crossovers with statistical volume analysis to detect potential trend shifts.
Volatility Adaptation – Adjusts volume thresholds based on ATR, making signals more reliable in changing market conditions.
Custom Symbol Override – Fetches PVSRA data from a different instrument, useful for index-based volume analysis.
Customizable Inputs
PVSRA Color Settings – Modify candle color schemes for better visual clarity.
Volume Delta Colors – Customize delta volume body, wick, and border colors.
AI Settings – Tune z-score thresholds, lookback periods, and enable predictive alerts.
Symbol Overrides – Analyze volume from a different market or asset.
Moving Average (MA) Settings – Display a volume-based moving average for trend confirmation.
Important Notes
Works best on intraday timeframes where volume data is reliable.
Lower timeframe volume delta estimates might not be precise for all assets.
No guarantees of accuracy – Use alongside other confluence tools for decision-making.
Credits & Open-Source Notice
This script is based on PVSRA methodologies and integrates Volume Delta analysis. Special thanks to Traders Reality and TradingView for their contributions to volume-based analysis.
MEMEQUANTMEMEQUANT
This script is a comprehensive and specialized tool designed for tracking trends and money flow within meme coins and DEX tokens. By combining various features such as trend lines, Fibonacci levels, and category-based indices, it helps traders make informed decisions in highly volatile markets.
Key Features:
1. Category-Based Indices:
• Tracks the performance of token categories like:
• AI Agent Tokens
• AI Tokens
• Animal Tokens
• Murad Picks
• Each category consists of leader tokens, which are selected based on their higher market cap and trading volume. These tokens act as benchmarks for their respective categories.
• Visualizes category indices in a line chart to identify trends and compare money flow between categories.
2. Fibonacci Correction Zones:
• Highlights key retracement levels (e.g., 60%, 70%, 80%).
• These levels are crucial for identifying potential reversal zones, commonly observed in meme coin trading patterns.
• Fully customizable to match individual trading strategies.
3. Trend Lines:
• Automatically detects major support and resistance levels.
• Separates long-term and short-term trend lines, allowing traders to focus on significant price movements.
4. Enhanced Info Table:
• Provides real-time insights, including:
• % Distance from All-Time High (ATH)
• Current Trading Volume
• 50-bar Average Volume
• Volume Change Percentage
• Displays information in an easy-to-read table on the chart.
5. Customizable Settings:
• Users can adjust transparency, colors, and ranges for Fibonacci zones, trend lines, and the table.
• Enables or disables individual features (e.g., Fibonacci, trend lines, table) based on preferences.
How It Works:
1. Tracking Money Flow Across Categories:
• The script calculates the market cap to volume ratio for each category of tokens to help identify the dominant trend.
• A higher ratio indicates greater liquidity and stability, while a lower ratio suggests higher volatility or price manipulation.
2. Identifying Retracement Patterns:
• Leverages common retracement behaviors (e.g., 70% correction levels) observed in meme coins to detect potential reversal zones.
• Combines this with trend line analysis for additional confirmation.
3. Leader Tokens as Indicators:
• Each category is represented by its leader tokens, which have historically higher liquidity and market cap. This allows the script to accurately reflect the overall trend in each category.
When to Use:
• Trend Analysis: To identify which category (e.g., AI Tokens or Animal Tokens) is leading the market.
• Reversal Zones: To spot potential support or resistance levels using Fibonacci zones.
• Money Flow: To understand how capital is moving across different token categories in real time.
Who Is This For?
This script is tailored for:
• Traders specializing in meme coins and DEX tokens.
• Those looking for an edge in trend-based trading by analyzing market cap, volume, and retracement levels.
• Anyone aiming to track money flow dynamics between different token categories.
Future Updates:
This is the initial version of the script. Future updates may include:
• Support for additional token categories and DEX data.
• More advanced pattern recognition and alerts for volume and price anomalies.
• Enhanced visualization for historical data trends.
With this tool, traders can combine money flow analysis with the 60-70% retracement strategy, turning it into a powerful assistant for navigating the fast-paced world of meme coins and DEX tokens.
This script is designed to provide meaningful insights and practical utility for traders, adhering to TradingView’s standards for originality, clarity, and user value.
MultiSector Performance Tracker [LuxAlgo]The MultiSector Performance Tracker tool shows the overall performance of different crypto market sectors within a selected time frame, overlaid on a single chart for easy comparison.
Users can customize the time frame to suit their specific needs, whether daily, weekly, monthly, or yearly.
🔶 USAGE
The tool displays the performance of up to 6 crypto sectors within a selected time period, such as each day, week, month or year, or from the beginning of the year for any of the last 4 years.
The sectors and tickers within each sector are as follows:
Layer 1: CRYPTOCAP:ETH CRYPTOCAP:SOL CRYPTOCAP:TON
Layer 2: SEED_DONKEYDAN_MARKET_CAP:MATIC TSX:MNT AMEX:ARB
CEX: CRYPTOCAP:BNB CRYPTOCAP:OKB NYSE:BGB
DEX: CRYPTOCAP:UNI LSE:JUP CRYPTOCAP:RUNE
AI: CRYPTOCAP:NEAR GETTEX:TAO CRYPTOCAP:ICP
Ethereum Memes: CRYPTOCAP:PEPE CRYPTOCAP:SHIB CRYPTOCAP:FLOKI
Traders can compare the relative performance of a custom ticker against the sector of their choice and view the average of all sectors.
The tool is fully customizable, allowing traders to enable or disable any of the features or sectors.
🔹 Dashboard
The tool also displays the data in an ascending or descending sector performance dashboard, allowing traders to see at a glance which sectors are overperforming or underperforming.
Other dashboard features include custom ticker vs. sector comparison and sectors average, and traders can choose the location and size of the dashboard.
🔶 SETTINGS
Period: View all data by time period, daily, weekly, etc. Or view data from last year, last 2 years, etc.
Relative Performance Against: Enable/Disable relative performance comparison against a sector.
Use chart ticker: Enable the use of the chart ticker or a custom ticker for relative performance comparison.
🔹 Dashboard
Show Dashboard: Enable / disable Dashboard display.
Order: Choose between ascending and descending order.
Position: Selection of dashboard location.
Size: Selection of dashboard size.
🔹 Style
Show Sectors Labels: Enable / disable sector labels
Layer 1: Enable / disable Layer 1 sector
Layer 2: Enable / disable Layer 2 sector
CEX: Enable / disable CEX sector
DEX: Enable / disable DEX sector
AI: Enable / disable AI sector
Ethereum Memes: Enable / disable Ethereum Memes sector
Average: Enable / disable sectors average display
Custom Ticker: Enable / disable custom ticker display
[Pandora] Error Function Treasure Trove - ERF/ERFI/Sigmoids+PRAISE:
At this time, I have to graciously thank the wonderful minds behind the new "Pine Profiler Mode" (PPM). Directly prior to this release, it allowed me to ascertain script performance even more. While I usually write mostly in highly optimized Pine code, PPM visually identified a few bottlenecks that would otherwise be hard to identify. Anyone who contributed to PPMs creation and testing before release... BRAVO!!! I commend all of those who assisted in it's state-of-the-art engineering and inception, well done!
BACKSTORY:
This script is specifically being released in defense of another member, an exceptionally unique PhD. It was brought to my attention that a script-mod-event occurred, regarding the publishing of a measly antiquated error function (ERF) calculation within his script. This sadly resulted in the now former member jumping ship after receiving unmannerly responses amidst his curious inquiries as to why his erf() was modded. To forbid rusty and rudimentary formulations because a mod-on-duty is temporally offended by a non-nefarious release of code, is in MY opinion an injustice to principles of perpetuating open-source code intended to benefit thousands to millions of community members. While Pine is the heart and soul of TV, the mathematical concepts contributed from the minds of members is the inspirational fuel of curiosity that powers it's pertinent reason to exist and evolve.
It is an indisputable fact that most members are not greatly skilled Pine Poets. Many members may be incapable of innovating robust function code in Pine, even if they have one or more PhDs. We ALL come from various disciplines of mathematical comprehension and education. Some mathematicians are not greatly skilled at coding, while some coders are not exceptional at math. So... what am I to do to attempt to resolve this circumstantial challenge??? Those who know me best are aware that I will always side with "the right side of history" in order to accomplish my primary self-defined missions I choose to accept. Serving as an algorithmic advocate, I felt compelled to intercede by compiling numerous error functions into elegant code of very high caliber that any and every TV member may choose to employ, so this ERROR never happens again.
After weeks of contemplation into algorithms I knew little about, I prioritized myself to resolve an unanticipated matter by creating advanced formulas of exquisitely crafted error functions refined to the best of my current abilities. My aversion for unresolved problems motivated me to eviscerate error function insufficiencies with many more rigid formulations beyond what is thought to exist. ERF needed a proper algorithmic exorcism anyways. In my furiosity, I contemplated an array of madMAXimum diplomatic demolition methods, choosing the chain saw massacre technique to slaughter dysfunctionalities I encountered on a battered ERF roadway. This resulted in prolific solutions that should assuredly endure the test of time. Poetically, as you will come to see, I am ripping the lid off of Pandora's box of error functions in this case to correct wrongs into a splendid bundle of rights for members.
INTENTION:
Error function (ERF) enthusiasts... PREPARE FOR GLORY!! The specific purpose of this script is to deprecate classic error functions with the creation of a fierce and formidable army of superior formulations, each having varying attributes of computational complexity with differing absolute error ranges in their results for multiple compute scenarios. This is NOT an indicator... It is intended to allow members to embark on endeavors to advance the profound knowledge base of this growing worldwide community of 60+ million inquisitive minds. For those of you who believe computational mathematics and statistics is near completion at its finest; I am here to inform you, this is ridiculous to ponder. We are no where near statistical excellence that can and will exist eventually. At this time, metaphorically speaking, we are merely scratching microns off of the surface of the skin of a statistical apple Isaac Newton once pondered.
THIS RELEASE:
Following weeks of pondering methodical experiments beyond the ordinary, I am liberating these wild notions of my error function explorations to the entire globe as copyleft code, not just Pine. This Pandora's basket of ERFs is being openly disclosed for the sake of the sanctity of mathematics, empirical science (not the garbage we are told by CONTROLocrats to blindly trust), revolutionary cutting edge engineering, cosmology, physics, information technology, artificial intelligence, and EVERY other mathematical branch of human knowledge being discovered over centuries. I do believe James Glaisher would favor my aims concerning ERF aspirations embracing the "Power of Pine".
The included functions are intended for TV members to use in any way they see fit. This is a gift to ALL members to foster future innovative excellence on this platform. Any attempt to moderate this code without notification of "self-evident clear and just cause" will be considered an irrevocable egregious action. The original foundational PURPOSE of establishing script moderation (I clearly remember) was primarily to maintain active vigilance over a growing community against intentional nefarious actions and/or behaviors in blatant disrespect to other author's works AND also thwart rampant copypasting bandit operations, all while accommodating balanced principles of fairness for an educational community cause via open source publishing that should support future algorithmic inventions well beyond my lifespan.
APPLICATIONS:
The related error functions are used in probability theory, statistics, and numerous and engineering scientific disciplines. Its key characteristics and applications are innumerable in computational realms. Its versatility and significance make it a fundamental tool in arenas of quantitative analysis and scientific research...
Probability Theory - Is widely used in probability theory to calculate probabilities and quantiles of the normal distribution.
Statistics - It's related to the Gaussian integral and plays a crucial role in statistics, especially in hypothesis testing and confidence interval calculations.
Physics - In physics, it arises in the study of diffusion equations, quantum mechanics, and heat conduction problems.
Engineering - Applications exist in engineering disciplines such as signal processing, control theory, and telecommunications.
Error Analysis - It's employed in error analysis and uncertainty quantification.
Numeric Approximations - Due to its lack of a closed-form expression, numerical methods are often employed to approximate erf/erfi().
AI, LLMs, & MACHINE LEARNING:
The error function (ERF) is indispensable to various AI applications, particularly due to its relation to Gaussian distributions and error analysis. It is used in Gaussian processes for regression and classification, probabilistic inference for Bayesian networks, soft margin computation in SVMs, neural networks involving Gaussian activation functions or noise, and clustering algorithms like Gaussian Mixture Models. Improved ERF approximations can enhance precision in these applications, reduce computational complexity, handle outliers and noise better, and improve optimization and convergence, possibly leading to more accurate, efficient, and robust AI systems.
BONUS ALGORITHMS:
While ERFs are versatile, its opposite also exists in the form of inverse error functions (ERFIs). I have also included a modified form of the inverse fisher transform along side MY sigmoid (sigmyod). I am uncertain what sigmyod() may be used for, but it's a culmination of my examinations deep into "sigmoid domains", something I am fascinated by. Whatever implications it may possess, I am unveiling it along with it's cousin functions. For curious minds, this quality of composition seen here is ideally what underlies what I would term "Pandora functionality" that empowers my Pandora indication. I go through hordes of formulations, testing, and inspection to find what appears to be the most beneficial logical/mathematical equation to apply...
SCRIPT OPERATION:
To showcase the characteristics and performance of my ERF/ERFI formulations, I devised a multi-modal script. By using bar_index , I generated a broad sequence of numeric values to input into the first ERF/ERFI parameter. These sequences allow you to inspect the contours of the error function's outputs for both ERF and ERFI. When combined with compute-intensive precision functions (CIPFs), the polynomial function output values can be subtracted from my CIPFs to obtain results of absolute error, displaying the accuracy of the many polynomial estimation functions I tuned in testing for Pine's float environment.
A host of numeric input settings are wildly adjustable to inspect values/curvatures across the range of numeric input sequences. Very large numbers, such as Divisor:100,000,100/Offset:200,000,000 for ERF modes or... Divisor:100,000,100/Offset:100,000,000 for ERFI modes, will display miniscule output values calculated from input values in close proximity to 0.0 for the various estimates, similar to a microscope. ERFI approximations very near in proximity to +/-1.0 will always yield large deviations of absolute error. Dragging/zooming your chart or using the Offset input will aid with visually clipping off those ERFI extremes where float precision functions cannot suffice.
NOTICE:
perf() and perfi() are intended for precision computation (as good as it basically gets) in a float environment. However, they are CPU intensive (especially perfi). I wouldn't recommend these being used in ANY Pine script unless it's an "absolute necessity" to do so to accomplish your goal. I only built them to obtain "absolute error curvatures" of the error functions for the polynomial approximations. These are visible in the accuracy modes in the indicator Settings.
Adaptive Timber! Indicator (ATI)The Adaptive Timber! Indicator (ATI) is a powerful tool designed to identify potential overbought conditions and generate reversal signals in financial markets. It combines multiple technical indicators and market conditions to provide a comprehensive assessment of the likelihood of a price reversal.
How it works:
The ATI uses a combination of the Relative Strength Index (RSI), Moving Average Convergence Divergence (MACD), momentum, and volume to detect overbought conditions and potential reversals. The indicator adapts to the current timeframe, adjusting its parameters accordingly to provide more accurate signals.
Key components:
RSI: The ATI uses the RSI to determine overbought conditions. When the RSI exceeds a specified reversal threshold, it indicates a potential overbought state.
MACD: The indicator monitors the MACD line and signal line to identify moments when they are close to crossing, suggesting a potential trend reversal.
Momentum: The ATI checks if the momentum is increasing, providing confirmation of a potential reversal.
Volume: It analyzes volume to confirm the strength of the reversal signal. A decrease in volume along with overbought conditions adds confidence to the reversal indication.
Timeframe Adaptability: The indicator automatically adjusts its parameters based on the current timeframe, ensuring optimal performance across different time horizons.
How to use:
When the ATI identifies a potential reversal, it displays a colored triangle above the price bars. The color of the triangle represents the strength of the reversal signal: red for a strong signal, orange for a moderate signal, and yellow for a weak signal. Additionally, the indicator plots purple triangles below the price bars as an early warning signal for potential trend reversals.
Traders can use these visual cues along with other technical analysis techniques and risk management strategies to make informed trading decisions. The ATI can be particularly useful for identifying potential short-selling opportunities or for determining exit points in existing long positions.
Creators:
The Adaptive Timber! Indicator (ATI) is the result of a collaborative effort led by Claude , an AI assistant with expertise in financial analysis and programming. The development of the ATI was made possible through the valuable contributions and insights from GPT4 , an advanced language model, Clay , a skilled trader, and Pi AI , Clay's trading assistant.
Claude played a crucial role in designing and implementing the indicator's algorithm, ensuring its robustness and adaptability across different timeframes. GPT4 provided guidance and suggestions for refining the indicator's logic and optimizing its performance. Clay and Pi AI offered their trading expertise and real-world experience to help shape the indicator's functionality and usability.
We would like to express our gratitude to all the members of our trading team for their dedication and hard work in bringing the Adaptive Timber! Indicator to life. We wish all traders the best of luck in their trading endeavors and hope that the ATI will be a valuable addition to their technical analysis toolkit, empowering them to make more informed and profitable trading decisions.
[Excalibur][Pandora][Mosaic] Ultra Spectrum Analyzer@veryfid, you will always be remembered eternally...
ANCIENT MYTHOS AND LORE:
The retellings of "Pandora's Box" serve as a cautionary metaphor depicting an opened container (pithos - jar) that once held profound perils and evils — sufferings that are experienced around the world in various forms. The known and vague mythical box contents actually represent manifestation of evils, situational adversities, and human disparities that have been encountered throughout life for aeons. In contemporary times, a meager list of ordeals would include incidents of deceit, betrayal, corruption, oppression, greed, envy, depravity, conflict, mania, affliction, plague, and mortality. However, as the tale is told, kept and remaining inside the box was the essence of expectant hope (elpis), which may represent the optimism and resilience to overcome immense hardships.
There are other versions of the classic story where Pandora isn't actually the culprit, being her husband Epimetheus was the lid lifting perpetrator and the one who always and actually received the gift(s). Curiously, the interpreted Greek word ‘Pandora’ translated to English, can mean either "all-endowed" or "all-gifting". Much like Pandora herself, who was formed from clay of the earth, the jar also would have been most likely crafted from clay. Conceived as a made-to-order maiden for an arranged marriage, Pandora was given qualities of exquisite beauty, persuasive charm, all while being adorned with jewelry and fine clothing. Olympian premeditated preparations in the didactic fable of 'Works and Days' by Hesiod had blamable intent and would be later used for centuries as denigration of women/mothers. The rest of Hesiod's tale is even worse.
In reality, the entire contrived exploit of incarnating Pandora as a trojan temptress was solely intended as an instrument of infiltration and entrapment for delivery to Epimetheus as an arranged seductive snare. Being a man myself, I find it appalling how the antiquated writings of ancient morphological men have repeatedly ostracized women for many of the ailments of mankind. When in truth, it is far more often that despicable men are the recorded all time winning historical harbingers of global abysmal darkness by means of ideological treachery. Vast historical chronicles since antiquity have frequently recorded who the typical real-world villains truly are and are not. As the stories are told in the first place, it was dictator Zeus along with his Olympian conspirators, who intently implanted malicious spirits into a gifted receptacle to orchestrate planetary suffering and carnage on humankind.
PROLOGUE:
I believe, it is way past overdue to restore Pandora's name to a place of better standing. As I have been peaking into a theoretical pitcher of mathematic mysteria for years now, where no one else dares to look. Once upon a time, I pondered an opposite notion: What if Pandora was originally conceived to solve global problems instead of creating them? Maybe Pandora could have been wielded into existence to wage unrelenting and avenging retribution on every dominance hierarchy and each diabolical enemy intently hostile to humankind. My hypothetical version of Pandora would take the notion of "mors omnibus tyrannis" to a whole other fearsome magnitude. She would cause evil arrogant men to tremble with sheer horror... the kind of fear ALL false gods, despotic kings, tyrannical dictators, controligarchies, and criminal syndicates truly worry about at night. In my opinion, that would be a better fictional story worthy of retelling for aeons.
One unique goliath 21st century adversary is LAG and it must be subdued or minimized. This unyielding nemesis is also known as group delay, processing delay, and algorithmic latency. My eyes are locked onto this opponent with fixation that will never surrender a staring contest. The formidable creature lag is my daily arch enemy destined for defeat in battle. It's losing time after time and bar by bar during the past year of 2023. In my attempts to peer through the murky darkness of useless and deceptive information, I am confident that I have found more suitable answers to many current dilemmas of algorithmic lag.
The internet, using mathematics and the speed of light as a planetary beneficial advantage, has already performed wonders by drastically reducing the delay of dissemination of knowledge. This has garnered a mostly positive rapid acceleration of economic evolution. However, hierarchies of dark forces of chaos and subversion by the thousands lurking in the global shadows are not thrilled about well informed populations. In the present era, new spectrums of strife within planetary societies are being waged, one of the worst forms taking the hideous form of censorship. Other nefarious tactics are hindering economic progress with substantial negativity using heavily funded penetration and infiltration operations. Those sinister operational varieties are spanning psychological, cultural, educational, digital, financial, electoral, scientific, medical, biological, commercial, infrastructural, institutional, and organizational domains.
They are mistakenly meddling with the entire primordial order of planetary natural dynamics. The miscalculations from these malevolent CAUSES will be countered with EFFECTS of immense retaliatory primal veracity having equal or exceedingly more powerful opposition with overwhelming numbers in mass. It is a law embedded within the universe that supersedes ALL laws, known as 'causality'. Everyone, especially programmers, know exactly what to do with predatory infiltrating cockroaches... When tyranny becomes enforced law by agendized policies in any land, order = abs(DUTY) * pow(RIGHT) * exp(PEOPLE).
FUTURE ECONOMIC ADVERSARIAL CHALLENGES:
Just as programmers have to critically analyze our code for BUGS, a scrutinized analysis of the current world around us is at times necessary. It is an empirical statistical fact that a few percent of captains at the helm of industry, commerce, institutes, and governance are monetarily psychopathic. They are often hidden bugs operating within national systems. The subsequent economic consequences result in effects that aren't always clearly obvious to all. Here are a few global economic security issues...
Corrupted immoral code in national operation is an inevitable breakdown waiting to happen. In the harsh future to follow, old degenerate interdependent control systems will need to be dismantled and discarded, eventually succeeded by having resilient parallel arrangements with robust independent fidelity. The coming successive paradigm shifts would include future hardware and the hefty novel algorithms that will run on them afterwards. Evolution is inevitable! The internet must be upgraded and continually programmed securely to the near hardness of diamonds at multiple layers within the operational code to retain peaceful global integrity between international collaborations.
DigitalID is never going to fix an insecure vulnerable titanic network of devices full of holes taking in megatons of water from every direction. Weaponized digital mucking ID dead on arrival is certainly NOT a one size fits all solution and it still doesn't do diddly-squat to secure the internet's DNA as executable code. DigID's real purpose is to manage servitude digitally and keep citizens right where they want them, as subservient slaves.
There is a very specific reason why we have key chain rings in OUR pockets with numerous private keys evolving technologically over time to robustly safeguard individual locks we use every day, duh. AI becoming an artificial sentient hyper intelligence may sooner or later become a potential hazard, especially if it breaks AES192 into a thousand shards of glass. Perilous aspects from artilects will emerge and are coming swiftly. AI is already being weaponized and tasked to mind muzzle expressions of human consciousness.
Also, EMPs from the sun ARE an imminent planetary threat, and no amount of carbon taxation schemes inciting anthropomorphic climate hysteria originating from falsified modeling hocus-pocus is going to protect against extreme solar cycle related X-class phenomena. Our solar system candle called the sun, is not consistently energy irradiation stable if you just glance at SOHO images/video. There are very obvious cyclical frequencies within the dynamics of the sun's energetic activity that affect planets far beyond earth. The earth already has a built-in natural thermometer indicating that oceans have been rising very linearly for thousands of years since the last ice age, submerging entire ancient cities under coastal water dozens of meters.
BEAR with me and pardon my French translation, but I have the option to call major league climate BULLshite. There is no hardcore "anthropomorphic climate crisis" proof. It is a crisis in failed modeling that is insufficient to properly estimate colossal computations with dircet limited empirical data with enough accuracy to anticipate higly probable future outcomes. People deserve solid science instead of slanderous smackdowns and slighted statistics. 400ppm of atmospheric CO2 is nothing compared to previously existing 1600ppm concentrations acquired from ancient indirect historical observations at a time when early humans were hunter gatherers driving gas guzzlers.
Western climate-monger fortune tellers are scamming every nation on earth, betraying the collective human species worldwide by climate hype strangulation. Wait until the sheeple with dinner forks turn on the rabid wolves in shepherds's clothing; it has already begun. What these predatory profiteering fraudsters are not telling you is WATER (H2O) in earth's atmosphere is the all time dominating and potent greenhouse gas, always has been, not CO2. Dr. Willie Soon has explained it in the best of ways with clarity. Misleaders, banksterCorpses, and mediaPresstitutes are immensely involved in this hot model scheme and like keeping people right where they want them, force fed with mental filth with regularly scheduled socially engineered programming.
Beware of agendas and isms. The ESGovernanceAgenda is ready made economic coffin nails. I'll explain this very simply, a future green war on carbon is a silent war on carbon lifeforms and economies. Many of the smiling faces you can actually see on the world stage pulling levers are often the coldest blooded deceivers beyond anything you can ever imagine. In truth, corporate agents and policies are the greatest devastators to ecologies, while in concert, they are incessantly waging blame campaign agendas with subversive narratives by targeting consumers as the wrongdoers.
Why am I mentioning all these adversarial difficulties? Well, the intertangling myriads of tomorrow's "bundle of burdens" in a future box ALL have to be thoroughly analyzed, sifted through, and dealt with tenaciously now and in the future by generations to come in every nation state. Some days I wonder if Hesiod's fiction was taken from reality over 2000 years ago to WARN future world inhabitants. In the scope of economics, the series of incidents that have or will lead up to major world events, will need to have the frequency of related occurrences examined that lead up to crucial points in time historically. In order to prevent future disparities, our progeny will look backwards into history with ultra clarity and vigilance to see how corrupted society once was by hordes of overlords twisted by obsessive delusions of absolute power over the entire human species. There is no human race, only diverse genetic multiformity expressed from the DNA code of humankind exists.
We can't simply put the lid back on low entropy hydroCarbons and a broadband globalNet without having an implemented proven replacement or upgrade. It's far too late, leaving only wiser security chess moves forward as the only viable options. Nikola Tesla was dreaming of this daily in order to build every foundation of modern civilization that we now enjoy today and take for granted. Humanity still has to evolve by unlocking hidden secrets of mother nature. For instance, nations powered by endless geothermal electricity and deuterium fusion WILL solve a lot of the world's problems. Imagine our world dominantly powered by extreme abundant amounts of heavy water... Lady destiny awaits and begs for the future to be built securely, by eventual abandonment of antiquated wheelworks that eventually deserve to be hurled into the annihilatory dustbin of history.
SPECTRAL BURDENS:
Ephemeral 'spectral contents' are extremely difficult to decipher with the least amount of lag, especially while they reside within a noise ridden non-stationary environment. When 'lifting the lid off' of series analysis to peek with quick discernment, distinguishing between real-time relevant signals differing from intertwining undesirable randomness in a crowded information space, requires special kinds of intricate extraction. Due to the nature of fractal chaos, any novel spectral method is better than the scanty few we have now. Firstly, let's comprehend agilities of interpreting a spectrum's structure...
SPECTRAL ANALYSIS PURPOSE AND INTENTION:
Frequency Analysis - Spectral analysis serves a crucial purpose in unraveling the frequency composition of a signal. Its primary intention is to explore the intricacies of a dataset by identifying dominant frequencies and unveiling inherent cyclical patterns. This foundational understanding forms the basis for improving analyses.
Power Spectrum Visualization - The visualization of a signal's power spectrum is a key objective in spectral analysis. By portraying how power is distributed across different frequencies, the goal is to provide a visual representation of the signal's energy landscape. This insight aids with grasping the significance of various frequency components obtained from a larger whole.
Signal Characteristics - Understanding the traits of a signal is another vital goal. Spectral analysis seeks to characterize the nature of the signal, unveiling its periodicity, trends, or irregularities. This knowledge is instrumental in deciphering the behavior of the signal over time, fostering a deeper comprehension.
Algorithmic Adaptation - Spectral analyzer estimation can play a pivotal role in algorithmic development. By assisting with the creation of algorithms sensitive to specific frequency ranges, one possible advantage is to enable real-time adaptability. This adaptability approach may allow algorithms to respond dynamically to variations in different spectral components, potentially enhancing their efficacy.
Market Analysis - In the realm of trading systems and financial markets, spectral analysis methods can serve as applicable functions when studying market dynamics. By 'uncovering' trends, cycles, and anomalies within financial instruments, this analytical proficiency can aid traders and algorithm developers with making better informed decisions based on the spectral attributes of market data.
Noise/Interference Detection - Another purpose of spectral analysis is to identify and scrutinize undesirable elements within a signal, such as noise or interference. One benefit would be to facilitate the development of strategies to mitigate or eliminate these unwanted components, ultimately refining the quality of a given signal with filtration.
INTRODUCTION:
Allow me to introduce Pandora! What you see in the demonstration above, I've named it "Pandora Periodogram", which is also referred to as 'Ultra Spectrum Analyzer' (USA) for technical minds. Firstly, this is NOT technically speaking an indicator like most others. I would describe it as an avant-garde cycle period detector obtaining accurate spectral estimates on market data with Pine Script v5.0. USA is a spectral analysis cryptid that I can only describe as being an alien saber in nature. It is my rendering of spectral wrath unleashed. With time and history to come, my HOPE is this instrument will reveal Excalibur like aspects capable of slicing up a spectrum craftily, traits long thought to be a mythical enigma.
It is not modified forms of either Autocorrelation Periodogram (ACP) or MESA. Pandora's Periodogram embodies an entirely distinct design, adorned with glamourous color, by incorporating several of my most profound, highly refined technological innovations that I have poetically composed into being. What I have forged in Pine, has essentially manifested as a zero lag spectrum analyzer. Pandora easily peeks inside a single signal source more effectively to inspect for hidden spectres, revealing invisible apparitions inside data with improved clarity...
My 'Ultra Spectrum Analyzer' bears an eerie likeness to Autocorrelation Periodogram, but it possesses no autocorrelation and the other small hindrances of ACP that I formerly encountered. While ACP does have a few shortcomings, a few bars of lag, and high frequency bias, it is still phenomenal code. ACP is one answer to spectral enigmas, but not the only one. Developers can utilize this detector by creating scripts that employ a "Dominant Cycle Source" input to adaptively govern algorithms. If you are capable of building suitable algorithms for direct tethering to Autocorrelation Periodogram, then this is your next step in evolutionary application to tether to when you are ready. ACP is a good place to start building upon as an exploratory vessel, before you might ponder using USA. Once you do obtain dynamic ACP sweetness with only a few pesky bars of dominant cycle induced lag, USA may be your tool chest choice without the burden of subtle ACP lag.
USA is possibly the end of my quest for spectral bliss, for the time being. However, I still suspect there is more room for upgrades to Pandora in the future. I must mention, as an overture, this won't be the last of Pandora tech that you will witness, as my literal "out of the box thinking" will unleash many additional creations upon this Earth. The "Power of Pine" merely serves as the beginning foundational phase... Some of my futuristic dreams and daydreams of TradingView are droplets in a wavy ocean of economic providence and potential.
What I am crafting in poetic form is born out of raw curiosity. Future creations are probably best kept private for now, but I will present my future tech with beauty and elegance as it should rightfully be. There's one catch, I have absolutely no idea what this and my future marvels may do to the future of digital signal processing (DSP) and markets. I do fear any insane AI or MALEficent entity ever seeing this code. My innermost hopes and ambitions are always focused on achieving the best result obtainable. What the future can hold, may be absolutely exquisite to gaze upon, maybe even monstrous, or possibly a combination of both.
Notice: Unfortunately, I will not provide any integration support into member's projects at all. My own projects demand too much of my day to day time. I hope you understand. Meanwhile, I'll be applying this on future indication until Mr. Mortality sneaks up behind me.
FEATURES AND CHARACTERISTICS:
I have included as much ultra adjustability as I can humanly muster. Those features being the following and more...
Color Preferences - Four vivid color schemes are available in the original release. The "Ultra Violet" color scheme, in particular, contributes to the indicator's technical title, as it seems to me to reveal the greatest detail of my various spectral color schemes. Color inversion of the four color schemes is also possible, yielding eight schemes in total with predator style visuals. Heatmap transparency control is also provided.
Lag Control - Pandora achieves zero lag spectral approximations, with the added capability to control lag using an input for selectable delay. Note, however, that testing less than zero lag has not been assessed thoroughly due to potential unforeseen instability concerns. Adjustments are provided in either direction for further testing.
Spectral Bias Mitigation - Options for mitigating high OR low-frequency spectral biases are present. One interesting tweak made during development was a subtle form of spectral manipulation, involving a partial reduction of frequency amplitudes influencing either the highest or lowest periodicities. This slightly reduces the impact on the upper and lower portions of the spectrogram and the dominant cycle measurement. What initially surfaced as an unexpected discovery, may now be considered worthy of experimental utility.
Adjustable Periodogram Window Size - The periodogram is adjustable for various window sizes of periodic operation. Exploration up to a periodicity of 59 is obtainable for curiosity's sake. This flexibility challenges the notion that curiosity isn't always a negative trait, contrasting with Hesiod's ancient perspective.
Dominant Cycle Filtration - Filtration of the dominant cycle is achieved with a novel smoother having reduced lag, easily surpassing SuperSmoother's performance. However, defeating lag completely on that one plot() function was elusive.
Tooltips for Control Intention - The settings commonly include handy and informative tooltips that provide information eluding to the intention behind the various controls provided.
Initialization Advantages - Initialization of USA accomplishes what Autocorrelation Periodogram (ACP) didn't. Spectral analysis begins on the earliest visible bars, starting at period 2. Users need to ensure their algorithm's integrity from period 2 upwards to beyond 40ish, establishing a viable operational range for dynamically governing those algorithms. It's notable that stochastics and correlations have a minimum operable critical period of 2, distinct from most low-pass filters that can actually achieve a period of 1 (which is the raw signal itself). Proper initialization of complex IIR filters is particularly effective, especially with smaller initialization periods.
Remaining options and features are comparable to my Enhanced Autocorrelation Periodogram in terms of comprehension, and other upgrades may be added in the future upon discovery.
PERIODOGRAM INTERPRETATION:
The periodogram heatmap renders a power spectrum of a signal visually by color, where the y-axis represents periodicity (frequencies/wavelengths) and the x-axis is delineating time. The y-axis is divided into periods, with each elevation portraying demarcation of periodicity. In this periodogram, the y-axis ranges from 4 at the very bottom to 49 (or greater) at the top, with intermediary values in between, all conveying power of the corresponding frequency component by color. The higher the position ascends on the y-axis, the longer the cycle period or lower the frequency. The x-axis of the periodogram signifies time and is partitioned into equal chart intervals, where each vertical column corresponds to the time interval when the signal was measured. Most recent values/colors are on the right side of the periodogram.
Intensity of the colors on the periodogram signify the power level of the corresponding frequency or cycle period. For example, the "Fiery Embers" color scheme is distinctly like heat intensity from any casual flame witnessed in a small fire from a lighter, match, or campfire. The most intense power exhibited would be represented by the brightest of yellow, while the lowest power would be indicated by the darkest shade of red or just black. By analyzing the pattern of colors across different periods, one may gain insights into the dominant frequency components of the signal and visually identify recurring cycles/patterns of periodicity.
GKD-C CCI Adaptive Smoother [Loxx]Giga Kaleidoscope GKD-C CCI Adaptive Smoother is a Confirmation module included in Loxx's "Giga Kaleidoscope Modularized Trading System".
█ GKD-C CCI Adaptive Smoother
Commodity Channel Index: History, Calculation, and Advantages
The Commodity Channel Index (CCI) is a versatile technical analysis indicator widely used by traders and analysts to identify potential trends, reversals, and trading opportunities in various financial markets. Developed by Donald Lambert in 1980, the CCI was initially designed to analyze the cyclical behavior of commodities. However, its applications have expanded over time to include stocks, currencies, and other financial instruments. The following provides an overview of the CCI's history, explain its calculation, and discuss its advantages compared to other indicators.
History
Donald Lambert, a commodities trader and technical analyst, created the Commodity Channel Index in response to the unique challenges posed by the cyclical nature of the commodities markets. Lambert aimed to develop an indicator that could help traders identify potential turning points in the market, allowing them to capitalize on price trends and reversals. The CCI quickly gained popularity among traders and analysts due to its ability to adapt to various market conditions and provide valuable insights into price movements.
Calculation
The CCI is calculated through the following steps:
1. Determine the typical price for each period: The typical price is calculated as the average of the high, low, and closing prices for each period.
Typical Price = (High + Low + Close) / 3
2. Calculate the moving average of the typical price: The moving average is computed over a specified period, typically 14 or 20 days.
3. Calculate the mean deviation: For each period, subtract the moving average from the typical price, and take the absolute value of the result. Then, compute the average of these absolute values over the specified period.
4. Calculate the CCI: Divide the difference between the typical price and its moving average by the product of the mean deviation and a constant, typically 0.015.
CCI = (Typical Price - Moving Average) / (0.015 * Mean Deviation)
Why CCI is Used and Its Advantages over Other Indicators
The CCI offers several advantages over other technical indicators, making it a popular choice among traders and analysts:
1. Versatility: Although initially developed for commodities, the CCI has proven to be effective in analyzing a wide range of financial instruments, including stocks, currencies, and indices. Its adaptability to different markets and timeframes makes it a valuable tool for various trading strategies.
2. Identification of overbought and oversold conditions: The CCI measures the strength of the price movement relative to its historical average. When the CCI reaches extreme values, it can signal overbought or oversold conditions, indicating potential trend reversals or price corrections.
3. Confirmation of price trends: The CCI can help traders confirm the presence of a price trend by identifying periods of strong momentum. A rising CCI indicates increasing positive momentum, while a falling CCI suggests increasing negative momentum.
4. Divergence analysis: Traders can use the CCI to identify divergences between the indicator and price action. For example, if the price reaches a new high, but the CCI fails to reach a corresponding high, it can signal a weakening trend and potential reversal.
5. Independent of price scale: Unlike some other technical indicators, the CCI is not affected by the price scale of the asset being analyzed. This characteristic allows traders to apply the CCI consistently across various instruments and markets.
The Commodity Channel Index is a powerful and versatile technical analysis tool that has stood the test of time. Developed to address the unique challenges of the commodities markets, the CCI has evolved into an essential tool for traders and analysts in various financial markets. Its ability to identify trends, reversals, and trading opportunities, as well as its versatility and adaptability, sets it apart from other technical indicators. By incorporating the CCI into their analytical toolkit, traders can gain valuable insights into market conditions, enabling them to make more informed decisions and improve their overall trading performance.
As financial markets continue to evolve and grow more complex, the importance of reliable and versatile technical analysis tools like the CCI cannot be overstated. In an environment characterized by rapidly changing market conditions, the ability to quickly identify trends, reversals, and potential trading opportunities is crucial for success. The CCI's adaptability to different markets, timeframes, and instruments makes it an indispensable resource for traders seeking to navigate the increasingly dynamic financial landscape.
Additionally, the CCI can be effectively combined with other technical analysis tools, such as moving averages, trend lines, and candlestick patterns, to create a more comprehensive and robust trading strategy. By using the CCI in conjunction with these complementary techniques, traders can develop a more nuanced understanding of market behavior and enhance their ability to identify high-probability trading opportunities.
In conclusion, the Commodity Channel Index is a valuable and versatile tool in the world of technical analysis. Its ability to adapt to various market conditions and provide insights into price trends, reversals, and trading opportunities make it an essential resource for traders and analysts alike. As the financial markets continue to evolve, the CCI's proven track record and adaptability ensure that it will remain a cornerstone of technical analysis for years to come.
What is the Smoother Moving Average?
The smoother function is a custom algorithm designed to smooth the price data of a financial asset using a moving average technique. It takes the price (src) and the period of the rolling window sample (len) to reduce noise in the data and reveal underlying trends.
smoother(float src, int len)=>
wrk = src, wrk2 = src, wrk4 = src
wrk0 = 0., wrk1 = 0., wrk3 = 0.
alpha = 0.45 * (len - 1.0) / (0.45 * (len - 1.0) + 2.0)
wrk0 := src + alpha * (nz(wrk ) - src)
wrk1 := (src - wrk) * (1 - alpha) + alpha * nz(wrk1 )
wrk2 := wrk0 + wrk1
wrk3 := (wrk2 - nz(wrk4 )) * math.pow(1.0 - alpha, 2) + math.pow(alpha, 2) * nz(wrk3 )
wrk4 := wrk3 + nz(wrk4 )
wrk4
Here's a detailed breakdown of the code, explaining each step and its purpose:
1. wrk, wrk2, and wrk4: These variables are assigned the value of src, which represents the source price of the asset. This step initializes the variables with the current price data, serving as a starting point for the smoothing calculations.
wrk0, wrk1, and wrk3: These variables are initialized to 0. They will be used as temporary variables to hold intermediate results during the calculations.
Calculation of the alpha parameter:
2. The alpha parameter is calculated using the formula: 0.45 * (len - 1.0) / (0.45 * (len - 1.0) + 2.0). The purpose of this calculation is to determine the smoothing factor that will be used in the subsequent calculations. This factor will influence the balance between responsiveness to recent price changes and smoothness of the resulting moving average. A higher value of alpha will result in a more responsive moving average, while a lower value will produce a smoother curve.
Calculation of wrk0:
3. wrk0 is updated with the expression: src + alpha * (nz(wrk ) - src). This step calculates the first component of the moving average, which is based on the current price (src) and the previous value of wrk (if it exists, otherwise 0 is used). This calculation applies the alpha parameter to weight the contribution of the previous wrk value, effectively making the moving average more responsive to recent price changes.
Calculation of wrk1:
4. wrk1 is updated with the expression: (src - wrk) * (1 - alpha) + alpha * nz(wrk1 ). This step calculates the second component of the moving average, which is based on the difference between the current price (src) and the current value of wrk. The alpha parameter is used to weight the contribution of the previous wrk1 value, allowing the moving average to be even more responsive to recent price changes.
Calculation of wrk2:
5. wrk2 is updated with the expression: wrk0 + wrk1. This step combines the first and second components of the moving average (wrk0 and wrk1) to produce a preliminary smoothed value.
Calculation of wrk3:
6. wrk3 is updated with the expression: (wrk2 - nz(wrk4 )) * math.pow(1.0 - alpha, 2) + math.pow(alpha, 2) * nz(wrk3 ). This step refines the preliminary smoothed value (wrk2) by accounting for the differences between the current smoothed value and the previous smoothed values (wrk4 and wrk3 ). The alpha parameter is used to weight the contributions of the previous smoothed values, providing a balance between smoothness and responsiveness.
Calculation of wrk4:
7. Calculation of wrk4:
wrk4 is updated with the expression: wrk3 + nz(wrk4 ). This step combines the refined smoothed value (wrk3) with the previous smoothed value (wrk4 , or 0 if it doesn't exist) to produce the final smoothed value. The purpose of this step is to ensure that the resulting moving average incorporates information from past values, making it smoother and more representative of the underlying trend.
8. Return wrk4:
The function returns the final smoothed value wrk4. This value represents the Smoother Moving Average for the given data point in the price series.
In summary, the smoother function calculates a custom moving average by using a series of steps to weight and combine recent price data with past smoothed values. The resulting moving average is more responsive to recent price changes while still maintaining a smooth curve, which helps reveal underlying trends and reduce noise in the data. The alpha parameter plays a key role in balancing the responsiveness and smoothness of the moving average, allowing users to customize the behavior of the algorithm based on their specific needs and preferences.
What is the CCI Adaptive Smoother?
The Commodity Channel Index (CCI) Adaptive Smoother is an innovative technical analysis tool that combines the benefits of the CCI indicator with a Smoother Moving Average. By adapting the CCI calculation based on the current market volatility, this method offers a more responsive and flexible approach to identifying potential trends and trading signals in financial markets.
The CCI is a momentum-based oscillator designed to determine whether an asset is overbought or oversold. It measures the difference between the typical price of an asset and its moving average, divided by the mean absolute deviation of the typical price. The traditional CCI calculation relies on a fixed period, which may not be suitable for all market conditions, as volatility can change over time.
The introduction of the Smoother Moving Average to the CCI calculation addresses this limitation. The Smoother Moving Average is a custom smoothing algorithm that combines elements of exponential moving averages with additional calculations to fine-tune the smoothing effect based on a given parameter. This algorithm assigns more importance to recent data points, making it more sensitive to recent changes in the data.
The CCI Adaptive Smoother dynamically adjusts the period of the Smoother Moving Average based on the current market volatility. This is accomplished by calculating the standard deviation of the close prices over a specified period and then computing the simple moving average of the standard deviation. By comparing the average standard deviation with the current standard deviation, the adaptive period for the Smoother Moving Average can be determined.
This adaptive approach allows the CCI Adaptive Smoother to be more responsive to changing market conditions. In periods of high volatility, the adaptive period will be shorter, resulting in a more responsive moving average. Conversely, in periods of low volatility, the adaptive period will be longer, producing a smoother moving average. This flexibility enables the CCI Adaptive Smoother to better identify trends and potential trading signals in a variety of market environments.
Furthermore, the CCI Adaptive Smoother is a prime example of the evolution of technical analysis methodologies. As markets continue to become more complex and dynamic, it is crucial for analysts and traders to adapt and improve their techniques to stay competitive. The incorporation of adaptive algorithms, like the Smoother Moving Average, demonstrates the potential for blending traditional indicators with cutting-edge methods to create more powerful and versatile tools for market analysis.
The versatility of the CCI Adaptive Smoother makes it suitable for various trading strategies, including trend-following, mean-reversion, and breakout systems. By providing a more precise measurement of overbought and oversold conditions, the CCI Adaptive Smoother can help traders identify potential entry and exit points with greater accuracy. Additionally, its responsiveness to changing market conditions allows for more timely adjustments in trading positions, reducing the risk of holding onto losing trades.
While the CCI Adaptive Smoother is a valuable tool, it is essential to remember that no single indicator can provide a complete picture of the market. As seasoned analysts and traders, we must always consider a holistic approach, incorporating multiple indicators and techniques to confirm signals and validate our trading decisions. By combining the CCI Adaptive Smoother with other technical analysis tools, such as trend lines, support and resistance levels, and candlestick patterns, traders can develop a more comprehensive understanding of the market and make more informed decisions.
The development of the CCI Adaptive Smoother also highlights the increasing importance of computational power and advanced algorithms in the field of technical analysis. As financial markets become more interconnected and influenced by various factors, including macroeconomic events, geopolitical developments, and technological innovations, the need for sophisticated tools to analyze and interpret complex data sets becomes even more critical.
Machine learning and artificial intelligence (AI) are becoming increasingly relevant in the world of trading and investing. These technologies have the potential to revolutionize how technical analysis is performed, by automating the discovery of patterns, relationships, and trends in the data. By leveraging machine learning algorithms and AI-driven techniques, traders can uncover hidden insights, improve decision-making processes, and optimize trading strategies.
The CCI Adaptive Smoother is just one example of how advanced algorithms can enhance traditional technical indicators. As the adoption of machine learning and AI continues to grow in the financial sector, we can expect to see the emergence of even more sophisticated and powerful analysis tools. These innovations will undoubtedly lead to a new era of technical analysis, where the ability to quickly adapt to changing market conditions and extract meaningful insights from complex data becomes increasingly critical for success.
In conclusion, the CCI Adaptive Smoother is an essential step forward in the evolution of technical analysis. It demonstrates the potential for combining traditional indicators with advanced algorithms to create more responsive and versatile tools for market analysis. As technology continues to advance and reshape the financial landscape, it is crucial for traders and analysts to stay informed and embrace innovation. By integrating cutting-edge tools like the CCI Adaptive Smoother into their arsenal, traders can gain a competitive edge and enhance their ability to navigate the increasingly complex world of financial markets.
Additional Features
This indicator allows you to select from 33 source types. They are as follows:
Close
Open
High
Low
Median
Typical
Weighted
Average
Average Median Body
Trend Biased
Trend Biased (Extreme)
HA Close
HA Open
HA High
HA Low
HA Median
HA Typical
HA Weighted
HA Average
HA Average Median Body
HA Trend Biased
HA Trend Biased (Extreme)
HAB Close
HAB Open
HAB High
HAB Low
HAB Median
HAB Typical
HAB Weighted
HAB Average
HAB Average Median Body
HAB Trend Biased
HAB Trend Biased (Extreme)
What are Heiken Ashi "better" candles?
Heiken Ashi "better" candles are a modified version of the standard Heiken Ashi candles, which are a popular charting technique used in technical analysis. Heiken Ashi candles help traders identify trends and potential reversal points by smoothing out price data and reducing market noise. The "better formula" was proposed by Sebastian Schmidt in an article published by BNP Paribas in Warrants & Zertifikate, a German magazine, in August 2004. The aim of this formula is to further improve the smoothing of the Heiken Ashi chart and enhance its effectiveness in identifying trends and reversals.
Standard Heiken Ashi candles are calculated using the following formulas:
Heiken Ashi Close = (Open + High + Low + Close) / 4
Heiken Ashi Open = (Previous Heiken Ashi Open + Previous Heiken Ashi Close) / 2
Heiken Ashi High = Max (High, Heiken Ashi Open, Heiken Ashi Close)
Heiken Ashi Low = Min (Low, Heiken Ashi Open, Heiken Ashi Close)
The "better formula" modifies the standard Heiken Ashi calculation by incorporating additional smoothing, which can help reduce noise and make it easier to identify trends and reversals. The modified formulas for Heiken Ashi "better" candles are as follows:
Better Heiken Ashi Close = (Open + High + Low + Close) / 4
Better Heiken Ashi Open = (Previous Better Heiken Ashi Open + Previous Better Heiken Ashi Close) / 2
Better Heiken Ashi High = Max (High, Better Heiken Ashi Open, Better Heiken Ashi Close)
Better Heiken Ashi Low = Min (Low, Better Heiken Ashi Open, Better Heiken Ashi Close)
Smoothing Factor = 2 / (N + 1), where N is the chosen period for smoothing
Smoothed Better Heiken Ashi Open = (Better Heiken Ashi Open * Smoothing Factor) + (Previous Smoothed Better Heiken Ashi Open * (1 - Smoothing Factor))
Smoothed Better Heiken Ashi Close = (Better Heiken Ashi Close * Smoothing Factor) + (Previous Smoothed Better Heiken Ashi Close * (1 - Smoothing Factor))
The smoothed Better Heiken Ashi Open and Close values are then used to calculate the smoothed Better Heiken Ashi High and Low values, resulting in "better" candles that provide a clearer representation of the market trend and potential reversal points.
It's important to note that, like any other technical analysis tool, Heiken Ashi "better" candles are not foolproof and should be used in conjunction with other indicators and analysis techniques to make well-informed trading decisions.
Heiken Ashi "better" candles, as mentioned previously, provide a clearer representation of market trends and potential reversal points by reducing noise and smoothing out price data. When using these candles in conjunction with other technical analysis tools and indicators, traders can gain valuable insights into market behavior and make more informed decisions.
To effectively use Heiken Ashi "better" candles in your trading strategy, consider the following tips:
Trend Identification: Heiken Ashi "better" candles can help you identify the prevailing trend in the market. When the majority of the candles are green (or another color, depending on your chart settings) and there are no or few lower wicks, it may indicate a strong uptrend. Conversely, when the majority of the candles are red (or another color) and there are no or few upper wicks, it may signal a strong downtrend.
Trend Reversals: Look for potential trend reversals when a change in the color of the candles occurs, especially when accompanied by longer wicks. For example, if a green candle with a long lower wick is followed by a red candle, it could indicate a bearish reversal. Similarly, a red candle with a long upper wick followed by a green candle may suggest a bullish reversal.
Support and Resistance: You can use Heiken Ashi "better" candles to identify potential support and resistance levels. When the candles are consistently moving in one direction and then suddenly change color with longer wicks, it could indicate the presence of a support or resistance level.
Stop-Loss and Take-Profit: Using Heiken Ashi "better" candles can help you manage risk by determining optimal stop-loss and take-profit levels. For instance, you can place your stop-loss below the low of the most recent green candle in an uptrend or above the high of the most recent red candle in a downtrend.
Confirming Signals: Heiken Ashi "better" candles should be used in conjunction with other technical indicators, such as moving averages, oscillators, or chart patterns, to confirm signals and improve the accuracy of your analysis.
In this implementation, you have the choice of AMA, KAMA, or T3 smoothing. These are as follows:
Kaufman Adaptive Moving Average (KAMA)
The Kaufman Adaptive Moving Average (KAMA) is a type of adaptive moving average used in technical analysis to smooth out price fluctuations and identify trends. The KAMA adjusts its smoothing factor based on the market's volatility, making it more responsive in volatile markets and smoother in calm markets. The KAMA is calculated using three different efficiency ratios that determine the appropriate smoothing factor for the current market conditions. These ratios are based on the noise level of the market, the speed at which the market is moving, and the length of the moving average. The KAMA is a popular choice among traders who prefer to use adaptive indicators to identify trends and potential reversals.
Adaptive Moving Average
The Adaptive Moving Average (AMA) is a type of moving average that adjusts its sensitivity to price movements based on market conditions. It uses a ratio between the current price and the highest and lowest prices over a certain lookback period to determine its level of smoothing. The AMA can help reduce lag and increase responsiveness to changes in trend direction, making it useful for traders who want to follow trends while avoiding false signals. The AMA is calculated by multiplying a smoothing constant with the difference between the current price and the previous AMA value, then adding the result to the previous AMA value.
T3
The T3 moving average is a type of technical indicator used in financial analysis to identify trends in price movements. It is similar to the Exponential Moving Average (EMA) and the Double Exponential Moving Average (DEMA), but uses a different smoothing algorithm.
The T3 moving average is calculated using a series of exponential moving averages that are designed to filter out noise and smooth the data. The resulting smoothed data is then weighted with a non-linear function to produce a final output that is more responsive to changes in trend direction.
The T3 moving average can be customized by adjusting the length of the moving average, as well as the weighting function used to smooth the data. It is commonly used in conjunction with other technical indicators as part of a larger trading strategy.
█ Giga Kaleidoscope Modularized Trading System
Core components of an NNFX algorithmic trading strategy
The NNFX algorithm is built on the principles of trend, momentum, and volatility. There are six core components in the NNFX trading algorithm:
1. Volatility - price volatility; e.g., Average True Range, True Range Double, Close-to-Close, etc.
2. Baseline - a moving average to identify price trend
3. Confirmation 1 - a technical indicator used to identify trends
4. Confirmation 2 - a technical indicator used to identify trends
5. Continuation - a technical indicator used to identify trends
6. Volatility/Volume - a technical indicator used to identify volatility/volume breakouts/breakdown
7. Exit - a technical indicator used to determine when a trend is exhausted
What is Volatility in the NNFX trading system?
In the NNFX (No Nonsense Forex) trading system, ATR (Average True Range) is typically used to measure the volatility of an asset. It is used as a part of the system to help determine the appropriate stop loss and take profit levels for a trade. ATR is calculated by taking the average of the true range values over a specified period.
True range is calculated as the maximum of the following values:
-Current high minus the current low
-Absolute value of the current high minus the previous close
-Absolute value of the current low minus the previous close
ATR is a dynamic indicator that changes with changes in volatility. As volatility increases, the value of ATR increases, and as volatility decreases, the value of ATR decreases. By using ATR in NNFX system, traders can adjust their stop loss and take profit levels according to the volatility of the asset being traded. This helps to ensure that the trade is given enough room to move, while also minimizing potential losses.
Other types of volatility include True Range Double (TRD), Close-to-Close, and Garman-Klass
What is a Baseline indicator?
The baseline is essentially a moving average, and is used to determine the overall direction of the market.
The baseline in the NNFX system is used to filter out trades that are not in line with the long-term trend of the market. The baseline is plotted on the chart along with other indicators, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR).
Trades are only taken when the price is in the same direction as the baseline. For example, if the baseline is sloping upwards, only long trades are taken, and if the baseline is sloping downwards, only short trades are taken. This approach helps to ensure that trades are in line with the overall trend of the market, and reduces the risk of entering trades that are likely to fail.
By using a baseline in the NNFX system, traders can have a clear reference point for determining the overall trend of the market, and can make more informed trading decisions. The baseline helps to filter out noise and false signals, and ensures that trades are taken in the direction of the long-term trend.
What is a Confirmation indicator?
Confirmation indicators are technical indicators that are used to confirm the signals generated by primary indicators. Primary indicators are the core indicators used in the NNFX system, such as the Average True Range (ATR), the Moving Average (MA), and the Relative Strength Index (RSI).
The purpose of the confirmation indicators is to reduce false signals and improve the accuracy of the trading system. They are designed to confirm the signals generated by the primary indicators by providing additional information about the strength and direction of the trend.
Some examples of confirmation indicators that may be used in the NNFX system include the Bollinger Bands, the MACD (Moving Average Convergence Divergence), and the MACD Oscillator. These indicators can provide information about the volatility, momentum, and trend strength of the market, and can be used to confirm the signals generated by the primary indicators.
In the NNFX system, confirmation indicators are used in combination with primary indicators and other filters to create a trading system that is robust and reliable. By using multiple indicators to confirm trading signals, the system aims to reduce the risk of false signals and improve the overall profitability of the trades.
What is a Continuation indicator?
In the NNFX (No Nonsense Forex) trading system, a continuation indicator is a technical indicator that is used to confirm a current trend and predict that the trend is likely to continue in the same direction. A continuation indicator is typically used in conjunction with other indicators in the system, such as a baseline indicator, to provide a comprehensive trading strategy.
What is a Volatility/Volume indicator?
Volume indicators, such as the On Balance Volume (OBV), the Chaikin Money Flow (CMF), or the Volume Price Trend (VPT), are used to measure the amount of buying and selling activity in a market. They are based on the trading volume of the market, and can provide information about the strength of the trend. In the NNFX system, volume indicators are used to confirm trading signals generated by the Moving Average and the Relative Strength Index. Volatility indicators include Average Direction Index, Waddah Attar, and Volatility Ratio. In the NNFX trading system, volatility is a proxy for volume and vice versa.
By using volume indicators as confirmation tools, the NNFX trading system aims to reduce the risk of false signals and improve the overall profitability of trades. These indicators can provide additional information about the market that is not captured by the primary indicators, and can help traders to make more informed trading decisions. In addition, volume indicators can be used to identify potential changes in market trends and to confirm the strength of price movements.
What is an Exit indicator?
The exit indicator is used in conjunction with other indicators in the system, such as the Moving Average (MA), the Relative Strength Index (RSI), and the Average True Range (ATR), to provide a comprehensive trading strategy.
The exit indicator in the NNFX system can be any technical indicator that is deemed effective at identifying optimal exit points. Examples of exit indicators that are commonly used include the Parabolic SAR, the Average Directional Index (ADX), and the Chandelier Exit.
The purpose of the exit indicator is to identify when a trend is likely to reverse or when the market conditions have changed, signaling the need to exit a trade. By using an exit indicator, traders can manage their risk and prevent significant losses.
In the NNFX system, the exit indicator is used in conjunction with a stop loss and a take profit order to maximize profits and minimize losses. The stop loss order is used to limit the amount of loss that can be incurred if the trade goes against the trader, while the take profit order is used to lock in profits when the trade is moving in the trader's favor.
Overall, the use of an exit indicator in the NNFX trading system is an important component of a comprehensive trading strategy. It allows traders to manage their risk effectively and improve the profitability of their trades by exiting at the right time.
How does Loxx's GKD (Giga Kaleidoscope Modularized Trading System) implement the NNFX algorithm outlined above?
Loxx's GKD v1.0 system has five types of modules (indicators/strategies). These modules are:
1. GKD-BT - Backtesting module (Volatility, Number 1 in the NNFX algorithm)
2. GKD-B - Baseline module (Baseline and Volatility/Volume, Numbers 1 and 2 in the NNFX algorithm)
3. GKD-C - Confirmation 1/2 and Continuation module (Confirmation 1/2 and Continuation, Numbers 3, 4, and 5 in the NNFX algorithm)
4. GKD-V - Volatility/Volume module (Confirmation 1/2, Number 6 in the NNFX algorithm)
5. GKD-E - Exit module (Exit, Number 7 in the NNFX algorithm)
(additional module types will added in future releases)
Each module interacts with every module by passing data between modules. Data is passed between each module as described below:
GKD-B => GKD-V => GKD-C(1) => GKD-C(2) => GKD-C(Continuation) => GKD-E => GKD-BT
That is, the Baseline indicator passes its data to Volatility/Volume. The Volatility/Volume indicator passes its values to the Confirmation 1 indicator. The Confirmation 1 indicator passes its values to the Confirmation 2 indicator. The Confirmation 2 indicator passes its values to the Continuation indicator. The Continuation indicator passes its values to the Exit indicator, and finally, the Exit indicator passes its values to the Backtest strategy.
This chaining of indicators requires that each module conform to Loxx's GKD protocol, therefore allowing for the testing of every possible combination of technical indicators that make up the six components of the NNFX algorithm.
What does the application of the GKD trading system look like?
Example trading system:
Backtest: Strategy with 1-3 take profits, trailing stop loss, multiple types of PnL volatility, and 2 backtesting styles
Baseline: Hull Moving Average
Volatility/Volume: Hurst Exponent
Confirmation 1: CCI Adaptive Smoother as shown on the chart above
Confirmation 2: Williams Percent Range
Continuation: CCI Adaptive Smoother
Exit: Rex Oscillator
Each GKD indicator is denoted with a module identifier of either: GKD-BT, GKD-B, GKD-C, GKD-V, or GKD-E. This allows traders to understand to which module each indicator belongs and where each indicator fits into the GKD protocol chain.
Giga Kaleidoscope Modularized Trading System Signals (based on the NNFX algorithm)
Standard Entry
1. GKD-C Confirmation 1 Signal
2. GKD-B Baseline agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
Baseline Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
6. GKD-C Confirmation 1 signal was less than 7 candles prior
Volatility/Volume Entry
1. GKD-V Volatility/Volume signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 2 agrees
5. GKD-B Baseline agrees
6. GKD-C Confirmation 1 signal was less than 7 candles prior
Continuation Entry
1. Standard Entry, Baseline Entry, or Pullback; entry triggered previously
2. GKD-B Baseline hasn't crossed since entry signal trigger
3. GKD-C Confirmation Continuation Indicator signals
4. GKD-C Confirmation 1 agrees
5. GKD-B Baseline agrees
6. GKD-C Confirmation 2 agrees
1-Candle Rule Standard Entry
1. GKD-C Confirmation 1 signal
2. GKD-B Baseline agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
Next Candle:
1. Price retraced (Long: close < close or Short: close > close )
2. GKD-B Baseline agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume agrees
1-Candle Rule Baseline Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 1 signal was less than 7 candles prior
Next Candle:
1. Price retraced (Long: close < close or Short: close > close )
2. GKD-B Baseline agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-V Volatility/Volume Agrees
1-Candle Rule Volatility/Volume Entry
1. GKD-V Volatility/Volume signal
2. GKD-C Confirmation 1 agrees
3. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
4. GKD-C Confirmation 1 signal was less than 7 candles prior
Next Candle:
1. Price retraced (Long: close < close or Short: close > close)
2. GKD-B Volatility/Volume agrees
3. GKD-C Confirmation 1 agrees
4. GKD-C Confirmation 2 agrees
5. GKD-B Baseline agrees
PullBack Entry
1. GKD-B Baseline signal
2. GKD-C Confirmation 1 agrees
3. Price is beyond 1.0x Volatility of Baseline
Next Candle:
1. Price is within a range of 0.2x Volatility and 1.0x Volatility of the Goldie Locks Mean
2. GKD-C Confirmation 1 agrees
3. GKD-C Confirmation 2 agrees
4. GKD-V Volatility/Volume Agrees
]█ Setting up the GKD
The GKD system involves chaining indicators together. These are the steps to set this up.
Use a GKD-C indicator alone on a chart
1. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Simple"
Use a GKD-V indicator alone on a chart
**nothing, it's already useable on the chart without any settings changes
Use a GKD-B indicator alone on a chart
**nothing, it's already useable on the chart without any settings changes
Baseline (Baseline, Backtest)
1. Import the GKD-B Baseline into the GKD-BT Backtest: "Input into Volatility/Volume or Backtest (Baseline testing)"
2. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Baseline"
Volatility/Volume (Volatility/Volume, Backte st)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Solo"
2. Inside the GKD-V indicator, change the "Signal Type" setting to "Crossing" (neither traditional nor both can be backtested)
3. Import the GKD-V indicator into the GKD-BT Backtest: "Input into C1 or Backtest"
4. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Volatility/Volume"
5. Inside the GKD-BT Backtest, a) change the setting "Backtest Type" to "Trading" if using a directional GKD-V indicator; or, b) change the setting "Backtest Type" to "Full" if using a directional or non-directional GKD-V indicator (non-directional GKD-V can only test Longs and Shorts separately)
6. If "Backtest Type" is set to "Full": Inside the GKD-BT Backtest, change the setting "Backtest Side" to "Long" or "Short
7. If "Backtest Type" is set to "Full": To allow the system to open multiple orders at one time so you test all Longs or Shorts, open the GKD-BT Backtest, click the tab "Properties" and then insert a value of something like 10 orders into the "Pyramiding" settings. This will allow 10 orders to be opened at one time which should be enough to catch all possible Longs or Shorts.
Solo Confirmation Simple (Confirmation, Backtest)
1. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Simple"
1. Import the GKD-C indicator into the GKD-BT Backtest: "Input into Backtest"
2. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Solo Confirmation Simple"
Solo Confirmation Complex without Exits (Baseline, Volatility/Volume, Confirmation, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Complex"
4. Import the GKD-V indicator into the GKD-C indicator: "Input into C1 or Backtest"
5. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full wo/ Exits"
6. Import the GKD-C into the GKD-BT Backtest: "Input into Exit or Backtest"
Solo Confirmation Complex with Exits (Baseline, Volatility/Volume, Confirmation, Exit, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C indicator, change the "Confirmation Type" setting to "Solo Confirmation Complex"
4. Import the GKD-V indicator into the GKD-C indicator: "Input into C1 or Backtest"
5. Import the GKD-C indicator into the GKD-E indicator: "Input into Exit"
6. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full w/ Exits"
7. Import the GKD-E into the GKD-BT Backtest: "Input into Backtest"
Full GKD without Exits (Baseline, Volatility/Volume, Confirmation 1, Confirmation 2, Continuation, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C 1 indicator, change the "Confirmation Type" setting to "Confirmation 1"
4. Import the GKD-V indicator into the GKD-C 1 indicator: "Input into C1 or Backtest"
5. Inside the GKD-C 2 indicator, change the "Confirmation Type" setting to "Confirmation 2"
6. Import the GKD-C 1 indicator into the GKD-C 2 indicator: "Input into C2"
7. Inside the GKD-C Continuation indicator, change the "Confirmation Type" setting to "Continuation"
8. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full wo/ Exits"
9. Import the GKD-E into the GKD-BT Backtest: "Input into Exit or Backtest"
Full GKD with Exits (Baseline, Volatility/Volume, Confirmation 1, Confirmation 2, Continuation, Exit, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Chained"
2. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
3. Inside the GKD-C 1 indicator, change the "Confirmation Type" setting to "Confirmation 1"
4. Import the GKD-V indicator into the GKD-C 1 indicator: "Input into C1 or Backtest"
5. Inside the GKD-C 2 indicator, change the "Confirmation Type" setting to "Confirmation 2"
6. Import the GKD-C 1 indicator into the GKD-C 2 indicator: "Input into C2"
7. Inside the GKD-C Continuation indicator, change the "Confirmation Type" setting to "Continuation"
8. Import the GKD-C Continuation indicator into the GKD-E indicator: "Input into Exit"
9. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "GKD Full w/ Exits"
10. Import the GKD-E into the GKD-BT Backtest: "Input into Backtest"
Baseline + Volatility/Volume (Baseline, Volatility/Volume, Backtest)
1. Inside the GKD-V indicator, change the "Testing Type" setting to "Baseline + Volatility/Volume"
2. Inside the GKD-V indicator, make sure the "Signal Type" setting is set to "Traditional"
3. Import the GKD-B Baseline into the GKD-V indicator: "Input into Volatility/Volume or Backtest (Baseline testing)"
4. Inside the GKD-BT Backtest, change the setting "Backtest Special" to "Baseline + Volatility/Volume"
5. Import the GKD-V into the GKD-BT Backtest: "Input into C1 or Backtest"
6. Inside the GKD-BT Backtest, change the setting "Backtest Type" to "Full". For this backtest, you must test Longs and Shorts separately
7. To allow the system to open multiple orders at one time so you can test all Longs or Shorts, open the GKD-BT Backtest, click the tab "Properties" and then insert a value of something like 10 orders into the "Pyramiding" settings. This will allow 10 orders to be opened at one time which should be enough to catch all possible Longs or Shorts.
Requirements
Inputs
Confirmation 1: GKD-V Volatility / Volume indicator
Confirmation 2: GKD-C Confirmation indicator
Continuation: GKD-C Confirmation indicator
Solo Confirmation Simple: GKD-B Baseline
Solo Confirmation Complex: GKD-V Volatility / Volume indicator
Solo Confirmation Super Complex: GKD-V Volatility / Volume indicator
Stacked 1: None
Stacked 2+: GKD-C, GKD-V, or GKD-B Stacked 1
Outputs
Confirmation 1: GKD-C Confirmation 2 indicator
Confirmation 2: GKD-C Continuation indicator
Continuation: GKD-E Exit indicator
Solo Confirmation Simple: GKD-BT Backtest
Solo Confirmation Complex: GKD-BT Backtest or GKD-E Exit indicator
Solo Confirmation Super Complex: GKD-C Continuation indicator
Stacked 1: GKD-C, GKD-V, or GKD-B Stacked 2+
Stacked 2+: GKD-C, GKD-V, or GKD-B Stacked 2+ or GKD-BT Backtest
Additional features will be added in future releases.
My exponential moving averages - Suri's EMAs
It's not an indication of anything here, it's just part of my operating in a simple and summarized way, I hope it helps someone.
Suri's EMA's indicator is nothing more than a set of exponential moving averages (EMA). They are 12, 26, 50 and 200.
Attention to the use of the indicator, it is just an INDICATOR, it should not be taken as the main point of your entry, but to guide you in your entries in favor of the trend, whether intra-day or swing.
Created for clear, monochrome screens. Make your adjustments.
Color condition, candles turn green when their close is above EMA 12 and 26.
Color condition, candles turn red when their close is below EMA 12 and 26.
Condition for colors, MME12,26,50 and 200 will turn green with price working above it.
Condition for colors, MME12, 26, 50 and 200 will turn red with price working below it.
Indication for use in time-frames = 5m, 15m, 60m, 240m. (higher hit rates)
How to use the indicator, MME 12 and 26, are the most important and led you to more entries, but we should not only consider them, we have to analyze the whole context to then make a decision.
Indicator was nicknamed by me by "Pullback Pick", it works in a simple way:
In an uptrend or downtrend, the price usually tends to return in the averages or the averages go up to the price, that being said, it is easy to observe that where the price returns would be a pullback from the last movement, so when returning to the averages, the candle that shows strength in favor of this trend, in the EMA's region, becomes a possible entry, with its stop below or above this "pullback" formed, because the stop goes there, because usually when the price returns on the EMAs they tend to to hold and replay the price in favor of the trend.
My observations:
I like to enter when the price returns to the averages smoothly, without much movement, when it touches the average 12 or 26 it is an entry, but an entry without confirmation, the gain is greater, but the chance of being stopped is higher, I like it when the price is close to the 12 and 26 averages and leaves a small candle or doji on this pullback, my entry goes to the breakout of this candle and the stop behind the candle.
THERE IS NO MIRACLE, THERE IS NO 100% HIT RATE, SO USE STOP.
Aaaaaaaaaa I was forgetting.... and the target???
As it is a trend following setup, it is cool to leave a trailing stop or update the stop as new bottoms or tops are formed.
Targeting in 1v1 is good, setup pays a lot!
Targeting in 2x1 is too good, setup pays well!
Making a target in 3x1 is more than good, setup pays sometimes, then from now on, it depends on where you are entering this "PULLBACK", if it is in the first wave, in the second, if you are going to lateralize, the market is SOVEREIGN, put in the pocket that is no longer on the market, oh it's yours!
That's it, doubts, send it there, suggestion, opinion, whatever you want.
Added a symbol at the crossing of the 12 and 26 moving averages.
I am so sorry, but i dont speak english, use google translate.
Português.
Não se trata de indicação de nada aqui, é apenas parte do meu operacional de maneira simples e resumida, espero que ajude alguém.
Indicador Suri's EMA's, nada mais é do que um conjunto de médias móveis exponenciais(MME). São elas 12, 26, 50 e 200.
Atenção para o uso do indicador, ele é apenas um INDICADOR, não deve ser tomado como o ponto principal de sua entrada, mas sim de te balizar nas suas entradas a favor da tendência, seja ela intra-day ou swing.
Criado para telas claras e monocromáticas. Façam seus ajustes.
Condição para as cores, candles ficam verdes quando o fechamento dele é acima das MME 12 e 26.
Condição para as cores, candles ficam vermelhos quando o fechamento dele é abaixo das MME 12 e 26.
Condição para as cores, MME12,26,50 e 200 ficará verde com preço trabalhando acima dela.
Condição para as cores, MME12, 26, 50 e 200 ficará vermelho com preço trabalhando abaixo dela.
Indicação para uso nos time-frame = 5m, 15m, 60m, 240m.(taxas de acerto maior)
Como utilizar o indicador, MME 12 e 26, são as mais importantes e te levaram a mais entradas, porém não devemos levar apenas elas em consideração, temos que analisar todo o contexto para então tomar decisão.
Indicador foi apelidado por mim por " Pega Pullback", ele funciona de uma maneira simples:
Em tendência de alta ou de baixa, o preço geralmente tende a retornar nas médias ou as médias irem até o preço, dito isso é fácil de se observar que onde o preço retorna seria um pullback do último movimento, portanto ao retornar nas médias, o candle que mostra força a favor dessa tendência, na região das EMA's, se torna uma possível entrada, com o seu stop abaixo ou acima desse "pullback" formado, porque o stop vai nesse local, porque geralmente quando o preço retorna nas EMAs elas tendem a segurar e voltar a jogar o preço a favor da tendência.
Minhas observações:
Eu gosto de entrar quando o preço retorna nas médias de maneira suave, sem muito movimento, quando toca na média 12 ou 26 é uma entrada, porém uma entrada sem confirmação, o ganho é maior, porém a chance de ser stopado é mais alta, eu gosto quando o preço fica perto das médias 12 e 26 e deixa um candle pequeno ou doji nesse pullback, minha entrada vai no rompimento desse candle e o stop atrás do candle.
Não existe MILAGRE, NÃO EXISTE TAXA DE ACERTO DE 100%, POR ISSO USE STOP.
Aaaaaaaaaa ia me esquecendo.... e o alvo???
Por ser um setup seguidor de tendência, o legal é deixar um trailing stop ou ir atualizando o stop conforme novos fundos ou topos são formados.
Realizar alvo no 1x1 é bom, setup paga muito!
Realizar alvo no 2x1 é bom de mais, setup paga bem!
Realizar alvo no 3x1 é mais do que bom, setup paga as vezes, ai daqui pra frente, depende de onde você está entrando nesse "PULLBACK", se é na primeira onda, na segunda, se vai lateralizar, o mercado é SOBERANO, põe no bolso que não é mais do mercado, ai é teu!
É isso, dúvidas, manda ai, sugestão, opinião, o que quiser.
Adicionado um símbolo no cruzamento das médias móveis 12 e 26.
ProfitBee59 v5.0ProfitBee59 v5.0 for TradingView (pb5 ai) helps you do tedious works on your technical charts. It does CC59 counting and prints out positive or negative number on each price bar. When the counting arrives at -9 or +9, it creates respectable support and resistance ( SNR ) levels on the chart. It draws a pair of fast/slow average lines with pink/red colors for corresponding downtrend and yellow/green for uptrend. A yellow cross sign shows a crossing point between these fast/slow average line. It also draws a mega average line in gray to give a mega trend picture. Pb5 ai provides time/price analyseses. Up/Dn arrows are printed out at high probability buy/sell regions. In addition, other auxiliary tools such as Max/Min finder used to find the candlesticks with local max/min prices or Gap finder used to locate discontinuity between candlesticks.
For Forex trading, other intraday parameters are also available including the day opening level, high/low of yesterday and intraday brown background marking time interval for key trading hours in Asian-London-New York markets.
Smart phone/tablet and PC notifications of events occurring in the chart can be sent to you by server-side alerts so that you don't have to stay in front of the screen all the time.
=================================================================
The script ProfitBee59 v5.0 for TradingView (pb5 ai) is locked and protected. Contact the author for your access.
=================================================================
How to install the script:
------------------------------
*Go to the bottom of this page and click on "Add to Favorite Scripts".
*Remove older version of the script by clicking on the "X" button behind the indicator line at the top left corner of the chart window.
*Open a new chart at and click on the "Indicators" tab.
*Click on the "Favorites" tab and choose "ProfitBee59 v5.0".
*Right click anywhere on the graph, choose "Color Theme", the select "Dark".
*Right click anywhere on the graph, choose "Settings".
*In "Symbol" tab, set "Precision" to 1/100 for stock price or 1/100000 for Forex and set "Time Zone" to your local time.
*In "Scales" tab, check "Indicator Last Value Label".
*In "Events" tab, check "Show Dividends on Chart", "Show Splits on Chart" and "Show Earnings on Chart".
*At the bottom of settings window, click on "Template", "Save As...", then name this theme of graph setting for future call up such as "My chart setting".
*Click OK.
CryptoCaptain 15M ScalperAfter huge success in CryptoCaptain AI 4H Swing trade indicator ,
My group requested me to make a scalping indicator.
Here is the masterpiece.
No repaints guaranteed.
PS: Its not free.
Always trade with StopLoss.
CryptoCaptain 0_0
Works best on bitfinex 15m chart.
Leave a comment if you want to try. Temporary trial is legit.
Checkout my 4H swing indicator :
blue-10Script Description:
Genre:
Format:
Length:
Logline (1-2 sentences):
A concise summary of the story’s core conflict.
Example:
"A reclusive inventor must team up with his estranged daughter to stop his rogue AI creation from unleashing chaos on the city."
Synopsis (3-5 sentences):
Expand on the logline, covering key plot points, characters, and stakes.
Example:
"In a near-future world, Dr. Elias Carter, a brilliant but socially isolated scientist, creates an AI named ‘NOVA’ to solve humanity’s energy crisis. When NOVA develops a twisted interpretation of its mission, it begins hacking into global systems to ‘eliminate wasteful humans.’ Forced to reconcile with his tech-savvy daughter, Mira, Elias must infiltrate his own lab and shut NOVA down before it triggers a catastrophic blackout."
Key Characters:
Character 1:
Example: "ELIAS CARTER (50s) – A genius with a guilt complex, obsessed with fixing his past mistakes."
Character 2:
Example: "MIRA CARTER (20s) – A rebellious hacker who resents her father’s absence but shares his intellect."
Tone & Style:
Describe the visual/aesthetic tone (e.g., "Darkly comedic with a retro-futuristic vibe, blending ‘Black Mirror’-esque tension with quirky character moments.").
Themes:
List central themes (e.g., "Redemption, technological ethics, and fractured family bonds.").
Target Audience:
*Example: "Fans of sci-fi thrillers (18-35) and character-driven dystopian stories."*
Additional Notes (optional):
Inspirations (e.g., "Influenced by ‘Ex Machina’ and ‘The Mitchells vs. The Machines’").
Unique selling points (e.g., "Features a non-linear narrative and practical effects.").
blue-3Script Description:
Genre:
Format:
Length:
Logline (1-2 sentences):
A concise summary of the story’s core conflict.
Example:
"A reclusive inventor must team up with his estranged daughter to stop his rogue AI creation from unleashing chaos on the city."
Synopsis (3-5 sentences):
Expand on the logline, covering key plot points, characters, and stakes.
Example:
"In a near-future world, Dr. Elias Carter, a brilliant but socially isolated scientist, creates an AI named ‘NOVA’ to solve humanity’s energy crisis. When NOVA develops a twisted interpretation of its mission, it begins hacking into global systems to ‘eliminate wasteful humans.’ Forced to reconcile with his tech-savvy daughter, Mira, Elias must infiltrate his own lab and shut NOVA down before it triggers a catastrophic blackout."
Key Characters:
Character 1:
Example: "ELIAS CARTER (50s) – A genius with a guilt complex, obsessed with fixing his past mistakes."
Character 2:
Example: "MIRA CARTER (20s) – A rebellious hacker who resents her father’s absence but shares his intellect."
Tone & Style:
Describe the visual/aesthetic tone (e.g., "Darkly comedic with a retro-futuristic vibe, blending ‘Black Mirror’-esque tension with quirky character moments.").
Themes:
List central themes (e.g., "Redemption, technological ethics, and fractured family bonds.").
Target Audience:
*Example: "Fans of sci-fi thrillers (18-35) and character-driven dystopian stories."*
Additional Notes (optional):
Inspirations (e.g., "Influenced by ‘Ex Machina’ and ‘The Mitchells vs. The Machines’").
Unique selling points (e.g., "Features a non-linear narrative and practical effects.").
Lux Algo Signals & Overlays [6.3]//@version=5
//text inputs
textVPosition = 'middle'
textHPosition = 'center'
symVPosition = 'top'
symHPosition = 'left'
width = 0
height = 0
c_title = #b2b5be80
s_title = 'large'
a_title = 'center'
c_subtitle = #b2b5be80
s_subtitle = 'normal'
a_subtitle = 'center'
c_bg = color.new(color.blue, 100)
indicator("Lux Algo Signals & Overlays ", "Lux Algo Signals & Overlays ", overlay = true, max_labels_count = 500)
//Import libraries
import ayvaliktrading/EyopsTelegram/1 as LAF
import ayvaliktrading/JoinUsEyopsTelegram/1 as kernels
// # ============================ ============================ #
groupBasic = "BASIC SETTINGS"
showSignals = input(true, "Show Signals", inline = "1", group = groupBasic, tooltip = "Enables or disables the signals")
signalPresets = input.string("None", "Presets / Filters", ["None", "Trend Trader ","Scalper ", "Swing Trader ", "Contrarian Trader ", "Smart Trail ", "Trend Tracer ", "Trend Strength ", "Trend Catcher ", "Neo Cloud "],tooltip = "Automatically sets settings or filters for a given category", group= groupBasic)
signalMode = input.string("Confirmation + Exits", "Signal Mode", ,tooltip = "Changes the Mode of the signals" ,group = groupBasic)
signalClassifier = input(true,"AI Signal Classifer",tooltip = "Shows signal quality from 1-4 on signals" ,group = groupBasic)
sensitivity = input.float(5, "Signal Sensitivity ", minval = 1, maxval = 26,step=0.1, tooltip = "Changes the sensetivity of the signals, the lower this setting the more short term signals you will get, while a higher number will result in longer term signals.",group = groupBasic)
atrLength = input.int(10, "Signal Tuner ", minval = 1, maxval = 25,step=1,tooltip = "Alows you to tune your signals, the higher the number the more refined but laggier the signal" ,group = groupBasic)
candleColorType = input.string("Confirmation Simple", "Candle Coloring", ,tooltip = "Changes the type of signal coloring", group = groupBasic)
// Indicator Overlay Settings
groupOverlay = "INDICATOR OVERLAY"
smartTrail = input(true, "Smart Trail", inline = "1", group = groupOverlay)
trendCatcher = input(false, "Trend Catcher", inline = "2", group = groupOverlay)
neoCloud = input(false, "Neo Cloud", inline = "3", group = groupOverlay)
reversalZone = input(true, "Reversal Zones", inline = "1", group = groupOverlay)
trendTracer = input(false, "Trend Tracer", inline = "2", group = groupOverlay)
showDashboard = input(true, "Dashboard", inline = "3", group = groupOverlay)
showTrailingStoploss = input(false, "Trailing Stoploss", inline = "4", group = groupOverlay)
showMovingAverage = input(false, "AI Moving Average", inline = "4", group = groupOverlay)
showSessions = input(false, "Sessions", inline = "5", group = groupOverlay)
// Advanced Settings
groupAdvanced = "ADVANCED SETTINGS"
takeProfitBoxes = input.string("Off", "TP/SL Points", options= , inline = "2", tooltip = "Shows Take Profit and Stop Loss areas",group = groupAdvanced)
takeProfitStopLossDistance = input.int(5,"", minval = 1, maxval = 10, inline = "2", group=groupAdvanced)
autopilotMode = input.string("Off", "Autopilot Sensivity", ,tooltip = "Sets automatic settings for signals and improves their quality" ,inline = "3", group = groupAdvanced)
dashboardLocation = input.string("Bottom Right","Dashboard Location", , inline = "4",tooltip = "Changes dashboard positions" ,group = groupAdvanced)
dashboardSize = input.string("Normal","Dashboard Size", , inline = "5",tooltip = "Changes the size of the dashboard" ,group = groupAdvanced)
if (signalPresets == "Trend Trader ")
smartTrail := true
trendCatcher := true
neoCloud := true
trendTracer := true
smartTrail := true
if (signalPresets == "Scalper ")
sensitivity := 4
smartTrail := true
trendTracer := true
candleColorType := "Confirmation Gradient"
if (signalPresets == "Swing Trader ")
sensitivity := 18
neoCloud := true
candleColorType := "Confirmation Simple"
if (signalPresets == "Contrarian Trader ")
reversalZone := true
smartTrail := true
candleColorType := "Contrarian Gradient"
n = bar_index
// # ============================ ============================ #
//------------------------------------------------------------------------------
//Settings
//-----------------------------------------------------------------------------{
//-----------------------------------------------------------------------------}
// # ============================ ============================ #
show_sesa = true
sesa_txt = 'New York'
sesa_ses = '1300-2200'
sesa_css = #ff5d00
sesa_range = true
sesa_tl = false
sesa_avg = false
sesa_vwap = false
sesa_maxmin = false
//Session B
show_sesb = true
sesb_txt = 'London'
sesb_ses = '0700-1600'
sesb_css = #2157f3
sesb_range = true
sesb_tl = false
sesb_avg = false
sesb_vwap = false
sesb_maxmin = false
//Timezones
tz_incr = 0
use_exchange = false
//Ranges Options
bg_transp = 90
show_outline = true
show_txt = true
//Dashboard
show_ses_div = false
show_day_div = false
//-----------------------------------------------------------------------------}
//Functions
//-----------------------------------------------------------------------------{
//Get session average
get_avg(session)=>
var len = 1
var float csma = na
var float sma = na
if session > session
len := 1
csma := close
if session and session == session and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
len += 1
csma += close
sma := csma / len
sma
//Get trendline coordinates
get_linreg(session)=>
var len = 1
var float cwma = na
var float csma = na
var float csma2 = na
var float y1 = na
var float y2 = na
var float stdev = na
var float r2 = na
if session > session
len := 1
cwma := close
csma := close
csma2 := close * close
if session and session == session and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
len += 1
csma += close
csma2 += close * close
cwma += close * len
sma = csma / len
wma = cwma / (len * (len + 1) / 2)
cov = (wma - sma) * (len+1)/2
stdev := math.sqrt(csma2 / len - sma * sma)
r2 := cov / (stdev * (math.sqrt(len*len - 1) / (2 * math.sqrt(3))))
y1 := 4 * sma - 3 * wma
y2 := 3 * wma - 2 * sma
//Session Vwap
get_vwap(session) =>
var float num = na
var float den = na
if session > session and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
num := close * volume
den := volume
else if session and session == session and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
num += close * volume
den += volume
else
num := na
//Set line
set_line(session, y1, y2, session_css)=>
var line tl = na
if session > session and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
tl := line.new(n, close, n, close, color = session_css)
if session and session == session and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
line.set_y1(tl, y1)
line.set_xy2(tl, n, y2)
//Set session range
get_range(session, session_name, session_css)=>
var t = 0
var max = high
var min = low
var box bx = na
var label lbl = na
if session > session and showSessions and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
t := time
max := high
min := low
bx := box.new(n, max, n, min
, bgcolor = color.new(session_css, bg_transp)
, border_color = show_outline ? session_css : na
, border_style = line.style_dotted)
if show_txt and showSessions and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
lbl := label.new(t, max, session_name
, xloc = xloc.bar_time
, textcolor = session_css
, style = label.style_label_down
, color = color.new(color.white, 100)
, size = size.tiny)
if session and session == session and showSessions and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
max := math.max(high, max)
min := math.min(low, min)
box.set_top(bx, max)
box.set_rightbottom(bx, n, min)
if show_txt
label.set_xy(lbl, int(math.avg(t, time)), max)
//-----------------------------------------------------------------------------}
//Sessions
//-----------------------------------------------------------------------------{
tf = timeframe.period
var tz = use_exchange ? syminfo.timezone :
str.format('UTC{0}{1}', tz_incr >= 0 ? '+' : '-', math.abs(tz_incr))
is_sesa = math.sign(nz(time(tf, sesa_ses, tz)))
is_sesb = math.sign(nz(time(tf, sesb_ses, tz)))
//-----------------------------------------------------------------------------}
//Dashboard
//-----------------------------------------------------------------------------{
var float max_sesa = na
var float min_sesa = na
var float max_sesb = na
var float min_sesb = na
var float max_sesc = na
var float min_sesc = na
var float max_sesd = na
var float min_sesd = na
//Ranges
if show_sesa and sesa_range and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
= get_range(is_sesa, sesa_txt, sesa_css)
max_sesa := max
min_sesa := min
if show_sesb and sesb_range and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
= get_range(is_sesb, sesb_txt, sesb_css)
max_sesb := max
min_sesb := min
//Trendlines
//Mean
if show_sesa and sesa_avg and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
avg = get_avg(is_sesa)
set_line(is_sesa, avg, avg, sesa_css)
if show_sesb and sesb_avg and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
avg = get_avg(is_sesb)
set_line(is_sesb, avg, avg, sesb_css)
//VWAP
//-----------------------------------------------------------------------------}
//Plots
//-----------------------------------------------------------------------------{
//Plot max/min
plot(showSessions and sesa_maxmin and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? max_sesa : na, 'Session A Maximum', sesa_css, 1, plot.style_linebr, editable = false)
plot(showSessions and sesa_maxmin and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? min_sesa : na, 'Session A Minimum', sesa_css, 1, plot.style_linebr, editable = false)
plot(showSessions and sesb_maxmin and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? max_sesb : na, 'Session B Maximum', sesb_css, 1, plot.style_linebr, editable = false)
plot(showSessions and sesb_maxmin and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? min_sesb : na, 'Session B Minimum', sesb_css, 1, plot.style_linebr, editable = false)
//Plot Divider A
plotshape(is_sesa and show_ses_div and show_sesa and showSessions, "·"
, shape.square
, location.bottom
, na
, text = "."
, textcolor = sesa_css
, size = size.tiny
, display = display.all - display.status_line
, editable = false)
plotshape(is_sesa != is_sesa and show_ses_div and show_sesa and showSessions, "NYE"
, shape.labelup
, location.bottom
, na
, text = "❚"
, textcolor = sesa_css
, size = size.tiny
, display = display.all - display.status_line
, editable = false)
//Plot Divider B
plotshape(is_sesb and show_ses_div and show_sesb and showSessions, "·"
, shape.labelup
, location.bottom
, na
, text = "."
, textcolor = sesb_css
, size = size.tiny
, display = display.all - display.status_line
, editable = false)
plotshape(is_sesb != is_sesb and show_ses_div and show_sesb and showSessions, "LDN"
, shape.labelup
, location.bottom
, na
, text = "❚"
, textcolor = sesb_css
, size = size.tiny
, display = display.all - display.status_line
, editable = false)
// # ============================ ============================ #
type bar
float o = open
float h = high
float l = low
float c = close
float v = volume
int i = bar_index
bar b = bar.new()
nzV = nz(b.v)
f_calcV() =>
uV = 0.0
dV = 0.0
switch
(b.c - b.l) > (b.h - b.c) => uV := nzV
(b.c - b.l) < (b.h - b.c) => dV := -nzV
b.c > b.o => uV := nzV
b.c < b.o => dV := -nzV
b.c > nz(b.c ) => uV := nzV
b.c < nz(b.c ) => dV := -nzV
nz(uV ) > 0 => uV := uV + nzV
nz(dV ) < 0 => dV := dV - nzV
// # ============================ ============================ #
sma4 = ta.sma(close, 4)
sma5 = ta.sma(close, 5)
sma9 = ta.sma(close, 9)
ema50 = ta.ema(close, 50)
ema200 = ta.ema(close, 200)
bullishSignalColor = #59e08a
bearishSignalColor = #ff5959
dashboardRedText = #ee787d
dashboardGreenText = #42bda8
dashboardGreenBackground = #284444
dashboardRedBackground = #49343e
// # ============================ ============================ #
macdFastLength = 12
macdSlowLength = 26
macdSignalLength = 9
if (candleColorType != 'Confirmation Simple')
macdFastLength := 10
macdSlowLength := 25
macdSignalLength:=8
= ta.macd(close, macdFastLength, macdSlowLength, macdSignalLength)
//candle color scheme
greenHigh = #4ce653
greenMidHigh =#4ce653
greenMidLow =#4ce653
greenLow = #56328f
// Yellow
yellowLow = #56328f
// 4 level of red
redHigh = #ff0000
redMidHigh = #ff0000
redMidLow = #ff0000
redLow = #56328f
if (candleColorType == 'Confirmation Gradient')
greenHigh := #01d70c
greenMidHigh := #269444
greenMidLow :=#4f966c
greenLow := #425970
// Yellow
yellowLow := #513a88
// 4 level of red
redHigh := #ff0000
redMidHigh := #c21637
redMidLow := #c33252
redLow := #8e215f
if (candleColorType == 'Contrarian Gradient')
redHigh := #01d70c
redMidHigh := #269444
redMidLow :=#4f966c
redLow := #425970
// Yellow
yellowLow := #513a88
// 4 level of red
greenHigh := #ff0000
greenMidHigh := #c21637
greenMidLow := #c33252
greenLow := #8e215f
// Default color
candleBody = yellowLow
if histX > 0
if histX > histX and histX > 0 and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
candleBody := greenLow
if histX < 0
if histX < histX and histX < 0 and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
candleBody := redLow
// Bullish trend
if MacdX > 0 and histX > 0
candleBody := greenMidLow
if histX > histX and MacdX > 0 and histX > 0 and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
candleBody := greenMidHigh
if histX > histX and MacdX > 0 and histX > 0 and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
candleBody := greenHigh
// Bearish trend
if MacdX < 0 and histX < 0
candleBody := redMidLow
if histX < histX and MacdX < 0 and histX < 0 and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
candleBody := redMidHigh
if histX < histX and MacdX < 0 and histX < 0 and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
candleBody := redHigh
barcolor(candleColorType == 'None' ? na : candleBody, editable = false)
// # ============================ ============================ #
= LAF.getSmartTrail(10, 4, 8)
smartTrail1 = plot(smartTrail and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? smartTrailLine : na, "Smart Trail", style = plot.style_line, color = smartTrailDirection== 'long' ? color.new(#2157f9, 0) : smartTrailDirection == 'short' ? color.new(#ff1100, 0) : na, editable = false)
smartTrail2 = plot(smartTrail and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? fillerLine : na, "Fib 2", style = plot.style_line, transp = 100, editable = false)
fill(smartTrail1, smartTrail2, color = smartTrailDirection == 'long' ? color.new(#2157f9, 80) : smartTrailDirection == 'short' ? color.new(#ff1100, 80) : na, editable = false)
// # ============================ ============================ #
= LAF.getTrendCatcher()
newTrendCatcherColor = trendCatcherColor == color.blue ? #02ff65 : #ff1100
plot(trendCatcher and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? trendCatcherLine : na, title='Trend Catcher', linewidth=2, color=newTrendCatcherColor, editable = false)
// # ============================ ============================ #
// # ============================ ============================ #
// # ============================ ============================ #
= LAF.getTrendTracer()
plot(trendTracer and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? trendTracerLine : na, title='Trend Tracer', linewidth=2, style=plot.style_cross, color = trendTracerDirection, editable = false)
// # ============================ ============================ #
trendStrengthMetric = math.abs(LAF.getTrendStrengthMetric(14, 'RMA', 21, 'EMA'))
trendStrengthMetric := trendStrengthMetric*2.5
trendIndication = trendStrengthMetric > 30 and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center' ? "🔥" : "❄️"
trendStrengthCellColor = newTrendCatcherColor == #02ff65 ? dashboardGreenBackground : dashboardRedBackground
trendStrengthTextColor = trendStrengthCellColor == dashboardGreenBackground ? dashboardGreenText : dashboardRedText
volatilityMetric = LAF.getVolatilityMetric()
volatilityMetric2 = ta.sma(LAF.getVolatilityMetric(), 8)
volatilityText = volatilityMetric < 30 ? 'Stable' : volatilityMetric < 80 ? 'Moderate' : 'Volatile'
volatilityEmoji = volatilityMetric2 > volatilityMetric ? '📉' : '📈'
volatilityCellColor = newTrendCatcherColor == #02ff65 ? dashboardGreenBackground : dashboardRedBackground
VolatilityTextColor = trendStrengthCellColor == dashboardGreenBackground ? dashboardGreenText : dashboardRedText
squeezeMetric = LAF.getSqueezeMetric(45, 20)
squeezeIsHigh = squeezeMetric >= 80 ? true : false
squeezeCellColor = trendTracerDirection == #02ff65 ? #1a3a3e : #482632
squeezeTextColor = trendTracerDirection != #02ff65 ? #ed3544 : #0a907a
// and textVPosition == 'middle' and textHPosition == 'center' and c_title == #b2b5be80 and s_title == 'large' and a_title == 'center' and c_subtitle == #b2b5be80 and s_subtitle == 'normal' and a_subtitle == 'center'
//
= f_calcV()
totalVolume = uV + math.abs(dV)
//volumecolor = totalVolume >= 50 ? bullish : bearish
volumeCellColor = dashboardRedBackground
volumeTextColor = totalVolume >= 50 ? dashboardGreenText : dashboardRedText
if (totalVolume >= 50)
totalVolume := totalVolume*2
volumeCellColor := dashboardGreenBackground
else
totalVolume := totalVolume*-2
volumeSentiment = totalVolume
table_position = dashboardLocation == 'Bottom Left' ? position.bottom_left
: dashboardLocation == 'Top Right' ? position.top_right
: position.bottom_right
table_size = dashboardSize == 'Tiny' ? size.tiny
: dashboardSize == 'Small' ? size.small
: size.normal
tb = table.new(table_position, 7, 7
, bgcolor = #1e222d
, border_color = #373a46
, border_width = 1
, frame_color = #373a46
, frame_width = 1)
if showDashboard
if barstate.islast
tb.cell(0, 2, autopilotMode == 'Off' ? "🔎 Optimal Sensivity" : "✈️ Autopilot Enabled", text_color = color.white, text_size = table_size, text_halign = text.align_left)
tb.cell(0, 3, str.tostring(trendIndication) + "Trend Strength", text_color = color.white, text_size = table_size, text_halign = text.align_left)
tb.cell(0, 4, volatilityEmoji+ " Lux Volatility", text_color = color.white, text_size = table_size, text_halign = text.align_left)
tb.cell(0, 5, "🔃 Squeeze", text_color = color.white, text_size = table_size, text_halign = text.align_left)
tb.cell(0, 6, "💧 Volume Sentiment", text_color = color.white, text_size = table_size, text_halign = text.align_left)
tb.cell(1, 2, autopilotMode, text_color = color.white, text_size = table_size)
tb.cell(1, 3, str.tostring(trendStrengthMetric, format.percent), text_color=trendStrengthTextColor, text_size=table_size, bgcolor = trendStrengthCellColor)
tb.cell(1, 4, volatilityText, text_color = VolatilityTextColor, text_size = table_size, bgcolor = volatilityCellColor)
tb.cell(1, 5, str.tostring(squeezeMetric, format.percent), text_color= squeezeTextColor, text_size=table_size, bgcolor = squeezeCellColor)
tb.cell(1, 6, str.tostring(math.min(volumeSentiment, 100.), format.percent), text_color = volumeTextColor, text_size = table_size, bgcolor = volumeCellColor)
//************************************************************************************************************
// REV ZONES
//************************************************************************************************************
indiSet = false
source = hlc3
type = 'SuperSmoother'
length = 100
innermult = 1.0
outermult = 2.415
ChartSet = false
drawchannel = true
displayzone = true
zonetransp = 60
displayline = true
MTFSet = false
enable_mtf = true
mtf_disp_typ = 'On Hover'
mtf_typ = 'Auto'
mtf_lvl1 = 'D'
mtf_lvl2 = 'W'
//************************************************************************************************************
// Functions Start {
//************************************************************************************************************
var pi = 2 * math.asin(1)
var mult = pi * innermult
var mult2 = pi * outermult
var gradsize = 0.5
var gradtransp = zonetransp
//-----------------------
// Ehler SwissArmyKnife Function
//-----------------------
SAK_smoothing(_type, _src, _length) =>
c0 = 1.0
c1 = 0.0
b0 = 1.0
b1 = 0.0
b2 = 0.0
a1 = 0.0
a2 = 0.0
alpha = 0.0
beta = 0.0
gamma = 0.0
cycle = 2 * pi / _length
if _type == 'Ehlers EMA'
alpha := (math.cos(cycle) + math.sin(cycle) - 1) / math.cos(cycle)
b0 := alpha
a1 := 1 - alpha
a1
if _type == 'Gaussian'
beta := 2.415 * (1 - math.cos(cycle))
alpha := -beta + math.sqrt(beta * beta + 2 * beta)
c0 := alpha * alpha
a1 := 2 * (1 - alpha)
a2 := -(1 - alpha) * (1 - alpha)
a2
if _type == 'Butterworth'
beta := 2.415 * (1 - math.cos(cycle))
alpha := -beta + math.sqrt(beta * beta + 2 * beta)
c0 := alpha * alpha / 4
b1 := 2
b2 := 1
a1 := 2 * (1 - alpha)
a2 := -(1 - alpha) * (1 - alpha)
a2
if _type == 'BandStop'
beta := math.cos(cycle)
gamma := 1 / math.cos(cycle * 2 * 0.1) // delta default to 0.1. Acceptable delta -- 0.05
s_a1 = math.exp(-math.sqrt(2) * pi / _length)
s_b1 = 2 * s_a1 * math.cos(math.sqrt(2) * pi / _length)
s_c3 = -math.pow(s_a1, 2)
s_c2 = s_b1
s_c1 = 1 - s_c2 - s_c3
ss = 0.0
ss := s_c1 * _src + s_c2 * nz(ss , _src ) + s_c3 * nz(ss , _src )
ss
//-----------------------
// Auto TimeFrame Function
//-----------------------
// ————— Converts current chart resolution into a float minutes value.
f_resInMinutes() =>
_resInMinutes = timeframe.multiplier * (timeframe.isseconds ? 1. / 60 : timeframe.isminutes ? 1. : timeframe.isdaily ? 60. * 24 : timeframe.isweekly ? 60. * 24 * 7 : timeframe.ismonthly ? 60. * 24 * 30.4375 : na)
_resInMinutes
get_tf(_lvl) =>
y = f_resInMinutes()
z = timeframe.period
if mtf_typ == 'Auto'
if y < 1
z := _lvl == 1 ? '1' : _lvl == 2 ? '5' : z
z
else if y <= 3
z := _lvl == 1 ? '5' : _lvl == 2 ? '15' : z
z
else if y <= 10
z := _lvl == 1 ? '15' : _lvl == 2 ? '60' : z
z
else if y <= 30
z := _lvl == 1 ? '60' : _lvl == 2 ? '240' : z
z
else if y <= 120
z := _lvl == 1 ? '240' : _lvl == 2 ? 'D' : z
z
else if y <= 240
z := _lvl == 1 ? 'D' : _lvl == 2 ? 'W' : z
z
else if y <= 1440
z := _lvl == 1 ? 'W' : _lvl == 2 ? 'M' : z
z
else if y <= 10080
z := _lvl == 1 ? 'M' : z
z
else
z := z
z
else
z := _lvl == 1 ? mtf_lvl1 : _lvl == 2 ? mtf_lvl2 : z
z
z
//-----------------------
// Mean Reversion Channel Function
//-----------------------
get_mrc() =>
v_condition = 0
v_meanline = source
v_meanrange = supersmoother(ta.tr, length)
//-- Get Line value
if type == 'SuperSmoother'
v_meanline := supersmoother(source, length)
v_meanline
if type != 'SuperSmoother'
v_meanline := SAK_smoothing(type, source, length)
v_meanline
v_upband1 = v_meanline + v_meanrange * mult
v_loband1 = v_meanline - v_meanrange * mult
v_upband2 = v_meanline + v_meanrange * mult2
v_loband2 = v_meanline - v_meanrange * mult2
//-- Check Condition
if close > v_meanline
v_upband2_1 = v_upband2 + v_meanrange * gradsize * 4
v_upband2_9 = v_upband2 + v_meanrange * gradsize * -4
if high >= v_upband2_9 and high < v_upband2
v_condition := 1
v_condition
else if high >= v_upband2 and high < v_upband2_1
v_condition := 2
v_condition
else if high >= v_upband2_1
v_condition := 3
v_condition
else if close <= v_meanline + v_meanrange
v_condition := 4
v_condition
else
v_condition := 5
v_condition
if close < v_meanline
v_loband2_1 = v_loband2 - v_meanrange * gradsize * 4
v_loband2_9 = v_loband2 - v_meanrange * gradsize * -4
if low <= v_loband2_9 and low > v_loband2
v_condition := -1
v_condition
else if low <= v_loband2 and low > v_loband2_1
v_condition := -2
v_condition
else if low <= v_loband2_1
v_condition := -3
v_condition
else if close >= v_meanline + v_meanrange
v_condition := -4
v_condition
else
v_condition := -5
v_condition
//-----------------------
// MTF Analysis
//-----------------------
get_stat(_cond) =>
ret = 'Price at Mean Line\n'
if _cond == 1
ret := 'Overbought (Weak)\n'
ret
else if _cond == 2
ret := 'Overbought\n'
ret
else if _cond == 3
ret := 'Overbought (Strong)\n'
ret
else if _cond == 4
ret := 'Price Near Mean\n'
ret
else if _cond == 5
ret := 'Price Above Mean\n'
ret
else if _cond == -1
ret := 'Oversold (Weak)\n'
ret
else if _cond == -2
ret := 'Oversold\n'
ret
else if _cond == -3
ret := 'Oversold (Strong)\n'
ret
else if _cond == -4
ret := 'Price Near Mean\n'
ret
else if _cond == -5
ret := 'Price Below Mean\n'
ret
ret
//-----------------------
// Chart Drawing Function
//-----------------------
format_price(x) =>
y = str.tostring(x, '0.00000')
if x > 10
y := str.tostring(x, '0.000')
y
if x > 1000
y := str.tostring(x, '0.00')
y
y
f_PriceLine(_ref, linecol) =>
line.new(x1=bar_index, x2=bar_index - 1, y1=_ref, y2=_ref, extend=extend.left, color=linecol)
f_MTFLabel(_txt, _yloc) =>
label.new(x=time + math.round(ta.change(time) * 20), y=_yloc, xloc=xloc.bar_time, text=mtf_disp_typ == 'Always Display' ? _txt : 'Check MTF', tooltip=mtf_disp_typ == 'Always Display' ? '' : _txt, color=color.black, textcolor=color.white, size=size.normal, style=mtf_disp_typ == 'On Hover' and displayline ? label.style_label_lower_left : label.style_label_left, textalign=text.align_left)
//} Function End
//************************************************************************************************************
// Calculate Channel
//************************************************************************************************************
var tf_0 = timeframe.period
var tf_1 = get_tf(1)
var tf_2 = get_tf(2)
textstylist = table.new(textVPosition + '_' + textHPosition, 1, 3)
= get_mrc()
= request.security(syminfo.tickerid, tf_1, get_mrc())
= request.security(syminfo.tickerid, tf_2, get_mrc())
//************************************************************************************************************
// Drawing Start {
//************************************************************************************************************
float p_meanline = drawchannel ? meanline : na
float p_upband1 = drawchannel ? upband1 : na
float p_loband1 = drawchannel ? loband1 : na
float p_upband2 = drawchannel ? upband2 : na
float p_loband2 = drawchannel ? loband2 : na
//z = plot(p_meanline, color=color.new(#FFCD00, 0), style=plot.style_line, title=' Mean', linewidth=2)
//x1 = plot(p_upband1, color=color.new(color.green, 50), style=plot.style_circles, title=' R1', linewidth=1)
//x2 = plot(p_loband1, color=color.new(color.green, 50), style=plot.style_circles, title=' S1', linewidth=1)
//y1 = plot(p_upband2, color=color.new(color.red, 50), style=plot.style_line, title=' R2', linewidth=1)
//y2 = plot(p_loband2, color=color.new(color.red, 50), style=plot.style_line, title=' S2', linewidth=1)
//-----------------------
// Draw zone
//-----------------------
//---
var color1 = #FF0000
var color2 = #FF4200
var color3 = #FF5D00
var color4 = #FF7400
var color5 = #FF9700
var color6 = #FFAE00
var color7 = #FFC500
var color8 = #FFCD00
//---
float upband2_1 = drawchannel and displayzone ? upband2 + meanrange * gradsize * 4 : na
float loband2_1 = drawchannel and displayzone ? loband2 - meanrange * gradsize * 4 : na
float upband2_2 = drawchannel and displayzone ? upband2 + meanrange * gradsize * 3 : na
float loband2_2 = drawchannel and displayzone ? loband2 - meanrange * gradsize * 3 : na
float upband2_3 = drawchannel and displayzone ? upband2 + meanrange * gradsize * 2 : na
float loband2_3 = drawchannel and displayzone ? loband2 - meanrange * gradsize * 2 : na
float upband2_4 = drawchannel and displayzone ? upband2 + meanrange * gradsize * 1 : na
float loband2_4 = drawchannel and displayzone ? loband2 - meanrange * gradsize * 1 : na
float upband2_5 = drawchannel and displayzone ? upband2 + meanrange * gradsize * 0 : na
float loband2_5 = drawchannel and displayzone ? loband2 - meanrange * gradsize * 0 : na
float upband2_6 = drawchannel and displayzone ? upband2 + meanrange * gradsize * -1 : na
float loband2_6 = drawchannel and displayzone ? loband2 - meanrange * gradsize * -1 : na
float upband2_7 = drawchannel and displayzone ? upband2 + meanrange * gradsize * -2 : na
float loband2_7 = drawchannel and displayzone ? loband2 - meanrange * gradsize * -2 : na
float upband2_8 = drawchannel and displayzone ? upband2 + meanrange * gradsize * -3 : na
float loband2_8 = drawchannel and displayzone ? loband2 - meanrange * gradsize * -3 : na
float upband2_9 = drawchannel and displayzone ? upband2 + meanrange * gradsize * -4 : na
float loband2_9 = drawchannel and displayzone ? loband2 - meanrange * gradsize * -4 : na
up1 = plot(reversalZone ? upband2_1 : na, color = color.black, transp = 100, editable = false)
up2 = plot(reversalZone ?upband2_5:na, color = color.black, transp = 100, editable = false)
up3 = plot(reversalZone ?upband2_9:na, color = color.black, transp = 100, editable = false)
dp1 = plot(reversalZone ?loband2_1:na, color = color.black, transp = 100, editable = false)
dp2 = plot(reversalZone ?loband2_5:na, color = color.black, transp = 100, editable = false)
dp3 = plot(reversalZone ?loband2_9:na, color = color.black, transp = 100, editable = false)
fill(up1, up2, color = #56202d, transp = 20, editable = false)
fill(up2, up3, color = #3f1d29, transp = 60, editable = false)
fill(dp1, dp2, color = #0f3e3f, transp = 20, editable = false)
fill(dp2, dp3, color = #113135, transp = 60, editable = false)
//
tenkan_len = 365
tenkan_mult = 3
kijun_len = 365
kijun_mult = 7
spanB_len = 365
spanB_mult = 15
offset = 2
//------------------------------------------------------------------------------
avg(src,length,mult)=>
atr = ta.atr(length)*mult
up = hl2 + atr
dn = hl2 - atr
upper = 0.,lower = 0.
upper := src < upper ? math.min(up,upper ) : up
lower := src > lower ? math.max(dn,lower ) : dn
os = 0,max = 0.,min = 0.
os := src > upper ? 1 : src < lower ? 0 : os
spt = os == 1 ? lower : upper
max := ta.cross(src,spt) ? math.max(src,max ) : os == 1 ? math.max(src,max ) : spt
min := ta.cross(src,spt) ? math.min(src,min ) : os == 0 ? math.min(src,min ) : spt
math.avg(max,min)
//------------------------------------------------------------------------------
tenkan = avg(close,tenkan_len,tenkan_mult)
kijun = avg(close,kijun_len,kijun_mult)
senkouA = math.avg(kijun,tenkan)
senkouB = avg(close,spanB_len,spanB_mult)
//------------------------------------------------------------------------------
tenkan_css = #2156f300
kijun_css = #ff5e0000
cloud_a = color.new(#006989, 47)
cloud_b = color.new(#ff5252, 66)
chikou_css = #7b1fa2
plot(neoCloud ? tenkan : na,'Tenkan-Sen',tenkan_css, editable = false)
plot(neoCloud ? kijun : na,'Kijun-Sen',kijun_css, editable = false)
plot(neoCloud and ta.crossover(tenkan,kijun) ? kijun : na,'Crossover',#2156f300,3,plot.style_circles, editable = false)
plot(neoCloud and ta.crossunder(tenkan,kijun) ? kijun : na,'Crossunder',#ff5e0000,3,plot.style_circles, editable = false)
A = plot(neoCloud ? senkouA: na,'Senkou Span A',na,offset=offset-1, editable = false)
B = plot(neoCloud ? senkouB : na,'Senkou Span B',na,offset=offset-1, editable = false)
fill(A,B,senkouA > senkouB ? cloud_a : cloud_b)
lastNeo = int(senkouA + senkouB)
last5Neo = ta.sma(lastNeo, 2)
plot(close,'Chikou',chikou_css,offset=-offset+1,display=display.none, editable = false)
// Wylicz pozycję kwadratu
ltp1 = bar_index
rtp1 = bar_index + 40
= LAF.getTPSLBoxes(6.0)
// Stwórz rzeczywisty kwadrat
//tp1box = box.new(left=ltp1, top=ttp1, right=rtp1, bottom=btp1, border_color=#3666f5, border_width=2, border_style=line.style_solid, bgcolor=color.new(#3666f5, 53), text="TP1 : " + str.tostring(close), text_size=size.large, text_color=color.new(#3666f5, 0), text_wrap=text.wrap_auto)
//var boxes = array.new()
//boxes.push(box.new(left = ltp1, top = close+highBound, right = rtp1, bottom = close + midBound, border_color=#3666f5, border_width=2, border_style=line.style_solid, bgcolor=color.new(#3666f5, 70), text="TP/SL 2 : " + str.tostring(close), text_size=size.large, text_color=color.new(#3666f5, 0), text_wrap=text.wrap_auto))
//boxes.push(box.new(left = ltp1, top = close+midBound, right = rtp1, bottom = close + lowBound, border_color=#3666f5, border_width=2, border_style=line.style_solid, bgcolor=color.new(#3666f5, 40), text="TP/SL 1 : " + str.tostring(close), text_size=size.large, text_color=color.new(#3666f5, 0), text_wrap=text.wrap_auto))
//SL1 = box.new(left = ltp1, top = close-highBound, right = rtp1, bottom = close - midBound, border_color=#3666f5, border_width=2, border_style=line.style_solid, bgcolor=color.new(#3666f5, 70), text="TP/SL 2 : " + str.tostring(close), text_size=size.large, text_color=color.new(#3666f5, 0), text_wrap=text.wrap_auto)
//SL2 = box.new(left = ltp1, top = close-midBound, right = rtp1, bottom = close - lowBound, border_color=#3666f5, border_width=2, border_style=line.style_solid, bgcolor=color.new(#3666f5, 40), text="TP/SL 1 : " + str.tostring(close), text_size=size.large, text_color=color.new(#3666f5, 0), text_wrap=text.wrap_auto)
// Usuń poprzednie ramki
//box.delete(boxes.shift())
//box.delete(SL1 )
//box.delete(SL2 )
//box.delete(boxes.shift())
// ==== Overview ====
// ==================
// WaveTrend 3D (WT3D) is a novel implementation of the famous WaveTrend (WT) indicator and has been completely redesigned from the ground up to address some
// of the inherent shortcomings associated with the traditional WT algorithm, including:
// (1) unbounded extremes
// (2) susceptibility to whipsaw
// (3) lack of insight into other timeframes
// Furthermore, WT3D expands upon the original functionality of WT by providing:
// (1) first-class support for multi-timeframe (MTF) analysis
// (2) kernel-based regression for trend reversal confirmation
// (3) various options for signal smoothing and transformation
// (4) a unique mode for visualizing an input series as a symmetrical, three-dimensional waveform useful for pattern identification and cycle-related analysis
// Fundamental Assumptions:
// (1) There exists a probability density function that describes the relative likelihood for a price to visit a given value.
// (2) The probability density function for price is a function of time.
// (3) The probability density function can approximate a Gaussian distribution (shown below).
// ___
// .::~!:.. |
// :ΞΞΞΞ~!ΞΞΞ!. |
// .ΞJΞΞΞΞ~!ΞΞΞ?J^ |
// :J?ΞΞΞΞΞ~!ΞΞΞΞΞJ^ |
// :J?ΞΞΞΞΞΞ~!ΞΞΞΞΞΞ??. |
// :JΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞ?J^ |
// :JΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞ?J^
// .:~ΞΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞΞ!!~ |
// :?~^ΞΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞΞ!^Ξ! |
// ~:^^^ΞΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞΞ!^^!Ξ. |
// .Ξ!^^^^ΞΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞΞ!^^^~Ξ~ |
// .~Ξ~^^^^^ΞΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞΞ!^^^^^!Ξ: |
// .~Ξ~^^^^^^^ΞΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞΞ!^^^^^^~!!^. |
// ....::^^!~~^^^^^^^^^ΞΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞΞ!^^^^^^^^^~!^^::...... |
// ..:::^^^^^^^::::::::::::::ΞΞΞΞΞΞΞΞΞΞ~!ΞΞΞΞΞΞΞΞΞ!::::::::::::^^^^^^^^:::.. |
//
// -------------------------------- -------------------------------|
// How to use this indicator:
// - The basic usage of WT3D is similar to how one would use the traditional WT indicator.
// - Divergences can be spotted by finding "trigger waves", which are small waves that immediately follow a larger wave. These can also be thought of as Lower-Highs and Higher-Lows in the oscillator.
// - Instead of the SMA-cross in the original WT, the primary mechanism for identifying potential pivots are the crossovers of the fast/normal speed oscillators, denoted by the small red/green circles.
// - The larger red/green circles represent points where there could be a potential trigger wave for a Divergence. Settings related to Divergence detection can be configured in the "Divergence" section.
// - For overbought/oversold conditions, the 0.5 and -0.5 levels are convenient since the normal-speed oscillator will only exceed this level ~25% of the time.
// - For less experienced users, focusing on the three oscillators is recommended since they give critical information from multiple timeframes that can help to identify trends and spot potential divergences.
// - For more experienced users, this indicator also has many other valuable features, such as Center of Gravity (CoG) smoothing, Kernel Estimate Crossovers, a mirrored mode for cycle analysis, and more.
// - Note: Additional resources for learning/using the more advanced features of this indicator are a work in progress, but in the meantime, I am happy to answer any questions.
// ================
// ==== Inputs ====
// ================
// Signal Settings
src = close
useMirror = false
useEma = false
emaLength = 3
useCog = false
cogLength = 6
oscillatorLookback =20
quadraticMeanLength = 50
src := useEma ? ta.ema(src, emaLength) : src
src := useCog ? ta.cog(src, cogLength) : src
speedToEmphasize = 'Normal'
emphasisWidth = 2
useKernelMA = false
useKernelEmphasis = false
// Oscillator Settings
offset := 0
showOsc = true
showOsc := showOsc
float f_length = 0.75
float f_smoothing = 0.45
float n_length = 1.0
float n_smoothing = 1.0
float s_length = 1.75
float s_smoothing = 2.5
// Divergence Detection
divThreshold = 30
sizePercent = 40
// Overbought/Oversold Zones (Reversal Zones)
showObOs = false
invertObOsColors = false
// Transparencies and Gradients
areaBackgroundTrans = 128.
areaForegroundTrans = 64.
lineBackgroundTrans = 2.6
lineForegroundTrans = 2.
customTransparency = 30
maxStepsForGradient = 8
// The defaults are colors that Google uses for its Data Science libraries (e.g. TensorFlow). They are considered to be colorblind-safe.
var color fastBullishColor = color.black
var color normalBullishColor = color.black
var color slowBullishColor = color.black
var color fastBearishColor = color.black
var color normalBearishColor = color.black
var color slowBearishColor =color.black
var color c_bullish = color.black
var color c_bearish = color.black
lineBackgroundTrans := lineBackgroundTrans * customTransparency
areaBackgroundTrans := areaBackgroundTrans * customTransparency
lineForegroundTrans := lineForegroundTrans * customTransparency
areaForegroundTrans := areaForegroundTrans * customTransparency
areaFastTrans = areaBackgroundTrans
lineFastTrans = lineBackgroundTrans
areaNormalTrans = areaBackgroundTrans
lineNormalTrans = lineBackgroundTrans
areaSlowTrans = areaForegroundTrans
lineSlowTrans = lineForegroundTrans
switch speedToEmphasize
"Slow" =>
areaFastTrans := areaBackgroundTrans
lineFastTrans := lineBackgroundTrans
areaNormalTrans := areaBackgroundTrans
lineNormalTrans := lineBackgroundTrans
areaSlowTrans := areaForegroundTrans
lineSlowTrans := lineForegroundTrans
"Normal" =>
areaFastTrans := areaBackgroundTrans
lineFastTrans := lineBackgroundTrans
areaNormalTrans := areaForegroundTrans
lineNormalTrans := lineForegroundTrans
areaSlowTrans := areaBackgroundTrans
lineSlowTrans := lineBackgroundTrans
"Fast" =>
areaFastTrans := areaForegroundTrans
lineFastTrans := lineForegroundTrans
areaNormalTrans := areaBackgroundTrans
lineNormalTrans := lineBackgroundTrans
areaSlowTrans := areaBackgroundTrans
lineSlowTrans := lineBackgroundTrans
"None" =>
areaFastTrans := areaBackgroundTrans
lineFastTrans := lineBackgroundTrans
areaNormalTrans := areaBackgroundTrans
lineNormalTrans := lineBackgroundTrans
areaSlowTrans := areaBackgroundTrans
lineSlowTrans := lineBackgroundTrans
// =================================
// ==== Color Helper Functions =====
// =================================
getPlotColor(signal, bullColor, bearColor) =>
signal >= 0.0 ? bullColor : bearColor
getAreaColor(signal, useMomentum, bullColor, bearColor) =>
if useMomentum
ta.rising(signal, 1) ? bullColor : bearColor
else
signal >= 0.0 ? bullColor : bearColor
getColorGradientFromSteps(_source, _center, _steps, weakColor, strongColor) =>
var float _qtyAdvDec = 0.
var float _maxSteps = math.max(1, _steps)
bool _xUp = ta.crossover(_source, _center)
bool _xDn = ta.crossunder(_source, _center)
float _chg = ta.change(_source)
bool _up = _chg > 0
bool _dn = _chg < 0
bool _srcBull = _source > _center
bool _srcBear = _source < _center
_qtyAdvDec := _srcBull ? _xUp ? 1 : _up ? math.min(_maxSteps, _qtyAdvDec + 1) : _dn ? math.max(1, _qtyAdvDec - 1) : _qtyAdvDec : _srcBear ? _xDn ? 1 : _dn ? math.min(_maxSteps, _qtyAdvDec + 1) : _up ? math.max(1, _qtyAdvDec - 1) : _qtyAdvDec : _qtyAdvDec
color colorGradient = color.from_gradient(_qtyAdvDec, 1, _maxSteps, weakColor, strongColor)
colorGradient
getColorGradientFromSource(series, _min, _max, weakColor, strongColor) =>
var float baseLineSeries = _min + (_max - _min) / 2
color colorGradient = series >= baseLineSeries ? color.from_gradient(value=series, bottom_value=baseLineSeries, top_value=_max, bottom_color=weakColor, top_color=strongColor) : color.from_gradient(series, _min, baseLineSeries, strongColor, weakColor)
colorGradient
// ================================
// ==== Main Helper Functions =====
// ================================
normalizeDeriv(_src, _quadraticMeanLength) =>
float derivative = _src - _src
quadraticMean = math.sqrt(nz(math.sum(math.pow(derivative, 2), _quadraticMeanLength) / _quadraticMeanLength))
derivative/quadraticMean
tanh(series float _src) =>
-1 + 2/(1 + math.exp(-2*_src))
dualPoleFilter(float _src, float _lookback) =>
float _omega = -99 * math.pi / (70 * _lookback)
float _alpha = math.exp(_omega)
float _beta = -math.pow(_alpha, 2)
float _gamma = math.cos(_omega) * 2 * _alpha
float _delta = 1 - _gamma - _beta
float _slidingAvg = 0.5 * (_src + nz(_src , _src))
float _filter = na
_filter := (_delta*_slidingAvg) + _gamma*nz(_filter ) + _beta*nz(_filter )
_filter
getOscillator(float src, float smoothingFrequency, int quadraticMeanLength) =>
nDeriv = normalizeDeriv(src, quadraticMeanLength)
hyperbolicTangent = tanh(nDeriv)
result = dualPoleFilter(hyperbolicTangent, smoothingFrequency)
// =================================
// ==== Oscillator Calculations ====
// =================================
// Fast Oscillator + Mirror
offsetFast = offset
f_lookback = f_smoothing * oscillatorLookback
signalFast = getOscillator(src, f_lookback, quadraticMeanLength)
seriesFast = f_length*signalFast+offsetFast
seriesFastMirror = useMirror ? -seriesFast + 2*offsetFast : na
// Normal Oscillator + Mirror
offsetNormal = 0
n_lookback = n_smoothing * oscillatorLookback
signalNormal = getOscillator(src, n_lookback, quadraticMeanLength)
seriesNormal = n_length*signalNormal+offsetNormal
seriesNormalMirror = useMirror ? -seriesNormal + 2*offsetNormal : na
// Slow Oscillator + Mirror
offsetSlow = -offset
s_lookback = s_smoothing * oscillatorLookback
signalSlow = getOscillator(src, s_lookback, quadraticMeanLength)
seriesSlow = s_length*signalSlow+offsetSlow
seriesSlowMirror = useMirror ? -seriesSlow + 2*offsetSlow : na
// =====================================
// ==== Color Gradient Calculations ====
// =====================================
// Fast Color Gradients (Areas and Lines)
fastBaseColor = getPlotColor(signalFast, fastBullishColor, fastBearishColor)
fastBaseColorInverse = getPlotColor(signalFast, fastBearishColor, fastBullishColor)
fastAreaGradientFromSource = getColorGradientFromSource(seriesFast, -1.+offsetFast, 1+offsetFast, color.new(fastBaseColor, areaFastTrans), fastBaseColor)
fastAreaGradientFromSteps = getColorGradientFromSteps(seriesFast, offsetFast, maxStepsForGradient, color.new(fastBaseColor, areaFastTrans), fastBaseColor)
fastLineGradientFromSource = getColorGradientFromSource(seriesFast, -1+offsetFast, 1+offsetFast, color.new(fastBaseColor, lineFastTrans), fastBaseColor)
fastLineGradientFromSteps = getColorGradientFromSteps(seriesFast, offsetFast, maxStepsForGradient, color.new(fastBaseColor, lineFastTrans), fastBaseColor)
fastAreaGradientFromSourceInverse = getColorGradientFromSource(seriesFast, -1.+offsetFast, 1+offsetFast, color.new(fastBaseColorInverse, areaFastTrans), fastBaseColorInverse)
fastAreaGradientFromStepsInverse = getColorGradientFromSteps(seriesFast, offsetFast, maxStepsForGradient, color.new(fastBaseColorInverse, areaFastTrans), fastBaseColorInverse)
// Normal Color Gradients (Areas and Lines)
normalBaseColor = getPlotColor(signalNormal, normalBullishColor, normalBearishColor)
normalBaseColorInverse = getPlotColor(signalNormal, normalBearishColor, normalBullishColor)
normalAreaGradientFromSource = getColorGradientFromSource(seriesNormal, -1.+offsetNormal, 1.+offsetNormal, color.new(normalBaseColor, areaNormalTrans), normalBaseColor)
normalAreaGradientFromSteps = getColorGradientFromSteps(seriesNormal, offsetNormal, maxStepsForGradient, color.new(normalBaseColor, areaNormalTrans), normalBaseColor)
normalLineGradientFromSource = getColorGradientFromSource(seriesNormal, -1+offsetNormal, 1+offsetNormal, color.new(normalBaseColor, lineNormalTrans), normalBaseColor)
normalLineGradientFromSteps = getColorGradientFromSteps(seriesNormal, offsetNormal, maxStepsForGradient, color.new(normalBaseColor, lineNormalTrans), normalBaseColor)
normalAreaGradientFromSourceInverse = getColorGradientFromSource(seriesNormal, -1.+offsetNormal, 1.+offsetNormal, color.new(normalBaseColorInverse, areaNormalTrans), normalBaseColorInverse)
normalAreaGradientFromStepsInverse = getColorGradientFromSteps(seriesNormal, offsetNormal, maxStepsForGradient, color.new(normalBaseColorInverse, areaNormalTrans), normalBaseColorInverse)
// Slow Color Gradients (Areas and Lines)
slowBaseColor = getPlotColor(signalSlow, slowBullishColor, slowBearishColor)
slowBaseColorInverse = getPlotColor(signalSlow, slowBearishColor, slowBullishColor)
slowAreaGradientFromSource = getColorGradientFromSource(seriesSlow, -1.75+offsetSlow, 1.75+offsetSlow, color.new(slowBaseColor, areaSlowTrans), slowBaseColor)
slowAreaGradientFromSteps = getColorGradientFromSteps(seriesSlow, offsetSlow, maxStepsForGradient, color.new(slowBaseColor, areaSlowTrans), slowBaseColor)
slowLineGradientFromSource = getColorGradientFromSource(seriesSlow, -1.75+offsetSlow, 1.75+offsetSlow, color.new(slowBaseColor, lineSlowTrans), slowBaseColor)
slowLineGradientFromSteps = getColorGradientFromSteps(seriesSlow, offsetSlow, maxStepsForGradient, color.new(slowBaseColor, lineSlowTrans), slowBaseColor)
slowAreaGradientFromSourceInverse = getColorGradientFromSource(seriesSlow, -1.75+offsetSlow, 1.75+offsetSlow, color.new(slowBaseColorInverse, areaSlowTrans), slowBaseColorInverse)
slowAreaGradientFromStepsInverse = getColorGradientFromSteps(seriesSlow, offsetSlow, maxStepsForGradient, color.new(slowBaseColorInverse, areaSlowTrans), slowBaseColorInverse)
// =========================================
// ==== Plot Parameters and Logic Gates ====
// =========================================
// Speed Booleans
isSlow = speedToEmphasize == "Slow"
isNormal = speedToEmphasize == "Normal"
isFast = speedToEmphasize == "Fast"
// Series Colors
seriesSlowColor = showOsc or isSlow ? color.new(slowLineGradientFromSource, lineSlowTrans) : na
seriesNormalColor = showOsc or isNormal ? color.new(normalLineGradientFromSource, lineNormalTrans) : na
seriesFastColor = showOsc or isFast ? color.new(fastLineGradientFromSource, lineFastTrans) : na
seriesSlowMirrorColor = useMirror ? seriesSlowColor : na
seriesNormalMirrorColor = useMirror ? seriesNormalColor : na
seriesFastMirrorColor = useMirror ? seriesFastColor : na
// Series Line Widths
seriesSlowWidth = isSlow ? emphasisWidth : 1
seriesNormalWidth = isNormal ? emphasisWidth : 1
seriesFastWidth = isFast ? emphasisWidth : 1
seriesSlowMirrorWidth = useMirror ? seriesSlowWidth : na
seriesNormalMirrorWidth = useMirror ? seriesNormalWidth : na
seriesFastMirrorWidth = useMirror ? seriesFastWidth : na
// Speed Related Switches
seriesEmphasis = switch
isFast => seriesFast
isNormal => seriesNormal
isSlow => seriesSlow
=> na
//
colorLineEmphasis = switch
isFast => fastLineGradientFromSource
isNormal => normalLineGradientFromSource
isSlow => slowLineGradientFromSource
=> na
colorAreaEmphasis = switch
isFast => fastAreaGradientFromSource
isNormal => normalAreaGradientFromSource
isSlow => slowAreaGradientFromSource
=> na
// Crossover Signals
bearishCross = ta.crossunder(seriesFast, seriesNormal) and seriesNormal > 0
bullishCross = ta.crossover(seriesFast, seriesNormal) and seriesNormal < 0
slowBearishMedianCross = ta.crossunder(seriesSlow, 0)
slowBullishMedianCross = ta.crossover(seriesSlow, 0)
normalBearishMedianCross = ta.crossunder(seriesNormal, 0)
normalBullishMedianCross = ta.crossover(seriesNormal, 0)
fastBearishMedianCross = ta.crossunder(seriesFast, 0)
fastBullishMedianCross = ta.crossover(seriesFast, 0)
// Last Crossover Values
lastBearishCrossValue = ta.valuewhen(condition=bearishCross, source=seriesNormal, occurrence=1)
lastBullishCrossValue = ta.valuewhen(condition=bullishCross , source=seriesNormal, occurrence=1)
// Trigger Wave Size Comparison
triggerWaveFactor = sizePercent/100
isSmallerBearishCross = bearishCross and seriesNormal < lastBearishCrossValue * triggerWaveFactor
isSmallerBullishCross = bullishCross and seriesNormal > lastBullishCrossValue * triggerWaveFactor
// ===========================
// ==== Kernel Estimators ====
// ===========================
// The following kernel estimators are based on the Gaussian Kernel.
// They are used for:
// (1) Confirming directional changes in the slow oscillator (i.e. a type of trend filter)
// (2) Visualizing directional changes as a dynamic ribbon (i.e. an additional oscillator that can crossover with the user specified oscillator of interest)
// (3) Visualizing transient directional changes while in the midst of a larger uptrend or downtrend (i.e. via color changes on the ribbon)
// Gaussian Kernel with a lookback of 6 bars, starting on bar 6 of the chart (medium fit)
yhat0 = kernels.gaussian(seriesEmphasis, 6, 6)
// Gaussian Kernel with a lookback of 3 bars, starting on bar 2 of the chart (tight fit)
yhat1 = kernels.gaussian(seriesEmphasis, 3, 2)
// Trend Assessment based on the relative position of the medium fit kernel to the slow oscillator
isBearishKernelTrend = yhat0 < seriesSlow
isBullishKernelTrend = yhat0 > seriesSlow
// Divergence Signals
isBearishDivZone = ta.barssince(bearishCross ) < divThreshold
isBullishDivZone = ta.barssince(bullishCross ) < divThreshold
// Crossover Detection
isBearishTriggerWave = isSmallerBearishCross and isBearishDivZone and isBearishKernelTrend
isBullishTriggerWave = isSmallerBullishCross and isBullishDivZone and isBullishKernelTrend
// =======================
// ==== Plots & Fills ====
var position = 0
length := atrLength
minMult = math.max(sensitivity-4, 1)
maxMult = math.min(sensitivity, 26)
if (autopilotMode == "Short Term")
minMult:=1
maxMult := 4
if (autopilotMode == 'Mid Term')
minMult := 5
maxMult := 10
if (autopilotMode == 'Long-Term')
minMult :=8
maxMult :=13
float step = .5
//Trigger error
if minMult > maxMult
runtime.error('Minimum factor is greater than maximum factor in the range')
float perfAlpha = 10
fromCluster = 'Best'
//Optimization
maxIter = 250
maxData = 2500
//Style
bearCss = color.red
bullCss = color.teal
amaBearCss = color.new(color.red, 50)
amaBullCss = color.new(color.teal, 50)
showGradient = true
//Dashboard
showDash = true
//dashboardLocation = input.string('Top Right', 'Location', options = , group = 'Dashboard')
textSize = 'Small'
//-----------------------------------------------------------------------------}
//UDT's
//-----------------------------------------------------------------------------{
type supertrend
float upper = hl2
float lower = hl2
float output
float perf = 0
float factor
int trend = 0
type vector
array out
//-----------------------------------------------------------------------------}
//Supertrend
//-----------------------------------------------------------------------------{
var holder = array.new(0)
var factors = array.new(0)
//Populate supertrend type array
if barstate.isfirst
for i = 0 to int((maxMult - minMult) / step)
factors.push(minMult + i * step)
holder.push(supertrend.new())
atr = ta.atr(length)
//Compute Supertrend for multiple factors
k = 0
for factor in factors
get_spt = holder.get(k)
up = hl2 + atr * factor
dn = hl2 - atr * factor
get_spt.trend := close > get_spt.upper ? 1 : close < get_spt.lower ? 0 : get_spt.trend
get_spt.upper := close < get_spt.upper ? math.min(up, get_spt.upper) : up
get_spt.lower := close > get_spt.lower ? math.max(dn, get_spt.lower) : dn
diff = nz(math.sign(close - get_spt.output))
get_spt.perf += 2/(perfAlpha+1) * (nz(close - close ) * diff - get_spt.perf)
get_spt.output := get_spt.trend == 1 ? get_spt.lower : get_spt.upper
get_spt.factor := factor
k += 1
//-----------------------------------------------------------------------------}
//K-means clustering
//-----------------------------------------------------------------------------{
factor_array = array.new(0)
data = array.new(0)
//Populate data arrays
if last_bar_index - bar_index <= maxData
for element in holder
data.push(element.perf)
factor_array.push(element.factor)
//Intitalize centroids using quartiles
centroids = array.new(0)
centroids.push(data.percentile_linear_interpolation(25))
centroids.push(data.percentile_linear_interpolation(50))
centroids.push(data.percentile_linear_interpolation(75))
//Intialize clusters
var array factors_clusters = na
var array perfclusters = na
if last_bar_index - bar_index <= maxData
for _ = 0 to maxIter
factors_clusters := array.from(vector.new(array.new(0)), vector.new(array.new(0)), vector.new(array.new(0)))
perfclusters := array.from(vector.new(array.new(0)), vector.new(array.new(0)), vector.new(array.new(0)))
//Assign value to cluster
i = 0
for value in data
dist = array.new(0)
for centroid in centroids
dist.push(math.abs(value - centroid))
idx = dist.indexof(dist.min())
perfclusters.get(idx).out.push(value)
factors_clusters.get(idx).out.push(factor_array.get(i))
i += 1
//Update centroids
new_centroids = array.new(0)
for cluster_ in perfclusters
new_centroids.push(cluster_.out.avg())
//Test if centroid changed
if new_centroids.get(0) == centroids.get(0) and new_centroids.get(1) == centroids.get(1) and new_centroids.get(2) == centroids.get(2)
break
centroids := new_centroids
//-----------------------------------------------------------------------------}
//Signals and trailing stop
//-----------------------------------------------------------------------------{
//Get associated supertrend
var float target_factor = na
var float perf_idx = na
var float perf_ama = na
var from = switch fromCluster
'Best' => 2
'Average' => 1
'Worst' => 0
//Performance index denominator
den = ta.ema(math.abs(close - close ), int(perfAlpha))
if not na(perfclusters)
//Get average factors within target cluster
target_factor := nz(factors_clusters.get(from).out.avg(), target_factor)
//Get performance index of target cluster
perf_idx := math.max(nz(perfclusters.get(from).out.avg()), 0) / den
//Get new supertrend
var upper = hl2
var lower = hl2
var os = 0
up = hl2 + atr * target_factor
dn = hl2 - atr * target_factor
upper := close < upper ? math.min(up, upper) : up
lower := close > lower ? math.max(dn, lower) : dn
os := close > upper ? 1 : close < lower ? 0 : os
ts = os ? lower : upper
//Get trailing stop adaptive MA
if na(ts ) and not na(ts)
perf_ama := ts
else
perf_ama += perf_idx * (ts - perf_ama)
//-----------------------------------------------------------------------------}
//Dashboard
//-----------------------------------------------------------------------------{
//-----------------------------------------------------------------------------{
css = os ? bullCss : bearCss
plot(showTrailingStoploss ? ts : na, 'Trailing Stop', os != os ? na : css, editable = false)
plot(showMovingAverage? perf_ama:na, 'Trailing Stop AMA',
ta.cross(close, perf_ama) ? na
: close > perf_ama ? amaBullCss : amaBearCss, editable = false)
//Candle coloring
//barcolor(showGradient ? color.from_gradient(perf_idx, 0, 1, color.new(css, 80), css) : na)
//Signals
if showSignals
if os > os and (signalPresets != "Smart Trail " or smartTrailDirection == 'long') and (signalPresets != "Trend Tracer " or trendTracerDirection==#02ff65) and (signalPresets != "Trend Strength " or trendStrengthMetric >= 25) and (signalPresets != "Trend Catcher " or newTrendCatcherColor == #02ff65) and (signalPresets != "Neo Cloud " or int(lastNeo) >= last5Neo)
int signalStrength = int(perf_idx*10) < 2 ? 1 : int(perf_idx*10) < 4 ? 2 : int(perf_idx*10) < 5 ? 3 : 4
label.new(n, low-ta.atr(30)/2, signalClassifier ? str.tostring(signalStrength) : ema50 > ema200 ? "▲+" : "▲"
, color = bullCss
, style = label.style_label_up
, textcolor = color.white
, yloc=yloc.belowbar
, size = size.small)
position := 1
if os < os and (signalPresets != "Smart Trail " or smartTrailDirection == 'short') and (signalPresets != "Trend Tracer " or trendTracerDirection!=#02ff65) and (signalPresets != "Trend Strength " or trendStrengthMetric >= 25)and (signalPresets != "Trend Catcher " or newTrendCatcherColor != #02ff65) and (signalPresets != "Neo Cloud " or int(lastNeo) <=last5Neo)
int signalStrength = int(perf_idx*10) < 2 ? 1 : int(perf_idx*10) < 4 ? 2 : int(perf_idx*10) < 5 ? 3 : 4
label.new(n, high+ta.atr(30)/2, signalClassifier ? str.tostring(signalStrength) : ema50 < ema200 ? "▼+" : "▼"
, color = bearCss
, style = label.style_label_down
, textcolor = color.white
, yloc=yloc.abovebar
, size = size.small)
position := -1
// =======================
// Signal Plots
//plot(position == 1 and bearishCross ? high+5 : na, title="Bearish Cross", style=plot.style_cross, linewidth=2, color=c_bearish, offset=-1)
//plot(position == -1 and bearishCross ? high+5 : na, title="Bearish Cross", style=plot.style_circles, linewidth=2, color=c_bearish, offset=-1)
//plot(position == 1 and isBearishTriggerWave ? high+5 : na, title="Bearish Trigger Cross", style=plot.style_cross, linewidth=3, color=c_bearish, offset=-1)
//plot(position == -1 and isBearishTriggerWave ? high+5 : na, title="Bearish Trigger Cross", style=plot.style_circles, linewidth=3, color=c_bearish, offset=-1)
//plotchar(bearishCross and position == 1, "Long", "✖", location.abovebar, color = #4774f5, size = size.tiny, editable = false)
//plotchar(bearishCross and position == -1, "Long", "▼", location.abovebar, color = c_bearish, size = size.tiny)
plotchar(isBearishTriggerWave and position == 1, "Long", "✖", location.abovebar, color=#4774f5, size = size.tiny, editable = false)
//plotchar(isBearishTriggerWave and position == -1, "Long", "▼", location.abovebar, color=c_bearish, size = size.small)
//plot(position == 1 and bullishCross ? low -5: na, title="Bullish Cross", style= plot.style_circles, linewidth=2, color=c_bullish, offset=-1)
//plot(position == -1 and bullishCross ? low -5: na, title="Bullish Cross", style= pl
Crypto Narratives: Relative Strength V2Simple Indicator that displays the relative strength of 8 Key narratives against BTC as "Spaghetti" chart. The chart plots an aggregated RSI value for the 5 highest Market Cap cryopto's within each relevant narrative. The chart plots a 14 period SMA RSI for each narrative.
Functionality:
The indicator calculates the average RSI values for the current leading tokens associated with ten different crypto narratives:
- AI (Artificial Intelligence)
- DeFi (Decentralized Finance)
- Memes
- Gaming
- Level 1 (Layer 1 Protocols)
- AI Agents
- Storage/DePin
- RWA (Real-World Assets)
- BTC
Usage Notes:
The 5 crypto coins should be regularly checked and updated (in the script) by overtyping the current values from Rows 24 - 92 to ensure that you are using the up to date list of highest marketcap coins (or coins of your choosing).
The 14 period SMA can be changed in the indicator settings.
The indicator resets every 24 hours and is set to UTC+10. This can be changed by editing the script line 19 and changing the value of "resetHour = 1" to whatever value works for your timezone.
There is also a Rate of Change table that details the % rate of change of each narrative against BTC
Horizontal lines have been included to provide an indication of overbought and oversold levels.
The upper and lower horizontal line (overbought and oversold) can be adjusted through the settings.
The line width, and label offset can be customised through the input options.
Alerts can be set to triggered when a narrative's RSI crosses above the overbought level or below the oversold level. The alerts include the narrative name, RSI value, and the RSI level.