Tag Archives: Github

sellbuy-o-meter

The Golden Ratio Multiplier Trading Algorithm

This post covers a trading algorithm developed in Python code that uses Fibonacci numbers to set levels to trade off of. It is not a short term strategy, however it is a long-term strategy more of an investment strategy. It is based on a post by Philip Swift on https://www.lookintobitcoin.com/charts/golden-ratio-multiplier/

And he also has an original article on it on medium.

https://medium.com/@positivecrypto/the-golden-ratio-multiplier-c2567401e12a?sk=e38c4eff4357a1131bed57a3f4b3eeae

I have code for this here:

cbpro-cli-tools

Theory

The golden ratio multiplier is a great indicator, visually easy to confirm and makes a lot of sense on how it works. Best of all no fussy back-testing with parameter tweaks, you just need your eyes. I like how Josh Olszewicz  added the centerline, that is a nice addition. His video was a pleasant and positive surprise that I used to review the working of the golden ratio multiplier. I started using this indicator in early 2020.

On top of the basics that are covered by Phillip Swift , I have made some additions to the it that will work well in algorithmic form to allow it to run with constraints as code. Constraints are needed to allow it to pick the correct points to enter and exit along the curve. The method of (Dollar Cost Averaging) DCAing in and out is to use 20SMA BBands (Bollenger Bands) as a point to trigger DCA, using this as a filter to trigger on price exceeding lower/upper BBand. This tweak is based on the fact that I wrote code around this and was looking for a good way for when it should reasonably DCA, versus just triggering on a crossing the Fib # * YearlySMA (SMA, Simple Moving Average), or DCAing daily when under/above this line. As far as amounts to DCA, generally with accumulate/distribute investing, I tend to compound the trades in a way that accumulates heavier as distance increases below the YearlySMA line and distributes heavier as it goes further above the target Fib # * YearlySMA. The trading scaler used is a logistic curve , with n being the ratio of the current price to the yearly avg for buys (For sales ‘n’ is inverted by n being subtracted from the target Fib level multiplier + 1, which positions the curve out to where it scales up until it reaches the next higher Fib # and back down. Using the logistic curve formula (1+k)/(k + n^e) as a trade multiplier, for example with k=0.2 and e=4, it DCA’s in around 2x at the 0.8 * yearlySMA point,, 4x at 0.5…never more than 6x as it approaches that as a limit as n –> 0. I am planning publishing this code and documentation around it and will cite your video in it when I do.

Golden Ratio Multiplier Tutorial

Also Josh Olszewicz has a video tutorial on YouTube about this particular indicator using it with TradingView code it is a worth taking me 10 minutes or so to review this as background material along with looking at https://www.lookintobitcoin.com/charts/golden-ratio-multiplier/ and the original medium article . I am going to try to stick to the essence of the code and not so much the background material which could be read in viewed in those locations.

Code

This code is light weight and can be run on a Raspberry Pi as well as a Linux PC. It requires Python 3 as this is needed for the API.

The code is designed to use the Coin Base Pro API, however there is a middle layer that interfaces with the API via cbpro_buy_sell.py  If someone wants to interface with another API this just needs to be modified to make the proper calls into another exchange as an API on the back end of it. Think of it as a thin interface layer that goes between the code and the API itself.

cbpro-cli-tools

Dependancy : One import

this would be the import cbpro code that constitutes the module for the CV pro interface API.

Other Internal Dependancies

There are a few dependencies in the code thankfully they are local dependencies. The intent was to make this code as freestanding as possible without having to import various library’s other than what is barely necessary to make it operate. The first is a file called coretamodule.py. It holds a lot of functions that are used in multiple algorithm some of which will be used in the code to implement the golden ratio multiplier which is known as the arc just Python. The other necessary dependency is a piece of code that calls for polls periodically the CB pro ticker to add prices and volumes to aC SV file. This is run from some thing like Cron on a regular basis hourly daily five minutes what have you it is run at the same rate as a fib-ad high is run calling this code 1st to fill in the CSV file which will be read in by fib active high. I will try to make this publicly available on a site where it could  be pulled down with that lease this up to the current days data. Current days data

Core concepts

The idea of the code is that a 350 day moving average is calculated.This forms the basis of the ratio that is the golden ratio between the average and the price itself. When the price is below a threshold which is coded in to a dictionary of two pools as one. When below this level it will accumulate in the particular currency pair. There is another number in this threshold when it is above this ratio of price to 350 day moving average price. this forms the basis of the ratio that is the golden ratio between the average and the price itself. When the price is below a threshold which is coded in to a dictionary of tuples as one. When below this level it will accumulate in the particular currency pair. There is another number in this threshold when it is above this ratio of price to 350 day moving average price.above this range it will distribute the currency pair. Some fine tuning is made when it is above or below the threshold. The fine-tuning is based on Bollinger bands when it is touching the bottom bands in the Bollinger band 20 day bands and the minimum price for the 20 day period is also the same as this the price they buy is initiated.C onversely for the sale when the price is at the upper edge of the Bollinger band and the maximum price for the 20 day. Has been hit as a target a sale is initiated. This makes it dollar cost average in and out on a periodic basis. The idea is to have extra filtering to kind of find a peek in the valley of price movement without DCA (Dollar Cost Average) in too much or out too much.

defines.py

there is also a dependency file called defines.py this file has the portfolio amount in it. Now it is being dynamically allocated by using the cbpro_read_accts.py.  It will scale the amount of USD traded however BTC an ETH have to be adjusted within the main function to pick values that are comfortable and proper for the trading circumstances. This allows for configuration between all of the pairs.Also in the defines file you will have to enter the key passphrase and be 64 secret, this will allow the trades to occur through the API as this is passed on by the main code.

 

Also in the defines file you will have to enter the key passphrase and be 64 secret, this will allow the trades to occur through the API as this is passed on by the main code.

 

 

Outputs and Logs

The output from the code is interpret able humanly to understand what is going on as well. There are rose of prices after the initial currency pairs listed these prices represent the various thresholds at the end it states whether it is holding or not and also what the threshold and target are the threshold being the Bollinger bands edge and the target being the lowest to highest price for the 28th time.Additionally there are log files created when the code runs there are several that are verbose and specific to a underlying currency and there is one summary file. The verbose files contain the returned output from the API function call which is a dictionary of values return from CB pro. There will also appear within the non-verbose output as a message usually relates to some thing that needs to be corrected possibly a bug or some thing like insufficient funds. This is driven by the message key from the CB Pro API return dictionary.

 

The code makes every  attempt to avoid this because it uses limit orders and it also checks the balances in the supply. The underlying currency and the to be bought or sold currency itself.

Compounding Positions

There is also a logistic scaler in this code. The logistics scaler works by increasing the amount of currency bought when the price falls below the threshold of the golden ratio multiplier which is by default one. This allows increasing the amount purchased automatically but also reaches a limit as the ratio between the price and the 350 day moving average approaches zero. It will go to a limiting constant. The opposite is true for cells there is the threshold that is coded into the dictionary of tuples the logistic scaler takes in that value plus one informs a curve that goes from the upper threshold to all 1+ this value and increases the amount sold after this plus one position it will decrease once again to zero. The idea is that it scales out hard as the price rises but has a limiting factor if it rises above the threshold too far. The idea would be to manually address these thresholds to what is expected of the currency. So periodically maintenance might be required on the threshold or they might just be a set it and forget it for some people. This all depends on how you want to invest.

 

As the price rises but has a limiting factor if it rises above the threshold too far. The idea would be to manually address these thresholds to what is expected of the currency. So periodically maintenance might be required on the threshold or they might just be a set it and forget it for some people. This all depends on how you want to invest.The default or one and two in a two pole for every currency pair. It is possible to adjust days as needed. It is also possible to adjust the logistics scaler Constants which are KNEE is the exponential constant which controls the rate of rise of the curve and key controls the multiplier affect in the amount that would be traded in the limit.

Outer Loop

The code works by looping through a list actually a dictionary of underlying currencies US D/ETH/BTC as underlyings. Then there is a function call and the inner currencies are called in a loop these other currencies that are actually traded against the underlying currencies. This allows money currency pairs to be traded and others to be added in the future.

CSV Data Collection using API

The caveat here is that there has to be enough data in the CSV file to go back for the time. And if a currency one is to be added it has to be attitude there to harvest the ticker data. I suppose in theory it would be possible to fit all data into the file itself to add a prior prior currency that’s been running in the ticker for a while if someone is motivated not to do that. Having this code in a folder and having it called by Cron preferably using a script calling the script first then harvesting the price volume data then calling for code it will work seamlessly as a plug and play algorithm. Because it is using the golden ratio multiplier there is no need for back testing as this has been proven out to work by Philip Smith Swift in his presentation and write up online.

Obviously this code could be modified in anyway the 350 day. Could be changed to something else along with the 20 day Bollinger band. And the threshold as I said earlier our configurable.

 

 

 

 

makemore

Makemore

The best way to predict a future is to look for it. We live in the moment where the world is a vast machine for predicting the future.

This summer, I spent a significant amount of time contemplating large language models and delving deeper into their research. My first encounter with GPT-2 was back in 2019, where I explored its code and experimented with it. During this period, I became curious about transfer learning and its applications. Additionally, I had some prior knowledge about transformers, but it wasn’t as comprehensive as my understanding of LSTMs and RNNs. I couldn’t confidently explain what they did, for example.

While researching transfer learning with smaller models like GPT-2, I stumbled upon Gwern Branwen’s website (https://gwern.net/) and, in particular, his TWDNE Project (https://gwern.net/twdne). I found it clever because it combined a generative model for both images and text. I decided to focus on the text side of the project, as the image aspect was already well-addressed by applications like Stable Diffusion….

Misato Katsuragi as a Math Teacher
Misato Katsuragi as a Math Teacher

I might revisit the image style transfer aspect in the future, as I had previously explored it to some extent. You can find more about this in my “How to Generate Art Demo Followup.”

Before this, I had predominantly explored machine learning with code from the ground up using Python (PMLC). I have used ML practically in the form of genetic algorithms for tuning parameters on investing models for years, non-differentiable, so no chain rule! An offshoot was a project called gen-gen-algo, a generic genetic algorithm. Now, finally after all these side quests, I was ready to tackle something more complex and cutting-edge using GPT.

I found excellent resources on GitHub and in video format from Andrej Karpathy (https://github.com/karpathy). The following repositories were particularly helpful in my learning journey. The first one, “nn-zero-to-hero,” features a series of videos that provided a solid foundation in understanding transformers.

The second repository, “makemore,” served as my warm-up exercise to get back into working with transformers and Large Language Models (LLMs) after a period of dormancy in the field. You can access these repositories here:

1. “nn-zero-to-hero”: https://github.com/karpathy/nn-zero-to-hero
2. “makemore”: https://github.com/karpathy/makemore

Fork of makemore

My experience with “makemore” went beyond the basic examples provided in the original repository, which generated new names based on a dataset of names. Initially, my goal was to apply “makemore” to various datasets other than “names.txt.” I experimented with larger and smaller datasets, including those with extensive collections of English words, numbers for addition, square roots, and a substantial dataset of quotes containing nearly 10 million entries, some of which had lines as long as 505 characters. By using scripts and modifications to “makemore.py,” I conducted a grid search to optimize hyperparameters, including constraints on model size. Output from “makemore.py” was saved to a CSV file, along with hexadecimal hash values for easy tracking and analysis during the tuning process.

To further enhance the code, I introduced a grid search optimization method using a Bash script. This allowed for exploring the hyperparameter space while maintaining a ceiling on the model size. Without such constraints, optimization typically led to increasingly larger models that resulted in the lowest loss.

I also introduced the concept of assigning a random hexadecimal tag to the output of “makemore.py.” This tagging system facilitated the easy identification of the best loss and the associated set of hyperparameters that produced it. Additionally, I incorporated an early stopping mechanism into the “makemore.py” code.

If you’re interested in exploring my fork of Andrej Karpathy’s “makemore” code, you can find it here:

https://github.com/erickclasen/makemore

For a more detailed understanding, I’ve created a comprehensive “verbose-readme.pdf” that provides additional information:

Version on this site, opens in browser:

verbose-readme

GitHub Version requires downloading:

https://github.com/erickclasen/makemore/blob/master/verbose-readme.pdf

 

 

 

 

DMAC BTC Trading Example With Tuning

I got this example originally called simple-strat.py from a Datacamp article . I have my changes to the code up on Github.

Tunable DMAC

Basically the modifications to the original code incorporates a Dual Moving Average Crossover (DMAC) strategy that adjusts the averages to pick optimum values . DMAC is a very basic trend trading strategy. Other types of strategies would include, mean reversion, hedging such as pairs trading, arbitrage or buying and selling across the bid-ask spread as a market maker, so called earning the spread.

DMAC Basics

A dual moving crossover is exactly what the name says. There are two moving averages, one short and one long. When the short one rises above the long one, it is a buy signal. When they cross the other way, guess what, sell. This is almost about as dirt simple as a strategy can get. Single moving average, where price crosses one average would be the simplest moving average strategy. DMAC used in the example is not  fancy at all, no stops, no screening, no failure mode protections. I have found that about 10% of designing a good algorithmic strategy is in the strategy, another 10-20% is in the tuning and generalizing, backtesting. The rest is protecting the strategy from itself, such as screening out bad trades. The absolute worst is price shocks, which can’t be predicted.

Tuning

Initially it was coded to work with stocks but, I revised it to take in BTC instead. I also took it one level up and had it automatically tune for the best combinations of the short and long average for the DMAC stratagy. This is a bit better than sitting there and fiddling the parameters by hand. The pitfall is that the model can be overfit to the backtest data and then perform poorly on new data. It is best to have a model that generalizes a bit. Sometimes this can be accomplished by actually dumbing the process down a bit or using a stochastic approach. More parameters in the model the more likely overfit will occur as well.

Aside on Ensemble Backtesting

One approach that I typically use when tuning an algorithmic models parameters is to use an ensemble approach. I have used this successfully with genetic algos, such as differential evolution. I will have the backtest tuning take random slices of the time series from random points within the time series. From this I will run backtests on each one and then average the results together to form the ensemble. This helps to generalize but, won’t work for every kind of model. This has to do with the nature of local maxima and minima that the model might rise up  to or fall into. With some models averaging may just average to the plain and not the peaks and valleys. This has to be looked at carefully. But, I am getting off topic here.

 DMAC versus Buy and Hold

The idea here is to do some Monday morning quarterbacking. When I was in industry this was always a thing on Mondays. No matter what the sport, everyone threw around the coulda, woulda, shouldas. Probably more so if there was money riding on the game. Well if you are a trader or investor, there is real money riding on the game so sometimes it is worth looking back and seeing how a trading strategy would have worked against the classic long term buy and hold. a.k.a HODL…hold on for dear life.

If any of the terminology or details above seem fuzzy, check out the Datacamp article.

Fantasy Account

If we bought BTC on 1/1/2019 and sold on 12/31/2019 and had a magic wand to make the perfect DMAC tuned model, would we have done better than buy and hold?

We start with $100K on 1/1/2019 and cash out on 12/31. What do we have?

DMAC with a 29 day short and 59 day long average…

Short AVG,Long AVG,Best Bank 29 59 902908.9599609375

$900K, the clear winner.

Buy and Hold comes in at $182K

HODL Gains: 1.8258754145331861

Obviously, this is perfect trades, like a Carnot cycle engine without friction, it’s not realistic . In the real world there are fees, slippages, price shocks, picking bad parameters and other gotchas.

But, even if you threw a dart at pairs of numbers for the averages, at least for 2019, odds are the DMAC would have beat buy and hold for the year. This is not always the case, as in years past a buy and hold would have beat the DMAC. See the printout here for a 2010-2017 run.

BTC Price with Average and Buy and Sell points
2019 BTC Price with Average and Buy and Sell points Up Arrow = BUY, Down Arrow = SELL
Amount of cash made on paper
2019: Amount of cash made on paper

output from simple-strat-loop-backtest-2019.py

Get Data
Run Model
Short AVG,Long AVG,Best Bank 1 1 100000.0
Short AVG,Long AVG,Best Bank 1 2 611522.802734375
Short AVG,Long AVG,Best Bank 1 4 646585.4736328125
Short AVG,Long AVG,Best Bank 1 16 652523.6328125
Short AVG,Long AVG,Best Bank 1 17 689456.2744140625
Short AVG,Long AVG,Best Bank 3 89 690951.904296875
Short AVG,Long AVG,Best Bank 4 90 696133.9599609375
Short AVG,Long AVG,Best Bank 6 86 707596.8994140625
Short AVG,Long AVG,Best Bank 9 82 709683.1787109375
Short AVG,Long AVG,Best Bank 9 83 732913.4521484375
Short AVG,Long AVG,Best Bank 10 83 742904.150390625
Short AVG,Long AVG,Best Bank 13 17 746094.677734375
Short AVG,Long AVG,Best Bank 14 81 762164.5263671875
Short AVG,Long AVG,Best Bank 22 95 781311.2548828125
Short AVG,Long AVG,Best Bank 27 55 814494.62890625
Short AVG,Long AVG,Best Bank 29 56 820056.7626953125
Short AVG,Long AVG,Best Bank 29 57 886127.4658203125
Short AVG,Long AVG,Best Bank 29 59 902908.9599609375
signal   short_mavg    long_mavg  positions
Date
2019-01-01     0.0  3843.520020  3843.520020        NaN
2019-01-02     0.0  3893.464722  3893.464722        0.0
2019-01-03     0.0  3874.556885  3874.556885        0.0
2019-01-04     0.0  3870.347046  3870.347046        0.0
2019-01-05     0.0  3865.316553  3865.316553        0.0
2019-01-06     0.0  3900.535889  3900.535889        0.0
2019-01-07     0.0  3918.351946  3918.351946        0.0
2019-01-08     0.0  3932.413940  3932.413940        0.0
2019-01-09     0.0  3943.845323  3943.845323        0.0
2019-01-10     0.0  3917.353247  3917.353247        0.0
2019-01-11     0.0  3896.445268  3896.445268        0.0
2019-01-12     0.0  3876.849915  3876.849915        0.0
2019-01-13     0.0  3851.934777  3851.934777        0.0
2019-01-14     0.0  3841.514596  3841.514596        0.0
2019-01-15     0.0  3827.458643  3827.458643        0.0
2019-01-16     0.0  3816.680405  3816.680405        0.0
2019-01-17     0.0  3808.555908  3808.555908        0.0
2019-01-18     0.0  3800.182766  3800.182766        0.0
2019-01-19     0.0  3796.413587  3796.413587        0.0
2019-01-20     0.0  3786.643591  3786.643591        0.0
2019-01-21     0.0  3776.614490  3776.614490        0.0
2019-01-22     0.0  3768.794611  3768.794611        0.0
2019-01-23     0.0  3760.808891  3760.808891        0.0
2019-01-24     0.0  3754.144582  3754.144582        0.0
2019-01-25     0.0  3747.969434  3747.969434        0.0
2019-01-26     0.0  3742.372934  3742.372934        0.0
2019-01-27     0.0  3736.506004  3736.506004        0.0
2019-01-28     0.0  3727.004020  3727.004020        0.0
2019-01-29     0.0  3717.387224  3717.387224        0.0
2019-01-30     0.0  3705.065211  3709.680371        0.0
…            …          …          …        …
2019-12-03     0.0  8139.303778  8350.672074        0.0
2019-12-04     0.0  8067.217824  8338.195445        0.0
2019-12-05     0.0  8001.267056  8324.681624        0.0
2019-12-06     0.0  7941.937231  8313.125919        0.0
2019-12-07     0.0  7898.880573  8295.507233        0.0
2019-12-08     0.0  7855.803442  8278.183023        0.0
2019-12-09     0.0  7798.747340  8262.575270        0.0
2019-12-10     0.0  7747.724289  8244.635676        0.0
2019-12-11     0.0  7692.612742  8225.930954        0.0
2019-12-12     0.0  7638.642797  8206.752102        0.0
2019-12-13     0.0  7589.042447  8190.893042        0.0
2019-12-14     0.0  7541.893538  8175.251465        0.0
2019-12-15     0.0  7493.670814  8159.122492        0.0
2019-12-16     0.0  7436.929603  8141.483051        0.0
2019-12-17     0.0  7379.385776  8118.634824        0.0
2019-12-18     0.0  7347.339473  8102.613207        0.0
2019-12-19     0.0  7318.911065  8084.971233        0.0
2019-12-20     0.0  7304.292666  8070.405356        0.0
2019-12-21     0.0  7300.657530  8064.922082        0.0
2019-12-22     0.0  7304.581391  8065.228863        0.0
2019-12-23     0.0  7315.192130  8043.109003        0.0
2019-12-24     0.0  7321.274835  8010.525266        0.0
2019-12-25     0.0  7323.232927  7971.939519        0.0
2019-12-26     0.0  7313.139935  7937.750000        0.0
2019-12-27     0.0  7307.173811  7901.519506        0.0
2019-12-28     0.0  7291.889211  7869.523975        0.0
2019-12-29     0.0  7286.821037  7839.406482        0.0
2019-12-30     0.0  7282.293541  7806.048696        0.0
2019-12-31     0.0  7277.866329  7769.928041        0.0
2020-01-01     0.0  7273.729391  7735.433461        0.0

[366 rows x 4 columns]
BTC-USD  holdings      cash     total  returns
Date
2019-01-01      0.0       0.0  100000.0  100000.0      NaN
2019-01-02      0.0       0.0  100000.0  100000.0      0.0
2019-01-03      0.0       0.0  100000.0  100000.0      0.0
2019-01-04      0.0       0.0  100000.0  100000.0      0.0
2019-01-05      0.0       0.0  100000.0  100000.0      0.0
BTC-USD  holdings           cash          total  returns
Date
2019-12-28      0.0       0.0  902908.959961  902908.959961      0.0
2019-12-29      0.0       0.0  902908.959961  902908.959961      0.0
2019-12-30      0.0       0.0  902908.959961  902908.959961      0.0
2019-12-31      0.0       0.0  902908.959961  902908.959961      0.0
2020-01-01      0.0       0.0  902908.959961  902908.959961      0.0
Portfolio Plot
Sharpe Ratio 2.2316038135122973
Compound Annual Growth Rate (CAGR) 0.8258754145331861
HODL Gains: 1.8258754145331861

 

Maximum Drawdown for the Period 1/1/2019-12/31/2019
Maximum Drawdown for the Period 1/1/2019-12/31/2019

2018

Followup, I was curious about the rough 2018 year for BTC. 2018 was quite bearish, with only a few small rallies up. It had awful buy and hold performance.

HODL Gains: 0.2565408133534354

Short AVG,Long AVG,Best Bank 1 4 498855.46875

Buy and hold would have been a 4x loss and DMAC would have traded in and out quite often, racking up some fees and slippage of course but, made a 5X gain.

2018 BTC Price with Average and Buy and Sell points Up Arrow = BUY, Down Arrow = SELL
2018 BTC Price with Average and Buy and Sell points Up Arrow = BUY, Down Arrow = SELL
 2019: Amount of cash made on paper
2018: Amount of cash made on paper
2018 Drawdown
2018 Drawdown

2018 Results Details on Github

Github repo

https://github.com/erickclasen/DMAC-BTC-Trading-Example-With-Tuning

Most Starred Python Projects on Github

I have been working through the Python Crash Course book by Eric Matthes. I got to Chapter 17, Working With API’s  and ran the example code python_repos_updated.py that generates an SVG graph of the most starred Github Python projects. The thing that stands out to me is the amount of machine learning related resources that show up on this hot list. The interactive graph is generated by using Pygal which works well and is easy to use.

Interactive SVG Graph of Most Starred Github Python Projects

Out of curiosity I looked at the most starred C projects on Github as well with a simple modification of the code.

Interactive SVG Graph of the Most Starred Github C Projects

Python Crash Course

The book has been a good resource to develop programming skills in Python, my background is in C primarily and this book was an excellent starting point to officially learn Python. What I mean by that, is that I have been using Python by modding other peoples code when I need some piece of code to use for some application. For a few years I was learning it that way on an as needed basis (plus using the Internet to look up things that I might get suck on) which works to a point. When I started to get into machine learning a bit deeper, I could see that having a more formal introduction to Python would be helpful. Having a book that goes through the process formally helped to quickly build skills in my opinion. In a matter of a few months I have gone through a decent amount of the book and have been able to branch off some of the examples and create free standing code that was interesting to learn from.