@zooko: This data seems very important. While it’s been very kind of @garethtdavies to volunteer their time to maintain it, might it fall within ECC’s funding to keep it going? It is not currently possible for anyone to request ecosystem funding through the existing ZF Grants platform to do so.
Hey, do you have the CSVs for the data somewhere? I’m happy to mess around with different graphs.
I keep coming back this thread to track shielded adoption, so least I can do.
Sounds great! Here’s the raw data used which is at the tx level so it’s ~7 million rows and ~850MB. Script used to determine types is in the original post.
https://keybase.pub/garethtdavies/zcash-all-tx.csv
I’ll also continue to publish the monthly data as I have been doing, just dropping the real-time dashboard until it can be better implemented as per the suggestions above.
Updated stats for August. While monthly transactions were the highest since August last year, 90% of those were fully transparent A mere 1.5% of tx in August were z2z tx.
I’ve separated out shielded coinbase stats and will later show as a % of all coinbase tx when this ratio becomes more meaningful. In August there were 24 of them.
There was 1 fully shielded Sprout tx and 0 migration transactions in August. The latter is interesting given there is still ~115k in the Sprout pool.
How many total coinbase outputs were generated in August, to compare to the 24 shielded ones?
There were 35,513 transparent coinbase tx in August.
Thanks! Much appreciated.
So shielded adoption clearly needs massive improvement. But I don’t think it makes sense to focus on shielded coinbase separately.
Shielded coinbase was hyped a bit too much. In and of it self, it has nothing to do with privacy: the source, amount, and recipient of a “shielded” coinbase are, i’m told, public. Which is the same level of privacy you get from a transparent coinbase. So why bother? Well, now the money is in the shielded pool and as a mining pool operator you can distribute your payouts privately without extra work.
Mining pools clearly aren’t doing this but this shouldn’t be surprisingly :
- Their pool management software needs to be updated. That takes time and there needs to be demand to do it.
- Well, there needs to be demand for shielded.
So, we’re back to shielded usage needing to go up. Which it does. But shielded coinbase is a sideshow.
Shielded coinbase is a necessary prerequisite to moving to a fully shielded protocol. Might as well have done that part early.
BTW, I see this currently for the dashboard link:
Whoops, we couldn’t load the dashboard
The dashboard does not exist, or you don’t have permission to view it
Updated for September. 18,716 fully shielded transactions - that’s a new high and 11% of all transactions
For those measuring shielded coinbase adoption, there were 128 shielded out of 34,399 total coinbase transactions. There were also 7 migration transactions.
this is awesome
tested ACK.
Also posted in a separate thread, but I wanted to follow up here as well: I’ve just launched (with Gareth’s advice and assistance) zdash.info - a metrics dashboard that now offers live feeds for much of the shielded adoption data covered in these reports. Updated hourly, and offers interactive charts by the day, week, month and year:
Oh man, that screenshot looks cool. zdash.info appears to not be up any more?
What’s the latest on measuring shielded adoption?
ECC metrics page is the only that’s up and running I think. Their data is updated monthly. It’s sad for me to look at the graphs though, more and more $ZEC leaves the shielded pool.
Thanks, yeah; I’m familiar with that page. I would really like to be able to drill down to months, weeks, days, hours. Is the code behind these graphs open source in a repo somewhere? It’s early; but, I think of shielded adoption as a sort of KPI for https://free2z.cash which only barely went live in early March.
Don’t despair! Transparent liquidity on exchanges and low prices in this phase can still be good in the long run as it gives more people the ability to get in. I think cases like ZDA/UDA, free2z and general increased interest in privacy could easily converge to turn around this dip in the shielded pool.
I haven’t read this thread throughly; but, are there any prevailing hypotheses on the two big spikes (and then dips) in 2020 and 2021?
EDIT (Also + ~10,000 ZEC between Feb and March isn’t too bad if I’m reading this right zcash-metrics - Google Sheets)
Hello,
For now we have been using blocksci. The initial setup takes quite some time and is not straight forward. Other folks have just used big query depending on their requirements.
Below is a fairly standard approach/configs you will need:
- Building Blocksci:
sudo add-apt-repository ppa:ubuntu-toolchain-r/test -y
sudo apt-get update
sudo apt install cmake libtool autoconf libboost-filesystem-dev libboost-iostreams-dev \
libboost-serialization-dev libboost-thread-dev libboost-test-dev libssl-dev libjsoncpp-dev \
libcurl4-openssl-dev libjsoncpp-dev libjsonrpccpp-dev libsnappy-dev zlib1g-dev libbz2-dev \
liblz4-dev libzstd-dev libjemalloc-dev libsparsehash-dev python3-dev python3-pip
git clone git@github.com:cryptolu/BlockSci.git
cd BlockSci
mkdir release
cd release
CC=gcc-7 CXX=g++-7 cmake -DCMAKE_BUILD_TYPE=Release ..
make
sudo make install
cd ..
CC=gcc-7 CXX=g++-7 sudo -H pip3 install -e blockscipy
This setup will likely vary depending on your system. This was tested on Ubuntu 18.04 and 20.04.
- Ensure the given zcash.conf has atleast these two params:
txindex=1
insightexplorer=1
Important note: this will increase the expected chain size on disk compared to a default zcash.conf and this chain needs to be ran with nodes matching the above config only to avoid reindexing.
- Start zcashd and ensure the node is on current height of the given chain
- While zcashd is still running, issue the following in another terminal to use values in your zcash.conf
blocksci_parser --output-directory zcash-data update --max-block -100 rpc --username your-rpc-username --password your-rpc-password --address 127.0.0.1 --port 8232
This will take a long time (8+hours) to create the folder called zcash-data
, but once you have that archive for the chain you can re run blocksci_parser
as needed to quickly update zcash-data
- Code examples above using
blocksci
module can query thezcash-data
as needed to get metrics
Outputs aggregated metrics from the Zcash blockchain · GitHub
Depending on your requirements, this tool can be ideal for off-chain metrics rather than on-chain metrics.
Forgot to explain a couple important details in the above blocksci_parser command
update --max-block -100
.
This is updating the newly created zcash-data dir containing metrics, not creating a new metrics dir each time the command is issued. It( --output-directory zcash-data
) will likely be located from the path you issue the command. It is NOT the standard zcashdatadir used on command line to operate a full node (e.g src/zcash-cli -datadir=~/.zcash/zcash/
).
Ive used max-block -100 as a safe window to pull metrics on a running node. This will gather metrics 100 blocks from current height. In my experience this helps not corrupt the --output-directory zcash-data
metrics dir and avoids dealing with certain file locks or chains that often reorg like testnet. If you are gathering metrics on a chain that has different confirmation windows or consensus rules, it should be much easier to lower 100 to something closer to current height.