Timescale NFT Starter Kit
The Timescale NFT Starter Kit is a step-by-step guide to get up and running with collecting, storing, analyzing and visualizing NFT data from OpenSea, using PostgreSQL and TimescaleDB.
The NFT Starter Kit will give you a foundation for analyzing NFT trends so that you can bring some data to your purchasing decisions, or just learn about the NFT space from a data-driven perspective. It also serves as a solid foundation for your more complex NFT analysis projects in the future.
We recommend following along with the NFT Starter Kit tutorial to get familar with the contents of this repository.
For more information about the NFT Starter Kit, see the announcement blog post.
- Data ingestion script, which collects historical data from OpenSea and ingests it into TimescaleDB
- Sample dataset, to get started quickly, if you'd prefer not to ingest live data
- Schema for storing NFT sales, assets, collections, and accounts
- Local TimescaleDB instance, pre-loaded with sample NFT data
- Pre-built dashboards and charts in Apache Superset and Grafana for visualizing your data analysis
- Queries to use as a starting point for your own analysis
Earn a Time Travel Tiger NFT
Time Travel Tigers is a collection of 20 hand-crafted NFTs featuring Timescale’s mascot: Eon the friendly tiger, as they travel through space and time, spreading the word about time-series data wearing various disguises to blend in. The first 20 people to complete the NFT Starter Kit tutorial can earn a limited edition NFT from the collection, for free! Simply download the NFT Starter Kit, complete the tutorial and fill out this form, and we’ll send one of the limited-edition Eon NFTs to your ETH address (at no cost to you!).
git clone https://github.com/timescale/nft-starter-kit.git cd nft-starter-kit
Setting up the pre-built Superset dashboards
This part of the project is fully Dockerized. TimescaleDB and the Superset dashboard is built out automatically using docker-compose. After completing the steps below, you will have a local TimescaleDB and Superset instance running in containers - containing 500K+ NFT transactions from OpenSea.
The Docker service uses port 8088 (for Superset) and 6543 (for TimescaleDB) so make sure there's no other services using those ports before starting the installation process.
docker-compose up --buildin the
cd pre-built-dashboards docker-compose up --build
See when the process is done (it could take a couple of minutes):
timescaledb_1 | PostgreSQL init process complete; ready for start up.
Go to http://0.0.0.0:8088/ in your browser and login with these credentials:
user: admin password: admin
Databasespage inside Superset (http://0.0.0.0:8088/databaseview/list/). You will see exactly one item there called
NFT Starter Kit.
Click the edit button (pencil icon) on the right side of the table (under "Actions").
Don't change anything in the popup window, just click
Finish. This will make sure the database can be reached from Superset.
Go check out your NFT dashboards!
Collections dashboard: http://0.0.0.0:8088/superset/dashboard/1
Assets dashboard: http://0.0.0.0:8088/superset/dashboard/2
Running the data ingestion script
If you'd like to ingest data into your database (be it a local TimescaleDB, or in Timescale Cloud) straight from the OpenSea API, follow these steps to configure the ingestion script:
- Go to the root folder of the project:
- Create a new Python virtual environment and install the requirements:
virtualenv env && source env/bin/activate pip install -r requirements.txt
- Replace the parameters in the
DB_NAME="tsdb" HOST="YOUR_HOST_URL" USER="tsdbadmin" PASS="YOUR_PASSWORD_HERE" PORT="PORT_NUMBER" OPENSEA_START_DATE="2021-10-01T00:00:00" # example start date (UTC) OPENSEA_END_DATE="2021-10-06T23:59:59" # example end date (UTC)
- Run the Python script:
Start ingesting data between 2021-10-01 00:00:00+00:00 and 2021-10-06 23:59:59+00:00 --- Fetching transactions from OpenSea... Data loaded into temp table! Data ingested! Data has been backfilled until this time: 2021-10-06 23:51:31.140126+00:00 ---
Ingest the sample data
If you don't want to spend time waiting until a decent amount of data is ingested, you can just use our sample dataset which contains 500K+ sale transactions from OpenSea (this sample was used for the Superset dashboard as well)
- Go to the folder with the sample CSV files (or you can also download them from here):
- Connect to your database with PSQL:
psql -x "postgres://host:port/tsdb?sslmode=require"
How to Connectprovide a customized command to run to connect directly to your database.
- Import the CSV files in this order (it can take a few minutes in total):
\copy accounts FROM 001_accounts.csv CSV HEADER; \copy collections FROM 002_collections.csv CSV HEADER; \copy assets FROM 003_assets.csv CSV HEADER; \copy nft_sales FROM 004_nft_sales.csv CSV HEADER;
- Try running some queries on your database:
SELECT count(*), MIN(time) AS min_date, MAX(time) AS max_date FROM nft_sales