Nonce crawls the decentralized space. It collects and analyzes data from the decentralized web to make it later retrievable by human-readable queries. It is also designed to do this task in a collaborative fashion. Volunteering nodes can take part in the bidding round to themselves execute scraping.
How It's Made
Nonce first collects content hashes from ENS to collect data from IPFS. The crawled data is persisted and later analyzed to be shown in the web application. To make crawling executed in a decentralized manner, Nonce uses a subsystem "Rand", that randomly selects a peer node to delegate the work. All in all, the system is implemented using Node.js, ENS, IPFS, Ethereum, and Infura.