2021-01-29 09:18:22 +01:00
|
|
|
# Nexus Cognitron
|
|
|
|
|
2021-01-29 09:36:47 +01:00
|
|
|
## Prerequisite
|
|
|
|
|
|
|
|
Follow the [root guide](../../README.md) to install Docker, IPFS and Bazel (optionally)
|
|
|
|
|
2021-01-29 09:18:22 +01:00
|
|
|
## Guide
|
|
|
|
|
|
|
|
#### 1. Download data dumps
|
|
|
|
|
|
|
|
```shell script
|
|
|
|
export COLLECTION=bafykbzacebzohi352bddfunaub5rgqv5b324nejk5v6fltjh45be5ykw5jsjg
|
|
|
|
export COLLECTION_PATH=$(realpath $COLLECTION)
|
2021-01-29 12:08:49 +01:00
|
|
|
ipfs get $COLLECTION && ipfs pin add $COLLECTION
|
2021-01-29 09:18:22 +01:00
|
|
|
```
|
|
|
|
|
|
|
|
#### 2. Launch Nexus Cognitron
|
|
|
|
|
2021-01-29 09:36:47 +01:00
|
|
|
Create [`docker-compose.yml`](docker-compose.yml) file to set up Nexus Cognitron and then launch it:
|
2021-01-29 09:18:22 +01:00
|
|
|
```shell script
|
2021-01-29 12:02:40 +01:00
|
|
|
docker-compose pull && docker-compose up
|
2021-01-29 09:18:22 +01:00
|
|
|
```
|
2021-01-29 10:18:10 +01:00
|
|
|
then go to [http://localhost:3000](http://localhost:3000)
|
2021-01-29 09:18:22 +01:00
|
|
|
|
|
|
|
#### 3. (Optional) Deploy data dumps into your database
|
|
|
|
|
|
|
|
There is a function `work` in [`traversing script`](installer/scripts/iterate.py)
|
|
|
|
that you can reimplement to iterate over the whole dataset and insert it into your
|
|
|
|
own database or do whatever you want in parallel mode.
|
|
|
|
|
|
|
|
By default this script is just printing documents.
|
|
|
|
|
|
|
|
```shell script
|
|
|
|
bazel run -c opt installer -- iterate \
|
2021-01-29 09:36:47 +01:00
|
|
|
--data-filepath $COLLECTION_PATH/index/scitech \
|
2021-01-29 09:18:22 +01:00
|
|
|
--schema-filepath schema/scitech.yaml
|
|
|
|
```
|