Additional services

Additional services to work with RDF data.

BioThings#

RMLMapper

The BioThings SDK is a Python package to build and deploy annotated Smart APIs from flat data files. Multiple BioThings APIs can be built using the BioThings Hub, and exposed using BioThings Web.

See the BioThings API Specifications.

BioThings Studio#

BioThings Studio enables to deploy a Docker container with all dependencies required to build BioThings APIs. See the BioThings Studio documentation.

docker run -d --rm --name studio \
-p 8001:8080 -p 8000:8000 -p 9000:9000 \
-p 7022:7022 -p 7080:7080 -p 9200:9200 -p 27017:27017 \
-v $(pwd)/workspace/biothings:/data \
biothings/biothings-studio:0.2a

Access BioThings Studio web UI at http://localhost:8880

Access BioThings API at http://localhost:7080

Use the BioThings SDK#

Available on PyPi. BioThings SDK provides a Python-based toolkit to build high-performance data APIs (or web services) from a single data source or multiple data sources. It has the particular focus on building data APIs for biomedical-related entities, a.k.a "BioThings" (such as genes, genetic variants, drugs, chemicals, diseases, etc).

pip install biothings

Documentation about BioThings SDK can be found at http://docs.biothings.io

Use the BioThings Explorer#

BioThings APIs can then be queried by the BioThings Explorer: https://biothings.io/explorer

A SDK is also available on PyPi to use the BioThings Explorer

pip install biothings-explorer

Blue Brain Nexus#

Blue Brain Nexus

Quickly build, manage, and leverage Knowledge Graphs using a web application, Python framework, and web services with Blue Brain Nexus. Using Blazegraph for RDF storage.

Login with GitHub or ORCID. Manage data access through fine grain Access Control List with organizations and users

Data management oriented SHACL version of W3C PROV-O. SHACL version of a subset of schemas defined by schema.org that are commonly used in Blue Brain Nexus.

Install Nexus Python SDK and CLI:

pip install nexus-sdk git+https://github.com/BlueBrain/nexus-cli

Install Nexus JavaScript SDK and React components (the SDK is written in Typescript, so type declarations for all operations are included in the package):

npm install @bbp/nexus-sdk @bbp/react-nexus

Nexus Fusion#

Enabling Collaborative Data and Knowledge Discovery

An extensible, open-source web interface that thrives on your data. With workspaces, plugins, and an admin interface available out-of-the-box, you can start working with your ingested data immediately.

Fusion is our extensible web application. It hosts different apps to accommodate various use cases. It comes by default with Studios (where you work with data), Admin (for managing the Nexus instance), and will soon support Workflows to organise your data activities. It runs on top of the Delta web services, and integrates neatly with our Forge Python framework.

Nexus Forge#

Building and Using Knowledge Graphs Made Easy

Blue Brain Nexus Forge is a domain-agnostic, generic and extensible Python framework enabling non-expert users to create and manage Knowledge Graphs.

Knowledge Graphs are often built from heterogeneous data and knowledge (i.e. data models such as ontologies, schemas) coming from different sources and often with different formats (i.e. structured, unstructured). Nexus Forge enables data scientists, data and knowledge engineers to address these challenges by uniquely combining under a consistent and generic Python Framework all necessary components to build and search a Knowledge Graph.

Check the example notebook to build a knowledge graph from tabular data, using pandas and nexus-forge in Python.

Nexus Delta#

Managing the Data and Knowledge Graph Lifecycle

A secure and scalable service that allows you to organize your data into a Knowledge Graph. Its API enables you to store your data, describe them with metadata, enforce format using schemas combined with automatic validation, capture provenance, and access revisions.

A scalable and secure service to store and leverage all your data, neatly organised in a Knowledge Graph. It offers an API to perform all your data management operations, this way it can easily integrate with your software stack. Its advanced indexing capabilities automatically build views from your metadata.


LinkedDataHub#

LinkedDataHub is an Open Source Knowledge Graph management system. You can use it to manage data, create visualizations and build apps on RDF Knowledge Graphs.

Clone the repository and prepare the environment file:

git clone https://github.com/AtomGraph/LinkedDataHub.git
cd LinkedDataHub
cp .env_sample .env

Start LinkedDataHub:

docker-compose up -d

Access LinkedDataHub web UI at https://localhost:4443

Accept the risk

You will need to accept the risk due to self-signed certificates.

You can now follow the web UI instructions to create an account to login to your LinkedDataHub.

To stop LinkedDataHub, run from the LinkedDataHub folder:

docker-compose down

LinkedPipes#

LinkedPipes is a Suite for Linked Data, with ETL, Visualization services and Applications.

Try the ETL web UI to define data transformation pipelines to RDF:

git clone https://github.com/linkedpipes/etl linkedpipes-etl
cd linkedpipes-etl
LP_ETL_PORT=8091 docker-compose up -d

Access at http://localhost:8091

LinkedPipes proposes various visualisation services:

To stop the LinkedPipes ETL, run from the linkedpipes-etl folder:

docker-compose down

YASGUI#

GitHub

The popular Yet Another Sparql Graphical User Interface.

docker run -it --rm --name yasgui -p 8088:80 \
-e "DEFAULT_SPARQL_ENDPOINT=http://dbpedia.org/sparql" \
-e "ENABLE_ENDPOINT_SELECTOR=true" \
erikap/yasgui

Access at http://localhost:8088

CORS requests

Require the SPARQL endpoint to allow Cross-Origin Requests.


LODEstar#

GitHub

SPARQL query and URI resolution, available through DockerHub.

docker run --rm -d --name lodestar -p 8082:8080 \
-e ENDPOINT_URL=https://graphdb.dumontierlab.com/repositories/ncats-red-kg \
-e TOP_RELATIONSHIP=http://w3id.org/biolink/vocab/id,http://w3id.org/biolink/vocab/name,http://w3id.org/biolink/vocab/description \
-e LABEL=http://w3id.org/biolink/vocab/label \
-e DESCRIPTION=http://w3id.org/biolink/vocab/description \
-e MAX_OBJECTS=10 \
-e SERVICE_BASE_URI=http://localhost:8080/ncats-red-kg netresearch/lodestar

Access at http://localhost:8082/lodestar/sparql

No graphs

Does not support graphs ๐Ÿšซ


Trifid#

GitHub

Linked Data Server: URI dereferencing, custom HTML render, YASGUI SPARQL endpoint.

git clone https://github.com/vemonet/trifid.git
docker build -t trifid ./trifid
docker run --rm -ti --name trifid -p 8080:8080 trifid --sparql-endpoint-url=https://graphdb.dumontierlab.com/repositories/test --dataset-base-url=https://w3id.org/d2s/
docker run --rm -ti --name trifid -v /home/vemonet/sandbox/trifid:/data -p 8080:8080 trifid --config=/data/config-ncats-red-kg.json

Go to http://localhost:8080/dataset/huri/ to resolve https://w3id.org/d2s/dataset/huri/

Modified version on GitHub.

Original project available on DockerHub. But config not working.

docker run -ti -p 8080:8080 zazuko/trifid
# Not working, provide env config file?
docker run -ti -p 8080:8080 -e TRIFID_CONFIG=config-ncats-red-kg.json zazuko/trifid
docker run -ti -p 8080:8080 -e SPARQL_ENDPOINT_URL=https://graphdb.dumontierlab.com/repositories/test -e DATASET_BASE_URL=https://w3id.org/d2s/ zazuko/trifid

Access default example on http://localhost:8080/data/person/mary-cooper to resolve URI.

No graphs

Does not support graphs ๐Ÿšซ


brwsr#

GitHub

Lightweight Linked Data Browser.

git clone https://github.com/Data2Semantics/brwsr.git
docker-compose up

Go to http://localhost:5000.

Change the SPARQL endpoint in the docker-compose.yml.

No graphs

Does not support graphs ๐Ÿšซ


RhizomerEye#

RhizomerEye is a tool to expose a SPARQL endpoint as REST API and deploy a Web UI to browse the triplestore.

See the source code for the RhizomerAPI and RhizomerEye.

The Web UI has been deployed publicly for a few triplestores:


TriplyDB#

See official documentation. It allows to deploy the following services over a triplestore:

No self-hosting

TriplyDB is hosted centrally and cannot be deployed locally ๐Ÿšซ

Last updated on by Vincent Emonet