cmemc is intended for system administrators and Linked Data Expert, who wants to automate and remote control activities on eccenca Corporate Memory.

Main features of cmemc include:

  • List, edit and check configurations.
  • List, create, delete, inspect datasets as well as dataset resources.
  • List, import, export, delete or open graphs.
  • List, import, export, create or delete Build projects.
  • List, execute, replay or open local and remote SPARQL queries.
  • List, install, uninstall, import and open vocabularies.
  • List, execute, open or inspect workflows and workflow schedulers.
  • Import or export whole Build workspaces and graph stores.
  • List, get or inspect server metrics.

In order to start working with cmemc, follow one of the installation options.

The following pages provide documentation for specific cmemc related topics:

Installation

  • Page:
    Installation and Configuration — cmemc can be installed using the python package from pypi.org / the release package or by pulling the docker image.

Configuration

Usage

  • Page:
    Command Reference — This document lists the help texts of all commands as a reference and to search for it.
  • Page:
    Command-line completion — In case you are using bash or zsh as your shell, you can optionally enable Command-line or tab completion for cmemc.
  • Page:
    SPARQL Scripts — By prepending a Shebang line to a SPARQL query file and making this file executable, the query file can be treated as an executable script.
  • Page:
    Troubleshooting and Caveats — This page lists and documents possible issues and warnings when working with cmemc.
  • Page:
    Using the docker image — In addition to the stand-alone cmemc binaries from the distribution package, the pypi or the brew based installation, you can use the eccenca cmemc docker image which is based on the official debian slim image. This is especially needed when you want to use cmemc in orchestrations.
  • Page:
    Workflow execution and orchestration — In some cases, you need to automate a complete graph of integration workflows which depend on each other and can sometimes run in parallel or after each other.