Main features of cmemc include:
- Create, delete and inspect datasets, as well as upload and download dataset file resources.
- Import, export, delete or open Knowledge Graphs.
- Import, export, create or delete projects.
- Execute or open local and remote SPARQL queries.
- Install, uninstall and open vocabularies.
- Execute (with or without payload), open or inspect workflows.
- Import bootstrap data, create showcase data, get health information.
- Import or export the workspace.
In order to start working with cmemc, follow one of the installation options.
The following pages provide documentation for specific cmemc related topics:
Installation and Configuration — cmemc can be installed using the python sources, using the release package or using the docker image.
Certificate handling and SSL verification — In a reasonable production deployment, all client-accessible Corporate Memory APIs will be securely available as HTTPS endpoints. This document clarifies how to deal with certificates.
Environment based Configuration — In addition to configuration files, cmemc can be widely configured and parameterised with environment variables.
File based Configuration — This page documents how to configure cmemc via configuration files.
Getting Credentials from external Processes — This page discusses how to avoid passwords in configuration files by using configured credential processes or environment variables.
Command Reference — This document lists the help texts of all commands as a reference and to search for it.
Troubleshooting and Caveats — This page lists and documents possible issues and warnings when working with cmemc.
Using the docker image — In addition to the stand-alone cmemc binaries from the distribution package, the pypi or the brew based installation, you can use the eccenca cmemc docker image which is based on the official debian slim image. This is especially needed when you want to use cmemc in orchestrations.
Workflow execution and orchestration — In some cases, you need to automate a complete graph of integration workflows which depend on each other and can sometimes run in parallel or after each other.