What are the URL Endpoints of the Regional Controller?
|https://<RC_IP_address>/api/v1/||This is the base URL for all of the API calls to the Regional Controller.|
|https://<RC_IP_address>/docs/||This is the address of an online copy of the documentation for the Regional Controller. Note that because this documentation is built along with the codebase, it is likely to be more up to date than this wiki.|
|http://<RC_IP_address>:8080/admin/||This is the console for the embedded copy of Apache® Airflow that is used as the workflow engine within the Regional Controller.|
Where is the full API documented?
As it is still being modified and in a state of flux, the full API is documented only in the online copy under https://<RC_IP_address>/docs/api.html
Alternately, you can clone the gerrit repository for the Regional Controller, and look at the markup files under /src/site/docs/.
What are the Predefined Users in the Regional Controller database?
The regional controller will be installed with the following pre-defined users. You are not required to use these usernames and can remove/replace them as you wish; however, the workflow user is used internally to log workflow events, so you would be well advised to leave that user alone.
|admin||admin123||create/read/update/delete all||This is the "super user" for RC; e.g. it has all roles and capabilities.|
|readonly||admin123||read all||This user has read only access; e.g. it can issue GET requests, but cannot change objects or make new objects.|
|workflow||admin123||read all, create podevents||This user is used internally in order to create POD events from within the workflows.|
|noaccess||admin123||(none)||This user has no access at all.|
How are Workflows initiated?
All workflows are started as a result of operations on PODs:
- POST - creates a POD and causes the create workflow to run
- PUT - updates a POD and cause one of several update workflows to run
- DELETE - deletes the POD and causes the delete workflow to run
The details of the workflow are described in the Blueprint for the POD.
How do I look at the Workflow Logs?
The workflow logs are stored within Airflow. As such, you need to access them via Airflow. There are two ways to do this:
- Use the Airflow console (see above for the URL).
- Click on DAGs
- Click on the individual DAG you are interested in. The name should contain both the workflow name and the UUID of the associated POD.
- Click on Graph View
- Click on the box for maintask
- In the popup sub-window, click on View Log
- Look at the logfile directly by following these steps:
- Login to the host where the regional controller is running
- Search within the arc-airflow-worker container for the log, and then cat it.
How can I determine the current state of a POD?
The current state of any POD can be determined by using the rc_cli command to look at the current POD state. For example:
The status filed in the output shows the current status of the POD; WORKFLOW means that a workflow is currently running, and ACTIVE means the POD is active with no current workflow.
How are User passwords stored in the Database?
They are not stored in clear text; rather a SHA-256 hash is stored. If you want to change the default passwords, or add your own users, you need to store the SHA-256 hash of the password in the PWHASH field in the database. One way to generate this hash is: