Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Integrated Edge Cloud (IEC)focuses on the multi-architecture based solution on the Edge and Cloud computing areas. In this document, we will give an overview about of IEC testing done in CI/CD.

...

Testing of the IEC platform is done in CI/CD using the validation labs described here.

Akarino Test Group Information

Currently, 2 or 3 virtual hosts are deployed on arm64 platform by fuel (compass) tools. For all the baremetal servers, those must meet those hardware and software requirements. In addition, access to Internet is required for the CI platform.

Hardware Information:

...

CPU Architecture

...

Memory

...

Hard disk

...

Network

...

Arm64

...

At least 16G

...

500G

...

1Gbps (Internet essential)

Software Information:

...

kubectl

...

kubeadm

...

Kubelet

...

Kubenetes-cni

...

Docker-ce

...

Calico

...

OS

...

1.13.0

...

1.13.0

...

1.13.0

...

0.6.0

...

18.06.1~ce

...

V3.3

...

Ubuntu 16.04

...

Overall Test Architecture

The following picture describes the overall testing environment deployed by Compass/Fuel tools. For the whole environment, there are several virtual hosts which will be deployed on baremetal host. One is jumper host which is used for running the k8s deployment scripts. Others are K8s nodes which will be deployed K8s with Calico. For each of virtual host, there are 2 NICs with 1Gbps, one is for Internet, the other is for internal connection. The specific information please reference:

https://jenkins.akraino.org/view/iec/

Attention: Only the Arm64 platform was deployed in community CI platform so far.

Image Removed

Test description

The validation of the IEC project in CI/CD consists in several steps:

...

Run the validation project K8S conformance tests

Info

Enabling this step is still work in progress for Release2. Logs for this steps are not yet pushed to Nexus, but the results can be seen in Jenkins at https://jenkins.akraino.org/view/iec/job/validation-enea-daily-master

...

Once the terraform is done provisioning the Kubernetes environment, the test.sh script is run to validate the cluster set-up. If the setup was successful, the script must return the list of nodes.  During the terraform apply process, the config files are moved from the remote machines, both from master and worker, to the local machine where terraform processes takes place. The test.sh script then moves the master node's config file to ~/.kube/config in the local machine. Then the 'kubectl get nodes -A' is executed to validate the cluster. 


CI/CD process

Pre-Requisites :
The instructions mentioned here are to set up a private Jenkins lab for the CI/CD process. 

1. Terraform must be installed as a pre-requisites in the Jenkins containers. Execute the following steps to install terraform.

Code Block
languagebash
themeEmacs
wget https://releases.hashicorp.com/terraform/0.14.9/terraform_0.14.9_linux_amd64.zip
unzip terraform_0.14.9_linux_amd64.zip
sudo mv terraform /usr/local/bin/

     

2.  Ensure the Installation by executing the following command. 

Code Block
languagebash
themeEmacs
terraform --help


3. Jenkins Image used: docker pull Jenkins. Bring up Jenkins with default installations. 

Setup :      
A Jenkins freestyle job with some additional plugins is to be set up for the CI/CD process. The freestyle job waits for events from the Gerrit-Event trigger plugin and then initializes the terraform process. The terraform process validates the template and provisions the infrastructure through terraform commands. Once the terraform is done provisioning the cluster, a post-build task is set up to push the build log to the nexus server. The logs are from terraform init, plan and apply commands which is also stored in a tf.log file. The following are the steps to set up the jenkins job. 

  1. Add Gerrit credentials under Manage Jenkins > Manage credentials > Add new credentials.  Enter your Gerrit user name and password. 
  2. Install Gerrit trigger plugin & post-build task plugin from > Manage Jenkins > Manage plugins > Available plugins.
    To Configure the Gerrit - trigger plugin, Follow the instructions below : 
    1. Create a folder ~/.ssh inside Jenkins container
    2. Generate ssh key using ssh-keygen -m PEM 
    3. Register the id_rsa.pud key in the Gerrit account under settings > ssh keys > add new keys
    4. Once the key is registered with the Gerrit account, test the connectivity with the following command. 

      Code Block
      languagebash
      themeEmacs
      ssh -p 29418 <UserName>@<HostName>
                                    
  3. Create a new freestyle job and configure the job with your Gerrit account in the Source Code Management section. Enter the repository URL and select the credentials that were added in step 1. 
  4. Add the following shell script under the 'Build' section.  These instructions are executed in order while building the job. The script mainly exports TF_VAR for terraform purposes,  installs LFtools, and initializes the terraform init, plan and apply processes.  

    Code Block
    languagebash
    themeEmacs
    export TF_VAR_aws_region="us-east-2"
    export TF_VAR_aws_ami="ami-026141f3d5c6d2d0c"
    export TF_VAR_aws_instance="t4g.medium"
    export TF_VAR_vpc_id=""
    export TF_VAR_aws_subnet_id=""
    export TF_VAR_access_key=""
    export TF_VAR_secret_key=""
    export TF_LOG="TRACE"
    export TF_LOG_PATH="./tf.log"
    sudo apt-get update
    sudo apt-get -y install python3-pip
    curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
    export CRYPTOGRAPHY_DONT_BUILD_RUST=1
    apt-get install build-essential libssl-dev libffi-dev
    pip3 install lftools
    cd /var/jenkins_home/workspace/gerrit-akraino/src/foundation/microk8s
    terraform init
    terraform plan
    terraform apply -auto-approve  


  5. Add the following to the post-build task. The post-build script is for pushing the current build's log to the nexus server. Configure the SILO, JENKINS_HOSTNAME, JOB_NAME etc accordingly.  

    Code Block
    languagebash
    themeEmacs
    echo "post build tasks"
    cat /var/jenkins_home/workspace/gerrit-akraino/src/foundation/microk8s/tf.log
    echo $BUILD_NUMBER
    NEXUS_URL=https://nexus.akraino.org
    SILO=gopaddle
    JENKINS_HOSTNAME=<hostName/IP:PORT>
    JOB_NAME=gerrit-akraino
    BUILD_URL="${JENKINS_HOSTNAME}/job/${JOB_NAME}/${BUILD_NUMBER}/"
    NEXUS_PATH="${SILO}/job/${JOB_NAME}/${BUILD_NUMBER}"
    lftools deploy logs $NEXUS_URL $NEXUS_PATH $BUILD_URL
    echo "Logs uploaded to $NEXUS_URL/content/sites/logs/$NEXUS_PATH"   


  6. Once configuring the job is done, build it. It might take some time to complete. To check if the source code is pulled from gerrit, go to /var/jenkins_home/workspace/<jobName>/src/foundation/microk8s.


      

At the end of each job the the environment is cleaned up by destroying the cluster and the networks that were created.

Info

For the self-release process, only the first step is mandatory

1. Automatic deploy

The deployment is done both using Fuel@OPNFV and Compass installers.

The Installer performs the following step:

  • For virtual deploys it creates the cluster VMs (using KVM). For baremetal deploys it provisions the nodes via IPMI using PXE boot
  • Will clone the IEC repo and it will install the IEC platform on them as described in the Installation guide
  • Will perform the Nginx Deployment test described below

The installation with Fuel@OPNF is done through this installation script.

The installation with Compass is done through this installation script.

Platform tests

In IEC project, there are 2 cases in Platform test. One is about service deployment test for verifying the basic K8S network function and deployment function end to end. The other is the K8S smoke check for checking K8s environments.

Deployment case

This test verifies the basic function of K8S by deploying a simple Nginx server. After the Nginx server is deployed, the test will issue a request to the server and verify the reply got from Nginx. The test can also be started by shell scripts nginx.sh.

The Test inputs

There should be a nginx.yaml configuration files which is used for deploying the Nginx pods.

Test Procedure

The test is completed by a script which located at iec project. At first, the program will start a Nginx server based on nginx.yaml file. After the status of Nginx pods are OK, it will send a "get" request to the Nginx server port for getting the reply. If there is a reply, it will be checked and restored in local database. At last, all the resources about Nginx will be deleted by "kubectl delete" command. 

Expected output

The script will get the information from the Nginx service IP. Just as follows:

wget -O /dev/null "http://serviceIP"

Test Results

If correct, it will return OK, otherwise is Error. 

Image Removed

K8S Healthcheck case (not included in CI)

The second case is K8S healthcheck which is used for checking the Kubernetes environments. It creates a Guestbook application that contains redis server, 2 instances of redis slave, frontend application, frontend service and redis master service and redis slave service. The test will write an entry into the guestbook application which will store the entry into the backend redis database. Application flow must work as expected and the data written must be available to read. In the end, you can directly run it from iec/src/foundation/scripts/functest.sh scripts shell.

Feature Tests

The feature tests are still going on developing now.

Test Step:

All the test cases has been integrated into IEC projects. You can start those tests by simple scripts on K8s-master node just as follows.

# source src/foundation/scripts/functest.sh <master-ip> #Functest case
# source src/foundation/scripts/nginx.sh  #Nginx case

2. Conformance Test

Info

Enabling this step is still work in progress for Release2. Logs for this steps are not yet pushed to Nexus, but the results can be seen in Jenkins at https://jenkins.akraino.org/view/iec/job/validation-enea-daily-master/)

The K8S conformance tests are ran using the Akraino Blueprint Validation project framework.

The Test inputs

The inputs are given via the IEC Jenkins job parameters, plus validation specific parameters:

  • Bluprint name (iec)
  • Layer to test (k8s)
  • Version of the validation docker images to use
  • Flag to run optional tests
Test Procedure

The tests are ran using the Akraino Blueprint Validation project procedure, by calling the Bluval Jenkins job

Expected output

All tests pass

Code Block
Status: Downloaded newer image for akraino/validation:k8s-latest
==============================================================================
Conformance                                                                   
==============================================================================
Conformance.Conformance :: Run k8s conformance test using sonobuoy            
==============================================================================
Run Sonobuoy Conformance Test                                         | PASS |
------------------------------------------------------------------------------
Conformance.Conformance :: Run k8s conformance test using sonobuoy    | PASS |
1 critical test, 1 passed, 0 failed
1 test total, 1 passed, 0 failed
==============================================================================
Conformance                                                           | PASS |
1 critical test, 1 passed, 0 failed
1 test total, 1 passed, 0 failed
==============================================================================
Output:  /opt/akraino/results/k8s/conformance/output.xml
Log:     /opt/akraino/results/k8s/conformance/log.html
Report:  /opt/akraino/results/k8s/conformance/report.html

3. Install SEBA use-case

The installation of the use-case is done as recommended by the upstream CORD community.

The Test inputs

The inputs are given via the Jenkins job parameters.

Test Procedure

The script found here is used to install SEBA.

Expected output

Installation in Jenkins is successful

Tested configurations

...

https://nexus.akraino.org/content/sites/logs/production/vex-yul-akraino-jenkins-prod-1/iec-type2-deploy-fuel-virtual-ubuntu1604-daily-master/

...

https://nexus.akraino.org/content/sites/logs/production/vex-yul-akraino-jenkins-prod-1/iec-type2-deploy-fuel-virtual-ubuntu1804-daily-master/

...

https://nexus.akraino.org/content/sites/logs/production/vex-yul-akraino-jenkins-prod-1/iec-type2-deploy-fuel-virtual-centos7-daily-master/

...

Test Dashboards

Single pane view of how the test score looks like for the Blue print.

...