You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 27 Next »

Weekly on Wednesdays at 8:00 AM PST / 11:00 AM EST.


Join Zoom Meeting: https://zoom.us/j/598290523

One tap mobile
+16699006833,,598290523# US (San Jose)
+16465588656,,598290523# US (New York)

Dial by your location
        +1 669 900 6833 US (San Jose)
        +1 646 558 8656 US (New York)
        +1 877 369 0926 US Toll-free
        +1 855 880 1246 US Toll-free
Meeting ID: 598 290 523
Find your local number: https://zoom.us/u/acimKOClJk


Notes

June 26, 2019

Agenda

  • Status on the build jobs
  • Status on Sonobuoy testing on ARM
  • Status on permanent lab resources for hosting the UI
  • Status on user documentation
  • Status on Jenkins job that consumes the validation containers
  • Status on pending patches
  • AoB

June 19, 2019

  • Sonobuoy testing on ARM
  • Release 2 requirements
    • What do we need for the first release?
      • Kubernetes conformance testing works
      • Docker containers are being built automatically every day
      • Need user documentation, starting from ready made containers
      • Need a Jenkins job to install
    • When can we have it?
  • Meetings for next two weeks: Andrew Wilkinson can start the meetings in Zoom, Cristina Pauna will host
  • Need lab resources permanently for hosting the UI – topic for TSC meeting

June 12, 2019

  • New rules enforced: you need a "+2" in a review for merging, but it cannot be your own
    • If you are a committer, you can give a "+2" to your own patch
    • One "+2" from someone else than submitter is needed
    • After this, anyone can merge the patch
  • Sonobuoy doesn’t work for arm, so we have 2 options→ postponed to next meeting
    • call the tests directly without sonobuoy on arm hr
    • adapt sonobuoy to work on arm (I incline more to this option but seems difficult to do it upstream so we’ll have to figure out a clever solution)
  • Documentation strategy
    • We have projects lined up to test the Blueprint validation project
    • We need to separate user documentation and developer documentation
    • Follow Documentation Sub-Committee requirements (Architecture, User Guide and Release Notes)
  • Proposal to TSC about mandatory tests: https://docs.google.com/presentation/d/1ONU7jmeGVrhbJe2gKRtode1aHKH8teRgfN91LYAPBmo/edit?usp=sharing
  • We had a discussion about when a blueprint project can be "mature": is the maturation review before Release 2 or after it? Do we want to have "Mature" projects in Release 2? Assumption is yes
  • We need to define what a Blueprint Validation release is, since the different blueprints need to be able to test against a stable set of tests. It would make sense to call the first release "2", so that BluVal Rel.2 would be used to test blueprints for Release 2. Most likely will also need a BlueVal Rel. 2.1 to fix bugs, and then the blueprints could choose any version 2.x (the latest is probably the best choice but there is no pressure to upgrade if an earlier release works)

June 5, 2019

  • etcd ha test cases
    • Issue is that there are two implementations, depending on how the k8s cluster has been deployed
    • It would be better to have a single implementation that would work with both
    • Juha and Cristina do not have access to an Airship deployment
  • Docker makefile bug
    • Fixed now
  • UI implementation
    • Discussion was about how to push jobs to validate different blueprints in different labs, since there are three kinds:
      • The public community lab that runs in UNH-IOL
      • 'Private' company labs which can operate in a 'master/agent' Jenkins model
      • Company labs that run in a peer model whereby the Edge lab jenkins only pulls and pushes but is not a slave to the LF jenkins. 
    • All of these will push the results in the LF Nexus, but only the two first ones can take jobs from LF Jenkins (and hence the UI)
    • The UI will soon run in a VM with a public IP

May 29, 2019

  • Pass variables to robot testcases (838)
    • As discussed earlier, we want to make a single file with all variables
    • Decision: go with the YAML version
  • Organize file structure for tests (887)
    • Proposal is to create a structure for storing the different tests
    • Additional idea is to add "layer" to the structure
    • Decision: Go ahead
  • New idea: there should be a single container per layer, use bluval to select what tests to run
  • Add container script for k8s ha (839): looks ok
  • UI discussion: the web UI can run in Docker containers, Ioakeim will make the instructions available on how to build them

May 22, 2019


5-22-19 Validation Framework Call.mp4


May 8, 2019

  • Automatic signed-off-by and linting work now: when you check in code, you need to sign off the commit and you will get a result from automatic code syntax check
  • Building containers and uploading to hub.docker.com does not yet work, LF staff is investigating
  • There will be a presentation about this project in the Akraino TSC/PTL meeting next week. Tapio Tallgren will make some slides and Cristina Pauna will check if the Sonobuyo on Robot Framework in a container will work (dl: Monday)
  • We also had a discussion about the UI framework. The key part is reading the log files, parsing them, and rendering the result on a web page
  • We agreed on a new committer: Ioakeim Samaras

April 24

  • General complaints:
    • Let's start having good commit messages to make it easier for outsiders to understand what is going on
    • Let's also have signed-off-by lines in git commits
  • JJB for patch verification is now done (thanks Cristina!)
  • Container scripts are also working, will be automated soon
  • Test containers will be put to hub.docker.com under Akraino/validation (https://hub.docker.com/r/akraino/validation)
  • Dashboard discussion is postponed
  • There is a new Kanban board for the project: https://jira.akraino.org/secure/RapidBoard.jspa?projectKey=VAL&rapidView=5
  • We also had a discussion about a common "environment file" which would contain all the parameters that the different tests would need (examples: IP addresses, user names, Kubernetes admin file). Need to build from bottom up
  • We also discussed bringing a proposal to the TSC about what the mandatory tests will be. Propose to start with Kubernetes conformance tests and Linux Testing Project
  • Need to make sure that all mandatory tests work on different architectures


April 17, 2019

  • JJB for patch verification will be made by Cristina
  • Patch for creating the k8s conformance test docker container was merged. Miguel will try it out with the robot test he's working on. JJB for building the docker images automatically will be done by Cristina; currently waiting for LF to make the dockerhub repo available from the jenkins slaves
  • Robot tests for Cyclictest are developed by Naga
  • Documentation and robot tests for LTP and baremetal are developed by Miguel
  • Juha is looking into robot framework
  • Deepak presented a proposal for a Dashboard to trigger and view tests results in an user-friendly way; gathered input from the team and will continue the discussion next time.

April 10, 2019

March 27, 2019

Agenda:

  • New committers to project
  • New meeting time. Propose Wednesdays 10-11 (an hour earlier)
  • Using IRC? Let's try the channel #akraino
  • Let's try to clarify the project scope: passing the mandatory validation tests is a part of the Akraino process to mature a process, so it is very important to agree in the Akraino TSC what the mandatory tests are. There can be tests that are supported by the framework but which are not mandatory. The logic should be:
        - If the blueprint installs Kubernetes, Kubernetes e2e conformance tests are mandatory
        - If the blueprint installs OpenStack, OpenStack Tempest and Rally are mandatory
    We also discussed making some HA tests mandatory.
  • Then we need to agree on how to move forward. Let's say you run a test; what happens next? If you run the test from LF Jenkins, it would be great to sync the results to the LF Jenkins. This requires that you are running your lab in a DMZ or you are opening a lot of ports on the firewall to run a slave Jenkins. So the easier alternative is to copy the results of the tests to the LF Nexus. To make it easy (for scripts) to understand the logs, we need to agree on a common format. For this, we propose running all tests using the Robot Framework.
    - Will BVal.yaml be an optional component to make it easy to chain together a number of tests?
    - To run the tests, I want to run something like "docker run --env-file abcd.txt --volume output.log akraino_k8s_tests". For this we need
        - The docker container
        - Definition of the env-file format: we need to figure out what information the test will require and write it to the environment file
        - The actual code to read the environment file, launch the Robot Framework, run the tests with Robots, and copy the output to the output.log
    - Once I have all this, I can make a Jenkins job in my lab which will launch the tests, and I can also configure it to copy the output somewhere.


March 6, 2019

Discussion about scope of testing + continue discussion about role of xtesting

Recording / chat

The plan is to use the Xtesting part of the OPNFV Functest project as the basis for the Akraino Blueprint Validation project. There will be separate containers for the different test sets, hosted at the Docker Hub. The "source code" (docker yaml etc) will be hosted in the Gerrit repository, a Jenkins job will build the images, and then they will be uploaded. We need to identify the mandatory tests that all Akraino Blueprint projects must pass. There will be a centralized database to store test results, to make it easy to follow the progress of different projects. A web interface to the database would be nice.

Specific notes:

  • The Nexus repository does not work well for storing Docker images that must support different architectures, as there is no support for manifest files. So it is best to use the Docker Hub for images
  • Xtesting has two different modes, a "tutorial" one that installs a Jenkins instance, and one for the "Functest" type, where the xtesting framework only creates the Jenkins jobs.
  • There is no need to have support for voting in the Jenkins jobs
  • Trevor Bramwell has set up the OPNFV test database, should ask him
  • Cedric is working on rewriting the web page code that shows the test results

February 27, 2019

Discussion about Xtesting.

Recording / chat

February 13, 2019

Agenda/notes

  • Weekly meetings
    • Time is ok?
    • Need to make official somehow
    • Recording or not?
    • Next week?
  • Will use Jira
  • Clarification of workflow
    • Will BVF do testing or just provide the tools?
  • Next tasks
    • Collection of test tools
    • Define "MVP"
    • Get an introduction from OPNFV team
    • What are we missing with this approach?
      • ONAP VFN SDK project

  • No labels