Table of Contents

Team Info

Project Technical Lead

Bill Zvonar.  Elected 1/17/19.

Committers

Committer

Company

Contact Info

 Committer BioPicture 

PTL

Bill ZvonarWind Riverbill.zvonar@windriver.com


Y

Bruce Jones

Intel

bruce.e.jones@intel.com



N

Dariush EslimiWind RiverDariush.Eslimi@windriver.com

N
Alex KozyrevWind RiverAlex.Kozyrev@windriver.com

N

Documentation

Presentations

Blueprint Proposal

StarlingX Far Edge Distributed Cloud Blueprint Proposal

Blueprint Specification

StarlingX Far Edge Distributed Cloud Blueprint Specification

Rel 1 Documentation

Rel 1 Self-Certification Status

See StarlingX Far Edge Distributed Cloud Documentation for further Rel 1 documentation.

Demo Video (from ONS)

Meetings

Meetings will be held bi-weekly on Mondays at 3 pm EST/EDT.

Feb 3, 2020

  • haven't had a meeting in a little while & have stalled out a bit, as the key players setting up the new lab have left the project 
  • will look to have a meeting next week to start working up the plan to get this activity started up again 

Jan 6, 2020

  • attendees: Bruce, Bill, Alex 
  • first meeting for a while due to holidays 
  • Bruce let us know that the Intel GDC team has been re-assigned & will no longer be working on this blueprint - the work will be moved to another team, to be determined 
  • this is a late breaking development
  • Bruce will follow up internally on the timing for identifying the new resources/lab to pick up the work that the GDC team was doing  

Dec 2, 2019

  • no meeting, didn't have quorum

Nov 25, 2019

  • attendees: Abraham, Erich, Bruce, Glenn, Bill 
  • Erich & Abraham summarized their progress on installing a full Distribute Cloud in the GDC environment 
    • 2 controllers + 4 subclouds
    • start with System Controller(s) - install from scratch
    • currently Subclouds are not installed from scratch, but rather just added to the Cloud 
    • takes about 1.5 hours now, depending on how heavily loaded the network is

Nov 18, 2019

  • did not have quorum for the meeting, likely due to holiday at GDC site

Nov 11, 2019

  • meeting cancelled due to Remembrance Day

Nov 4, 2019

  • attendees: Ada, Bill, Bruce
  • work continues on GDC lab/Jenkins setup - team to provide an update by mid-week 

Oct 28, 2019

  • attendees: Ada, Bill, Bruce, Christopher L, Abraham
  • per Christopher, they have started working on the DC install/config
  • were able to establish subnet to subnet communication between System Controllers
  • working on subcloud configuration now
  • current forecast is to complete the install/config by Nov 15 

Oct 21, 2019

  • attendees: Ada, Bill, Bruce, Glenn
  • SFEDC 2.0
    • Ada's team will provide an updated estimate based on the information they've got now - by this Thursday 
      • Bill to fold the additional installation details into the overall distributed cloud documentation 
    • Bill to check if it's still possible to get into the overall 2.0 release if the blueprint 2.0 deliverables are completed some time in November 
  • RFP-Ready Kit (RRK)
    • a proof-of-concept kit to address an RFP 
    • something the IOTG group is using
    • Glenn is interested in using the SFEDC 2.0 blueprint for their RRK 
    • IOTG would take this blueprint, and would apply it to their environment 

Oct 7, 2019

  • attendees: Bill, Ada, Bruce, Alex 
  • generally speaking, we are at risk for getting the 2nd lab set up by month's end - we'll continue to work towards that goal, the key items are as follows... 
    • Distributed Cloud Installation Recipe
      • close, but not quite there - need an updated forecast from Yang on when this can be provided to Ada & team 
      • per Ada, this is for sure at risk now, since they haven't been able to start, and now need to assign a new resource to the activity (Haydeh is no longer on the project)
    • Jenkins Information/Instructions
      • per Ada, she has reviewed, the instructions look good - she will ask Christopher to review them as well - Ada will get a forecast from Christopher on setting this up 

Sep 30, 2019

  • meeting wasn't held as folks were away 

Sep 23, 2019

  • attendees: Bill, Ada (Intel), Bruce Jones (Intel), Alex Kozyrev (WR)
  • Distributed Cloud installation recipe - still in the realm of possibility to get done this week, Bill to check with Yang
  • GDC HW - per Ada, the servers listed in Sep 11 meeting will be at their disposal generally, if more powerful servers are required, they may be able to use some but not on an ongoing basis 
  • Jenkins information/instructions - Ada to review that stuff 

Sep 16, 2019

  • attendees: Bill, Lionel Pelamourgues 
  • some discussion about use cases for this blueprint - Bill to introduce Lionel to Glenn Seiller for further discussion 

Sep 11, 2019

  • attendees: Bill, Ada Cabrales (Intel), Hayde Martinez Landa (Intel)
  • discussion around next steps for Ada & Hayde to get the 2nd lab up & running (on Intel's premises)
  • the following actions were identified, with some progress notes since then...
    • Ada/Hayde send their server information – we’ll review & assess how similar it is to where we’re running our Distributed Cloud
      • Memory: 64GB
      • CPU: Intel(R) Xeon(R) CPU E3-1275 v6 @ 3.80GHz (8 cores)
      • Disks: 2x447GB
      • Network Cards: Intel X550T, Intel I210
    • Bill send Jenkins information/instructions
    • Bill find out how soon (roughly) we think we’ll be able to send the Distributed Cloud installation recipe
      • current plan is that this information should be available at some time during the week of Sep 23rd 
    • Bill to send the recurring meeting info to Ada/Hayde

Jul 22, 2019

  • attendees: Bill, Dariush, Neil Oliver
  • abbreviated meeting today, will work via email and ad-hoc discussions on Rel 2 planning and blueprint family expansion 
  • Neil Oliver from Intel joined today - Neill is part of the OpenNess group in the Network & Custom Logic group at Intel 
  • OpenNESS is Open Network Edge Services Software
    • https://www.openness.org/
    • was NEV SDK
    • inspired by ETSI MEC platform
    • support LTE, WiFi, Landline I/F
    • added the data plane – e.g. traffic steering at the edge node to the right application (e.g. a Container or a VM)
    • also a controller component
  • Open NESS controller can use underlying components (like an OpenStack controller)
  • they're in their initial release open source s/w
  • many customers that have experience with the NEV SDK, which has been out for 2-3 years
    • there’s a Wind River Linux (Titanium Server) and a CentOS version
  • Neill's looking at synergy with StarlingX control components – maybe the Open NESS controller could act as a plug-in that StarlingX could use
  • OpenNESS is in a couple of other blueprints as well - one is the Tencent connected vehicle blueprint – traffic steering to a vehicle or an edge device
  • we'll continue talking to Neill about the possibilities for defining a blueprint that includes StarlingX and OpenNESS

Jul 8, 2019

  • team is currently working on defining the plan for Rel 2, and what our "infrastructure" work items are, as defined by the various sub-committees
  • working through the charts and updates from the Mini-Summit in June (per 06/17-19/2019 Technical Mini-Summit and Release 2 Planning)
  • currently, unclear on what the sub-committee requirements are in most cases, need to firm this up (will ask at the TSC Working meeting tomorrow)
  • missed a number of meetings leading up to Rel 1 completion - should go back and add the details here at some point

April 1, 2019

  • meeting cancelled, next one will be held on April 15 

Mar 18, 2019

  • Attendees

    • Alex, Bill, Dariush, Glenn, Greg, Numan

  • ONS Demo Video
    • lab setup is in progress, should be ready tomorrow
    • Bill: post the video up on the SFEDC blueprint wiki
    • Dariush: put a preliminary README file up on the SFEDC Gerrit
  • ONS Booth
  • ONS TSC Face to Face
    • per Glenn, it’s on Tuesday
    • Dariush will be on vacation, not able to attend, Glenn will work a plan with Ian to cover
  • SFEDC Dedicated Hardware
    • while we're working our plan to get our own dedicated HW, we should confirm that we’re planning to validate our blueprint on the Community Lab
    • Bill to raise with the process sub-committee 
  • Release 1 Planning
    • Rel 1 pushed to end of May now (May 31st), approved at the last TSC meeting
    • it was acknowledged that all of the blueprints will remain in incubation after this release

Mar 4, 2019

  • focused on plan for ONS demo video using in-house lab facilities - target by March 15
  • no line of sight yet to lab HW, expecting update by end of week 

Feb 25, 2019

Agenda items:

  • plan for dedicated lab H/W
  • refining plan for Rel 1 - lab setup, use cases, CI/CD
  • plan for ONS demo video - subset of Rel 1 functionality 
  • day/time/cadence for regular meeting - bi-weekly on Mondays, need to create meeting on Akraino calendar

Attachments

  File Modified
PDF File Akraino_Blueprint_Far_Edge_v3.pdf Nov 09, 2018 by Jim Einarsson
PDF File Far Edge Akraino Blueprint v5.pdf Dec 07, 2018 by Ian Jolliffe
Microsoft Excel Spreadsheet Akraino_BP_BoM.xlsx Dec 07, 2018 by Cesar Berho
Multimedia File Akraino-StarlingX-EdgeX.mp4 Mar 28, 2019 by Bill Zvonar
Microsoft Word Document Akraino R1 StarlingX Far Edge Distributed Cloud Datasheet.docx May 30, 2019 by Bill Zvonar
Microsoft Powerpoint Presentation Akraino Rel 1 Self-Certification Status - StarlingX Far Edge Distributed Cloud.pptx Jun 05, 2019 by Bill Zvonar

  • No labels