Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Meetings will be held bi-weekly on Mondays at 3 pm EST/EDT.

Feb 3, 2020

  • haven't had a meeting in a little while & have stalled out a bit, as the key players setting up the new lab have left the project 
  • will look to have a meeting next week to start working up the plan to get this activity started up again 

Jan 6, 2020

  • attendees: Bruce, Bill, Alex 
  • first meeting for a while due to holidays 
  • Bruce let us know that the Intel GDC team has been re-assigned & will no longer be working on this blueprint - the work will be moved to another team, to be determined 
  • this is a late breaking development
  • Bruce will follow up internally on the timing for identifying the new resources/lab to pick up the work that the GDC team was doing  

Dec 2, 2019

  • no meeting, didn't have quorum

Nov 25, 2019

  • attendees: Abraham, Erich, Bruce, Glenn, Bill 
  • Erich & Abraham summarized their progress on installing a full Distribute Cloud in the GDC environment 
    • 2 controllers + 4 subclouds
    • start with System Controller(s) - install from scratch
    • currently Subclouds are not installed from scratch, but rather just added to the Cloud 
    • takes about 1.5 hours now, depending on how heavily loaded the network is

Nov 18, 2019

  • did not have quorum for the meeting, likely due to holiday at GDC site

Nov 11, 2019

  • meeting cancelled due to Remembrance Day

Nov 4, 2019

  • attendees: Ada, Bill, Bruce
  • work continues on GDC lab/Jenkins setup - team to provide an update by mid-week 

Oct 28, 2019

  • attendees: Ada, Bill, Bruce, Christopher L, Abraham
  • per Christopher, they have started working on the DC install/config
  • were able to establish subnet to subnet communication between System Controllers
  • working on subcloud configuration now
  • current forecast is to complete the install/config by Nov 15 

Oct 21, 2019

  • attendees: Ada, Bill, Bruce, Glenn
  • SFEDC 2.0
    • Ada's team will provide an updated estimate based on the information they've got now - by this Thursday 
      • Bill to fold the additional installation details into the overall distributed cloud documentation 
    • Bill to check if it's still possible to get into the overall 2.0 release if the blueprint 2.0 deliverables are completed some time in November 
  • RFP-Ready Kit (RRK)
    • a proof-of-concept kit to address an RFP 
    • something the IOTG group is using
    • Glenn is interested in using the SFEDC 2.0 blueprint for their RRK 
    • IOTG would take this blueprint, and would apply it to their environment 

Oct 7, 2019

  • attendees: Bill, Ada, Bruce, Alex 
  • generally speaking, we are at risk for getting the 2nd lab set up by month's end - we'll continue to work towards that goal, the key items are as follows... 

...

    • Distributed Cloud Installation Recipe
      • close, but not quite there - need an updated forecast from Yang on when this can be provided to Ada & team 
      • per Ada, this is for sure at risk now, since they haven't been able to start, and now need to assign a new resource to the activity (Haydeh is no longer on the project)
    • Jenkins Information/Instructions
      • per Ada, she has reviewed, the instructions look good - she will ask Christopher to review them as well - Ada will get a forecast from Christopher on setting this up 

Sep 30, 2019

  • meeting wasn't held as folks were away 

Sep 23, 2019

  • attendees: Bill, Ada (Intel), Bruce Jones (Intel), Alex Kozyrev (WR)
  • Distributed Cloud installation recipe - still in the realm of possibility to get done this week, Bill to check with Yang
  • GDC HW - per Ada, the servers listed in Sep 11 meeting will be at their disposal generally, if more powerful servers are required, they may be able to use some but not on an ongoing basis 
  • Jenkins information/instructions - Ada to review that stuff 

Sep 16, 2019

  • attendees: Bill, Lionel Pelamourgues 
  • some discussion about use cases for this blueprint - Bill to introduce Lionel to Glenn Seiller for further discussion 

Sep 11, 2019

  • attendees: Bill, Ada Cabrales (Intel), Hayde Martinez Landa (Intel)
  • discussion around next steps for Ada & Hayde to get the 2nd lab up & running (on Intel's premises)
  • the following actions were identified, with some progress notes since then...
    • Ada/Hayde send their server information – we’ll review & assess how similar it is to where we’re running our Distributed Cloud
      • Memory: 64GB
      • CPU: Intel(R) Xeon(R) CPU E3-1275 v6 @ 3.80GHz (8 cores)
      • Disks: 2x447GB
      • Network Cards: Intel X550T, Intel I210
    • Bill send Jenkins information/instructions
    • Bill find out how soon (roughly) we think we’ll be able to send the Distributed Cloud installation recipe
      • current plan is that this information should be available at some time during the week of Sep 23rd 
    • Bill to send the recurring meeting info to Ada/Hayde

Jul 22, 2019

  • attendees: Bill, Dariush, Neil Oliver
  • abbreviated meeting today, will work via email and ad-hoc discussions on Rel 2 planning and blueprint family expansion 
  • Neil Oliver from Intel joined today - Neill is part of the OpenNess group in the Network & Custom Logic group at Intel 
  • OpenNESS is Open Network Edge Services Software
    • https://www.openness.org/
    • was NEV SDK
    • inspired by ETSI MEC platform
    • support LTE, WiFi, Landline I/F
    • added the data plane – e.g. traffic steering at the edge node to the right application (e.g. a Container or a VM)
    • also a controller component
  • Open NESS controller can use underlying components (like an OpenStack controller)
  • they're in their initial release open source s/w
  • many customers that have experience with the NEV SDK, which has been out for 2-3 years
    • there’s a Wind River Linux (Titanium Server) and a CentOS version
  • Neill's looking at synergy with StarlingX control components – maybe the Open NESS controller could act as a plug-in that StarlingX could use
  • OpenNESS is in a couple of other blueprints as well - one is the Tencent connected vehicle blueprint – traffic steering to a vehicle or an edge device
  • we'll continue talking to Neill about the possibilities for defining a blueprint that includes StarlingX and OpenNESS

...