SuccessConsole Output

Skipping 176 KB.. Full Log
kube-public       Active   8s
kube-system       Active   8s
Track k8scluster init_k8s_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556288&event=k8scluster&operation=init_k8s_ok&value=&comment=&tags=
--2023-11-21 08:44:48--  https://raw.githubusercontent.com/coreos/flannel/master/Documentation/kube-flannel.yml
Resolving raw.githubusercontent.com (raw.githubusercontent.com)... 185.199.110.133, 185.199.111.133, 185.199.108.133, ...
Connecting to raw.githubusercontent.com (raw.githubusercontent.com)|185.199.110.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 4398 (4.3K) [text/plain]
Saving to: ‘/tmp/flannel.m5e6n2/kube-flannel.yml’

     0K ....                                                  100% 28.3M=0s

2023-11-21 08:44:48 (28.3 MB/s) - ‘/tmp/flannel.m5e6n2/kube-flannel.yml’ saved [4398/4398]

namespace/kube-flannel created
clusterrole.rbac.authorization.k8s.io/flannel created
clusterrolebinding.rbac.authorization.k8s.io/flannel created
serviceaccount/flannel created
configmap/kube-flannel-cfg created
daemonset.apps/kube-flannel-ds created
node/osmtest202311210839 untainted
LAST SEEN   TYPE      REASON                    OBJECT                     MESSAGE
11s         Normal    Starting                  node/osmtest202311210839   Starting kubelet.
11s         Warning   InvalidDiskCapacity       node/osmtest202311210839   invalid capacity 0 on image filesystem
11s         Normal    NodeAllocatableEnforced   node/osmtest202311210839   Updated Node Allocatable limit across pods
11s         Normal    NodeHasSufficientMemory   node/osmtest202311210839   Node osmtest202311210839 status is now: NodeHasSufficientMemory
11s         Normal    NodeHasNoDiskPressure     node/osmtest202311210839   Node osmtest202311210839 status is now: NodeHasNoDiskPressure
11s         Normal    NodeHasSufficientPID      node/osmtest202311210839   Node osmtest202311210839 status is now: NodeHasSufficientPID
Track k8scluster k8s_ready_before_helm: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556294&event=k8scluster&operation=k8s_ready_before_helm&value=&comment=&tags=
Deleting existing namespace osm: kubectl delete ns osm
Helm3 is not installed, installing ...
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
  0 14.7M    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100 14.7M  100 14.7M    0     0  14.8M      0 --:--:-- --:--:-- --:--:-- 14.8M
linux-amd64/
linux-amd64/LICENSE
linux-amd64/README.md
linux-amd64/helm
version.BuildInfo{Version:"v3.11.3", GitCommit:"323249351482b3bbfc9f5004f65d400aa70f9ae7", GitTreeState:"clean", GoVersion:"go1.20.3"}
"stable" has been added to your repositories
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "stable" chart repository
Update Complete. ⎈Happy Helming!⎈
Track k8scluster install_helm_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556298&event=k8scluster&operation=install_helm_ok&value=&comment=&tags=
Installing OpenEBS
"openebs" has been added to your repositories
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "openebs" chart repository
...Successfully got an update from the "stable" chart repository
Update Complete. ⎈Happy Helming!⎈
NAME: openebs
LAST DEPLOYED: Tue Nov 21 08:45:05 2023
NAMESPACE: openebs
STATUS: deployed
REVISION: 1
TEST SUITE: None
NOTES:
Successfully installed OpenEBS.

Check the status by running: kubectl get pods -n openebs

The default values will install NDM and enable OpenEBS hostpath and device
storage engines along with their default StorageClasses. Use `kubectl get sc`
to see the list of installed OpenEBS StorageClasses.

**Note**: If you are upgrading from the older helm chart that was using cStor
and Jiva (non-csi) volumes, you will have to run the following command to include
the older provisioners:

helm upgrade openebs openebs/openebs \
	--namespace openebs \
	--set legacy.enabled=true \
	--reuse-values

For other engines, you will need to perform a few more additional steps to
enable the engine, configure the engines (e.g. creating pools) and create 
StorageClasses. 

For example, cStor can be enabled using commands like:

helm upgrade openebs openebs/openebs \
	--namespace openebs \
	--set cstor.enabled=true \
	--reuse-values

For more information, 
- view the online documentation at https://openebs.io/docs or
- connect with an active community on Kubernetes slack #openebs channel.
NAME   	NAMESPACE	REVISION	UPDATED                                	STATUS  	CHART        	APP VERSION
openebs	openebs  	1       	2023-11-21 08:45:05.382564114 +0000 UTC	deployed	openebs-3.7.0	3.7.0      
Waiting for storageclass
Storageclass available
storageclass.storage.k8s.io/openebs-hostpath patched
Track k8scluster k8s_storageclass_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556306&event=k8scluster&operation=k8s_storageclass_ok&value=&comment=&tags=
Installing MetalLB
"metallb" has been added to your repositories
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "metallb" chart repository
...Successfully got an update from the "openebs" chart repository
...Successfully got an update from the "stable" chart repository
Update Complete. ⎈Happy Helming!⎈
NAME: metallb
LAST DEPLOYED: Tue Nov 21 08:45:07 2023
NAMESPACE: metallb-system
STATUS: deployed
REVISION: 1
TEST SUITE: None
NOTES:
MetalLB is now running in the cluster.

Now you can configure it via its CRs. Please refer to the metallb official docs
on how to use the CRs.
Track k8scluster k8s_metallb_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556308&event=k8scluster&operation=k8s_metallb_ok&value=&comment=&tags=
Installing cert-manager
"jetstack" has been added to your repositories
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "metallb" chart repository
...Successfully got an update from the "openebs" chart repository
...Successfully got an update from the "jetstack" chart repository
...Successfully got an update from the "stable" chart repository
Update Complete. ⎈Happy Helming!⎈
NAME: cert-manager
LAST DEPLOYED: Tue Nov 21 08:45:09 2023
NAMESPACE: cert-manager
STATUS: deployed
REVISION: 1
TEST SUITE: None
NOTES:
cert-manager v1.9.1 has been deployed successfully!

In order to begin issuing certificates, you will need to set up a ClusterIssuer
or Issuer resource (for example, by creating a 'letsencrypt-staging' issuer).

More information on the different types of issuers and how to configure them
can be found in our documentation:

https://cert-manager.io/docs/configuration/

For information on how to configure cert-manager to automatically provision
Certificates for Ingress resources, take a look at the `ingress-shim`
documentation:

https://cert-manager.io/docs/usage/ingress/
Track k8scluster k8s_certmanager_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556351&event=k8scluster&operation=k8s_certmanager_ok&value=&comment=&tags=

Bootstraping... 1 checks of 100
MetalLB: Waiting for 1 of 2 pods to be ready:
metallb-speaker-vr8qz	2/4	

CertManager: Waiting for 1 of 1 pods to be ready:
No	resources	


Bootstraping... 2 checks of 100
MetalLB: Waiting for 1 of 2 pods to be ready:
metallb-speaker-vr8qz	3/4	

CertManager: Waiting for 1 of 1 pods to be ready:
No	resources	


Bootstraping... 3 checks of 100
MetalLB: Waiting for 1 of 2 pods to be ready:
metallb-speaker-vr8qz	3/4	

CertManager: Waiting for 1 of 1 pods to be ready:
No	resources	


Bootstraping... 4 checks of 100
MetalLB: Waiting for 1 of 2 pods to be ready:
metallb-speaker-vr8qz	3/4	

CertManager: Waiting for 1 of 1 pods to be ready:
No	resources	


Bootstraping... 5 checks of 100
MetalLB: Waiting for 1 of 2 pods to be ready:
metallb-speaker-vr8qz	3/4	

CertManager: Waiting for 1 of 1 pods to be ready:
No	resources	


Bootstraping... 6 checks of 100
MetalLB: Waiting for 1 of 2 pods to be ready:
metallb-speaker-vr8qz	3/4	

CertManager: Waiting for 1 of 1 pods to be ready:
No	resources	


Bootstraping... 7 checks of 100
MetalLB: Waiting for 1 of 2 pods to be ready:
metallb-speaker-vr8qz	3/4	

CertManager: Waiting for 1 of 1 pods to be ready:
No	resources	


Bootstraping... 8 checks of 100
MetalLB: Waiting for 1 of 2 pods to be ready:
metallb-speaker-vr8qz	3/4	

CertManager: Waiting for 1 of 1 pods to be ready:
No	resources	

===> Successful checks: 1/10
===> Successful checks: 2/10
===> Successful checks: 3/10
===> Successful checks: 4/10
===> Successful checks: 5/10
===> Successful checks: 6/10
===> Successful checks: 7/10
===> Successful checks: 8/10
===> Successful checks: 9/10
===> Successful checks: 10/10
K8S CLUSTER IS READY
Track k8scluster k8s_ready_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556390&event=k8scluster&operation=k8s_ready_ok&value=&comment=&tags=
Creating IP address pool manifest: /etc/osm/metallb-ipaddrpool.yaml
apiVersion: metallb.io/v1beta1
kind: IPAddressPool
metadata:
  name: first-pool
  namespace: metallb-system
spec:
  addresses:
  - 172.21.23.5/32
Applying IP address pool manifest: kubectl apply -f /etc/osm/metallb-ipaddrpool.yaml
ipaddresspool.metallb.io/first-pool created
Track k8scluster k8scluster_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556391&event=k8scluster&operation=k8scluster_ok&value=&comment=&tags=
DEBUG_INSTALL=
DEFAULT_IP=172.21.23.5
OSM_BEHIND_PROXY=
OSM_DEVOPS=/usr/share/osm-devops
HOME=/home/ubuntu
Installing juju client
juju (2.9/stable) 2.9.45 from Canonical** installed
Finished installation of juju client
Track juju juju_client_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556397&event=juju&operation=juju_client_ok&value=&comment=&tags=
Since Juju 2 is being run for the first time, it has downloaded the latest public cloud information.

k8s substrate added as cloud "k8scloud" with storage provisioned
by the existing "openebs-hostpath" storage class.
You can now bootstrap to this cloud by running 'juju bootstrap k8scloud'.
08:46:40 INFO  juju.cmd supercommand.go:56 running juju [2.9.45 afb8ee760af71d0bca8c3e4e0dc28af2dabc9b1d gc go1.20.8]
08:46:40 DEBUG juju.cmd supercommand.go:57   args: []string{"/snap/juju/24550/bin/juju", "bootstrap", "-v", "--debug", "k8scloud", "osm", "--config", "controller-service-type=loadbalancer", "--agent-version=2.9.43"}
08:46:40 DEBUG juju.cmd.juju.commands bootstrap.go:1313 authenticating with region "" and credential "k8scloud" ()
08:46:40 DEBUG juju.cmd.juju.commands bootstrap.go:1461 provider attrs: map[operator-storage: workload-storage:]
08:46:41 INFO  cmd authkeys.go:114 Adding contents of "/home/ubuntu/.local/share/juju/ssh/juju_id_rsa.pub" to authorized-keys
08:46:41 DEBUG juju.cmd.juju.commands bootstrap.go:1536 preparing controller with config: map[agent-metadata-url: agent-stream:released apt-ftp-proxy: apt-http-proxy: apt-https-proxy: apt-mirror: apt-no-proxy: authorized-keys:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDvpDiTwlyo92AsDuy7FPekyFKdjuALJ70ygAl9NIirfhH78MTvZGuD76JagTLaPYKGg35er6ttAZ+rUYTjPQ/ns7gMypCn3fHAf5TEzoBRZoJx/7ZXoLZqKGnnuvlF2181SV2smjCJ6uzr769FZCiz70kFYJAh8xtPqCbXnX2odIFZQAgOprzJ+bGgqcJ79NkBA7xVdiDqRiN1Zi/ZpuId1B2MXw0c7wK/3A7WNSZ4cLN+W1iWRR+XbOfj59qnSpbcrb0LAHGcPg21BdI9dCaxeTQNZwFlAtS3/2iAYeICn475D2BS0GSqAE7vyBy8MzzOqcbBup+cA+wxYdx1BV8J juju-client-key
 automatically-retry-hooks:true backup-dir: charmhub-url:https://api.charmhub.io cloudinit-userdata: container-image-metadata-url: container-image-stream:released container-inherit-properties: container-networking-method: default-series: default-space: development:false disable-network-management:false disable-telemetry:false egress-subnets: enable-os-refresh-update:true enable-os-upgrade:true fan-config: firewall-mode:instance ftp-proxy: http-proxy: https-proxy: ignore-machine-addresses:false image-metadata-url: image-stream:released juju-ftp-proxy: juju-http-proxy: juju-https-proxy: juju-no-proxy:127.0.0.1,localhost,::1 logforward-enabled:false logging-config: logging-output: lxd-snap-channel:5.0/stable max-action-results-age:336h max-action-results-size:5G max-status-history-age:336h max-status-history-size:5G name:controller net-bond-reconfigure-delay:17 no-proxy:127.0.0.1,localhost,::1 num-container-provision-workers:4 num-provision-workers:16 operator-storage:openebs-hostpath provisioner-harvest-mode:destroyed proxy-ssh:false resource-tags: snap-http-proxy: snap-https-proxy: snap-store-assertions: snap-store-proxy: snap-store-proxy-url: ssl-hostname-verification:true test-mode:false transmit-vendor-metrics:true type:kubernetes update-status-hook-interval:5m uuid:1d5f2b8f-6da1-4a5c-8d8d-fc88d5b7e213 workload-storage:openebs-hostpath]
08:46:41 DEBUG juju.kubernetes.provider provider.go:140 opening model "controller".
08:46:41 INFO  cmd bootstrap.go:856 Creating Juju controller "osm" on k8scloud
08:46:41 INFO  juju.cmd.juju.commands bootstrap.go:927 combined bootstrap constraints: 
08:46:41 INFO  cmd bootstrap.go:975 Bootstrap to generic Kubernetes cluster
08:46:41 DEBUG juju.environs.simplestreams simplestreams.go:417 searching for signed metadata in datasource "gui simplestreams"
08:46:41 DEBUG juju.environs.simplestreams simplestreams.go:452 looking for data index using path streams/v1/index2.sjson
08:46:41 DEBUG juju.environs.simplestreams simplestreams.go:464 looking for data index using URL https://streams.canonical.com/juju/gui/streams/v1/index2.sjson
08:46:41 DEBUG juju.environs.simplestreams simplestreams.go:467 streams/v1/index2.sjson not accessed, actual error: [{github.com/juju/juju/environs/simplestreams.(*urlDataSource).Fetch:192: "https://streams.canonical.com/juju/gui/streams/v1/index2.sjson" not found}]
08:46:41 DEBUG juju.environs.simplestreams simplestreams.go:468 streams/v1/index2.sjson not accessed, trying legacy index path: streams/v1/index.sjson
08:46:41 DEBUG juju.environs.simplestreams simplestreams.go:487 read metadata index at "https://streams.canonical.com/juju/gui/streams/v1/index.sjson"
08:46:41 DEBUG juju.environs.simplestreams simplestreams.go:1019 finding products at path "streams/v1/com.canonical.streams-released-dashboard.sjson"
08:46:41 INFO  cmd bootstrap.go:871 Fetching Juju Dashboard 0.8.1
08:46:41 DEBUG juju.kubernetes.provider k8s.go:476 controller pod config: 
&{Tags:map[] Bootstrap:0xc0002ae380 DisableSSLHostnameVerification:false ProxySettings:{Http: Https: Ftp: NoProxy:127.0.0.1,localhost,::1 AutoNoProxy:} Controller:map[agent-logfile-max-backups:2 agent-logfile-max-size:100M api-port:17070 api-port-open-delay:2s audit-log-capture-args:false audit-log-exclude-methods:[ReadOnlyMethods] audit-log-max-backups:10 audit-log-max-size:300M auditing-enabled:true batch-raft-fsm:false ca-cert:-----BEGIN CERTIFICATE-----
MIIEEzCCAnugAwIBAgIVAMBJr9XQC1dqGeX6F++KUrUyqFBYMA0GCSqGSIb3DQEB
CwUAMCExDTALBgNVBAoTBEp1anUxEDAOBgNVBAMTB2p1anUtY2EwHhcNMjMxMTIx
MDg0MTQxWhcNMzMxMTIxMDg0NjQxWjAhMQ0wCwYDVQQKEwRKdWp1MRAwDgYDVQQD
EwdqdWp1LWNhMIIBojANBgkqhkiG9w0BAQEFAAOCAY8AMIIBigKCAYEA2xdzp/JA
9zXFccytQfw7PcyVTfGIPzsHL+zS5o+lqXqG61ZW9UtGIJCXnJ9dS8eSvM/kTLyW
SxTLL8YpW0Jzw2kTRv7OH5KseNESrk7QA+jwIDrht+5+YQoO4PXbw+jgRDeTC7+6
4bguttonDBr+nqHCB/XeiqQb+cCgzjphiLHejo5pa0SZO2b0oqLpfxA1nmq8Hblc
eK5bQZ68dRrmUkQ7BQOfKDoc2kRA6SVrYAXr2XaiHV9XC91d8OhugANRNEq58OVH
++8t4PorGGHHORyFpw4+PQhAOTV923ngIt0qSlR17xMuN2dD+QfGKyT5GO6q9egB
lJYKVVDN8JbrAQqZ/MagJIbewQcZY5Yt3DbxrRZ34gSMcbUpegBooCKYuR6MuMFn
6gknJEKPuX8VGZuW+aCdbqBV7slSdyvcZHaybkaRkZrmVb8UclXuTBhU6ta5dWyz
gbSInGF4sbkAqz6ApWmllhoqsqzBIz52f7fQZqy3V7W7rwdzXNHQgRZFAgMBAAGj
QjBAMA4GA1UdDwEB/wQEAwICpDAPBgNVHRMBAf8EBTADAQH/MB0GA1UdDgQWBBQy
ESU3+QqC3LDd5+36X1xNvgmfHTANBgkqhkiG9w0BAQsFAAOCAYEAvPUpqjY9p1Nl
wtrc4JKYc8+JbHlklWN3ysawoVaez5wXZX4P1sdcNhMeUKSIQfm5G462q9qzgwRl
tRWRDcuXDuni4gH3k3qY0k9x21AJfjZG517XzMUEOLGRxxzW3PuA+J5NIbG1WpE+
e3uoYsJwMo8jopWquCLwUV7LROpg2TT6udpFsOpdaDoPfVmNwO9aWiQCvDchvNdY
DvmiO6M/H/LShyq/n40QXqvTmm+1dX65y1IlI8Kk1H64ZKKvtAZXHgu0zom/as7X
pUlIcGVuDbRHxQzY/utFHHWQaDhxtmyteRUhNISCFnq8iSBnKGYZjO8EEY/j6Cm1
YE+5jc4eYLNYjn3ZA2lPULPVL5a+SJ991WvU6WSGdY0KynFQYrc0G5KwNgLeBgRQ
bMq9w9tqsHwWFG6tMaKyDAbcCjhiKnXE9d5yMow4fXFVbRYpDE6VE/UgPkXbUvW/
EVrb9OaLCil9BgTLZLkXg66BDfoG/AWFquKq82Qaqf1xyOkb1+Y5
-----END CERTIFICATE-----
 charmstore-url:https://api.jujucharms.com/charmstore controller-name:osm controller-uuid:26eaaedd-2d0c-4e4a-8f77-44891c106097 juju-db-snap-channel:4.4/stable max-agent-state-size:524288 max-charm-state-size:2097152 max-debug-log-duration:24h0m0s max-prune-txn-batch-size:1000000 max-prune-txn-passes:100 max-txn-log-size:10M metering-url:https://api.jujucharms.com/omnibus/v3 migration-agent-wait-time:15m0s model-logfile-max-backups:2 model-logfile-max-size:10M model-logs-size:20M mongo-memory-profile:default non-synced-writes-to-raft-log:false prune-txn-query-count:1000 prune-txn-sleep-time:10ms set-numa-control-policy:false state-port:37017] APIInfo:0xc0002c2a80 ControllerTag:controller-26eaaedd-2d0c-4e4a-8f77-44891c106097 ControllerName:osm JujuVersion:2.9.43 DataDir:/var/lib/juju LogDir:/var/log/juju MetricsSpoolDir:/var/lib/juju/metricspool ControllerId:0 AgentEnvironment:map[PROVIDER_TYPE:kubernetes]}
08:46:41 INFO  cmd bootstrap.go:395 Creating k8s resources for controller "controller-osm"
08:46:41 DEBUG juju.kubernetes.provider bootstrap.go:628 creating controller service: 
&Service{ObjectMeta:{controller-service  controller-osm    0 0001-01-01 00:00:00 +0000 UTC <nil> <nil> map[app.kubernetes.io/managed-by:juju app.kubernetes.io/name:controller] map[controller.juju.is/id:26eaaedd-2d0c-4e4a-8f77-44891c106097] [] []  []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:api-server,Protocol:,Port:17070,TargetPort:{0 17070 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{app.kubernetes.io/name: controller,},ClusterIP:,Type:LoadBalancer,ExternalIPs:[],SessionAffinity:,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:nil,ClusterIPs:[],IPFamilies:[],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}
08:46:43 DEBUG juju.kubernetes.provider configmap.go:84 updating configmap "controller-configmap"
08:46:43 DEBUG juju.kubernetes.provider configmap.go:84 updating configmap "controller-configmap"
08:46:44 DEBUG juju.kubernetes.provider bootstrap.go:1210 mongodb container args:
printf 'args="--dbpath=/var/lib/juju/db --sslPEMKeyFile=/var/lib/juju/server.pem --sslPEMKeyPassword=ignored --sslMode=requireSSL --port=37017 --journal --replSet=juju --quiet --oplogSize=1024 --auth --keyFile=/var/lib/juju/shared-secret --storageEngine=wiredTiger --bind_ip_all"\nipv6Disabled=$(sysctl net.ipv6.conf.all.disable_ipv6 -n)\nif [ $ipv6Disabled -eq 0 ]; then\n  args="${args} --ipv6"\nfi\nwhile [ ! -f "/var/lib/juju/server.pem" ]; do\n  echo "Waiting for /var/lib/juju/server.pem to be created..."\n  sleep 1\ndone\nexec mongod ${args}\n'>/root/mongo.sh && chmod a+x /root/mongo.sh && /root/mongo.sh
08:46:44 DEBUG juju.kubernetes.provider k8s.go:2021 selecting units "app.kubernetes.io/name=controller" to watch
08:46:44 DEBUG juju.kubernetes.provider.watcher k8swatcher.go:114 fire notify watcher for controller-0
08:46:45 DEBUG juju.kubernetes.provider.watcher k8swatcher.go:114 fire notify watcher for controller
08:46:54 DEBUG juju.kubernetes.provider.watcher k8swatcher.go:114 fire notify watcher for controller
08:46:54 DEBUG juju.kubernetes.provider bootstrap.go:959 Successfully assigned controller-osm/controller-0 to osmtest202311210839
08:46:54 DEBUG juju.kubernetes.provider bootstrap.go:959 Downloading images
08:46:54 INFO  cmd bootstrap.go:961 Downloading images
08:46:54 DEBUG juju.kubernetes.provider.watcher k8swatcher.go:114 fire notify watcher for controller-0
08:47:01 DEBUG juju.kubernetes.provider.watcher k8swatcher.go:114 fire notify watcher for controller-0
08:47:01 DEBUG juju.kubernetes.provider bootstrap.go:959 Pulled images
08:47:01 DEBUG juju.kubernetes.provider bootstrap.go:959 Created container mongodb
08:47:01 DEBUG juju.kubernetes.provider bootstrap.go:959 Started mongodb container
08:47:10 DEBUG juju.kubernetes.provider.watcher k8swatcher.go:114 fire notify watcher for controller-0
08:47:10 DEBUG juju.kubernetes.provider bootstrap.go:959 Created container api-server
08:47:10 DEBUG juju.kubernetes.provider bootstrap.go:959 Started controller container
08:47:10 DEBUG juju.kubernetes.provider.watcher k8swatcher.go:114 fire notify watcher for controller
08:47:10 INFO  cmd bootstrap.go:1047 Starting controller pod
08:47:10 INFO  cmd bootstrap.go:715 Bootstrap agent now started
08:47:10 INFO  juju.juju api.go:340 API endpoints changed from [] to [172.21.23.5:17070]
08:47:10 INFO  cmd controller.go:88 Contacting Juju controller at 172.21.23.5 to verify accessibility...
08:47:10 INFO  juju.juju api.go:86 connecting to API addresses: [172.21.23.5:17070]
08:47:14 INFO  cmd controller.go:141 Still waiting for API to become available: unable to connect to API: dial tcp 172.21.23.5:17070: connect: connection refused
08:47:17 INFO  juju.juju api.go:86 connecting to API addresses: [172.21.23.5:17070]
08:47:17 DEBUG juju.api apiclient.go:1151 successfully dialed "wss://172.21.23.5:17070/model/1d5f2b8f-6da1-4a5c-8d8d-fc88d5b7e213/api"
08:47:17 INFO  juju.api apiclient.go:686 connection established to "wss://172.21.23.5:17070/model/1d5f2b8f-6da1-4a5c-8d8d-fc88d5b7e213/api"
08:47:18 DEBUG juju.api monitor.go:35 RPC connection died
08:47:18 INFO  cmd controller.go:108 
Bootstrap complete, controller "osm" is now available in namespace "controller-osm"
08:47:18 INFO  cmd bootstrap.go:597 
Now you can run
	juju add-model <model-name>
to create a new model to deploy k8s workloads.
08:47:18 INFO  cmd supercommand.go:544 command finished
..+......+......+...............+...+....+...........+.+.....+..........+.....+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*........+.+..+.......+..+.+..+......+.+......+..+.......+.....+...+...+...+.+...+...+...+..+.......+...+..+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*......+..+...+.+........+...+.............+......+........+...+..........+........+.+.........+..+..........+............+..+...+.......+...+............+.....+...+......+.........+...+...+............+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
..............+.+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*.....+...+......+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++*.....+..+.........+...+.......+.....+...+.............+........+....+..................+...+.....+.......+.....+...+.......+...+.........+.....+..........+..+......+...+..........+......+........+.......+.........+..+....+...+..+.+............+..+.........+.......+...+...............+.........+...........+....+...+..+......+.......+..+..........+..+...+............+....+.........+...+..+............+...+...+....+......+........+......+.........+.+......+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
-----
To start your first container, try: lxc launch ubuntu:22.04
Or for a virtual machine: lxc launch ubuntu:22.04 --vm

Cloud "lxd-cloud" added to controller "osm".
WARNING loading credentials: credentials for cloud lxd-cloud not found
To upload a credential to the controller for cloud "lxd-cloud", use 
* 'add-model' with --credential option or
* 'add-credential -c lxd-cloud'.
Using cloud "lxd-cloud" from the controller to verify credentials.
Controller credential "lxd-cloud" for user "admin" for cloud "lxd-cloud" on controller "osm" added.
For more information, see ‘juju show-credential lxd-cloud lxd-cloud’.

Checking required packages: iptables-persistent
    Not installed.
Installing iptables-persistent requires root privileges
Reading package lists...
Building dependency tree...
Reading state information...
The following additional packages will be installed:
  netfilter-persistent
The following NEW packages will be installed:
  iptables-persistent netfilter-persistent
0 upgraded, 2 newly installed, 0 to remove and 4 not upgraded.
Need to get 13.9 kB of archives.
After this operation, 93.2 kB of additional disk space will be used.
Get:1 http://azure.archive.ubuntu.com/ubuntu jammy/universe amd64 netfilter-persistent all 1.0.16 [7440 B]
Get:2 http://azure.archive.ubuntu.com/ubuntu jammy/universe amd64 iptables-persistent all 1.0.16 [6488 B]
debconf: unable to initialize frontend: Dialog
debconf: (Dialog frontend will not work on a dumb terminal, an emacs shell buffer, or without a controlling terminal.)
debconf: falling back to frontend: Readline
debconf: unable to initialize frontend: Readline
debconf: (This frontend requires a controlling tty.)
debconf: falling back to frontend: Teletype
dpkg-preconfigure: unable to re-open stdin: 
Fetched 13.9 kB in 0s (490 kB/s)
Selecting previously unselected package netfilter-persistent.
(Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 62658 files and directories currently installed.)
Preparing to unpack .../netfilter-persistent_1.0.16_all.deb ...
Unpacking netfilter-persistent (1.0.16) ...
Selecting previously unselected package iptables-persistent.
Preparing to unpack .../iptables-persistent_1.0.16_all.deb ...
Unpacking iptables-persistent (1.0.16) ...
Setting up netfilter-persistent (1.0.16) ...
Created symlink /etc/systemd/system/multi-user.target.wants/netfilter-persistent.service → /lib/systemd/system/netfilter-persistent.service.
Setting up iptables-persistent (1.0.16) ...
update-alternatives: using /lib/systemd/system/netfilter-persistent.service to provide /lib/systemd/system/iptables.service (iptables.service) in auto mode
debconf: unable to initialize frontend: Dialog
debconf: (Dialog frontend will not work on a dumb terminal, an emacs shell buffer, or without a controlling terminal.)
debconf: falling back to frontend: Readline
Processing triggers for man-db (2.10.2-1) ...

Running kernel seems to be up-to-date.

No services need to be restarted.

No containers need to be restarted.

No user sessions are running outdated binaries.

No VM guests are running outdated hypervisor (qemu) binaries on this host.
iptables v1.8.7 (nf_tables): option "--to-destination" requires an argument
Try `iptables -h' or 'iptables --help' for more information.
iptables v1.8.7 (nf_tables): option "--to-destination" requires an argument
Try `iptables -h' or 'iptables --help' for more information.
run-parts: executing /usr/share/netfilter-persistent/plugins.d/15-ip4tables save
run-parts: executing /usr/share/netfilter-persistent/plugins.d/25-ip6tables save
Track juju juju_controller_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556454&event=juju&operation=juju_controller_ok&value=&comment=&tags=
Track juju juju_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556454&event=juju&operation=juju_ok&value=&comment=&tags=
Track docker_images docker_images_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556455&event=docker_images&operation=docker_images_ok&value=&comment=&tags=
DEBUG_INSTALL=
OSM_DEVOPS=/usr/share/osm-devops
OSM_DOCKER_TAG=testing-daily
OSM_HELM_WORK_DIR=/etc/osm/helm
"bitnami" has been added to your repositories
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "metallb" chart repository
...Successfully got an update from the "openebs" chart repository
...Successfully got an update from the "jetstack" chart repository
...Successfully got an update from the "bitnami" chart repository
...Successfully got an update from the "stable" chart repository
Update Complete. ⎈Happy Helming!⎈
Release "mongodb-k8s" does not exist. Installing it now.
NAME: mongodb-k8s
LAST DEPLOYED: Tue Nov 21 08:47:38 2023
NAMESPACE: osm
STATUS: deployed
REVISION: 1
TEST SUITE: None
NOTES:
CHART NAME: mongodb
CHART VERSION: 13.9.4
APP VERSION: 6.0.5

** Please be patient while the chart is being deployed **

MongoDB&reg; can be accessed on the following DNS name(s) and ports from within your cluster:

    mongodb-k8s-0.mongodb-k8s-headless.osm.svc.cluster.local:27017
    mongodb-k8s-1.mongodb-k8s-headless.osm.svc.cluster.local:27017

To connect to your database, create a MongoDB&reg; client container:

    kubectl run --namespace osm mongodb-k8s-client --rm --tty -i --restart='Never' --env="MONGODB_ROOT_PASSWORD=$MONGODB_ROOT_PASSWORD" --image docker.io/bitnami/mongodb:6.0.5-debian-11-r4 --command -- bash

Then, run the following command:
    mongosh admin --host "mongodb-k8s-0.mongodb-k8s-headless.osm.svc.cluster.local:27017,mongodb-k8s-1.mongodb-k8s-headless.osm.svc.cluster.local:27017"
Track deploy_osm deploy_mongodb_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556459&event=deploy_osm&operation=deploy_mongodb_ok&value=&comment=&tags=
helm -n osm install osm /usr/share/osm-devops/installers/helm/osm -f /etc/osm/helm/osm-values.yaml  --set global.image.repositoryBase=opensourcemano --set mysql.dbHostPath=/var/lib/osm/osm --set vca.host=172.21.23.5 --set vca.secret=ffbc367bd0d177fa0d55700106f8ef93 --set vca.cacert=LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUVFekNDQW51Z0F3SUJBZ0lWQU1CSnI5WFFDMWRxR2VYNkYrK0tVclV5cUZCWU1BMEdDU3FHU0liM0RRRUIKQ3dVQU1DRXhEVEFMQmdOVkJBb1RCRXAxYW5VeEVEQU9CZ05WQkFNVEIycDFhblV0WTJFd0hoY05Nak14TVRJeApNRGcwTVRReFdoY05Nek14TVRJeE1EZzBOalF4V2pBaE1RMHdDd1lEVlFRS0V3UktkV3AxTVJBd0RnWURWUVFECkV3ZHFkV3AxTFdOaE1JSUJvakFOQmdrcWhraUc5dzBCQVFFRkFBT0NBWThBTUlJQmlnS0NBWUVBMnhkenAvSkEKOXpYRmNjeXRRZnc3UGN5VlRmR0lQenNITCt6UzVvK2xxWHFHNjFaVzlVdEdJSkNYbko5ZFM4ZVN2TS9rVEx5VwpTeFRMTDhZcFcwSnp3MmtUUnY3T0g1S3NlTkVTcms3UUErandJRHJodCs1K1lRb080UFhidytqZ1JEZVRDNys2CjRiZ3V0dG9uREJyK25xSENCL1hlaXFRYitjQ2d6anBoaUxIZWpvNXBhMFNaTzJiMG9xTHBmeEExbm1xOEhibGMKZUs1YlFaNjhkUnJtVWtRN0JRT2ZLRG9jMmtSQTZTVnJZQVhyMlhhaUhWOVhDOTFkOE9odWdBTlJORXE1OE9WSAorKzh0NFBvckdHSEhPUnlGcHc0K1BRaEFPVFY5MjNuZ0l0MHFTbFIxN3hNdU4yZEQrUWZHS3lUNUdPNnE5ZWdCCmxKWUtWVkROOEpickFRcVovTWFnSkliZXdRY1pZNVl0M0RieHJSWjM0Z1NNY2JVcGVnQm9vQ0tZdVI2TXVNRm4KNmdrbkpFS1B1WDhWR1p1VythQ2RicUJWN3NsU2R5dmNaSGF5YmthUmtacm1WYjhVY2xYdVRCaFU2dGE1ZFd5egpnYlNJbkdGNHNia0FxejZBcFdtbGxob3FzcXpCSXo1MmY3ZlFacXkzVjdXN3J3ZHpYTkhRZ1JaRkFnTUJBQUdqClFqQkFNQTRHQTFVZER3RUIvd1FFQXdJQ3BEQVBCZ05WSFJNQkFmOEVCVEFEQVFIL01CMEdBMVVkRGdRV0JCUXkKRVNVMytRcUMzTERkNSszNlgxeE52Z21mSFRBTkJna3Foa2lHOXcwQkFRc0ZBQU9DQVlFQXZQVXBxalk5cDFObAp3dHJjNEpLWWM4K0piSGxrbFdOM3lzYXdvVmFlejV3WFpYNFAxc2RjTmhNZVVLU0lRZm01RzQ2MnE5cXpnd1JsCnRSV1JEY3VYRHVuaTRnSDNrM3FZMGs5eDIxQUpmalpHNTE3WHpNVUVPTEdSeHh6VzNQdUErSjVOSWJHMVdwRSsKZTN1b1lzSndNbzhqb3BXcXVDTHdVVjdMUk9wZzJUVDZ1ZHBGc09wZGFEb1BmVm1Od085YVdpUUN2RGNodk5kWQpEdm1pTzZNL0gvTFNoeXEvbjQwUVhxdlRtbSsxZFg2NXkxSWxJOEtrMUg2NFpLS3Z0QVpYSGd1MHpvbS9hczdYCnBVbEljR1Z1RGJSSHhRelkvdXRGSEhXUWFEaHh0bXl0ZVJVaE5JU0NGbnE4aVNCbktHWVpqTzhFRVkvajZDbTEKWUUrNWpjNGVZTE5Zam4zWkEybFBVTFBWTDVhK1NKOTkxV3ZVNldTR2RZMEt5bkZRWXJjMEc1S3dOZ0xlQmdSUQpiTXE5dzl0cXNId1dGRzZ0TWFLeURBYmNDamhpS25YRTlkNXlNb3c0ZlhGVmJSWXBERTZWRS9VZ1BrWGJVdlcvCkVWcmI5T2FMQ2lsOUJnVExaTGtYZzY2QkRmb0cvQVdGcXVLcTgyUWFxZjF4eU9rYjErWTUKLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQoK
NAME: osm
LAST DEPLOYED: Tue Nov 21 08:47:39 2023
NAMESPACE: osm
STATUS: deployed
REVISION: 1
TEST SUITE: None
NOTES:
1. Get the application URL by running these commands:
  export NODE_PORT=$(kubectl get --namespace osm -o jsonpath="{.spec.ports[0].nodePort}" services nbi)
  export NODE_IP=$(kubectl get nodes --namespace osm -o jsonpath="{.items[0].status.addresses[0].address}")
  echo http://$NODE_IP:$NODE_PORT
USER-SUPPLIED VALUES:
global:
  image:
    repositoryBase: opensourcemano
mysql:
  dbHostPath: /var/lib/osm/osm
vca:
  cacert: LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUVFekNDQW51Z0F3SUJBZ0lWQU1CSnI5WFFDMWRxR2VYNkYrK0tVclV5cUZCWU1BMEdDU3FHU0liM0RRRUIKQ3dVQU1DRXhEVEFMQmdOVkJBb1RCRXAxYW5VeEVEQU9CZ05WQkFNVEIycDFhblV0WTJFd0hoY05Nak14TVRJeApNRGcwTVRReFdoY05Nek14TVRJeE1EZzBOalF4V2pBaE1RMHdDd1lEVlFRS0V3UktkV3AxTVJBd0RnWURWUVFECkV3ZHFkV3AxTFdOaE1JSUJvakFOQmdrcWhraUc5dzBCQVFFRkFBT0NBWThBTUlJQmlnS0NBWUVBMnhkenAvSkEKOXpYRmNjeXRRZnc3UGN5VlRmR0lQenNITCt6UzVvK2xxWHFHNjFaVzlVdEdJSkNYbko5ZFM4ZVN2TS9rVEx5VwpTeFRMTDhZcFcwSnp3MmtUUnY3T0g1S3NlTkVTcms3UUErandJRHJodCs1K1lRb080UFhidytqZ1JEZVRDNys2CjRiZ3V0dG9uREJyK25xSENCL1hlaXFRYitjQ2d6anBoaUxIZWpvNXBhMFNaTzJiMG9xTHBmeEExbm1xOEhibGMKZUs1YlFaNjhkUnJtVWtRN0JRT2ZLRG9jMmtSQTZTVnJZQVhyMlhhaUhWOVhDOTFkOE9odWdBTlJORXE1OE9WSAorKzh0NFBvckdHSEhPUnlGcHc0K1BRaEFPVFY5MjNuZ0l0MHFTbFIxN3hNdU4yZEQrUWZHS3lUNUdPNnE5ZWdCCmxKWUtWVkROOEpickFRcVovTWFnSkliZXdRY1pZNVl0M0RieHJSWjM0Z1NNY2JVcGVnQm9vQ0tZdVI2TXVNRm4KNmdrbkpFS1B1WDhWR1p1VythQ2RicUJWN3NsU2R5dmNaSGF5YmthUmtacm1WYjhVY2xYdVRCaFU2dGE1ZFd5egpnYlNJbkdGNHNia0FxejZBcFdtbGxob3FzcXpCSXo1MmY3ZlFacXkzVjdXN3J3ZHpYTkhRZ1JaRkFnTUJBQUdqClFqQkFNQTRHQTFVZER3RUIvd1FFQXdJQ3BEQVBCZ05WSFJNQkFmOEVCVEFEQVFIL01CMEdBMVVkRGdRV0JCUXkKRVNVMytRcUMzTERkNSszNlgxeE52Z21mSFRBTkJna3Foa2lHOXcwQkFRc0ZBQU9DQVlFQXZQVXBxalk5cDFObAp3dHJjNEpLWWM4K0piSGxrbFdOM3lzYXdvVmFlejV3WFpYNFAxc2RjTmhNZVVLU0lRZm01RzQ2MnE5cXpnd1JsCnRSV1JEY3VYRHVuaTRnSDNrM3FZMGs5eDIxQUpmalpHNTE3WHpNVUVPTEdSeHh6VzNQdUErSjVOSWJHMVdwRSsKZTN1b1lzSndNbzhqb3BXcXVDTHdVVjdMUk9wZzJUVDZ1ZHBGc09wZGFEb1BmVm1Od085YVdpUUN2RGNodk5kWQpEdm1pTzZNL0gvTFNoeXEvbjQwUVhxdlRtbSsxZFg2NXkxSWxJOEtrMUg2NFpLS3Z0QVpYSGd1MHpvbS9hczdYCnBVbEljR1Z1RGJSSHhRelkvdXRGSEhXUWFEaHh0bXl0ZVJVaE5JU0NGbnE4aVNCbktHWVpqTzhFRVkvajZDbTEKWUUrNWpjNGVZTE5Zam4zWkEybFBVTFBWTDVhK1NKOTkxV3ZVNldTR2RZMEt5bkZRWXJjMEc1S3dOZ0xlQmdSUQpiTXE5dzl0cXNId1dGRzZ0TWFLeURBYmNDamhpS25YRTlkNXlNb3c0ZlhGVmJSWXBERTZWRS9VZ1BrWGJVdlcvCkVWcmI5T2FMQ2lsOUJnVExaTGtYZzY2QkRmb0cvQVdGcXVLcTgyUWFxZjF4eU9rYjErWTUKLS0tLS1FTkQgQ0VSVElGSUNBVEUtLS0tLQoK
  host: 172.21.23.5
  pubkey: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDvpDiTwlyo92AsDuy7FPekyFKdjuALJ70ygAl9NIirfhH78MTvZGuD76JagTLaPYKGg35er6ttAZ+rUYTjPQ/ns7gMypCn3fHAf5TEzoBRZoJx/7ZXoLZqKGnnuvlF2181SV2smjCJ6uzr769FZCiz70kFYJAh8xtPqCbXnX2odIFZQAgOprzJ+bGgqcJ79NkBA7xVdiDqRiN1Zi/ZpuId1B2MXw0c7wK/3A7WNSZ4cLN+W1iWRR+XbOfj59qnSpbcrb0LAHGcPg21BdI9dCaxeTQNZwFlAtS3/2iAYeICn475D2BS0GSqAE7vyBy8MzzOqcbBup+cA+wxYdx1BV8J
    juju-client-key
  secret: ffbc367bd0d177fa0d55700106f8ef93
Track deploy_osm deploy_osm_services_k8s_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556461&event=deploy_osm&operation=deploy_osm_services_k8s_ok&value=&comment=&tags=
DEBUG_INSTALL=
OSM_DEVOPS=/usr/share/osm-devops
OSM_DOCKER_TAG=testing-daily
OSM_HELM_WORK_DIR=/etc/osm/helm
Updating Helm values file helm/values/airflow-values.yaml to use defaultAirflowTag: testing-daily
Updating Helm values file helm/values/airflow-values.yaml to use defaultAirflowRepository: opensourcemano/airflow
"apache-airflow" has been added to your repositories
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "metallb" chart repository
...Successfully got an update from the "apache-airflow" chart repository
...Successfully got an update from the "openebs" chart repository
...Successfully got an update from the "jetstack" chart repository
...Successfully got an update from the "stable" chart repository
...Successfully got an update from the "bitnami" chart repository
Update Complete. ⎈Happy Helming!⎈
Release "airflow" does not exist. Installing it now.
NAME: airflow
LAST DEPLOYED: Tue Nov 21 08:47:44 2023
NAMESPACE: osm
STATUS: deployed
REVISION: 1
TEST SUITE: None
NOTES:
Thank you for installing Apache Airflow 2.5.3!

Your release is named airflow.
You can now access your dashboard(s) by executing the following command(s) and visiting the corresponding port at localhost in your browser:

Airflow Webserver:     kubectl port-forward svc/airflow-webserver 8080:8080 --namespace osm
Default Webserver (Airflow UI) Login credentials:
    username: admin
    password: admin
Default Postgres connection credentials:
    username: postgres
    password: postgres
    port: 5432

You can get Fernet Key value by running the following:

    echo Fernet Key: $(kubectl get secret --namespace osm airflow-fernet-key -o jsonpath="{.data.fernet-key}" | base64 --decode)
Track deploy_osm airflow_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556702&event=deploy_osm&operation=airflow_ok&value=&comment=&tags=
"prometheus-community" has been added to your repositories
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "metallb" chart repository
...Successfully got an update from the "apache-airflow" chart repository
...Successfully got an update from the "openebs" chart repository
...Successfully got an update from the "jetstack" chart repository
...Successfully got an update from the "prometheus-community" chart repository
...Successfully got an update from the "bitnami" chart repository
...Successfully got an update from the "stable" chart repository
Update Complete. ⎈Happy Helming!⎈
Release "pushgateway" does not exist. Installing it now.
NAME: pushgateway
LAST DEPLOYED: Tue Nov 21 08:51:46 2023
NAMESPACE: osm
STATUS: deployed
REVISION: 1
TEST SUITE: None
NOTES:
1. Get the application URL by running these commands:
  export POD_NAME=$(kubectl get pods --namespace osm -l "app=prometheus-pushgateway,release=pushgateway" -o jsonpath="{.items[0].metadata.name}")
  echo "Visit http://127.0.0.1:8080 to use your application"
  kubectl port-forward $POD_NAME 8080:80
Track deploy_osm pushgateway_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556707&event=deploy_osm&operation=pushgateway_ok&value=&comment=&tags=
"prometheus-community" already exists with the same configuration, skipping
Hang tight while we grab the latest from your chart repositories...
...Successfully got an update from the "metallb" chart repository
...Successfully got an update from the "apache-airflow" chart repository
...Successfully got an update from the "openebs" chart repository
...Successfully got an update from the "jetstack" chart repository
...Successfully got an update from the "prometheus-community" chart repository
...Successfully got an update from the "bitnami" chart repository
...Successfully got an update from the "stable" chart repository
Update Complete. ⎈Happy Helming!⎈
Release "alertmanager" does not exist. Installing it now.
NAME: alertmanager
LAST DEPLOYED: Tue Nov 21 08:51:51 2023
NAMESPACE: osm
STATUS: deployed
REVISION: 1
NOTES:
1. Get the application URL by running these commands:
  export NODE_PORT=$(kubectl get --namespace osm -o jsonpath="{.spec.ports[0].nodePort}" services alertmanager)
  export NODE_IP=$(kubectl get nodes --namespace osm -o jsonpath="{.items[0].status.addresses[0].address}")
  echo http://$NODE_IP:$NODE_PORT
Track deploy_osm alertmanager_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556711&event=deploy_osm&operation=alertmanager_ok&value=&comment=&tags=
Track deploy_osm install_osm_ngsa_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556711&event=deploy_osm&operation=install_osm_ngsa_ok&value=&comment=&tags=
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed

  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
100  3102  100  3102    0     0   7758      0 --:--:-- --:--:-- --:--:--  7774
OK
Hit:1 http://azure.archive.ubuntu.com/ubuntu jammy InRelease
Hit:2 http://azure.archive.ubuntu.com/ubuntu jammy-updates InRelease
Hit:3 http://azure.archive.ubuntu.com/ubuntu jammy-backports InRelease
Hit:4 http://azure.archive.ubuntu.com/ubuntu jammy-security InRelease
Hit:5 https://download.docker.com/linux/ubuntu jammy InRelease
Hit:6 https://osm-download.etsi.org/repository/osm/debian/testing-daily testing InRelease
Hit:7 https://packages.cloud.google.com/apt kubernetes-xenial InRelease
Get:8 https://osm-download.etsi.org/repository/osm/debian/testing-daily testing/osmclient amd64 Packages [483 B]
Get:9 https://osm-download.etsi.org/repository/osm/debian/testing-daily testing/IM amd64 Packages [859 B]
Fetched 1342 B in 1s (1030 B/s)
Reading package lists...
W: https://osm-download.etsi.org/repository/osm/debian/testing-daily/dists/testing/InRelease: Key is stored in legacy trusted.gpg keyring (/etc/apt/trusted.gpg), see the DEPRECATION section in apt-key(8) for details.
W: Conflicting distribution: https://osm-download.etsi.org/repository/osm/debian/testing-daily testing InRelease (expected testing but got )
Repository: 'deb [arch=amd64] https://osm-download.etsi.org/repository/osm/debian/testing-daily testing osmclient IM'
Description:
Archive for codename: testing components: osmclient,IM
More info: https://osm-download.etsi.org/repository/osm/debian/testing-daily
Adding repository.
Adding deb entry to /etc/apt/sources.list.d/archive_uri-https_osm-download_etsi_org_repository_osm_debian_testing-daily-jammy.list
Adding disabled deb-src entry to /etc/apt/sources.list.d/archive_uri-https_osm-download_etsi_org_repository_osm_debian_testing-daily-jammy.list
Hit:1 http://azure.archive.ubuntu.com/ubuntu jammy InRelease
Hit:2 http://azure.archive.ubuntu.com/ubuntu jammy-updates InRelease
Hit:3 http://azure.archive.ubuntu.com/ubuntu jammy-backports InRelease
Hit:4 http://azure.archive.ubuntu.com/ubuntu jammy-security InRelease
Hit:5 https://download.docker.com/linux/ubuntu jammy InRelease
Hit:6 https://osm-download.etsi.org/repository/osm/debian/testing-daily testing InRelease
Get:7 https://packages.cloud.google.com/apt kubernetes-xenial InRelease [8993 B]
Fetched 8993 B in 1s (6994 B/s)
Reading package lists...
W: https://osm-download.etsi.org/repository/osm/debian/testing-daily/dists/testing/InRelease: Key is stored in legacy trusted.gpg keyring (/etc/apt/trusted.gpg), see the DEPRECATION section in apt-key(8) for details.
W: Conflicting distribution: https://osm-download.etsi.org/repository/osm/debian/testing-daily testing InRelease (expected testing but got )
Reading package lists...
Building dependency tree...
Reading state information...
The following additional packages will be installed:
  build-essential bzip2 cpp cpp-11 dpkg-dev fakeroot fontconfig-config
  fonts-dejavu-core g++ g++-11 gcc gcc-11 gcc-11-base javascript-common
  libalgorithm-diff-perl libalgorithm-diff-xs-perl libalgorithm-merge-perl
  libasan6 libatomic1 libc-dev-bin libc-devtools libc6-dev libcc1-0
  libcrypt-dev libdeflate0 libdpkg-perl libexpat1-dev libfakeroot
  libfile-fcntllock-perl libfontconfig1 libgcc-11-dev libgd3 libgomp1 libisl23
  libitm1 libjbig0 libjpeg-turbo8 libjpeg8 libjs-jquery libjs-sphinxdoc
  libjs-underscore liblsan0 libmpc3 libnsl-dev libpython3-dev
  libpython3.10-dev libquadmath0 libstdc++-11-dev libtiff5 libtirpc-dev
  libtsan0 libubsan1 libwebp7 libxpm4 linux-libc-dev lto-disabled-list make
  manpages-dev python3-dev python3-wheel python3.10-dev rpcsvc-proto
  zlib1g-dev
Suggested packages:
  bzip2-doc cpp-doc gcc-11-locales debian-keyring g++-multilib g++-11-multilib
  gcc-11-doc gcc-multilib autoconf automake libtool flex bison gdb gcc-doc
  gcc-11-multilib apache2 | lighttpd | httpd glibc-doc bzr libgd-tools
  libstdc++-11-doc make-doc
The following NEW packages will be installed:
  build-essential bzip2 cpp cpp-11 dpkg-dev fakeroot fontconfig-config
  fonts-dejavu-core g++ g++-11 gcc gcc-11 gcc-11-base javascript-common
  libalgorithm-diff-perl libalgorithm-diff-xs-perl libalgorithm-merge-perl
  libasan6 libatomic1 libc-dev-bin libc-devtools libc6-dev libcc1-0
  libcrypt-dev libdeflate0 libdpkg-perl libexpat1-dev libfakeroot
  libfile-fcntllock-perl libfontconfig1 libgcc-11-dev libgd3 libgomp1 libisl23
  libitm1 libjbig0 libjpeg-turbo8 libjpeg8 libjs-jquery libjs-sphinxdoc
  libjs-underscore liblsan0 libmpc3 libnsl-dev libpython3-dev
  libpython3.10-dev libquadmath0 libstdc++-11-dev libtiff5 libtirpc-dev
  libtsan0 libubsan1 libwebp7 libxpm4 linux-libc-dev lto-disabled-list make
  manpages-dev python3-dev python3-pip python3-wheel python3.10-dev
  rpcsvc-proto zlib1g-dev
0 upgraded, 64 newly installed, 0 to remove and 4 not upgraded.
Need to get 71.3 MB of archives.
After this operation, 239 MB of additional disk space will be used.
Get:1 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libc-dev-bin amd64 2.35-0ubuntu3.4 [20.3 kB]
Get:2 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 linux-libc-dev amd64 5.15.0-89.99 [1338 kB]
Get:3 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libcrypt-dev amd64 1:4.4.27-1 [112 kB]
Get:4 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 rpcsvc-proto amd64 1.4.2-0ubuntu6 [68.5 kB]
Get:5 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libtirpc-dev amd64 1.3.2-2ubuntu0.1 [192 kB]
Get:6 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libnsl-dev amd64 1.3.0-2build2 [71.3 kB]
Get:7 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libc6-dev amd64 2.35-0ubuntu3.4 [2100 kB]
Get:8 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 gcc-11-base amd64 11.4.0-1ubuntu1~22.04 [20.2 kB]
Get:9 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libisl23 amd64 0.24-2build1 [727 kB]
Get:10 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libmpc3 amd64 1.2.1-2build1 [46.9 kB]
Get:11 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 cpp-11 amd64 11.4.0-1ubuntu1~22.04 [10.0 MB]
Get:12 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 cpp amd64 4:11.2.0-1ubuntu1 [27.7 kB]
Get:13 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libcc1-0 amd64 12.3.0-1ubuntu1~22.04 [48.3 kB]
Get:14 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libgomp1 amd64 12.3.0-1ubuntu1~22.04 [126 kB]
Get:15 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libitm1 amd64 12.3.0-1ubuntu1~22.04 [30.2 kB]
Get:16 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libatomic1 amd64 12.3.0-1ubuntu1~22.04 [10.4 kB]
Get:17 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libasan6 amd64 11.4.0-1ubuntu1~22.04 [2282 kB]
Get:18 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 liblsan0 amd64 12.3.0-1ubuntu1~22.04 [1069 kB]
Get:19 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libtsan0 amd64 11.4.0-1ubuntu1~22.04 [2260 kB]
Get:20 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libubsan1 amd64 12.3.0-1ubuntu1~22.04 [976 kB]
Get:21 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libquadmath0 amd64 12.3.0-1ubuntu1~22.04 [154 kB]
Get:22 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libgcc-11-dev amd64 11.4.0-1ubuntu1~22.04 [2517 kB]
Get:23 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 gcc-11 amd64 11.4.0-1ubuntu1~22.04 [20.1 MB]
Get:24 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 gcc amd64 4:11.2.0-1ubuntu1 [5112 B]
Get:25 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libstdc++-11-dev amd64 11.4.0-1ubuntu1~22.04 [2101 kB]
Get:26 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 g++-11 amd64 11.4.0-1ubuntu1~22.04 [11.4 MB]
Get:27 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 g++ amd64 4:11.2.0-1ubuntu1 [1412 B]
Get:28 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 make amd64 4.3-4.1build1 [180 kB]
Get:29 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libdpkg-perl all 1.21.1ubuntu2.2 [237 kB]
Get:30 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 bzip2 amd64 1.0.8-5build1 [34.8 kB]
Get:31 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 lto-disabled-list all 24 [12.5 kB]
Get:32 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 dpkg-dev all 1.21.1ubuntu2.2 [922 kB]
Get:33 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 build-essential amd64 12.9ubuntu3 [4744 B]
Get:34 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libfakeroot amd64 1.28-1ubuntu1 [31.5 kB]
Get:35 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 fakeroot amd64 1.28-1ubuntu1 [60.4 kB]
Get:36 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 fonts-dejavu-core all 2.37-2build1 [1041 kB]
Get:37 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 fontconfig-config all 2.13.1-4.2ubuntu5 [29.1 kB]
Get:38 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 javascript-common all 11+nmu1 [5936 B]
Get:39 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libalgorithm-diff-perl all 1.201-1 [41.8 kB]
Get:40 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libalgorithm-diff-xs-perl amd64 0.04-6build3 [11.9 kB]
Get:41 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libalgorithm-merge-perl all 0.08-3 [12.0 kB]
Get:42 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libfontconfig1 amd64 2.13.1-4.2ubuntu5 [131 kB]
Get:43 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libjpeg-turbo8 amd64 2.1.2-0ubuntu1 [134 kB]
Get:44 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libjpeg8 amd64 8c-2ubuntu10 [2264 B]
Get:45 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libdeflate0 amd64 1.10-2 [70.9 kB]
Get:46 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjbig0 amd64 2.1-3.1ubuntu0.22.04.1 [29.2 kB]
Get:47 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libwebp7 amd64 1.2.2-2ubuntu0.22.04.2 [206 kB]
Get:48 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libtiff5 amd64 4.3.0-6ubuntu0.6 [183 kB]
Get:49 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libxpm4 amd64 1:3.5.12-1ubuntu0.22.04.2 [36.7 kB]
Get:50 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libgd3 amd64 2.3.0-2ubuntu2 [129 kB]
Get:51 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libc-devtools amd64 2.35-0ubuntu3.4 [28.9 kB]
Get:52 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libexpat1-dev amd64 2.4.7-1ubuntu0.2 [147 kB]
Get:53 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libfile-fcntllock-perl amd64 0.22-3build7 [33.9 kB]
Get:54 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libjs-jquery all 3.6.0+dfsg+~3.5.13-1 [321 kB]
Get:55 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libjs-underscore all 1.13.2~dfsg-2 [118 kB]
Get:56 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 libjs-sphinxdoc all 4.3.2-1 [139 kB]
Get:57 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 zlib1g-dev amd64 1:1.2.11.dfsg-2ubuntu9.2 [164 kB]
Get:58 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpython3.10-dev amd64 3.10.12-1~22.04.2 [4764 kB]
Get:59 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpython3-dev amd64 3.10.6-1~22.04 [7166 B]
Get:60 http://azure.archive.ubuntu.com/ubuntu jammy/main amd64 manpages-dev all 5.10-1ubuntu1 [2309 kB]
Get:61 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3.10-dev amd64 3.10.12-1~22.04.2 [507 kB]
Get:62 http://azure.archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-dev amd64 3.10.6-1~22.04 [26.0 kB]
Get:63 http://azure.archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-wheel all 0.37.1-2ubuntu0.22.04.1 [32.0 kB]
Get:64 http://azure.archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-pip all 22.0.2+dfsg-1ubuntu0.4 [1305 kB]
Fetched 71.3 MB in 1s (67.5 MB/s)
Selecting previously unselected package libc-dev-bin.
(Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 62680 files and directories currently installed.)
Preparing to unpack .../00-libc-dev-bin_2.35-0ubuntu3.4_amd64.deb ...
Unpacking libc-dev-bin (2.35-0ubuntu3.4) ...
Selecting previously unselected package linux-libc-dev:amd64.
Preparing to unpack .../01-linux-libc-dev_5.15.0-89.99_amd64.deb ...
Unpacking linux-libc-dev:amd64 (5.15.0-89.99) ...
Selecting previously unselected package libcrypt-dev:amd64.
Preparing to unpack .../02-libcrypt-dev_1%3a4.4.27-1_amd64.deb ...
Unpacking libcrypt-dev:amd64 (1:4.4.27-1) ...
Selecting previously unselected package rpcsvc-proto.
Preparing to unpack .../03-rpcsvc-proto_1.4.2-0ubuntu6_amd64.deb ...
Unpacking rpcsvc-proto (1.4.2-0ubuntu6) ...
Selecting previously unselected package libtirpc-dev:amd64.
Preparing to unpack .../04-libtirpc-dev_1.3.2-2ubuntu0.1_amd64.deb ...
Unpacking libtirpc-dev:amd64 (1.3.2-2ubuntu0.1) ...
Selecting previously unselected package libnsl-dev:amd64.
Preparing to unpack .../05-libnsl-dev_1.3.0-2build2_amd64.deb ...
Unpacking libnsl-dev:amd64 (1.3.0-2build2) ...
Selecting previously unselected package libc6-dev:amd64.
Preparing to unpack .../06-libc6-dev_2.35-0ubuntu3.4_amd64.deb ...
Unpacking libc6-dev:amd64 (2.35-0ubuntu3.4) ...
Selecting previously unselected package gcc-11-base:amd64.
Preparing to unpack .../07-gcc-11-base_11.4.0-1ubuntu1~22.04_amd64.deb ...
Unpacking gcc-11-base:amd64 (11.4.0-1ubuntu1~22.04) ...
Selecting previously unselected package libisl23:amd64.
Preparing to unpack .../08-libisl23_0.24-2build1_amd64.deb ...
Unpacking libisl23:amd64 (0.24-2build1) ...
Selecting previously unselected package libmpc3:amd64.
Preparing to unpack .../09-libmpc3_1.2.1-2build1_amd64.deb ...
Unpacking libmpc3:amd64 (1.2.1-2build1) ...
Selecting previously unselected package cpp-11.
Preparing to unpack .../10-cpp-11_11.4.0-1ubuntu1~22.04_amd64.deb ...
Unpacking cpp-11 (11.4.0-1ubuntu1~22.04) ...
Selecting previously unselected package cpp.
Preparing to unpack .../11-cpp_4%3a11.2.0-1ubuntu1_amd64.deb ...
Unpacking cpp (4:11.2.0-1ubuntu1) ...
Selecting previously unselected package libcc1-0:amd64.
Preparing to unpack .../12-libcc1-0_12.3.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libcc1-0:amd64 (12.3.0-1ubuntu1~22.04) ...
Selecting previously unselected package libgomp1:amd64.
Preparing to unpack .../13-libgomp1_12.3.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libgomp1:amd64 (12.3.0-1ubuntu1~22.04) ...
Selecting previously unselected package libitm1:amd64.
Preparing to unpack .../14-libitm1_12.3.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libitm1:amd64 (12.3.0-1ubuntu1~22.04) ...
Selecting previously unselected package libatomic1:amd64.
Preparing to unpack .../15-libatomic1_12.3.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libatomic1:amd64 (12.3.0-1ubuntu1~22.04) ...
Selecting previously unselected package libasan6:amd64.
Preparing to unpack .../16-libasan6_11.4.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libasan6:amd64 (11.4.0-1ubuntu1~22.04) ...
Selecting previously unselected package liblsan0:amd64.
Preparing to unpack .../17-liblsan0_12.3.0-1ubuntu1~22.04_amd64.deb ...
Unpacking liblsan0:amd64 (12.3.0-1ubuntu1~22.04) ...
Selecting previously unselected package libtsan0:amd64.
Preparing to unpack .../18-libtsan0_11.4.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libtsan0:amd64 (11.4.0-1ubuntu1~22.04) ...
Selecting previously unselected package libubsan1:amd64.
Preparing to unpack .../19-libubsan1_12.3.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libubsan1:amd64 (12.3.0-1ubuntu1~22.04) ...
Selecting previously unselected package libquadmath0:amd64.
Preparing to unpack .../20-libquadmath0_12.3.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libquadmath0:amd64 (12.3.0-1ubuntu1~22.04) ...
Selecting previously unselected package libgcc-11-dev:amd64.
Preparing to unpack .../21-libgcc-11-dev_11.4.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libgcc-11-dev:amd64 (11.4.0-1ubuntu1~22.04) ...
Selecting previously unselected package gcc-11.
Preparing to unpack .../22-gcc-11_11.4.0-1ubuntu1~22.04_amd64.deb ...
Unpacking gcc-11 (11.4.0-1ubuntu1~22.04) ...
Selecting previously unselected package gcc.
Preparing to unpack .../23-gcc_4%3a11.2.0-1ubuntu1_amd64.deb ...
Unpacking gcc (4:11.2.0-1ubuntu1) ...
Selecting previously unselected package libstdc++-11-dev:amd64.
Preparing to unpack .../24-libstdc++-11-dev_11.4.0-1ubuntu1~22.04_amd64.deb ...
Unpacking libstdc++-11-dev:amd64 (11.4.0-1ubuntu1~22.04) ...
Selecting previously unselected package g++-11.
Preparing to unpack .../25-g++-11_11.4.0-1ubuntu1~22.04_amd64.deb ...
Unpacking g++-11 (11.4.0-1ubuntu1~22.04) ...
Selecting previously unselected package g++.
Preparing to unpack .../26-g++_4%3a11.2.0-1ubuntu1_amd64.deb ...
Unpacking g++ (4:11.2.0-1ubuntu1) ...
Selecting previously unselected package make.
Preparing to unpack .../27-make_4.3-4.1build1_amd64.deb ...
Unpacking make (4.3-4.1build1) ...
Selecting previously unselected package libdpkg-perl.
Preparing to unpack .../28-libdpkg-perl_1.21.1ubuntu2.2_all.deb ...
Unpacking libdpkg-perl (1.21.1ubuntu2.2) ...
Selecting previously unselected package bzip2.
Preparing to unpack .../29-bzip2_1.0.8-5build1_amd64.deb ...
Unpacking bzip2 (1.0.8-5build1) ...
Selecting previously unselected package lto-disabled-list.
Preparing to unpack .../30-lto-disabled-list_24_all.deb ...
Unpacking lto-disabled-list (24) ...
Selecting previously unselected package dpkg-dev.
Preparing to unpack .../31-dpkg-dev_1.21.1ubuntu2.2_all.deb ...
Unpacking dpkg-dev (1.21.1ubuntu2.2) ...
Selecting previously unselected package build-essential.
Preparing to unpack .../32-build-essential_12.9ubuntu3_amd64.deb ...
Unpacking build-essential (12.9ubuntu3) ...
Selecting previously unselected package libfakeroot:amd64.
Preparing to unpack .../33-libfakeroot_1.28-1ubuntu1_amd64.deb ...
Unpacking libfakeroot:amd64 (1.28-1ubuntu1) ...
Selecting previously unselected package fakeroot.
Preparing to unpack .../34-fakeroot_1.28-1ubuntu1_amd64.deb ...
Unpacking fakeroot (1.28-1ubuntu1) ...
Selecting previously unselected package fonts-dejavu-core.
Preparing to unpack .../35-fonts-dejavu-core_2.37-2build1_all.deb ...
Unpacking fonts-dejavu-core (2.37-2build1) ...
Selecting previously unselected package fontconfig-config.
Preparing to unpack .../36-fontconfig-config_2.13.1-4.2ubuntu5_all.deb ...
Unpacking fontconfig-config (2.13.1-4.2ubuntu5) ...
Selecting previously unselected package javascript-common.
Preparing to unpack .../37-javascript-common_11+nmu1_all.deb ...
Unpacking javascript-common (11+nmu1) ...
Selecting previously unselected package libalgorithm-diff-perl.
Preparing to unpack .../38-libalgorithm-diff-perl_1.201-1_all.deb ...
Unpacking libalgorithm-diff-perl (1.201-1) ...
Selecting previously unselected package libalgorithm-diff-xs-perl.
Preparing to unpack .../39-libalgorithm-diff-xs-perl_0.04-6build3_amd64.deb ...
Unpacking libalgorithm-diff-xs-perl (0.04-6build3) ...
Selecting previously unselected package libalgorithm-merge-perl.
Preparing to unpack .../40-libalgorithm-merge-perl_0.08-3_all.deb ...
Unpacking libalgorithm-merge-perl (0.08-3) ...
Selecting previously unselected package libfontconfig1:amd64.
Preparing to unpack .../41-libfontconfig1_2.13.1-4.2ubuntu5_amd64.deb ...
Unpacking libfontconfig1:amd64 (2.13.1-4.2ubuntu5) ...
Selecting previously unselected package libjpeg-turbo8:amd64.
Preparing to unpack .../42-libjpeg-turbo8_2.1.2-0ubuntu1_amd64.deb ...
Unpacking libjpeg-turbo8:amd64 (2.1.2-0ubuntu1) ...
Selecting previously unselected package libjpeg8:amd64.
Preparing to unpack .../43-libjpeg8_8c-2ubuntu10_amd64.deb ...
Unpacking libjpeg8:amd64 (8c-2ubuntu10) ...
Selecting previously unselected package libdeflate0:amd64.
Preparing to unpack .../44-libdeflate0_1.10-2_amd64.deb ...
Unpacking libdeflate0:amd64 (1.10-2) ...
Selecting previously unselected package libjbig0:amd64.
Preparing to unpack .../45-libjbig0_2.1-3.1ubuntu0.22.04.1_amd64.deb ...
Unpacking libjbig0:amd64 (2.1-3.1ubuntu0.22.04.1) ...
Selecting previously unselected package libwebp7:amd64.
Preparing to unpack .../46-libwebp7_1.2.2-2ubuntu0.22.04.2_amd64.deb ...
Unpacking libwebp7:amd64 (1.2.2-2ubuntu0.22.04.2) ...
Selecting previously unselected package libtiff5:amd64.
Preparing to unpack .../47-libtiff5_4.3.0-6ubuntu0.6_amd64.deb ...
Unpacking libtiff5:amd64 (4.3.0-6ubuntu0.6) ...
Selecting previously unselected package libxpm4:amd64.
Preparing to unpack .../48-libxpm4_1%3a3.5.12-1ubuntu0.22.04.2_amd64.deb ...
Unpacking libxpm4:amd64 (1:3.5.12-1ubuntu0.22.04.2) ...
Selecting previously unselected package libgd3:amd64.
Preparing to unpack .../49-libgd3_2.3.0-2ubuntu2_amd64.deb ...
Unpacking libgd3:amd64 (2.3.0-2ubuntu2) ...
Selecting previously unselected package libc-devtools.
Preparing to unpack .../50-libc-devtools_2.35-0ubuntu3.4_amd64.deb ...
Unpacking libc-devtools (2.35-0ubuntu3.4) ...
Selecting previously unselected package libexpat1-dev:amd64.
Preparing to unpack .../51-libexpat1-dev_2.4.7-1ubuntu0.2_amd64.deb ...
Unpacking libexpat1-dev:amd64 (2.4.7-1ubuntu0.2) ...
Selecting previously unselected package libfile-fcntllock-perl.
Preparing to unpack .../52-libfile-fcntllock-perl_0.22-3build7_amd64.deb ...
Unpacking libfile-fcntllock-perl (0.22-3build7) ...
Selecting previously unselected package libjs-jquery.
Preparing to unpack .../53-libjs-jquery_3.6.0+dfsg+~3.5.13-1_all.deb ...
Unpacking libjs-jquery (3.6.0+dfsg+~3.5.13-1) ...
Selecting previously unselected package libjs-underscore.
Preparing to unpack .../54-libjs-underscore_1.13.2~dfsg-2_all.deb ...
Unpacking libjs-underscore (1.13.2~dfsg-2) ...
Selecting previously unselected package libjs-sphinxdoc.
Preparing to unpack .../55-libjs-sphinxdoc_4.3.2-1_all.deb ...
Unpacking libjs-sphinxdoc (4.3.2-1) ...
Selecting previously unselected package zlib1g-dev:amd64.
Preparing to unpack .../56-zlib1g-dev_1%3a1.2.11.dfsg-2ubuntu9.2_amd64.deb ...
Unpacking zlib1g-dev:amd64 (1:1.2.11.dfsg-2ubuntu9.2) ...
Selecting previously unselected package libpython3.10-dev:amd64.
Preparing to unpack .../57-libpython3.10-dev_3.10.12-1~22.04.2_amd64.deb ...
Unpacking libpython3.10-dev:amd64 (3.10.12-1~22.04.2) ...
Selecting previously unselected package libpython3-dev:amd64.
Preparing to unpack .../58-libpython3-dev_3.10.6-1~22.04_amd64.deb ...
Unpacking libpython3-dev:amd64 (3.10.6-1~22.04) ...
Selecting previously unselected package manpages-dev.
Preparing to unpack .../59-manpages-dev_5.10-1ubuntu1_all.deb ...
Unpacking manpages-dev (5.10-1ubuntu1) ...
Selecting previously unselected package python3.10-dev.
Preparing to unpack .../60-python3.10-dev_3.10.12-1~22.04.2_amd64.deb ...
Unpacking python3.10-dev (3.10.12-1~22.04.2) ...
Selecting previously unselected package python3-dev.
Preparing to unpack .../61-python3-dev_3.10.6-1~22.04_amd64.deb ...
Unpacking python3-dev (3.10.6-1~22.04) ...
Selecting previously unselected package python3-wheel.
Preparing to unpack .../62-python3-wheel_0.37.1-2ubuntu0.22.04.1_all.deb ...
Unpacking python3-wheel (0.37.1-2ubuntu0.22.04.1) ...
Selecting previously unselected package python3-pip.
Preparing to unpack .../63-python3-pip_22.0.2+dfsg-1ubuntu0.4_all.deb ...
Unpacking python3-pip (22.0.2+dfsg-1ubuntu0.4) ...
Setting up javascript-common (11+nmu1) ...
Setting up gcc-11-base:amd64 (11.4.0-1ubuntu1~22.04) ...
Setting up manpages-dev (5.10-1ubuntu1) ...
Setting up lto-disabled-list (24) ...
Setting up libxpm4:amd64 (1:3.5.12-1ubuntu0.22.04.2) ...
Setting up libfile-fcntllock-perl (0.22-3build7) ...
Setting up libalgorithm-diff-perl (1.201-1) ...
Setting up libdeflate0:amd64 (1.10-2) ...
Setting up linux-libc-dev:amd64 (5.15.0-89.99) ...
Setting up libgomp1:amd64 (12.3.0-1ubuntu1~22.04) ...
Setting up bzip2 (1.0.8-5build1) ...
Setting up python3-wheel (0.37.1-2ubuntu0.22.04.1) ...
Setting up libjbig0:amd64 (2.1-3.1ubuntu0.22.04.1) ...
Setting up libfakeroot:amd64 (1.28-1ubuntu1) ...
Setting up libasan6:amd64 (11.4.0-1ubuntu1~22.04) ...
Setting up fakeroot (1.28-1ubuntu1) ...
update-alternatives: using /usr/bin/fakeroot-sysv to provide /usr/bin/fakeroot (fakeroot) in auto mode
Setting up libtirpc-dev:amd64 (1.3.2-2ubuntu0.1) ...
Setting up rpcsvc-proto (1.4.2-0ubuntu6) ...
Setting up make (4.3-4.1build1) ...
Setting up libquadmath0:amd64 (12.3.0-1ubuntu1~22.04) ...
Setting up libmpc3:amd64 (1.2.1-2build1) ...
Setting up libatomic1:amd64 (12.3.0-1ubuntu1~22.04) ...
Setting up fonts-dejavu-core (2.37-2build1) ...
Setting up python3-pip (22.0.2+dfsg-1ubuntu0.4) ...
Setting up libjpeg-turbo8:amd64 (2.1.2-0ubuntu1) ...
Setting up libdpkg-perl (1.21.1ubuntu2.2) ...
Setting up libwebp7:amd64 (1.2.2-2ubuntu0.22.04.2) ...
Setting up libubsan1:amd64 (12.3.0-1ubuntu1~22.04) ...
Setting up libnsl-dev:amd64 (1.3.0-2build2) ...
Setting up libcrypt-dev:amd64 (1:4.4.27-1) ...
Setting up libjs-jquery (3.6.0+dfsg+~3.5.13-1) ...
Setting up libisl23:amd64 (0.24-2build1) ...
Setting up libc-dev-bin (2.35-0ubuntu3.4) ...
Setting up libalgorithm-diff-xs-perl (0.04-6build3) ...
Setting up libcc1-0:amd64 (12.3.0-1ubuntu1~22.04) ...
Setting up liblsan0:amd64 (12.3.0-1ubuntu1~22.04) ...
Setting up libitm1:amd64 (12.3.0-1ubuntu1~22.04) ...
Setting up libjs-underscore (1.13.2~dfsg-2) ...
Setting up libalgorithm-merge-perl (0.08-3) ...
Setting up libtsan0:amd64 (11.4.0-1ubuntu1~22.04) ...
Setting up libjpeg8:amd64 (8c-2ubuntu10) ...
Setting up cpp-11 (11.4.0-1ubuntu1~22.04) ...
Setting up fontconfig-config (2.13.1-4.2ubuntu5) ...
Setting up dpkg-dev (1.21.1ubuntu2.2) ...
Setting up libjs-sphinxdoc (4.3.2-1) ...
Setting up libgcc-11-dev:amd64 (11.4.0-1ubuntu1~22.04) ...
Setting up gcc-11 (11.4.0-1ubuntu1~22.04) ...
Setting up cpp (4:11.2.0-1ubuntu1) ...
Setting up libc6-dev:amd64 (2.35-0ubuntu3.4) ...
Setting up libtiff5:amd64 (4.3.0-6ubuntu0.6) ...
Setting up libfontconfig1:amd64 (2.13.1-4.2ubuntu5) ...
Setting up gcc (4:11.2.0-1ubuntu1) ...
Setting up libexpat1-dev:amd64 (2.4.7-1ubuntu0.2) ...
Setting up libgd3:amd64 (2.3.0-2ubuntu2) ...
Setting up libstdc++-11-dev:amd64 (11.4.0-1ubuntu1~22.04) ...
Setting up zlib1g-dev:amd64 (1:1.2.11.dfsg-2ubuntu9.2) ...
Setting up libc-devtools (2.35-0ubuntu3.4) ...
Setting up g++-11 (11.4.0-1ubuntu1~22.04) ...
Setting up libpython3.10-dev:amd64 (3.10.12-1~22.04.2) ...
Setting up python3.10-dev (3.10.12-1~22.04.2) ...
Setting up g++ (4:11.2.0-1ubuntu1) ...
update-alternatives: using /usr/bin/g++ to provide /usr/bin/c++ (c++) in auto mode
Setting up build-essential (12.9ubuntu3) ...
Setting up libpython3-dev:amd64 (3.10.6-1~22.04) ...
Setting up python3-dev (3.10.6-1~22.04) ...
Processing triggers for man-db (2.10.2-1) ...
Processing triggers for libc-bin (2.35-0ubuntu3.4) ...
NEEDRESTART-VER: 3.5
NEEDRESTART-KCUR: 6.2.0-1016-azure
NEEDRESTART-KEXP: 6.2.0-1016-azure
NEEDRESTART-KSTA: 1
Requirement already satisfied: pip in /usr/lib/python3/dist-packages (22.0.2)
Collecting pip
  Downloading pip-23.3.1-py3-none-any.whl (2.1 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.1/2.1 MB 23.5 MB/s eta 0:00:00
Installing collected packages: pip
  Attempting uninstall: pip
    Found existing installation: pip 22.0.2
    Not uninstalling pip at /usr/lib/python3/dist-packages, outside environment /usr
    Can't uninstall 'pip'. No files were found to uninstall.
Successfully installed pip-23.3.1
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Reading package lists...
Building dependency tree...
Reading state information...
The following NEW packages will be installed:
  python3-osm-im python3-osmclient
0 upgraded, 2 newly installed, 0 to remove and 4 not upgraded.
Need to get 244 kB of archives.
After this operation, 8383 kB of additional disk space will be used.
Get:1 https://osm-download.etsi.org/repository/osm/debian/testing-daily testing/IM amd64 python3-osm-im all 13.0.0.post18+gef549b0-1 [175 kB]
Get:2 https://osm-download.etsi.org/repository/osm/debian/testing-daily testing/osmclient amd64 python3-osmclient all 11.0.0rc1.post56+g34ef701-1 [68.2 kB]
Fetched 244 kB in 0s (1017 kB/s)
Selecting previously unselected package python3-osm-im.
(Reading database ... 
(Reading database ... 5%
(Reading database ... 10%
(Reading database ... 15%
(Reading database ... 20%
(Reading database ... 25%
(Reading database ... 30%
(Reading database ... 35%
(Reading database ... 40%
(Reading database ... 45%
(Reading database ... 50%
(Reading database ... 55%
(Reading database ... 60%
(Reading database ... 65%
(Reading database ... 70%
(Reading database ... 75%
(Reading database ... 80%
(Reading database ... 85%
(Reading database ... 90%
(Reading database ... 95%
(Reading database ... 100%
(Reading database ... 69467 files and directories currently installed.)
Preparing to unpack .../python3-osm-im_13.0.0.post18+gef549b0-1_all.deb ...
Unpacking python3-osm-im (13.0.0.post18+gef549b0-1) ...
Selecting previously unselected package python3-osmclient.
Preparing to unpack .../python3-osmclient_11.0.0rc1.post56+g34ef701-1_all.deb ...
Unpacking python3-osmclient (11.0.0rc1.post56+g34ef701-1) ...
Setting up python3-osmclient (11.0.0rc1.post56+g34ef701-1) ...
Setting up python3-osm-im (13.0.0.post18+gef549b0-1) ...
NEEDRESTART-VER: 3.5
NEEDRESTART-KCUR: 6.2.0-1016-azure
NEEDRESTART-KEXP: 6.2.0-1016-azure
NEEDRESTART-KSTA: 1
Defaulting to user installation because normal site-packages is not writeable
Collecting enum34==1.1.10 (from -r /usr/lib/python3/dist-packages/osm_im/requirements.txt (line 17))
  Downloading enum34-1.1.10-py3-none-any.whl (11 kB)
Collecting lxml==4.9.3 (from -r /usr/lib/python3/dist-packages/osm_im/requirements.txt (line 19))
  Downloading lxml-4.9.3-cp310-cp310-manylinux_2_28_x86_64.whl.metadata (3.8 kB)
Collecting pyang==2.5.3 (from -r /usr/lib/python3/dist-packages/osm_im/requirements.txt (line 23))
  Downloading pyang-2.5.3-py2.py3-none-any.whl (592 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 592.9/592.9 kB 16.2 MB/s eta 0:00:00
Collecting pyangbind==0.8.3.post1 (from -r /usr/lib/python3/dist-packages/osm_im/requirements.txt (line 27))
  Downloading pyangbind-0.8.3.post1-py3-none-any.whl.metadata (4.2 kB)
Collecting pyyaml==6.0.1 (from -r /usr/lib/python3/dist-packages/osm_im/requirements.txt (line 29))
  Downloading PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB)
Collecting regex==2023.8.8 (from -r /usr/lib/python3/dist-packages/osm_im/requirements.txt (line 31))
  Downloading regex-2023.8.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (40 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 40.9/40.9 kB 3.7 MB/s eta 0:00:00
Requirement already satisfied: six==1.16.0 in /usr/lib/python3/dist-packages (from -r /usr/lib/python3/dist-packages/osm_im/requirements.txt (line 33)) (1.16.0)
Downloading lxml-4.9.3-cp310-cp310-manylinux_2_28_x86_64.whl (7.9 MB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.9/7.9 MB 97.0 MB/s eta 0:00:00
Downloading pyangbind-0.8.3.post1-py3-none-any.whl (51 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 51.8/51.8 kB 3.4 MB/s eta 0:00:00
Downloading PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (705 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 705.5/705.5 kB 43.2 MB/s eta 0:00:00
Downloading regex-2023.8.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (771 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 771.9/771.9 kB 38.8 MB/s eta 0:00:00
Installing collected packages: enum34, regex, pyyaml, lxml, pyang, pyangbind
Successfully installed enum34-1.1.10 lxml-4.9.3 pyang-2.5.3 pyangbind-0.8.3.post1 pyyaml-6.0.1 regex-2023.8.8
Reading package lists...
Building dependency tree...
Reading state information...
libmagic1 is already the newest version (1:5.41-3ubuntu0.1).
libmagic1 set to manually installed.
0 upgraded, 0 newly installed, 0 to remove and 4 not upgraded.
Defaulting to user installation because normal site-packages is not writeable
Collecting certifi==2023.7.22 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 17))
  Downloading certifi-2023.7.22-py3-none-any.whl.metadata (2.2 kB)
Collecting charset-normalizer==3.2.0 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 19))
  Downloading charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (31 kB)
Collecting click==8.1.7 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 21))
  Downloading click-8.1.7-py3-none-any.whl.metadata (3.0 kB)
Collecting idna==3.4 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 23))
  Downloading idna-3.4-py3-none-any.whl (61 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.5/61.5 kB 4.2 MB/s eta 0:00:00
Collecting jinja2==3.1.2 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 25))
  Downloading Jinja2-3.1.2-py3-none-any.whl (133 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.1/133.1 kB 8.3 MB/s eta 0:00:00
Collecting jsonpath-ng==1.6.0 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 27))
  Downloading jsonpath_ng-1.6.0-py3-none-any.whl.metadata (17 kB)
Collecting markupsafe==2.1.3 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 29))
  Downloading MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.0 kB)
Collecting packaging==23.1 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 31))
  Downloading packaging-23.1-py3-none-any.whl (48 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 48.9/48.9 kB 4.8 MB/s eta 0:00:00
Collecting ply==3.11 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 33))
  Downloading ply-3.11-py2.py3-none-any.whl (49 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 49.6/49.6 kB 4.0 MB/s eta 0:00:00
Collecting prettytable==3.9.0 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 35))
  Downloading prettytable-3.9.0-py3-none-any.whl.metadata (26 kB)
Collecting python-magic==0.4.27 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 37))
  Downloading python_magic-0.4.27-py2.py3-none-any.whl (13 kB)
Requirement already satisfied: pyyaml==6.0.1 in ./.local/lib/python3.10/site-packages (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 39)) (6.0.1)
Collecting requests==2.31.0 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 41))
  Downloading requests-2.31.0-py3-none-any.whl.metadata (4.6 kB)
Collecting urllib3==2.0.5 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 43))
  Downloading urllib3-2.0.5-py3-none-any.whl.metadata (6.6 kB)
Collecting verboselogs==1.7 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 45))
  Downloading verboselogs-1.7-py2.py3-none-any.whl (11 kB)
Collecting wcwidth==0.2.6 (from -r /usr/lib/python3/dist-packages/osmclient/requirements.txt (line 47))
  Downloading wcwidth-0.2.6-py2.py3-none-any.whl (29 kB)
Downloading certifi-2023.7.22-py3-none-any.whl (158 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 158.3/158.3 kB 11.6 MB/s eta 0:00:00
Downloading charset_normalizer-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (201 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 201.8/201.8 kB 10.5 MB/s eta 0:00:00
Downloading click-8.1.7-py3-none-any.whl (97 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 97.9/97.9 kB 7.5 MB/s eta 0:00:00
Downloading jsonpath_ng-1.6.0-py3-none-any.whl (29 kB)
Downloading MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)
Downloading prettytable-3.9.0-py3-none-any.whl (27 kB)
Downloading requests-2.31.0-py3-none-any.whl (62 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.6/62.6 kB 5.2 MB/s eta 0:00:00
Downloading urllib3-2.0.5-py3-none-any.whl (123 kB)
   ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 123.8/123.8 kB 6.9 MB/s eta 0:00:00
Installing collected packages: wcwidth, verboselogs, ply, urllib3, python-magic, prettytable, packaging, markupsafe, jsonpath-ng, idna, click, charset-normalizer, certifi, requests, jinja2
  WARNING: The script jsonpath_ng is installed in '/home/ubuntu/.local/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
  WARNING: The script normalizer is installed in '/home/ubuntu/.local/bin' which is not on PATH.
  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.
Successfully installed certifi-2023.7.22 charset-normalizer-3.2.0 click-8.1.7 idna-3.4 jinja2-3.1.2 jsonpath-ng-1.6.0 markupsafe-2.1.3 packaging-23.1 ply-3.11 prettytable-3.9.0 python-magic-0.4.27 requests-2.31.0 urllib3-2.0.5 verboselogs-1.7 wcwidth-0.2.6

OSM client installed
OSM client assumes that OSM host is running in localhost (127.0.0.1).
In case you want to interact with a different OSM host, you will have to configure this env variable in your .bashrc file:
     export OSM_HOSTNAME=<OSM_host>
Track osmclient osmclient_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556761&event=osmclient&operation=osmclient_ok&value=&comment=&tags=
Checking OSM health state...
helm -n osm list
NAME        	NAMESPACE	REVISION	UPDATED                                	STATUS  	CHART                        	APP VERSION
airflow     	osm      	1       	2023-11-21 08:47:44.793923648 +0000 UTC	deployed	airflow-1.9.0                	2.5.3      
alertmanager	osm      	1       	2023-11-21 08:51:51.539308328 +0000 UTC	deployed	alertmanager-0.22.0          	v0.24.0    
mongodb-k8s 	osm      	1       	2023-11-21 08:47:38.688240733 +0000 UTC	deployed	mongodb-13.9.4               	6.0.5      
osm         	osm      	1       	2023-11-21 08:47:39.277890243 +0000 UTC	deployed	osm-0.0.1                    	14         
pushgateway 	osm      	1       	2023-11-21 08:51:46.822442539 +0000 UTC	deployed	prometheus-pushgateway-1.18.2	1.4.2      
helm -n osm status osm
NAME: osm
LAST DEPLOYED: Tue Nov 21 08:47:39 2023
NAMESPACE: osm
STATUS: deployed
REVISION: 1
TEST SUITE: None
NOTES:
1. Get the application URL by running these commands:
  export NODE_PORT=$(kubectl get --namespace osm -o jsonpath="{.spec.ports[0].nodePort}" services nbi)
  export NODE_IP=$(kubectl get nodes --namespace osm -o jsonpath="{.items[0].status.addresses[0].address}")
  echo http://$NODE_IP:$NODE_PORT
===> Successful checks: 1/24
===> Successful checks: 2/24
===> Successful checks: 3/24
===> Successful checks: 4/24
===> Successful checks: 5/24
===> Successful checks: 6/24
===> Successful checks: 7/24
===> Successful checks: 8/24
===> Successful checks: 9/24
===> Successful checks: 10/24
===> Successful checks: 11/24
===> Successful checks: 12/24
===> Successful checks: 13/24
===> Successful checks: 14/24
===> Successful checks: 15/24
===> Successful checks: 16/24
===> Successful checks: 17/24
===> Successful checks: 18/24
===> Successful checks: 19/24
===> Successful checks: 20/24
===> Successful checks: 21/24
===> Successful checks: 22/24
===> Successful checks: 23/24
===> Successful checks: 24/24
SYSTEM IS READY
Track healthchecks after_healthcheck_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556885&event=healthchecks&operation=after_healthcheck_ok&value=&comment=&tags=
721852d8-521d-4627-8b22-0e3177c9bd53
c4a2e115-e6b4-41c5-b82a-c74eda37ee79
Track final_ops add_local_k8scluster_ok: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556888&event=final_ops&operation=add_local_k8scluster_ok&value=&comment=&tags=
Fixing firewall so docker and LXD can share the same host without affecting each other.
# Generated by iptables-save v1.8.7 on Tue Nov 21 08:54:48 2023
*mangle
:PREROUTING ACCEPT [0:0]
:INPUT ACCEPT [0:0]
:FORWARD ACCEPT [0:0]
:OUTPUT ACCEPT [0:0]
:POSTROUTING ACCEPT [0:0]
:KUBE-IPTABLES-HINT - [0:0]
:KUBE-KUBELET-CANARY - [0:0]
:KUBE-PROXY-CANARY - [0:0]
COMMIT
# Completed on Tue Nov 21 08:54:48 2023
# Generated by iptables-save v1.8.7 on Tue Nov 21 08:54:48 2023
*security
:INPUT ACCEPT [0:0]
:FORWARD ACCEPT [0:0]
:OUTPUT ACCEPT [285131:119178192]
-A OUTPUT -d 168.63.129.16/32 -p tcp -m tcp --dport 53 -j ACCEPT
-A OUTPUT -d 168.63.129.16/32 -p tcp -m owner --uid-owner 0 -j ACCEPT
-A OUTPUT -d 168.63.129.16/32 -p tcp -m conntrack --ctstate INVALID,NEW -j DROP
COMMIT
# Completed on Tue Nov 21 08:54:48 2023
# Generated by iptables-save v1.8.7 on Tue Nov 21 08:54:48 2023
*filter
:INPUT ACCEPT [0:0]
:FORWARD ACCEPT [0:0]
:OUTPUT ACCEPT [0:0]
:DOCKER - [0:0]
:DOCKER-ISOLATION-STAGE-1 - [0:0]
:DOCKER-ISOLATION-STAGE-2 - [0:0]
:DOCKER-USER - [0:0]
:FLANNEL-FWD - [0:0]
:KUBE-EXTERNAL-SERVICES - [0:0]
:KUBE-FIREWALL - [0:0]
:KUBE-FORWARD - [0:0]
:KUBE-KUBELET-CANARY - [0:0]
:KUBE-NODEPORTS - [0:0]
:KUBE-PROXY-CANARY - [0:0]
:KUBE-PROXY-FIREWALL - [0:0]
:KUBE-SERVICES - [0:0]
-A INPUT -m conntrack --ctstate NEW -m comment --comment "kubernetes load balancer firewall" -j KUBE-PROXY-FIREWALL
-A INPUT -m comment --comment "kubernetes health check service ports" -j KUBE-NODEPORTS
-A INPUT -m conntrack --ctstate NEW -m comment --comment "kubernetes externally-visible service portals" -j KUBE-EXTERNAL-SERVICES
-A INPUT -j KUBE-FIREWALL
-A FORWARD -m conntrack --ctstate NEW -m comment --comment "kubernetes load balancer firewall" -j KUBE-PROXY-FIREWALL
-A FORWARD -m comment --comment "kubernetes forwarding rules" -j KUBE-FORWARD
-A FORWARD -m conntrack --ctstate NEW -m comment --comment "kubernetes service portals" -j KUBE-SERVICES
-A FORWARD -m conntrack --ctstate NEW -m comment --comment "kubernetes externally-visible service portals" -j KUBE-EXTERNAL-SERVICES
-A FORWARD -j DOCKER-USER
-A FORWARD -j DOCKER-ISOLATION-STAGE-1
-A FORWARD -o docker0 -m conntrack --ctstate RELATED,ESTABLISHED -j ACCEPT
-A FORWARD -o docker0 -j DOCKER
-A FORWARD -i docker0 ! -o docker0 -j ACCEPT
-A FORWARD -i docker0 -o docker0 -j ACCEPT
-A FORWARD -m comment --comment "flanneld forward" -j FLANNEL-FWD
-A OUTPUT -m conntrack --ctstate NEW -m comment --comment "kubernetes load balancer firewall" -j KUBE-PROXY-FIREWALL
-A OUTPUT -m conntrack --ctstate NEW -m comment --comment "kubernetes service portals" -j KUBE-SERVICES
-A OUTPUT -j KUBE-FIREWALL
-A DOCKER-ISOLATION-STAGE-1 -i docker0 ! -o docker0 -j DOCKER-ISOLATION-STAGE-2
-A DOCKER-ISOLATION-STAGE-1 -j RETURN
-A DOCKER-ISOLATION-STAGE-2 -o docker0 -j DROP
-A DOCKER-ISOLATION-STAGE-2 -j RETURN
-A DOCKER-USER -j ACCEPT
-A DOCKER-USER -j RETURN
-A FLANNEL-FWD -s 10.244.0.0/16 -m comment --comment "flanneld forward" -j ACCEPT
-A FLANNEL-FWD -d 10.244.0.0/16 -m comment --comment "flanneld forward" -j ACCEPT
-A KUBE-FIREWALL ! -s 127.0.0.0/8 -d 127.0.0.0/8 -m comment --comment "block incoming localnet connections" -m conntrack ! --ctstate RELATED,ESTABLISHED,DNAT -j DROP
-A KUBE-FIREWALL -m comment --comment "kubernetes firewall for dropping marked packets" -m mark --mark 0x8000/0x8000 -j DROP
-A KUBE-FORWARD -m conntrack --ctstate INVALID -j DROP
-A KUBE-FORWARD -m comment --comment "kubernetes forwarding rules" -m mark --mark 0x4000/0x4000 -j ACCEPT
-A KUBE-FORWARD -m comment --comment "kubernetes forwarding conntrack rule" -m conntrack --ctstate RELATED,ESTABLISHED -j ACCEPT
COMMIT
# Completed on Tue Nov 21 08:54:48 2023
# Generated by iptables-save v1.8.7 on Tue Nov 21 08:54:48 2023
*nat
:PREROUTING ACCEPT [0:0]
:INPUT ACCEPT [0:0]
:OUTPUT ACCEPT [0:0]
:POSTROUTING ACCEPT [0:0]
:DOCKER - [0:0]
:FLANNEL-POSTRTG - [0:0]
:KUBE-EXT-BF2LB5WJRBPLA42J - [0:0]
:KUBE-EXT-FQUXG555W5IVIWW3 - [0:0]
:KUBE-EXT-GZN4S7ND4PF6YXD6 - [0:0]
:KUBE-EXT-MD4PSIBW5SPF2DB3 - [0:0]
:KUBE-EXT-PQIZCPF63EFIBBJY - [0:0]
:KUBE-EXT-VTS3SY4BGCZLE3HK - [0:0]
:KUBE-EXT-XUD4OEZNIHB47KQL - [0:0]
:KUBE-EXT-YA74QX5VY2UAABIX - [0:0]
:KUBE-KUBELET-CANARY - [0:0]
:KUBE-MARK-DROP - [0:0]
:KUBE-MARK-MASQ - [0:0]
:KUBE-NODEPORTS - [0:0]
:KUBE-POSTROUTING - [0:0]
:KUBE-PROXY-CANARY - [0:0]
:KUBE-SEP-22JYILAGAWOEZ2V7 - [0:0]
:KUBE-SEP-2EKL53EI67NMIYCL - [0:0]
:KUBE-SEP-2KV723JD5URYWNXM - [0:0]
:KUBE-SEP-5D2WEBEDBSLEFQZO - [0:0]
:KUBE-SEP-5ME4XGOFV45WR7Q5 - [0:0]
:KUBE-SEP-EELVSR32NONIBZHH - [0:0]
:KUBE-SEP-FOB2KBXV2FRZH6LW - [0:0]
:KUBE-SEP-FYSEASATA4IFOBM5 - [0:0]
:KUBE-SEP-HJ7EWOW62IX6GL6R - [0:0]
:KUBE-SEP-HRQVWSF7IHI77K53 - [0:0]
:KUBE-SEP-ISHFB2I4PMRIVC6G - [0:0]
:KUBE-SEP-JFFC7E2KKG7F5OP4 - [0:0]
:KUBE-SEP-JIWKU7LWBAE46CYF - [0:0]
:KUBE-SEP-K5ETGAT6VOT2SO74 - [0:0]
:KUBE-SEP-MHWK5N6X4ER6N5YC - [0:0]
:KUBE-SEP-OP4AXEAS4OXHBEQX - [0:0]
:KUBE-SEP-PTRZXBEV4JI54VRO - [0:0]
:KUBE-SEP-PUHFDAMRBZWCPADU - [0:0]
:KUBE-SEP-QTEADDRAHUZGD5RY - [0:0]
:KUBE-SEP-R7EMXN5TTQQVP4UW - [0:0]
:KUBE-SEP-SDEOVHTVPO7QT7XS - [0:0]
:KUBE-SEP-SF3LG62VAE5ALYDV - [0:0]
:KUBE-SEP-WXWGHGKZOCNYRYI7 - [0:0]
:KUBE-SERVICES - [0:0]
:KUBE-SVC-5QOWUZVRO3UICSLI - [0:0]
:KUBE-SVC-BF2LB5WJRBPLA42J - [0:0]
:KUBE-SVC-ERIFXISQEP7F7OF4 - [0:0]
:KUBE-SVC-FQUXG555W5IVIWW3 - [0:0]
:KUBE-SVC-GZ25SP4UFGF7SAVL - [0:0]
:KUBE-SVC-GZN4S7ND4PF6YXD6 - [0:0]
:KUBE-SVC-JD5MR3NA4I4DYORP - [0:0]
:KUBE-SVC-MD4PSIBW5SPF2DB3 - [0:0]
:KUBE-SVC-NPX46M4PTMTKRN6Y - [0:0]
:KUBE-SVC-O36IMWM6WEZJKHBK - [0:0]
:KUBE-SVC-PQIZCPF63EFIBBJY - [0:0]
:KUBE-SVC-QE77U7R3P7AE7O5U - [0:0]
:KUBE-SVC-TCOU7JCQXEZGVUNU - [0:0]
:KUBE-SVC-TTTQGL2HNUNQKPOG - [0:0]
:KUBE-SVC-USIDOZAE2VTXK5OJ - [0:0]
:KUBE-SVC-VTS3SY4BGCZLE3HK - [0:0]
:KUBE-SVC-XUD4OEZNIHB47KQL - [0:0]
:KUBE-SVC-YA74QX5VY2UAABIX - [0:0]
:KUBE-SVC-YO4WYSW77CAYSOKE - [0:0]
:KUBE-SVC-ZUD4L6KQKCHD52W4 - [0:0]
-A PREROUTING -m comment --comment "kubernetes service portals" -j KUBE-SERVICES
-A PREROUTING -m addrtype --dst-type LOCAL -j DOCKER
-A OUTPUT -m comment --comment "kubernetes service portals" -j KUBE-SERVICES
-A OUTPUT ! -d 127.0.0.0/8 -m addrtype --dst-type LOCAL -j DOCKER
-A POSTROUTING -m comment --comment "kubernetes postrouting rules" -j KUBE-POSTROUTING
-A POSTROUTING -s 172.17.0.0/16 ! -o docker0 -j MASQUERADE
-A POSTROUTING -m comment --comment "flanneld masq" -j FLANNEL-POSTRTG
-A DOCKER -i docker0 -j RETURN
-A FLANNEL-POSTRTG -m comment --comment "flanneld masq" -j RETURN
-A FLANNEL-POSTRTG -s 10.244.0.0/24 -d 10.244.0.0/16 -m comment --comment "flanneld masq" -j RETURN
-A FLANNEL-POSTRTG -s 10.244.0.0/16 -d 10.244.0.0/24 -m comment --comment "flanneld masq" -j RETURN
-A FLANNEL-POSTRTG ! -s 10.244.0.0/16 -d 10.244.0.0/24 -m comment --comment "flanneld masq" -j RETURN
-A FLANNEL-POSTRTG -s 10.244.0.0/16 ! -d 224.0.0.0/4 -m comment --comment "flanneld masq" -j MASQUERADE --random-fully
-A FLANNEL-POSTRTG ! -s 10.244.0.0/16 -d 10.244.0.0/16 -m comment --comment "flanneld masq" -j MASQUERADE --random-fully
-A KUBE-EXT-BF2LB5WJRBPLA42J -m comment --comment "masquerade traffic for osm/airflow-webserver:airflow-ui external destinations" -j KUBE-MARK-MASQ
-A KUBE-EXT-BF2LB5WJRBPLA42J -j KUBE-SVC-BF2LB5WJRBPLA42J
-A KUBE-EXT-FQUXG555W5IVIWW3 -m comment --comment "masquerade traffic for osm/nbi external destinations" -j KUBE-MARK-MASQ
-A KUBE-EXT-FQUXG555W5IVIWW3 -j KUBE-SVC-FQUXG555W5IVIWW3
-A KUBE-EXT-GZN4S7ND4PF6YXD6 -m comment --comment "masquerade traffic for osm/alertmanager:http external destinations" -j KUBE-MARK-MASQ
-A KUBE-EXT-GZN4S7ND4PF6YXD6 -j KUBE-SVC-GZN4S7ND4PF6YXD6
-A KUBE-EXT-MD4PSIBW5SPF2DB3 -m comment --comment "masquerade traffic for osm/prometheus external destinations" -j KUBE-MARK-MASQ
-A KUBE-EXT-MD4PSIBW5SPF2DB3 -j KUBE-SVC-MD4PSIBW5SPF2DB3
-A KUBE-EXT-PQIZCPF63EFIBBJY -m comment --comment "masquerade traffic for osm/grafana:service external destinations" -j KUBE-MARK-MASQ
-A KUBE-EXT-PQIZCPF63EFIBBJY -j KUBE-SVC-PQIZCPF63EFIBBJY
-A KUBE-EXT-VTS3SY4BGCZLE3HK -m comment --comment "masquerade traffic for controller-osm/controller-service:api-server external destinations" -j KUBE-MARK-MASQ
-A KUBE-EXT-VTS3SY4BGCZLE3HK -j KUBE-SVC-VTS3SY4BGCZLE3HK
-A KUBE-EXT-XUD4OEZNIHB47KQL -m comment --comment "masquerade traffic for osm/webhook-translator external destinations" -j KUBE-MARK-MASQ
-A KUBE-EXT-XUD4OEZNIHB47KQL -j KUBE-SVC-XUD4OEZNIHB47KQL
-A KUBE-EXT-YA74QX5VY2UAABIX -m comment --comment "masquerade traffic for osm/ng-ui external destinations" -j KUBE-MARK-MASQ
-A KUBE-EXT-YA74QX5VY2UAABIX -j KUBE-SVC-YA74QX5VY2UAABIX
-A KUBE-MARK-DROP -j MARK --set-xmark 0x8000/0x8000
-A KUBE-MARK-MASQ -j MARK --set-xmark 0x4000/0x4000
-A KUBE-NODEPORTS -p tcp -m comment --comment "osm/airflow-webserver:airflow-ui" -m tcp --dport 14436 -j KUBE-EXT-BF2LB5WJRBPLA42J
-A KUBE-NODEPORTS -p tcp -m comment --comment "osm/grafana:service" -m tcp --dport 3000 -j KUBE-EXT-PQIZCPF63EFIBBJY
-A KUBE-NODEPORTS -p tcp -m comment --comment "osm/nbi" -m tcp --dport 9999 -j KUBE-EXT-FQUXG555W5IVIWW3
-A KUBE-NODEPORTS -p tcp -m comment --comment "controller-osm/controller-service:api-server" -m tcp --dport 12043 -j KUBE-EXT-VTS3SY4BGCZLE3HK
-A KUBE-NODEPORTS -p tcp -m comment --comment "osm/ng-ui" -m tcp --dport 80 -j KUBE-EXT-YA74QX5VY2UAABIX
-A KUBE-NODEPORTS -p tcp -m comment --comment "osm/alertmanager:http" -m tcp --dport 9093 -j KUBE-EXT-GZN4S7ND4PF6YXD6
-A KUBE-NODEPORTS -p tcp -m comment --comment "osm/webhook-translator" -m tcp --dport 9998 -j KUBE-EXT-XUD4OEZNIHB47KQL
-A KUBE-NODEPORTS -p tcp -m comment --comment "osm/prometheus" -m tcp --dport 9091 -j KUBE-EXT-MD4PSIBW5SPF2DB3
-A KUBE-POSTROUTING -m mark ! --mark 0x4000/0x4000 -j RETURN
-A KUBE-POSTROUTING -j MARK --set-xmark 0x4000/0x0
-A KUBE-POSTROUTING -m comment --comment "kubernetes service traffic requiring SNAT" -j MASQUERADE --random-fully
-A KUBE-SEP-22JYILAGAWOEZ2V7 -s 10.244.0.5/32 -m comment --comment "metallb-system/metallb-webhook-service" -j KUBE-MARK-MASQ
-A KUBE-SEP-22JYILAGAWOEZ2V7 -p tcp -m comment --comment "metallb-system/metallb-webhook-service" -m tcp -j DNAT --to-destination 10.244.0.5:9443
-A KUBE-SEP-2EKL53EI67NMIYCL -s 10.244.0.28/32 -m comment --comment "osm/airflow-statsd:statsd-scrape" -j KUBE-MARK-MASQ
-A KUBE-SEP-2EKL53EI67NMIYCL -p tcp -m comment --comment "osm/airflow-statsd:statsd-scrape" -m tcp -j DNAT --to-destination 10.244.0.28:9102
-A KUBE-SEP-2KV723JD5URYWNXM -s 10.244.0.29/32 -m comment --comment "osm/airflow-webserver:airflow-ui" -j KUBE-MARK-MASQ
-A KUBE-SEP-2KV723JD5URYWNXM -p tcp -m comment --comment "osm/airflow-webserver:airflow-ui" -m tcp -j DNAT --to-destination 10.244.0.29:8080
-A KUBE-SEP-5D2WEBEDBSLEFQZO -s 10.244.0.38/32 -m comment --comment "osm/airflow-postgresql:tcp-postgresql" -j KUBE-MARK-MASQ
-A KUBE-SEP-5D2WEBEDBSLEFQZO -p tcp -m comment --comment "osm/airflow-postgresql:tcp-postgresql" -m tcp -j DNAT --to-destination 10.244.0.38:5432
-A KUBE-SEP-5ME4XGOFV45WR7Q5 -s 10.244.0.9/32 -m comment --comment "cert-manager/cert-manager-webhook:https" -j KUBE-MARK-MASQ
-A KUBE-SEP-5ME4XGOFV45WR7Q5 -p tcp -m comment --comment "cert-manager/cert-manager-webhook:https" -m tcp -j DNAT --to-destination 10.244.0.9:10250
-A KUBE-SEP-EELVSR32NONIBZHH -s 10.244.0.17/32 -m comment --comment "osm/prometheus" -j KUBE-MARK-MASQ
-A KUBE-SEP-EELVSR32NONIBZHH -p tcp -m comment --comment "osm/prometheus" -m tcp -j DNAT --to-destination 10.244.0.17:9090
-A KUBE-SEP-FOB2KBXV2FRZH6LW -s 10.244.0.43/32 -m comment --comment "osm/pushgateway-prometheus-pushgateway:http" -j KUBE-MARK-MASQ
-A KUBE-SEP-FOB2KBXV2FRZH6LW -p tcp -m comment --comment "osm/pushgateway-prometheus-pushgateway:http" -m tcp -j DNAT --to-destination 10.244.0.43:9091
-A KUBE-SEP-FYSEASATA4IFOBM5 -s 10.244.0.25/32 -m comment --comment "osm/grafana:service" -j KUBE-MARK-MASQ
-A KUBE-SEP-FYSEASATA4IFOBM5 -p tcp -m comment --comment "osm/grafana:service" -m tcp -j DNAT --to-destination 10.244.0.25:3000
-A KUBE-SEP-HJ7EWOW62IX6GL6R -s 10.244.0.10/32 -m comment --comment "kube-system/kube-dns:metrics" -j KUBE-MARK-MASQ
-A KUBE-SEP-HJ7EWOW62IX6GL6R -p tcp -m comment --comment "kube-system/kube-dns:metrics" -m tcp -j DNAT --to-destination 10.244.0.10:9153
-A KUBE-SEP-HRQVWSF7IHI77K53 -s 172.21.23.5/32 -m comment --comment "default/kubernetes:https" -j KUBE-MARK-MASQ
-A KUBE-SEP-HRQVWSF7IHI77K53 -p tcp -m comment --comment "default/kubernetes:https" -m tcp -j DNAT --to-destination 172.21.23.5:6443
-A KUBE-SEP-ISHFB2I4PMRIVC6G -s 10.244.0.13/32 -m comment --comment "controller-osm/modeloperator" -j KUBE-MARK-MASQ
-A KUBE-SEP-ISHFB2I4PMRIVC6G -p tcp -m comment --comment "controller-osm/modeloperator" -m tcp -j DNAT --to-destination 10.244.0.13:17071
-A KUBE-SEP-JFFC7E2KKG7F5OP4 -s 10.244.0.26/32 -m comment --comment "osm/nbi" -j KUBE-MARK-MASQ
-A KUBE-SEP-JFFC7E2KKG7F5OP4 -p tcp -m comment --comment "osm/nbi" -m tcp -j DNAT --to-destination 10.244.0.26:9999
-A KUBE-SEP-JIWKU7LWBAE46CYF -s 10.244.0.37/32 -m comment --comment "osm/airflow-redis:redis-db" -j KUBE-MARK-MASQ
-A KUBE-SEP-JIWKU7LWBAE46CYF -p tcp -m comment --comment "osm/airflow-redis:redis-db" -m tcp -j DNAT --to-destination 10.244.0.37:6379
-A KUBE-SEP-K5ETGAT6VOT2SO74 -s 10.244.0.45/32 -m comment --comment "osm/alertmanager:http" -j KUBE-MARK-MASQ
-A KUBE-SEP-K5ETGAT6VOT2SO74 -p tcp -m comment --comment "osm/alertmanager:http" -m tcp -j DNAT --to-destination 10.244.0.45:9093
-A KUBE-SEP-MHWK5N6X4ER6N5YC -s 10.244.0.21/32 -m comment --comment "osm/ng-ui" -j KUBE-MARK-MASQ
-A KUBE-SEP-MHWK5N6X4ER6N5YC -p tcp -m comment --comment "osm/ng-ui" -m tcp -j DNAT --to-destination 10.244.0.21:80
-A KUBE-SEP-OP4AXEAS4OXHBEQX -s 10.244.0.10/32 -m comment --comment "kube-system/kube-dns:dns-tcp" -j KUBE-MARK-MASQ
-A KUBE-SEP-OP4AXEAS4OXHBEQX -p tcp -m comment --comment "kube-system/kube-dns:dns-tcp" -m tcp -j DNAT --to-destination 10.244.0.10:53
-A KUBE-SEP-PTRZXBEV4JI54VRO -s 10.244.0.18/32 -m comment --comment "osm/webhook-translator" -j KUBE-MARK-MASQ
-A KUBE-SEP-PTRZXBEV4JI54VRO -p tcp -m comment --comment "osm/webhook-translator" -m tcp -j DNAT --to-destination 10.244.0.18:9998
-A KUBE-SEP-PUHFDAMRBZWCPADU -s 10.244.0.4/32 -m comment --comment "kube-system/kube-dns:metrics" -j KUBE-MARK-MASQ
-A KUBE-SEP-PUHFDAMRBZWCPADU -p tcp -m comment --comment "kube-system/kube-dns:metrics" -m tcp -j DNAT --to-destination 10.244.0.4:9153
-A KUBE-SEP-QTEADDRAHUZGD5RY -s 10.244.0.12/32 -m comment --comment "controller-osm/controller-service:api-server" -j KUBE-MARK-MASQ
-A KUBE-SEP-QTEADDRAHUZGD5RY -p tcp -m comment --comment "controller-osm/controller-service:api-server" -m tcp -j DNAT --to-destination 10.244.0.12:17070
-A KUBE-SEP-R7EMXN5TTQQVP4UW -s 10.244.0.10/32 -m comment --comment "kube-system/kube-dns:dns" -j KUBE-MARK-MASQ
-A KUBE-SEP-R7EMXN5TTQQVP4UW -p udp -m comment --comment "kube-system/kube-dns:dns" -m udp -j DNAT --to-destination 10.244.0.10:53
-A KUBE-SEP-SDEOVHTVPO7QT7XS -s 10.244.0.28/32 -m comment --comment "osm/airflow-statsd:statsd-ingest" -j KUBE-MARK-MASQ
-A KUBE-SEP-SDEOVHTVPO7QT7XS -p udp -m comment --comment "osm/airflow-statsd:statsd-ingest" -m udp -j DNAT --to-destination 10.244.0.28:9125
-A KUBE-SEP-SF3LG62VAE5ALYDV -s 10.244.0.4/32 -m comment --comment "kube-system/kube-dns:dns-tcp" -j KUBE-MARK-MASQ
-A KUBE-SEP-SF3LG62VAE5ALYDV -p tcp -m comment --comment "kube-system/kube-dns:dns-tcp" -m tcp -j DNAT --to-destination 10.244.0.4:53
-A KUBE-SEP-WXWGHGKZOCNYRYI7 -s 10.244.0.4/32 -m comment --comment "kube-system/kube-dns:dns" -j KUBE-MARK-MASQ
-A KUBE-SEP-WXWGHGKZOCNYRYI7 -p udp -m comment --comment "kube-system/kube-dns:dns" -m udp -j DNAT --to-destination 10.244.0.4:53
-A KUBE-SERVICES -d 10.97.94.37/32 -p tcp -m comment --comment "osm/airflow-webserver:airflow-ui cluster IP" -m tcp --dport 8080 -j KUBE-SVC-BF2LB5WJRBPLA42J
-A KUBE-SERVICES -d 10.101.193.82/32 -p tcp -m comment --comment "osm/airflow-postgresql:tcp-postgresql cluster IP" -m tcp --dport 5432 -j KUBE-SVC-QE77U7R3P7AE7O5U
-A KUBE-SERVICES -d 10.96.0.1/32 -p tcp -m comment --comment "default/kubernetes:https cluster IP" -m tcp --dport 443 -j KUBE-SVC-NPX46M4PTMTKRN6Y
-A KUBE-SERVICES -d 10.96.0.10/32 -p tcp -m comment --comment "kube-system/kube-dns:dns-tcp cluster IP" -m tcp --dport 53 -j KUBE-SVC-ERIFXISQEP7F7OF4
-A KUBE-SERVICES -d 10.98.129.88/32 -p tcp -m comment --comment "controller-osm/modeloperator cluster IP" -m tcp --dport 17071 -j KUBE-SVC-YO4WYSW77CAYSOKE
-A KUBE-SERVICES -d 10.102.233.155/32 -p tcp -m comment --comment "osm/grafana:service cluster IP" -m tcp --dport 3000 -j KUBE-SVC-PQIZCPF63EFIBBJY
-A KUBE-SERVICES -d 10.102.111.79/32 -p tcp -m comment --comment "osm/nbi cluster IP" -m tcp --dport 9999 -j KUBE-SVC-FQUXG555W5IVIWW3
-A KUBE-SERVICES -d 10.108.13.27/32 -p tcp -m comment --comment "osm/airflow-redis:redis-db cluster IP" -m tcp --dport 6379 -j KUBE-SVC-USIDOZAE2VTXK5OJ
-A KUBE-SERVICES -d 10.98.140.170/32 -p tcp -m comment --comment "osm/pushgateway-prometheus-pushgateway:http cluster IP" -m tcp --dport 9091 -j KUBE-SVC-5QOWUZVRO3UICSLI
-A KUBE-SERVICES -d 10.96.0.10/32 -p tcp -m comment --comment "kube-system/kube-dns:metrics cluster IP" -m tcp --dport 9153 -j KUBE-SVC-JD5MR3NA4I4DYORP
-A KUBE-SERVICES -d 10.108.190.103/32 -p tcp -m comment --comment "controller-osm/controller-service:api-server cluster IP" -m tcp --dport 17070 -j KUBE-SVC-VTS3SY4BGCZLE3HK
-A KUBE-SERVICES -d 172.21.23.5/32 -p tcp -m comment --comment "controller-osm/controller-service:api-server loadbalancer IP" -m tcp --dport 17070 -j KUBE-EXT-VTS3SY4BGCZLE3HK
-A KUBE-SERVICES -d 10.107.24.122/32 -p tcp -m comment --comment "osm/ng-ui cluster IP" -m tcp --dport 80 -j KUBE-SVC-YA74QX5VY2UAABIX
-A KUBE-SERVICES -d 10.106.238.168/32 -p tcp -m comment --comment "osm/airflow-statsd:statsd-scrape cluster IP" -m tcp --dport 9102 -j KUBE-SVC-TTTQGL2HNUNQKPOG
-A KUBE-SERVICES -d 10.104.247.154/32 -p tcp -m comment --comment "osm/alertmanager:http cluster IP" -m tcp --dport 9093 -j KUBE-SVC-GZN4S7ND4PF6YXD6
-A KUBE-SERVICES -d 10.96.0.10/32 -p udp -m comment --comment "kube-system/kube-dns:dns cluster IP" -m udp --dport 53 -j KUBE-SVC-TCOU7JCQXEZGVUNU
-A KUBE-SERVICES -d 10.98.142.214/32 -p tcp -m comment --comment "metallb-system/metallb-webhook-service cluster IP" -m tcp --dport 443 -j KUBE-SVC-GZ25SP4UFGF7SAVL
-A KUBE-SERVICES -d 10.96.75.19/32 -p tcp -m comment --comment "cert-manager/cert-manager-webhook:https cluster IP" -m tcp --dport 443 -j KUBE-SVC-ZUD4L6KQKCHD52W4
-A KUBE-SERVICES -d 10.106.94.220/32 -p tcp -m comment --comment "osm/webhook-translator cluster IP" -m tcp --dport 9998 -j KUBE-SVC-XUD4OEZNIHB47KQL
-A KUBE-SERVICES -d 10.102.181.87/32 -p tcp -m comment --comment "osm/prometheus cluster IP" -m tcp --dport 9090 -j KUBE-SVC-MD4PSIBW5SPF2DB3
-A KUBE-SERVICES -d 10.106.238.168/32 -p udp -m comment --comment "osm/airflow-statsd:statsd-ingest cluster IP" -m udp --dport 9125 -j KUBE-SVC-O36IMWM6WEZJKHBK
-A KUBE-SERVICES -m comment --comment "kubernetes service nodeports; NOTE: this must be the last rule in this chain" -m addrtype --dst-type LOCAL -j KUBE-NODEPORTS
-A KUBE-SVC-5QOWUZVRO3UICSLI ! -s 10.244.0.0/16 -d 10.98.140.170/32 -p tcp -m comment --comment "osm/pushgateway-prometheus-pushgateway:http cluster IP" -m tcp --dport 9091 -j KUBE-MARK-MASQ
-A KUBE-SVC-5QOWUZVRO3UICSLI -m comment --comment "osm/pushgateway-prometheus-pushgateway:http -> 10.244.0.43:9091" -j KUBE-SEP-FOB2KBXV2FRZH6LW
-A KUBE-SVC-BF2LB5WJRBPLA42J ! -s 10.244.0.0/16 -d 10.97.94.37/32 -p tcp -m comment --comment "osm/airflow-webserver:airflow-ui cluster IP" -m tcp --dport 8080 -j KUBE-MARK-MASQ
-A KUBE-SVC-BF2LB5WJRBPLA42J -m comment --comment "osm/airflow-webserver:airflow-ui -> 10.244.0.29:8080" -j KUBE-SEP-2KV723JD5URYWNXM
-A KUBE-SVC-ERIFXISQEP7F7OF4 ! -s 10.244.0.0/16 -d 10.96.0.10/32 -p tcp -m comment --comment "kube-system/kube-dns:dns-tcp cluster IP" -m tcp --dport 53 -j KUBE-MARK-MASQ
-A KUBE-SVC-ERIFXISQEP7F7OF4 -m comment --comment "kube-system/kube-dns:dns-tcp -> 10.244.0.10:53" -m statistic --mode random --probability 0.50000000000 -j KUBE-SEP-OP4AXEAS4OXHBEQX
-A KUBE-SVC-ERIFXISQEP7F7OF4 -m comment --comment "kube-system/kube-dns:dns-tcp -> 10.244.0.4:53" -j KUBE-SEP-SF3LG62VAE5ALYDV
-A KUBE-SVC-FQUXG555W5IVIWW3 ! -s 10.244.0.0/16 -d 10.102.111.79/32 -p tcp -m comment --comment "osm/nbi cluster IP" -m tcp --dport 9999 -j KUBE-MARK-MASQ
-A KUBE-SVC-FQUXG555W5IVIWW3 -m comment --comment "osm/nbi -> 10.244.0.26:9999" -j KUBE-SEP-JFFC7E2KKG7F5OP4
-A KUBE-SVC-GZ25SP4UFGF7SAVL ! -s 10.244.0.0/16 -d 10.98.142.214/32 -p tcp -m comment --comment "metallb-system/metallb-webhook-service cluster IP" -m tcp --dport 443 -j KUBE-MARK-MASQ
-A KUBE-SVC-GZ25SP4UFGF7SAVL -m comment --comment "metallb-system/metallb-webhook-service -> 10.244.0.5:9443" -j KUBE-SEP-22JYILAGAWOEZ2V7
-A KUBE-SVC-GZN4S7ND4PF6YXD6 ! -s 10.244.0.0/16 -d 10.104.247.154/32 -p tcp -m comment --comment "osm/alertmanager:http cluster IP" -m tcp --dport 9093 -j KUBE-MARK-MASQ
-A KUBE-SVC-GZN4S7ND4PF6YXD6 -m comment --comment "osm/alertmanager:http -> 10.244.0.45:9093" -j KUBE-SEP-K5ETGAT6VOT2SO74
-A KUBE-SVC-JD5MR3NA4I4DYORP ! -s 10.244.0.0/16 -d 10.96.0.10/32 -p tcp -m comment --comment "kube-system/kube-dns:metrics cluster IP" -m tcp --dport 9153 -j KUBE-MARK-MASQ
-A KUBE-SVC-JD5MR3NA4I4DYORP -m comment --comment "kube-system/kube-dns:metrics -> 10.244.0.10:9153" -m statistic --mode random --probability 0.50000000000 -j KUBE-SEP-HJ7EWOW62IX6GL6R
-A KUBE-SVC-JD5MR3NA4I4DYORP -m comment --comment "kube-system/kube-dns:metrics -> 10.244.0.4:9153" -j KUBE-SEP-PUHFDAMRBZWCPADU
-A KUBE-SVC-MD4PSIBW5SPF2DB3 ! -s 10.244.0.0/16 -d 10.102.181.87/32 -p tcp -m comment --comment "osm/prometheus cluster IP" -m tcp --dport 9090 -j KUBE-MARK-MASQ
-A KUBE-SVC-MD4PSIBW5SPF2DB3 -m comment --comment "osm/prometheus -> 10.244.0.17:9090" -j KUBE-SEP-EELVSR32NONIBZHH
-A KUBE-SVC-NPX46M4PTMTKRN6Y ! -s 10.244.0.0/16 -d 10.96.0.1/32 -p tcp -m comment --comment "default/kubernetes:https cluster IP" -m tcp --dport 443 -j KUBE-MARK-MASQ
-A KUBE-SVC-NPX46M4PTMTKRN6Y -m comment --comment "default/kubernetes:https -> 172.21.23.5:6443" -j KUBE-SEP-HRQVWSF7IHI77K53
-A KUBE-SVC-O36IMWM6WEZJKHBK ! -s 10.244.0.0/16 -d 10.106.238.168/32 -p udp -m comment --comment "osm/airflow-statsd:statsd-ingest cluster IP" -m udp --dport 9125 -j KUBE-MARK-MASQ
-A KUBE-SVC-O36IMWM6WEZJKHBK -m comment --comment "osm/airflow-statsd:statsd-ingest -> 10.244.0.28:9125" -j KUBE-SEP-SDEOVHTVPO7QT7XS
-A KUBE-SVC-PQIZCPF63EFIBBJY ! -s 10.244.0.0/16 -d 10.102.233.155/32 -p tcp -m comment --comment "osm/grafana:service cluster IP" -m tcp --dport 3000 -j KUBE-MARK-MASQ
-A KUBE-SVC-PQIZCPF63EFIBBJY -m comment --comment "osm/grafana:service -> 10.244.0.25:3000" -j KUBE-SEP-FYSEASATA4IFOBM5
-A KUBE-SVC-QE77U7R3P7AE7O5U ! -s 10.244.0.0/16 -d 10.101.193.82/32 -p tcp -m comment --comment "osm/airflow-postgresql:tcp-postgresql cluster IP" -m tcp --dport 5432 -j KUBE-MARK-MASQ
-A KUBE-SVC-QE77U7R3P7AE7O5U -m comment --comment "osm/airflow-postgresql:tcp-postgresql -> 10.244.0.38:5432" -j KUBE-SEP-5D2WEBEDBSLEFQZO
-A KUBE-SVC-TCOU7JCQXEZGVUNU ! -s 10.244.0.0/16 -d 10.96.0.10/32 -p udp -m comment --comment "kube-system/kube-dns:dns cluster IP" -m udp --dport 53 -j KUBE-MARK-MASQ
-A KUBE-SVC-TCOU7JCQXEZGVUNU -m comment --comment "kube-system/kube-dns:dns -> 10.244.0.10:53" -m statistic --mode random --probability 0.50000000000 -j KUBE-SEP-R7EMXN5TTQQVP4UW
-A KUBE-SVC-TCOU7JCQXEZGVUNU -m comment --comment "kube-system/kube-dns:dns -> 10.244.0.4:53" -j KUBE-SEP-WXWGHGKZOCNYRYI7
-A KUBE-SVC-TTTQGL2HNUNQKPOG ! -s 10.244.0.0/16 -d 10.106.238.168/32 -p tcp -m comment --comment "osm/airflow-statsd:statsd-scrape cluster IP" -m tcp --dport 9102 -j KUBE-MARK-MASQ
-A KUBE-SVC-TTTQGL2HNUNQKPOG -m comment --comment "osm/airflow-statsd:statsd-scrape -> 10.244.0.28:9102" -j KUBE-SEP-2EKL53EI67NMIYCL
-A KUBE-SVC-USIDOZAE2VTXK5OJ ! -s 10.244.0.0/16 -d 10.108.13.27/32 -p tcp -m comment --comment "osm/airflow-redis:redis-db cluster IP" -m tcp --dport 6379 -j KUBE-MARK-MASQ
-A KUBE-SVC-USIDOZAE2VTXK5OJ -m comment --comment "osm/airflow-redis:redis-db -> 10.244.0.37:6379" -j KUBE-SEP-JIWKU7LWBAE46CYF
-A KUBE-SVC-VTS3SY4BGCZLE3HK ! -s 10.244.0.0/16 -d 10.108.190.103/32 -p tcp -m comment --comment "controller-osm/controller-service:api-server cluster IP" -m tcp --dport 17070 -j KUBE-MARK-MASQ
-A KUBE-SVC-VTS3SY4BGCZLE3HK -m comment --comment "controller-osm/controller-service:api-server -> 10.244.0.12:17070" -j KUBE-SEP-QTEADDRAHUZGD5RY
-A KUBE-SVC-XUD4OEZNIHB47KQL ! -s 10.244.0.0/16 -d 10.106.94.220/32 -p tcp -m comment --comment "osm/webhook-translator cluster IP" -m tcp --dport 9998 -j KUBE-MARK-MASQ
-A KUBE-SVC-XUD4OEZNIHB47KQL -m comment --comment "osm/webhook-translator -> 10.244.0.18:9998" -j KUBE-SEP-PTRZXBEV4JI54VRO
-A KUBE-SVC-YA74QX5VY2UAABIX ! -s 10.244.0.0/16 -d 10.107.24.122/32 -p tcp -m comment --comment "osm/ng-ui cluster IP" -m tcp --dport 80 -j KUBE-MARK-MASQ
-A KUBE-SVC-YA74QX5VY2UAABIX -m comment --comment "osm/ng-ui -> 10.244.0.21:80" -j KUBE-SEP-MHWK5N6X4ER6N5YC
-A KUBE-SVC-YO4WYSW77CAYSOKE ! -s 10.244.0.0/16 -d 10.98.129.88/32 -p tcp -m comment --comment "controller-osm/modeloperator cluster IP" -m tcp --dport 17071 -j KUBE-MARK-MASQ
-A KUBE-SVC-YO4WYSW77CAYSOKE -m comment --comment "controller-osm/modeloperator -> 10.244.0.13:17071" -j KUBE-SEP-ISHFB2I4PMRIVC6G
-A KUBE-SVC-ZUD4L6KQKCHD52W4 ! -s 10.244.0.0/16 -d 10.96.75.19/32 -p tcp -m comment --comment "cert-manager/cert-manager-webhook:https cluster IP" -m tcp --dport 443 -j KUBE-MARK-MASQ
-A KUBE-SVC-ZUD4L6KQKCHD52W4 -m comment --comment "cert-manager/cert-manager-webhook:https -> 10.244.0.9:10250" -j KUBE-SEP-5ME4XGOFV45WR7Q5
COMMIT
# Completed on Tue Nov 21 08:54:48 2023
# Generated by ip6tables-save v1.8.7 on Tue Nov 21 08:54:48 2023
*mangle
:PREROUTING ACCEPT [0:0]
:INPUT ACCEPT [0:0]
:FORWARD ACCEPT [0:0]
:OUTPUT ACCEPT [0:0]
:POSTROUTING ACCEPT [0:0]
:KUBE-IPTABLES-HINT - [0:0]
:KUBE-KUBELET-CANARY - [0:0]
:KUBE-PROXY-CANARY - [0:0]
COMMIT
# Completed on Tue Nov 21 08:54:48 2023
# Generated by ip6tables-save v1.8.7 on Tue Nov 21 08:54:48 2023
*filter
:INPUT ACCEPT [0:0]
:FORWARD ACCEPT [0:0]
:OUTPUT ACCEPT [0:0]
:KUBE-EXTERNAL-SERVICES - [0:0]
:KUBE-FIREWALL - [0:0]
:KUBE-FORWARD - [0:0]
:KUBE-KUBELET-CANARY - [0:0]
:KUBE-NODEPORTS - [0:0]
:KUBE-PROXY-CANARY - [0:0]
:KUBE-PROXY-FIREWALL - [0:0]
:KUBE-SERVICES - [0:0]
-A INPUT -m conntrack --ctstate NEW -m comment --comment "kubernetes load balancer firewall" -j KUBE-PROXY-FIREWALL
-A INPUT -m comment --comment "kubernetes health check service ports" -j KUBE-NODEPORTS
-A INPUT -m conntrack --ctstate NEW -m comment --comment "kubernetes externally-visible service portals" -j KUBE-EXTERNAL-SERVICES
-A FORWARD -m conntrack --ctstate NEW -m comment --comment "kubernetes load balancer firewall" -j KUBE-PROXY-FIREWALL
-A FORWARD -m comment --comment "kubernetes forwarding rules" -j KUBE-FORWARD
-A FORWARD -m conntrack --ctstate NEW -m comment --comment "kubernetes service portals" -j KUBE-SERVICES
-A FORWARD -m conntrack --ctstate NEW -m comment --comment "kubernetes externally-visible service portals" -j KUBE-EXTERNAL-SERVICES
-A OUTPUT -m conntrack --ctstate NEW -m comment --comment "kubernetes load balancer firewall" -j KUBE-PROXY-FIREWALL
-A OUTPUT -m conntrack --ctstate NEW -m comment --comment "kubernetes service portals" -j KUBE-SERVICES
-A KUBE-FIREWALL -m comment --comment "kubernetes firewall for dropping marked packets" -m mark --mark 0x8000/0x8000 -j DROP
-A KUBE-FORWARD -m conntrack --ctstate INVALID -j DROP
-A KUBE-FORWARD -m comment --comment "kubernetes forwarding rules" -m mark --mark 0x4000/0x4000 -j ACCEPT
-A KUBE-FORWARD -m comment --comment "kubernetes forwarding conntrack rule" -m conntrack --ctstate RELATED,ESTABLISHED -j ACCEPT
COMMIT
# Completed on Tue Nov 21 08:54:48 2023
# Generated by ip6tables-save v1.8.7 on Tue Nov 21 08:54:48 2023
*nat
:PREROUTING ACCEPT [0:0]
:INPUT ACCEPT [0:0]
:OUTPUT ACCEPT [0:0]
:POSTROUTING ACCEPT [0:0]
:KUBE-KUBELET-CANARY - [0:0]
:KUBE-MARK-DROP - [0:0]
:KUBE-MARK-MASQ - [0:0]
:KUBE-NODEPORTS - [0:0]
:KUBE-POSTROUTING - [0:0]
:KUBE-PROXY-CANARY - [0:0]
:KUBE-SERVICES - [0:0]
-A PREROUTING -m comment --comment "kubernetes service portals" -j KUBE-SERVICES
-A OUTPUT -m comment --comment "kubernetes service portals" -j KUBE-SERVICES
-A POSTROUTING -m comment --comment "kubernetes postrouting rules" -j KUBE-POSTROUTING
-A KUBE-MARK-DROP -j MARK --set-xmark 0x8000/0x8000
-A KUBE-MARK-MASQ -j MARK --set-xmark 0x4000/0x4000
-A KUBE-POSTROUTING -m mark ! --mark 0x4000/0x4000 -j RETURN
-A KUBE-POSTROUTING -j MARK --set-xmark 0x4000/0x0
-A KUBE-POSTROUTING -m comment --comment "kubernetes service traffic requiring SNAT" -j MASQUERADE --random-fully
-A KUBE-SERVICES ! -d ::1/128 -m comment --comment "kubernetes service nodeports; NOTE: this must be the last rule in this chain" -m addrtype --dst-type LOCAL -j KUBE-NODEPORTS
COMMIT
# Completed on Tue Nov 21 08:54:48 2023
Track end end: https://osm.etsi.org/InstallLog.php?&installation_id=1700556187-NchhxCXLVFgga2VP&local_ts=1700556888&event=end&operation=end&value=&comment=&tags=
/etc/osm
/etc/osm/helm
/etc/osm/helm/alertmanager-values.yaml
/etc/osm/helm/airflow-values.yaml
/etc/osm/helm/mongodb-values.yaml
/etc/osm/helm/osm-values.yaml
/etc/osm/metallb-ipaddrpool.yaml
/etc/osm/kubeadm-config.yaml

DONE
+ set +eux
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Defaulted container "lcm" out of: lcm, kafka-ro-mongo-test (init)
Unable to use a TTY - input is not a terminal or the right kind of file
+ export JUJU_PASSWORD=ffbc367bd0d177fa0d55700106f8ef93
+ JUJU_PASSWORD=ffbc367bd0d177fa0d55700106f8ef93
+ cat
+ echo Environment was updated at /robot-systest/results/osm_environment.rc
Environment was updated at /robot-systest/results/osm_environment.rc
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Add VIM and K8s cluster to OSM)
[Pipeline] sh
[azure_robot_tests] Running shell script
+ . /robot-systest/results/osm_environment.rc
+ export CLOUD_TYPE=azure
+ export OSM_HOSTNAME=172.21.23.5
+ export OSM_IMAGE_NAME=osmtest202311210839
+ export JUJU_PASSWORD=ffbc367bd0d177fa0d55700106f8ef93
+ . /robot-systest/results/k8s_environment.rc
+ export CLOUD_TYPE=azure
+ export USE_PAAS_K8S=FALSE
+ export K8S_IP=172.21.23.11
+ export K8S_IMAGE_NAME=k8stest202311210832
+ export K8S_CREDENTIALS=/robot-systest/results/kubeconfig.yaml
+ osm version
Server version: 9.0.0.post107+g5cdcb80 2020-04-17
Client version: 11.0.0rc1.post56+g34ef701
+ set +x
Adding VIM to OSM
dddffb4a-2a02-4d0a-977e-a327a05e4b9a
+-----------------+--------------------------------------------------------------+
| key             | attribute                                                    |
+-----------------+--------------------------------------------------------------+
| _id             | "dddffb4a-2a02-4d0a-977e-a327a05e4b9a"                       |
| name            | "azure-etsi"                                                 |
| vim_type        | "azure"                                                      |
| description     | "None"                                                       |
| vim_url         | "http://www.azure.com"                                       |
| vim_user        | "7c5ba2e6-2013-49a0-bf9a-f2592030f7ff"                       |
| vim_password    | "********"                                                   |
| vim_tenant_name | "e6746ab5-ebdc-4e9d-821b-a71bdaf63d9b"                       |
| config          | {                                                            |
|                 |   "region_name": "westeurope",                               |
|                 |   "resource_group": "OSM_CICD_GROUP",                        |
|                 |   "subscription_id": "8fb7e78d-097b-413d-bc65-41d29be6bab1", |
|                 |   "vnet_name": "OSM-CICD-net",                               |
|                 |   "flavors_pattern": "^Standard"                             |
|                 | }                                                            |
| _admin          | {                                                            |
|                 |   "created": 1700556891.6048152,                             |
|                 |   "modified": 1700556891.6048152,                            |
|                 |   "projects_read": [                                         |
|                 |     "4ba6ae61-e275-4620-9767-94cb15ba9a12"                   |
|                 |   ],                                                         |
|                 |   "projects_write": [                                        |
|                 |     "4ba6ae61-e275-4620-9767-94cb15ba9a12"                   |
|                 |   ],                                                         |
|                 |   "operationalState": "ENABLED",                             |
|                 |   "operations": [                                            |
|                 |     {                                                        |
|                 |       "lcmOperationType": "create",                          |
|                 |       "operationState": "COMPLETED",                         |
|                 |       "startTime": 1700556891.6049385,                       |
|                 |       "statusEnteredTime": 1700556891.6256092,               |
|                 |       "operationParams": null                                |
|                 |     }                                                        |
|                 |   ],                                                         |
|                 |   "current_operation": null,                                 |
|                 |   "detailed-status": ""                                      |
|                 | }                                                            |
| schema_version  | "1.11"                                                       |
| admin           | {                                                            |
|                 |   "current_operation": 0                                     |
|                 | }                                                            |
+-----------------+--------------------------------------------------------------+
Adding K8s cluster to OSM
7c6f776b-16e5-439c-bf17-56400f91e949
+----------------+------------------------------------------------------------------------------------------------------+
| key            | attribute                                                                                            |
+----------------+------------------------------------------------------------------------------------------------------+
| _id            | "7c6f776b-16e5-439c-bf17-56400f91e949"                                                               |
| name           | "k8stest202311210832"                                                                                |
| credentials    | {                                                                                                    |
|                |   "apiVersion": "v1",                                                                                |
|                |   "clusters": [                                                                                      |
|                |     {                                                                                                |
|                |       "cluster": {                                                                                   |
|                |         "certificate-authority-data": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUREekNDQWZlZ0F3SUJBZ |
|                | 0lVSGxnbm9GN3dwWXE0SmlxYXMxQmZSellMREdRd0RRWUpLb1pJaHZjTkFRRUwKQlFBd0Z6RVZNQk1HQTFVRUF3d01NVEF1TVRVe |
|                | UxqRTRNeTR4TUI0WERUSXpNVEV5TVRBNE16Y3lNbG9YRFRNegpNVEV4T0RBNE16Y3lNbG93RnpFVk1CTUdBMVVFQXd3TU1UQXVNV |
|                | FV5TGpFNE15NHhNSUlCSWpBTkJna3Foa2lHCjl3MEJBUUVGQUFPQ0FROEFNSUlCQ2dLQ0FRRUFzMjVIdk5mcGp2am8zTnlpOUR4Q |
|                | zFzcWJWVTdnc2t3cmVzYS8KQjB1K3ZINXhTU1V6NUJ2YXRRUGhNenlzd1U4TUFpNmxVYmsxdFdBcTdYZWZ3M3ZTbGtsWE90NTlwW |
|                | kNPVEV1QQp0SXZaZXdrdkxUOXBsSVlaalh5MjFJVkUwdzIyM3lhVmpWUnNWRCtBWVdPS1FjTXV1cUVNSXBtYVJCL2wxSm1JClpzT |
|                | m5zK3hKZzJubDRYQWkweDZlSHllTzNYSU9UemM0ZmFoN2Y0dkc0R0hUT1RONGNlK1ErbWFwSjNQcmdTTCsKL2lGTGFXbFpzRUdZS |
|                | WNFMzZnNEJOM29XSHl6eG5ENDkrSHdIc0JsdE9tYUM1M29kclYxVjcvOWVGdG5XZFFSWAo2V0QxV3g0TFZSekV3S0U4YWVjZ1F5N |
|                | XVhajFodGI5dndKUEVkQW9JVWp0enpzUjFBUUlEQVFBQm8xTXdVVEFkCkJnTlZIUTRFRmdRVVd2MFgxWnJwR3lYaWphY1A0RW0xU |
|                | VhnNEJOY3dId1lEVlIwakJCZ3dGb0FVV3YwWDFacnAKR3lYaWphY1A0RW0xUVhnNEJOY3dEd1lEVlIwVEFRSC9CQVV3QXdFQi96Q |
|                | U5CZ2txaGtpRzl3MEJBUXNGQUFPQwpBUUVBQ0cvZXE2MmtseTgwUUxMMXRtY05WNzllektpSFljNkZWbHFvMTIwQXdiQSsrelRjR |
|                | HdWdW9tUHoxbjNrCmRmMHFFc1ZNZFRlMlQ2ZTRZVmx2ZzBpVmJrZlZhYldkSlNBZmY5MWJsRW1QRVlhM3V3YW8xOUhJVVBmSDNVN |
|                | lcKNGwxZHNHeVhOYkhOQlp2UW84b3ZyNGd4YXhmTE1mZUJ0WkJFRXFMYS9nalgrZndIMkQrRnRnNUZIQWFCRk9VYQpEY2RtZk56V |
|                | 1Ird1REYUU3cUFCMEpmQ25ZbWY4VlBjU1BkZ3VOVnJyWC9mWGlnMkJkVFJ4QWhOekY5Y21hcE0zCjVWOVl6dTAxR1N2d3AvRXBLK |
|                | 2xQY1dNdDlmTzFoaXBIN3RHS0pJdlhCdW9FQVloWVdCUkdqS29Kd1FuTVUwdVkKRmdDYjZML1BKRlF0LzZ4M3pkUmpQUVR3Tmc9P |
|                | QotLS0tLUVORCBDRVJUSUZJQ0FURS0tLS0tCg==",                                                            |
|                |         "server": "https://172.21.23.11:16443"                                                       |
|                |       },                                                                                             |
|                |       "name": "microk8s-cluster"                                                                     |
|                |     }                                                                                                |
|                |   ],                                                                                                 |
|                |   "contexts": [                                                                                      |
|                |     {                                                                                                |
|                |       "context": {                                                                                   |
|                |         "cluster": "microk8s-cluster",                                                               |
|                |         "user": "admin"                                                                              |
|                |       },                                                                                             |
|                |       "name": "microk8s"                                                                             |
|                |     }                                                                                                |
|                |   ],                                                                                                 |
|                |   "current-context": "microk8s",                                                                     |
|                |   "kind": "Config",                                                                                  |
|                |   "preferences": {},                                                                                 |
|                |   "users": [                                                                                         |
|                |     {                                                                                                |
|                |       "name": "admin",                                                                               |
|                |       "user": {                                                                                      |
|                |         "client-certificate-data": "LS0tLS1CRUdJTiBDRVJUSUZJQ0FURS0tLS0tCk1JSUN6RENDQWJTZ0F3SUJBZ0lV |
|                | SXQvL20vNmRUeWx2SDlMcmR6WVZFcGVaeEwwd0RRWUpLb1pJaHZjTkFRRUwKQlFBd0Z6RVZNQk1HQTFVRUF3d01NVEF1TVRVeUxq |
|                | RTRNeTR4TUI0WERUSXpNVEV5TVRBNE16Y3lORm9YRFRNegpNVEV4T0RBNE16Y3lORm93S1RFT01Bd0dBMVVFQXd3RllXUnRhVzR4 |
|                | RnpBVkJnTlZCQW9NRG5ONWMzUmxiVHB0CllYTjBaWEp6TUlJQklqQU5CZ2txaGtpRzl3MEJBUUVGQUFPQ0FROEFNSUlCQ2dLQ0FR |
|                | RUF0SUkrTTJCWnhrdlcKK3pKUTlUUHh6WDFCSEpXbWFkMW1yaUsrNUhuT04xWlhhTXBpNTJiR0NxdkZoNmQzclIzOWNXMzlJMUt1 |
|                | N0c1UQp5Z2pib0ZzcC9UY2RobDIvRVRoa0lDZVUycDQySXFaZXFwNk9QOXFEMmN0WVpsamw2ellDaDJ0bks5MEQyRFFpCjhZVmVU |
|                | eVlVRlo0aWhKRExNWDBOQnhUMlliWllvTUh0QzVxT25LQzVqVkxiOWg5NFd6UUJRLzZ0ZFFobE9ocUMKZ0YrUFVUSmd0OHVxN3R3 |
|                | d0JIU3hZS0xrWk5lQWVhMC9HMEdQN2lTNG1ESURab0ZEcG1tS0Fib2V4cGJTczNZdwp5VDQ2em5hMzk5WXNld3ZUUzM5T3NMR3V0 |
|                | UlR4a0hEMVZES1JQWFlod2g5cHRqT3VUVE5oYTgrRzFwK1lUWWd5CkQ1NXFVYnJKZ3dJREFRQUJNQTBHQ1NxR1NJYjNEUUVCQ3dV |
|                | QUE0SUJBUUNXZzFVeXpoVFcwa291VG44TmN1NW0KTnJoMG5LVUNkMWplTU9pZ1IydWh6Y21QU2kzZURZZC9kZVQreitGbFVQdnp3 |
|                | Zk9wSEFoWmUweVBoa2IxbDlKbwo4bHJIZ1JYK2pMTENCQUdUTjVSMDc4anF0RUFJVFdiNWs4amp2ZC9tVlk1cnQySDZoV2pDZFhR |
|                | RFFoWlpGS1pvCkE2Ky9aYnl5SzBoTmZXeUpJN0JhSnFXbzQxK05tMDk2K0ZSTE1OUkVxRWo0WHFyQ1QwV1VNV0lhU0tGZW9XeVkK |
|                | dzBQWWtXbWlRRXlzd1EvcHIvNzVPekRIVFpvanFEQVBwbjU0ZnF3Z2Q3dXRvYUFCYjltd3h3K2N5Q0xINDRqdgpSVHF3SExoUHIw |
|                | ZVdFalFpWUdmUWpqQWF0QnpieXY5YllyYkNMcTdQaUFFRlM2czBhNDlVZ1dLZVd6bEd1Sy9UCi0tLS0tRU5EIENFUlRJRklDQVRF |
|                | LS0tLS0K",                                                                                           |
|                |         "client-key-data": "LS0tLS1CRUdJTiBSU0EgUFJJVkFURSBLRVktLS0tLQpNSUlFcEFJQkFBS0NBUUVBdElJK00y |
|                | Qlp4a3ZXK3pKUTlUUHh6WDFCSEpXbWFkMW1yaUsrNUhuT04xWlhhTXBpCjUyYkdDcXZGaDZkM3JSMzljVzM5STFLdTdHNVF5Z2pi |
|                | b0ZzcC9UY2RobDIvRVRoa0lDZVUycDQySXFaZXFwNk8KUDlxRDJjdFlabGpsNnpZQ2gydG5LOTBEMkRRaThZVmVUeVlVRlo0aWhK |
|                | RExNWDBOQnhUMlliWllvTUh0QzVxTwpuS0M1alZMYjloOTRXelFCUS82dGRRaGxPaHFDZ0YrUFVUSmd0OHVxN3R3d0JIU3hZS0xr |
|                | Wk5lQWVhMC9HMEdQCjdpUzRtRElEWm9GRHBtbUtBYm9leHBiU3MzWXd5VDQ2em5hMzk5WXNld3ZUUzM5T3NMR3V0UlR4a0hEMVZE |
|                | S1IKUFhZaHdoOXB0ak91VFROaGE4K0cxcCtZVFlneUQ1NXFVYnJKZ3dJREFRQUJBb0lCQUUzWS9jcGFlMTB4eC8xOQpyYnlTTW01 |
|                | VTNvZmRZbSsxdEZyVlVVNW4yYSt6NU82a21oTUNGOWJ5VGJZN1d6bWwvVysxNTc0Y0lxU0V6d0xTCjN5VkJLOUt3dW1NQWV5NktD |
|                | MlJ5clplN004MnR3cFlDMjRaZVR3N1d5OEVZQmRCeThwUkVsYkl0RjBOZStOTUEKbWJPNDBHQ2lXbkNob1lBdVZqZi9xbUpwcWI5 |
|                | cUtjc0JIaFJUQ2xZOG1GQ2h2VFBGNkNlcTVzS09NcVNrM3M5YwpQYXJqWWpzQmRrNEthSytDZmhaSlRUenNIckJiM2F0RVZsMkNv |
|                | ZGh4cXlRT3h6eUJvV2x3RENRS2wzVmFmdEZqCkRDZlZYSWluN2x3TnM1RThNb1N6QUZ5Z3dHOXA4TlJ3azVqRDdLT0xyMEJlam9T |
|                | ME92aVkvQkh1dHdraFRhMWMKUVNBS01jRUNnWUVBMldaZEtRWVFrTDIrT0IrVUNuOW52YzZpTDRlS3NNY3JoQXRQaTUrOXJFWExG |
|                | engvRUN5dApDNGJjV0NlOHpxVWFuQ1dteldaZURNcjUzVjZrQS9UbVE0SWlRdG5qdUd0cDNPSlV2V3A4S29EQXdJZ011Um9vCjRy |
|                | cTN3cHQ3c1hlMGFvK1I0OTVtMWRLcjNkOTZLQ1F0b2ZnKy9SUnE2YStMRThGSlByc0V3c1VDZ1lFQTFJOEsKY3Fkdm94eDRVa1lL |
|                | blk5QW1TQ3pZcUlwR095K0gvZkFqWmVCZTk1dzRLZEZYb0RNcTFtK3NMTzFYUU1SbW8vYQpYVndsZEVVSktZZUZVMHpieEhwYzVo |
|                | cTVGY29ZZHBNMFNscFQ3YU1aUlYxVW03QVNXUno3WmVoMS9UY0F5LzhYCkFPZTVGOWQyMnFOZTR4aXhzcFVUM1g3ZGs2bmVlbzhx |
|                | cG5FQWY2Y0NnWUVBdTl3dzlKcEJCbEp3Wlo4a0ZrNHkKSDFySDI3cU1wbVgxdkl6cG1BWFpxVnhHT3pZVHRxNWtlbWwzVHRaSDJX |
|                | bmlyTHY4ZjlITnc5QkNTb0RWeW5WOQo1U2cyNHYzV1FpN1B0QWdBb2cyNmJpQUVjRnNnWTNPdmtQMDRmZnBOMFBWWCtoMUdQRnFi |
|                | RW5xUitaQjd4dVE4CmFwSlNHKy9nMHo3V042UDIzelpNenVFQ2dZRUFuNlp3YW4xb1ZGTGw3S1cvQTJpOVZFWkRkaW5tUkZwTTB5 |
|                | WDYKOGw2Qld0QlFaK3c3SmlnSGtndnVOanBFVm9BZmtML0xlNlBpWGRvY2I1emlmeGFUV3BldGZ3ZERUU0ptRFdtegpOTVZ6bEdH |
|                | VWNZMFFKMyt2eUU4RVpCNmR3SHEzbG9FYWZndU44bUpiV3d5cUVGaFQzNVlwUFl3MFVOeWR4ZUFTCllNRXZackVDZ1lBazZXdzJ6 |
|                | QjBOcERUMk5jT1JuRDhxQkhxVHN1MjNwY0JJY2hrSHpzVkF5NmZSbFhydHZvOS8KYnMybWZuT0IyNDdrVnMrUDlhMlRlbHYwNmNQ |
|                | eVdkSENpZHQvNEFNTXl1ZGZNT0FsTUNJKzRwMVdrK29UQmc0LwppR0NtdzF0YSt1d2xVYmQrSHBUdlk1K1VkTVVyK25HZXVVeFlG |
|                | OHlBbmNNY0xybmJYUERvQlE9PQotLS0tLUVORCBSU0EgUFJJVkFURSBLRVktLS0tLQo="                                |
|                |       }                                                                                              |
|                |     }                                                                                                |
|                |   ]                                                                                                  |
|                | }                                                                                                    |
| k8s_version    | "v1"                                                                                                 |
| vim_account    | "dddffb4a-2a02-4d0a-977e-a327a05e4b9a"                                                               |
| nets           | {                                                                                                    |
|                |   "net1": "OSM-CICD-subnet"                                                                          |
|                | }                                                                                                    |
| description    | "Robot cluster"                                                                                      |
| namespace      | "kube-system"                                                                                        |
| _admin         | {                                                                                                    |
|                |   "created": 1700556906.9514124,                                                                     |
|                |   "modified": 1700556906.9514124,                                                                    |
|                |   "projects_read": [                                                                                 |
|                |     "4ba6ae61-e275-4620-9767-94cb15ba9a12"                                                           |
|                |   ],                                                                                                 |
|                |   "projects_write": [                                                                                |
|                |     "4ba6ae61-e275-4620-9767-94cb15ba9a12"                                                           |
|                |   ],                                                                                                 |
|                |   "operationalState": "PROCESSING",                                                                  |
|                |   "operations": [                                                                                    |
|                |     {                                                                                                |
|                |       "lcmOperationType": "create",                                                                  |
|                |       "operationState": "PROCESSING",                                                                |
|                |       "startTime": 1700556906.9514399,                                                               |
|                |       "statusEnteredTime": 1700556906.9514399,                                                       |
|                |       "detailed-status": "",                                                                         |
|                |       "operationParams": null                                                                        |
|                |     }                                                                                                |
|                |   ],                                                                                                 |
|                |   "current_operation": null,                                                                         |
|                |   "helm_chart_repos": [],                                                                            |
|                |   "juju_bundle_repos": []                                                                            |
|                | }                                                                                                    |
| schema_version | "1.11"                                                                                               |
+----------------+------------------------------------------------------------------------------------------------------+
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Run Robot tests)
[Pipeline] sh
[azure_robot_tests] Running shell script
+ . /robot-systest/results/osm_environment.rc
+ export CLOUD_TYPE=azure
+ export OSM_HOSTNAME=172.21.23.5
+ export OSM_IMAGE_NAME=osmtest202311210839
+ export JUJU_PASSWORD=ffbc367bd0d177fa0d55700106f8ef93
+ . /robot-systest/results/k8s_environment.rc
+ export CLOUD_TYPE=azure
+ export USE_PAAS_K8S=FALSE
+ export K8S_IP=172.21.23.11
+ export K8S_IMAGE_NAME=k8stest202311210832
+ export K8S_CREDENTIALS=/robot-systest/results/kubeconfig.yaml
+ /robot-systest/run_test.sh -t hackfest_basic
==============================================================================
Testsuite                                                                     
==============================================================================
Testsuite.Hackfest Basic :: [HACKFEST-BASIC] Basic NS with a single-VDU VNF   
==============================================================================
Create Hackfest Basic VNF Descriptor                                  | PASS |
------------------------------------------------------------------------------
Create Hackfest Basic NS Descriptor                                   | PASS |
------------------------------------------------------------------------------
Network Service Instance Test                                         | PASS |
------------------------------------------------------------------------------
Get Vnf Ip Address                                                    | PASS |
------------------------------------------------------------------------------
Test Ping                                                             | PASS |
------------------------------------------------------------------------------
Test SSH Access                                                       | PASS |
------------------------------------------------------------------------------
Delete NS Instance Test                                               | PASS |
------------------------------------------------------------------------------
Delete NS Descriptor Test                                             | PASS |
------------------------------------------------------------------------------
Delete VNF Descriptor Test                                            | PASS |
------------------------------------------------------------------------------
Testsuite.Hackfest Basic :: [HACKFEST-BASIC] Basic NS with a singl... | PASS |
9 tests, 9 passed, 0 failed
==============================================================================
Testsuite                                                             | PASS |
9 tests, 9 passed, 0 failed
==============================================================================
Output:  /robot-systest/reports/output.xml
Log:     /robot-systest/reports/log.html
Report:  /robot-systest/reports/report.html
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Declarative: Post Actions)
[Pipeline] echo
Retrieve container logs
[Pipeline] sh
[azure_robot_tests] Running shell script
+ . /robot-systest/results/osm_environment.rc
+ export CLOUD_TYPE=azure
+ export OSM_HOSTNAME=172.21.23.5
+ export OSM_IMAGE_NAME=osmtest202311210839
+ export JUJU_PASSWORD=ffbc367bd0d177fa0d55700106f8ef93
+ /robot-systest/cloud-scripts/remote-extract-logs.sh
Saving grafana logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving keystone logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving lcm logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving mon logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving nbi logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving pol logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving ro logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving ngui logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving airflow-scheduler logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving pushgateway-prometheus-pushgateway logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving webhook-translator logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving kafka logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving mongo logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving mysql logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving prometheus logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving zookeeper logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.
Saving alertmanager logs...
Warning: Permanently added '172.21.23.5' (ED25519) to the list of known hosts.

All logs saved to /robot-systest/results/
[Pipeline] echo
Save results
[Pipeline] sh
[azure_robot_tests] Running shell script
+ rm -rf results
[Pipeline] sh
[azure_robot_tests] Running shell script
+ cp -var /robot-systest/results /robot-systest/reports/log.html /robot-systest/reports/output.xml /robot-systest/reports/report.html .
'/robot-systest/results' -> './results'
'/robot-systest/results/k8s_environment.rc' -> './results/k8s_environment.rc'
'/robot-systest/results/osm-deploy-grafana.log' -> './results/osm-deploy-grafana.log'
'/robot-systest/results/osm-deploy-keystone.log' -> './results/osm-deploy-keystone.log'
'/robot-systest/results/osm-deploy-lcm.log' -> './results/osm-deploy-lcm.log'
'/robot-systest/results/osm-deploy-mon.log' -> './results/osm-deploy-mon.log'
'/robot-systest/results/osm-deploy-nbi.log' -> './results/osm-deploy-nbi.log'
'/robot-systest/results/osm-deploy-pol.log' -> './results/osm-deploy-pol.log'
'/robot-systest/results/osm-deploy-ro.log' -> './results/osm-deploy-ro.log'
'/robot-systest/results/osm-deploy-ngui.log' -> './results/osm-deploy-ngui.log'
'/robot-systest/results/osm-deploy-airflow-scheduler.log' -> './results/osm-deploy-airflow-scheduler.log'
'/robot-systest/results/osm-deploy-pushgateway-prometheus-pushgateway.log' -> './results/osm-deploy-pushgateway-prometheus-pushgateway.log'
'/robot-systest/results/osm-deploy-webhook-translator.log' -> './results/osm-deploy-webhook-translator.log'
'/robot-systest/results/osm-sts-kafka.log' -> './results/osm-sts-kafka.log'
'/robot-systest/results/osm-sts-mongo.log' -> './results/osm-sts-mongo.log'
'/robot-systest/results/osm-sts-mysql.log' -> './results/osm-sts-mysql.log'
'/robot-systest/results/osm-sts-prometheus.log' -> './results/osm-sts-prometheus.log'
'/robot-systest/results/osm-sts-zookeeper.log' -> './results/osm-sts-zookeeper.log'
'/robot-systest/results/osm-sts-alertmanager.log' -> './results/osm-sts-alertmanager.log'
'/robot-systest/results/kubeconfig.yaml' -> './results/kubeconfig.yaml'
'/robot-systest/results/osm_environment.rc' -> './results/osm_environment.rc'
'/robot-systest/reports/log.html' -> './log.html'
'/robot-systest/reports/output.xml' -> './output.xml'
'/robot-systest/reports/report.html' -> './report.html'
[Pipeline] step
Archiving artifacts
Recording fingerprints
[Pipeline] echo
Updates the Robot dashboard in Jenkins
[Pipeline] robot
Robot results publisher started...
-Parsing output xml:
Done!
-Copying log files to build dir:
Done!
-Assigning results to build:
Done!
-Checking thresholds:
Done!
Done publishing Robot results.
[Pipeline] echo
Destroy the K8s cluster
[Pipeline] sh
[azure_robot_tests] Running shell script
+ . /robot-systest/results/k8s_environment.rc
+ export CLOUD_TYPE=azure
+ export USE_PAAS_K8S=FALSE
+ export K8S_IP=172.21.23.11
+ export K8S_IMAGE_NAME=k8stest202311210832
+ export K8S_CREDENTIALS=/robot-systest/results/kubeconfig.yaml
+ /robot-systest/cloud-scripts/delete-k8s.sh
Deleting IaaS k8s cluster in azure
++ az vm show --resource-group OSM_CICD_GROUP --name k8stest202311210832 --query 'networkProfile.networkInterfaces[0].id'
+ INTERFACE_ID='"/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/k8stest202311210832VMNic"'
+ INTERFACE_ID=/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/k8stest202311210832VMNic
++ az vm show --resource-group OSM_CICD_GROUP --name k8stest202311210832 --query storageProfile.osDisk.managedDisk.id
+ OS_DISK_ID='"/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Compute/disks/k8stest202311210832_OsDisk_1_479863dcfb4d4f9db294832fdee8f45d"'
+ OS_DISK_ID=/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Compute/disks/k8stest202311210832_OsDisk_1_479863dcfb4d4f9db294832fdee8f45d
++ az network nic show --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/k8stest202311210832VMNic --query networkSecurityGroup.id
+ SECURITY_GROUP_ID='"/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkSecurityGroups/k8stest202311210832NSG"'
+ SECURITY_GROUP_ID=/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkSecurityGroups/k8stest202311210832NSG
++ az network nic show --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/k8stest202311210832VMNic --query 'ipConfigurations[0].publicIpAddress.id'
+ PUBLIC_IP_ID=
+ PUBLIC_IP_ID=
+ az vm delete --resource-group OSM_CICD_GROUP --name k8stest202311210832 --yes
+ az network nic delete --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/k8stest202311210832VMNic
+ az disk delete --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Compute/disks/k8stest202311210832_OsDisk_1_479863dcfb4d4f9db294832fdee8f45d --yes
+ az network nsg delete --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkSecurityGroups/k8stest202311210832NSG
+ '[' -n '' ']'
[Pipeline] echo
Destroy the OSM host
[Pipeline] sh
[azure_robot_tests] Running shell script
+ . /robot-systest/results/osm_environment.rc
+ export CLOUD_TYPE=azure
+ export OSM_HOSTNAME=172.21.23.5
+ export OSM_IMAGE_NAME=osmtest202311210839
+ export JUJU_PASSWORD=ffbc367bd0d177fa0d55700106f8ef93
+ /robot-systest/cloud-scripts/delete-osm-vm.sh
++ az vm show --resource-group OSM_CICD_GROUP --name osmtest202311210839 --query 'networkProfile.networkInterfaces[0].id'
+ INTERFACE_ID='"/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/osmtest202311210839VMNic"'
+ INTERFACE_ID=/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/osmtest202311210839VMNic
++ az vm show --resource-group OSM_CICD_GROUP --name osmtest202311210839 --query storageProfile.osDisk.managedDisk.id
+ OS_DISK_ID='"/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Compute/disks/osmtest202311210839_OsDisk_1_b01a0c77aff64950a54bb3cb7e932015"'
+ OS_DISK_ID=/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Compute/disks/osmtest202311210839_OsDisk_1_b01a0c77aff64950a54bb3cb7e932015
++ az network nic show --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/osmtest202311210839VMNic --query networkSecurityGroup.id
+ SECURITY_GROUP_ID='"/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkSecurityGroups/osmtest202311210839NSG"'
+ SECURITY_GROUP_ID=/subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkSecurityGroups/osmtest202311210839NSG
++ az network nic show --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/osmtest202311210839VMNic --query 'ipConfigurations[0].publicIpAddress.id'
+ PUBLIC_IP_ID=
+ PUBLIC_IP_ID=
+ az vm delete --resource-group OSM_CICD_GROUP --name osmtest202311210839 --yes
+ az network nic delete --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkInterfaces/osmtest202311210839VMNic
+ az disk delete --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Compute/disks/osmtest202311210839_OsDisk_1_b01a0c77aff64950a54bb3cb7e932015 --yes
+ az network nsg delete --id /subscriptions/8fb7e78d-097b-413d-bc65-41d29be6bab1/resourceGroups/OSM_CICD_GROUP/providers/Microsoft.Network/networkSecurityGroups/osmtest202311210839NSG
+ '[' -n '' ']'
[Pipeline] sh
[azure_robot_tests] Running shell script
+ az vm list -o table
Name                 ResourceGroup       Location    Zones
-------------------  ------------------  ----------  -------
osmtest202311201121  OSM_CICD_GROUP      westeurope
vm-CICD-Host         OSM_CICD_GROUP      westeurope  1
vm-VPN-Host          OSM_GROUP           westeurope
VPN-Gateway          OSM_GROUP           westeurope
vm-Hackfest-Host     OSM_HACKFEST_GROUP  westeurope
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
$ docker stop --time=1 3fa63d4a331f3495c5062cb5cce8100cd64009f1f3aec8b9caba9c6ddfeea1b6
$ docker rm -f 3fa63d4a331f3495c5062cb5cce8100cd64009f1f3aec8b9caba9c6ddfeea1b6
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS