* * ==> Audit <== * |---------|-------------------------------------------------|----------|------|---------|--------------------------------|--------------------------------| | Command | Args | Profile | User | Version | Start Time | End Time | |---------|-------------------------------------------------|----------|------|---------|--------------------------------|--------------------------------| | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:11:22 NZST | Tue, 07 Sep 2021 16:11:23 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:11:41 NZST | Tue, 07 Sep 2021 16:11:41 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:11:54 NZST | Tue, 07 Sep 2021 16:11:55 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:12:04 NZST | Tue, 07 Sep 2021 16:12:05 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:12:18 NZST | Tue, 07 Sep 2021 16:12:19 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:12:24 NZST | Tue, 07 Sep 2021 16:12:24 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:12:32 NZST | Tue, 07 Sep 2021 16:12:32 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:13:38 NZST | Tue, 07 Sep 2021 16:13:39 NZST | | ssh | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:13:42 NZST | Tue, 07 Sep 2021 16:13:52 NZST | | ssh | grep host.minikube.internal | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:15:11 NZST | Tue, 07 Sep 2021 16:15:12 NZST | | | /etc/hosts | cut -f1 | | | | | | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:25:15 NZST | Tue, 07 Sep 2021 16:25:16 NZST | | ssh | grep host.minikube.internal | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:27:34 NZST | Tue, 07 Sep 2021 16:27:35 NZST | | | /etc/hosts | cut -f1 | | | | | | | image | load | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:30:01 NZST | Tue, 07 Sep 2021 16:30:11 NZST | | | quay.io/jetstack/cert-manager-cainjector:v1.3.1 | | | | | | | image | load | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:30:11 NZST | Tue, 07 Sep 2021 16:30:20 NZST | | | quay.io/jetstack/cert-manager-webhook:v1.3.1 | | | | | | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:32:48 NZST | Tue, 07 Sep 2021 16:32:48 NZST | | ssh | grep host.minikube.internal | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:34:05 NZST | Tue, 07 Sep 2021 16:34:06 NZST | | | /etc/hosts | cut -f1 | | | | | | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:34:33 NZST | Tue, 07 Sep 2021 16:34:34 NZST | | addons | enable ingress-dns | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:35:00 NZST | Tue, 07 Sep 2021 16:35:01 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:35:45 NZST | Tue, 07 Sep 2021 16:35:45 NZST | | tunnel | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:36:03 NZST | Tue, 07 Sep 2021 16:36:13 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:36:29 NZST | Tue, 07 Sep 2021 16:36:30 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:46:36 NZST | Tue, 07 Sep 2021 16:46:36 NZST | | ip | | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:51:11 NZST | Tue, 07 Sep 2021 16:51:12 NZST | | addons | enable ingress | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:51:48 NZST | Tue, 07 Sep 2021 16:51:50 NZST | | help | tunnerl | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:52:07 NZST | Tue, 07 Sep 2021 16:52:07 NZST | | help | tunnel | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:52:10 NZST | Tue, 07 Sep 2021 16:52:10 NZST | | addons | enable ingress-dns | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:55:00 NZST | Tue, 07 Sep 2021 16:55:00 NZST | | profile | list | minikube | root | v1.23.0 | Tue, 07 Sep 2021 16:58:11 NZST | Tue, 07 Sep 2021 16:58:12 NZST | | stop | | minikube | root | v1.23.0 | Tue, 07 Sep 2021 16:58:13 NZST | Tue, 07 Sep 2021 16:58:26 NZST | | delete | | minikube | root | v1.23.0 | Tue, 07 Sep 2021 16:58:26 NZST | Tue, 07 Sep 2021 16:58:40 NZST | | config | get disk-size | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:58:40 NZST | Tue, 07 Sep 2021 16:58:40 NZST | | config | get disk-size | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:58:40 NZST | Tue, 07 Sep 2021 16:58:40 NZST | | start | --vm=true | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 16:58:41 NZST | Tue, 07 Sep 2021 17:03:52 NZST | | ip | | minikube | root | v1.23.0 | Tue, 07 Sep 2021 17:03:52 NZST | Tue, 07 Sep 2021 17:03:52 NZST | | addons | configure registry-creds | minikube | root | v1.23.0 | Tue, 07 Sep 2021 17:03:53 NZST | Tue, 07 Sep 2021 17:03:53 NZST | | addons | enable registry-creds | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 17:03:54 NZST | Tue, 07 Sep 2021 17:03:54 NZST | | addons | enable ingress | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 17:03:54 NZST | Tue, 07 Sep 2021 17:03:55 NZST | | addons | enable ingress-dns | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 17:03:55 NZST | Tue, 07 Sep 2021 17:03:55 NZST | | image | load | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 19:10:50 NZST | Tue, 07 Sep 2021 19:11:00 NZST | | | quay.io/jetstack/cert-manager-cainjector:v1.3.1 | | | | | | | image | load | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 19:11:00 NZST | Tue, 07 Sep 2021 19:11:14 NZST | | | quay.io/jetstack/cert-manager-webhook:v1.3.1 | | | | | | | image | load | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 19:19:21 NZST | Tue, 07 Sep 2021 19:19:29 NZST | | | quay.io/jetstack/cert-manager-cainjector:v1.3.1 | | | | | | | image | load | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 19:19:29 NZST | Tue, 07 Sep 2021 19:19:36 NZST | | | quay.io/jetstack/cert-manager-webhook:v1.3.1 | | | | | | | ssh | grep host.minikube.internal | minikube | noah | v1.23.0 | Tue, 07 Sep 2021 19:23:44 NZST | Tue, 07 Sep 2021 19:23:45 NZST | | | /etc/hosts | cut -f1 | | | | | | | ssh | grep host.minikube.internal | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:04:20 NZST | Wed, 08 Sep 2021 08:04:21 NZST | | | /etc/hosts | cut -f1 | | | | | | | ip | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:13:23 NZST | Wed, 08 Sep 2021 08:13:23 NZST | | stop | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:17:18 NZST | Wed, 08 Sep 2021 08:17:26 NZST | | delete | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:17:35 NZST | Wed, 08 Sep 2021 08:17:38 NZST | | start | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:17:53 NZST | Wed, 08 Sep 2021 08:23:28 NZST | | addons | enable ingress-dns | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:24:39 NZST | Wed, 08 Sep 2021 08:24:39 NZST | | ip | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:24:51 NZST | Wed, 08 Sep 2021 08:24:52 NZST | | ip | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:25:45 NZST | Wed, 08 Sep 2021 08:25:45 NZST | | ip | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:25:57 NZST | Wed, 08 Sep 2021 08:25:58 NZST | | ip | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:28:52 NZST | Wed, 08 Sep 2021 08:28:52 NZST | | ip | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:28:59 NZST | Wed, 08 Sep 2021 08:29:00 NZST | | delete | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:30:42 NZST | Wed, 08 Sep 2021 08:30:53 NZST | | start | --vm=true | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:32:08 NZST | Wed, 08 Sep 2021 08:36:26 NZST | | addons | enable ingress-dns | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:41:03 NZST | Wed, 08 Sep 2021 08:41:03 NZST | | addons | enable ingress | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:41:06 NZST | Wed, 08 Sep 2021 08:42:42 NZST | | ip | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:43:20 NZST | Wed, 08 Sep 2021 08:43:20 NZST | | ip | | minikube | noah | v1.23.0 | Wed, 08 Sep 2021 08:46:33 NZST | Wed, 08 Sep 2021 08:46:33 NZST | |---------|-------------------------------------------------|----------|------|---------|--------------------------------|--------------------------------| * * ==> Last Start <== * Log file created at: 2021/09/08 08:32:08 Running on machine: starship Binary: Built with gc go1.17 for darwin/amd64 Log line format: [IWEF]mmdd hh:mm:ss.uuuuuu threadid file:line] msg I0908 08:32:08.719192 20823 out.go:298] Setting OutFile to fd 1 ... I0908 08:32:08.719428 20823 out.go:350] isatty.IsTerminal(1) = true I0908 08:32:08.719431 20823 out.go:311] Setting ErrFile to fd 2... I0908 08:32:08.719435 20823 out.go:350] isatty.IsTerminal(2) = true I0908 08:32:08.719532 20823 root.go:313] Updating PATH: /Users/noah/.minikube/bin I0908 08:32:08.719926 20823 out.go:305] Setting JSON to false I0908 08:32:08.765529 20823 start.go:111] hostinfo: {"hostname":"starship.localdomain","uptime":778418,"bootTime":1630268310,"procs":836,"os":"darwin","platform":"darwin","platformFamily":"Standalone Workstation","platformVersion":"11.5.2","kernelVersion":"20.6.0","kernelArch":"x86_64","virtualizationSystem":"","virtualizationRole":"","hostId":"fecbf22b-fbbe-36de-9664-f12a7dd41d3d"} W0908 08:32:08.765661 20823 start.go:119] gopshost.Virtualization returned error: not implemented yet I0908 08:32:08.785993 20823 out.go:177] 😄 minikube v1.23.0 on Darwin 11.5.2 I0908 08:32:08.786238 20823 notify.go:169] Checking for updates... I0908 08:32:08.786959 20823 driver.go:343] Setting default libvirt URI to qemu:///system I0908 08:32:08.787014 20823 global.go:111] Querying for installed drivers using PATH=/Users/noah/.minikube/bin:/Users/noah/.nvm/versions/node/v15.8.0/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Users/noah/.local/bin:/Users/noah/Library/Python/3.9/bin:/Users/noah/dev/misc-scripts/bin I0908 08:32:08.813657 20823 global.go:119] hyperkit default: true priority: 8, state: {Installed:true Healthy:true Running:true NeedsImprovement:false Error: Reason: Fix: Doc:} I0908 08:32:08.813847 20823 global.go:119] parallels default: true priority: 7, state: {Installed:false Healthy:false Running:false NeedsImprovement:false Error:exec: "prlctl": executable file not found in $PATH Reason: Fix:Install Parallels Desktop for Mac Doc:https://minikube.sigs.k8s.io/docs/drivers/parallels/} I0908 08:32:08.814486 20823 global.go:119] podman default: true priority: 3, state: {Installed:false Healthy:false Running:false NeedsImprovement:false Error:exec: "podman": executable file not found in $PATH Reason: Fix:Install Podman Doc:https://minikube.sigs.k8s.io/docs/drivers/podman/} I0908 08:32:08.814502 20823 global.go:119] ssh default: false priority: 4, state: {Installed:true Healthy:true Running:false NeedsImprovement:false Error: Reason: Fix: Doc:} I0908 08:32:08.814597 20823 global.go:119] virtualbox default: true priority: 6, state: {Installed:false Healthy:false Running:false NeedsImprovement:false Error:unable to find VBoxManage in $PATH Reason: Fix:Install VirtualBox Doc:https://minikube.sigs.k8s.io/docs/reference/drivers/virtualbox/} I0908 08:32:08.814678 20823 global.go:119] vmware default: true priority: 7, state: {Installed:false Healthy:false Running:false NeedsImprovement:false Error:exec: "docker-machine-driver-vmware": executable file not found in $PATH Reason: Fix:Install docker-machine-driver-vmware Doc:https://minikube.sigs.k8s.io/docs/reference/drivers/vmware/} I0908 08:32:08.814699 20823 global.go:119] vmwarefusion default: false priority: 1, state: {Installed:false Healthy:false Running:false NeedsImprovement:false Error:the 'vmwarefusion' driver is no longer available Reason: Fix:Switch to the newer 'vmware' driver by using '--driver=vmware'. This may require first deleting your existing cluster Doc:https://minikube.sigs.k8s.io/docs/drivers/vmware/} I0908 08:32:08.971162 20823 docker.go:132] docker version: linux-20.10.8 I0908 08:32:08.971282 20823 cli_runner.go:115] Run: docker system info --format "{{json .}}" I0908 08:32:09.603933 20823 info.go:263] docker info: {ID:F2OM:D4CD:QY7U:QEBQ:WC7B:5ZVW:CFBJ:O34G:RVM3:HKTF:RCNO:4BOJ Containers:36 ContainersRunning:0 ContainersPaused:0 ContainersStopped:36 Images:191 Driver:overlay2 DriverStatus:[[Backing Filesystem extfs] [Supports d_type true] [Native Overlay Diff true] [userxattr false]] SystemStatus: Plugins:{Volume:[local] Network:[bridge host ipvlan macvlan null overlay] Authorization: Log:[awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog]} MemoryLimit:true SwapLimit:true KernelMemory:true KernelMemoryTCP:true CPUCfsPeriod:true CPUCfsQuota:true CPUShares:true CPUSet:true PidsLimit:true IPv4Forwarding:true BridgeNfIptables:true BridgeNfIP6Tables:true Debug:false NFd:44 OomKillDisable:true NGoroutines:47 SystemTime:2021-09-07 20:32:09.1223312 +0000 UTC LoggingDriver:json-file CgroupDriver:cgroupfs NEventsListener:4 KernelVersion:5.10.47-linuxkit OperatingSystem:Docker Desktop OSType:linux Architecture:x86_64 IndexServerAddress:https://index.docker.io/v1/ RegistryConfig:{AllowNondistributableArtifactsCIDRs:[] AllowNondistributableArtifactsHostnames:[] InsecureRegistryCIDRs:[127.0.0.0/8] IndexConfigs:{DockerIo:{Name:docker.io Mirrors:[] Secure:true Official:true}} Mirrors:[]} NCPU:8 MemTotal:16789286912 GenericResources: DockerRootDir:/var/lib/docker HTTPProxy:http.docker.internal:3128 HTTPSProxy:http.docker.internal:3128 NoProxy: Name:docker-desktop Labels:[] ExperimentalBuild:false ServerVersion:20.10.8 ClusterStore: ClusterAdvertise: Runtimes:{Runc:{Path:runc}} DefaultRuntime:runc Swarm:{NodeID: NodeAddr: LocalNodeState:inactive ControlAvailable:false Error: RemoteManagers:} LiveRestoreEnabled:false Isolation: InitBinary:docker-init ContainerdCommit:{ID:e25210fe30a0a703442421b0f60afac609f950a3 Expected:e25210fe30a0a703442421b0f60afac609f950a3} RuncCommit:{ID:v1.0.1-0-g4144b63 Expected:v1.0.1-0-g4144b63} InitCommit:{ID:de40ad0 Expected:de40ad0} SecurityOptions:[name=seccomp,profile=default] ProductLicense: Warnings: ServerErrors:[] ClientInfo:{Debug:false Plugins:[map[Name:buildx Path:/usr/local/lib/docker/cli-plugins/docker-buildx SchemaVersion:0.1.0 ShortDescription:Build with BuildKit Vendor:Docker Inc. Version:v0.6.1-docker] map[Name:compose Path:/usr/local/lib/docker/cli-plugins/docker-compose SchemaVersion:0.1.0 ShortDescription:Docker Compose Vendor:Docker Inc. Version:v2.0.0-rc.1] map[Name:scan Path:/usr/local/lib/docker/cli-plugins/docker-scan SchemaVersion:0.1.0 ShortDescription:Docker Scan Vendor:Docker Inc. Version:v0.8.0]] Warnings:}} I0908 08:32:09.604014 20823 global.go:119] docker default: true priority: 9, state: {Installed:true Healthy:true Running:false NeedsImprovement:false Error: Reason: Fix: Doc:} I0908 08:32:09.604025 20823 driver.go:278] not recommending "ssh" due to default: false I0908 08:32:09.604039 20823 driver.go:313] Picked: hyperkit I0908 08:32:09.604042 20823 driver.go:314] Alternatives: [ssh] I0908 08:32:09.604044 20823 driver.go:315] Rejects: [parallels virtualbox vmware vmwarefusion] I0908 08:32:09.624350 20823 out.go:177] ✨ Automatically selected the hyperkit driver I0908 08:32:09.624405 20823 start.go:278] selected driver: hyperkit I0908 08:32:09.624415 20823 start.go:751] validating driver "hyperkit" against I0908 08:32:09.624435 20823 start.go:762] status for hyperkit: {Installed:true Healthy:true Running:true NeedsImprovement:false Error: Reason: Fix: Doc:} I0908 08:32:09.625312 20823 install.go:52] acquiring lock: {Name:mk4023283b30b374c3f04c8805d539e68824c0b8 Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:09.625812 20823 install.go:117] Validating docker-machine-driver-hyperkit, PATH=/Users/noah/.minikube/bin:/Users/noah/.nvm/versions/node/v15.8.0/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/Users/noah/.local/bin:/Users/noah/Library/Python/3.9/bin:/Users/noah/dev/misc-scripts/bin I0908 08:32:09.679637 20823 install.go:137] /Users/noah/.minikube/bin/docker-machine-driver-hyperkit version is 1.17.1 I0908 08:32:09.688626 20823 install.go:79] stdout: /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:09.688642 20823 install.go:81] /Users/noah/.minikube/bin/docker-machine-driver-hyperkit looks good I0908 08:32:09.688700 20823 start_flags.go:264] no existing cluster config was found, will generate one from the flags I0908 08:32:09.689071 20823 start_flags.go:719] Wait components to verify : map[apiserver:true system_pods:true] I0908 08:32:09.689087 20823 cni.go:93] Creating CNI manager for "" I0908 08:32:09.689094 20823 cni.go:167] CNI unnecessary in this configuration, recommending no CNI I0908 08:32:09.689102 20823 start_flags.go:278] config: {Name:minikube KeepContext:false EmbedCerts:false MinikubeISO: KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.26@sha256:d4aa14fbdc3a28a60632c24af937329ec787b02c89983c6f5498d346860a848c Memory:8192 CPUs:4 DiskSize:81920 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.99.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.22.1 ClusterName:minikube Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop: ExposedPorts:[] ListenAddress: Network: MultiNodeRequested:false ExtraDisks:0} I0908 08:32:09.689214 20823 iso.go:123] acquiring lock: {Name:mkee12d011877d41c3b144e274227b9da19c1a4c Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:09.729278 20823 out.go:177] 👍 Starting control plane node minikube in cluster minikube I0908 08:32:09.729341 20823 preload.go:131] Checking if preload exists for k8s version v1.22.1 and runtime docker I0908 08:32:09.729475 20823 preload.go:147] Found local preload: /Users/noah/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v12-v1.22.1-docker-overlay2-amd64.tar.lz4 I0908 08:32:09.729538 20823 cache.go:56] Caching tarball of preloaded images I0908 08:32:09.729917 20823 preload.go:173] Found /Users/noah/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v12-v1.22.1-docker-overlay2-amd64.tar.lz4 in cache, skipping download I0908 08:32:09.729946 20823 cache.go:59] Finished verifying existence of preloaded tar for v1.22.1 on docker I0908 08:32:09.730517 20823 profile.go:148] Saving config to /Users/noah/.minikube/profiles/minikube/config.json ... I0908 08:32:09.730561 20823 lock.go:36] WriteFile acquiring /Users/noah/.minikube/profiles/minikube/config.json: {Name:mk620181772c61343fb150938d52565f3588325d Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:09.731309 20823 cache.go:205] Successfully downloaded all kic artifacts I0908 08:32:09.731348 20823 start.go:313] acquiring machines lock for minikube: {Name:mk1835fa11daf47a7a40dfffee6b42690e90cd46 Clock:{} Delay:500ms Timeout:13m0s Cancel:} I0908 08:32:09.731466 20823 start.go:317] acquired machines lock for "minikube" in 105.569µs I0908 08:32:09.731504 20823 start.go:89] Provisioning new machine with config: &{Name:minikube KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.23.0.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.26@sha256:d4aa14fbdc3a28a60632c24af937329ec787b02c89983c6f5498d346860a848c Memory:8192 CPUs:4 DiskSize:81920 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.99.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.22.1 ClusterName:minikube Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP: Port:8443 KubernetesVersion:v1.22.1 ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop: ExposedPorts:[] ListenAddress: Network: MultiNodeRequested:false ExtraDisks:0} &{Name: IP: Port:8443 KubernetesVersion:v1.22.1 ControlPlane:true Worker:true} I0908 08:32:09.731581 20823 start.go:126] createHost starting for "" (driver="hyperkit") I0908 08:32:09.769801 20823 out.go:204] 🔥 Creating hyperkit VM (CPUs=4, Memory=8192MB, Disk=81920MB) ... I0908 08:32:09.770565 20823 main.go:130] libmachine: Found binary path at /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:09.770632 20823 main.go:130] libmachine: Launching plugin server for driver hyperkit I0908 08:32:09.783617 20823 main.go:130] libmachine: Plugin server listening at address 127.0.0.1:59956 I0908 08:32:09.786068 20823 main.go:130] libmachine: () Calling .GetVersion I0908 08:32:09.788475 20823 main.go:130] libmachine: Using API Version 1 I0908 08:32:09.788486 20823 main.go:130] libmachine: () Calling .SetConfigRaw I0908 08:32:09.789991 20823 main.go:130] libmachine: () Calling .GetMachineName I0908 08:32:09.790156 20823 main.go:130] libmachine: (minikube) Calling .GetMachineName I0908 08:32:09.790283 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:09.790419 20823 start.go:160] libmachine.API.Create for "minikube" (driver="hyperkit") I0908 08:32:09.790461 20823 client.go:168] LocalClient.Create starting I0908 08:32:09.790571 20823 main.go:130] libmachine: Reading certificate data from /Users/noah/.minikube/certs/ca.pem I0908 08:32:09.790915 20823 main.go:130] libmachine: Decoding PEM data... I0908 08:32:09.790952 20823 main.go:130] libmachine: Parsing certificate... I0908 08:32:09.791054 20823 main.go:130] libmachine: Reading certificate data from /Users/noah/.minikube/certs/cert.pem I0908 08:32:09.791373 20823 main.go:130] libmachine: Decoding PEM data... I0908 08:32:09.791414 20823 main.go:130] libmachine: Parsing certificate... I0908 08:32:09.791430 20823 main.go:130] libmachine: Running pre-create checks... I0908 08:32:09.791437 20823 main.go:130] libmachine: (minikube) Calling .PreCreateCheck I0908 08:32:09.791607 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:09.791839 20823 main.go:130] libmachine: (minikube) Calling .GetConfigRaw I0908 08:32:09.792689 20823 main.go:130] libmachine: Creating machine... I0908 08:32:09.792696 20823 main.go:130] libmachine: (minikube) Calling .Create I0908 08:32:09.792807 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:09.793311 20823 main.go:130] libmachine: (minikube) DBG | I0908 08:32:09.792789 20841 common.go:101] Making disk image using store path: /Users/noah/.minikube I0908 08:32:09.793390 20823 main.go:130] libmachine: (minikube) Downloading /Users/noah/.minikube/cache/boot2docker.iso from file:///Users/noah/.minikube/cache/iso/minikube-v1.23.0.iso... I0908 08:32:10.019628 20823 main.go:130] libmachine: (minikube) DBG | I0908 08:32:10.019565 20841 common.go:108] Creating ssh key: /Users/noah/.minikube/machines/minikube/id_rsa... I0908 08:32:10.127686 20823 main.go:130] libmachine: (minikube) DBG | I0908 08:32:10.127594 20841 common.go:114] Creating raw disk image: /Users/noah/.minikube/machines/minikube/minikube.rawdisk... I0908 08:32:10.127695 20823 main.go:130] libmachine: (minikube) DBG | Writing magic tar header I0908 08:32:10.127981 20823 main.go:130] libmachine: (minikube) DBG | Writing SSH key tar header I0908 08:32:10.128606 20823 main.go:130] libmachine: (minikube) DBG | I0908 08:32:10.128527 20841 common.go:128] Fixing permissions on /Users/noah/.minikube/machines/minikube ... I0908 08:32:10.302552 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:10.302568 20823 main.go:130] libmachine: (minikube) DBG | clean start, hyperkit pid file doesn't exist: /Users/noah/.minikube/machines/minikube/hyperkit.pid I0908 08:32:10.302604 20823 main.go:130] libmachine: (minikube) DBG | Using UUID ace7a274-101a-11ec-a102-acde48001122 I0908 08:32:10.643079 20823 main.go:130] libmachine: (minikube) DBG | Generated MAC c6:37:a0:ac:f6:0 I0908 08:32:10.643106 20823 main.go:130] libmachine: (minikube) DBG | Starting with cmdline: loglevel=3 console=ttyS0 console=tty0 noembed nomodeset norestore waitusb=10 systemd.legacy_systemd_cgroup_controller=yes random.trust_cpu=on hw_rng_model=virtio base host=minikube I0908 08:32:10.645511 20823 main.go:130] libmachine: (minikube) DBG | Attempt 0 I0908 08:32:10.645525 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:10.645728 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:10.649067 20823 main.go:130] libmachine: (minikube) DBG | Searching for c6:37:a0:ac:f6:0 in /var/db/dhcpd_leases ... I0908 08:32:10.649903 20823 main.go:130] libmachine: (minikube) DBG | Found 34 entries in /var/db/dhcpd_leases! I0908 08:32:10.649916 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:ae:82:8f:a1:7c:9d ID:1,ae:82:8f:a1:7c:9d Lease:0x61384b00} I0908 08:32:10.649939 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:c2:ac:f1:f7:42:5a ID:1,c2:ac:f1:f7:42:5a Lease:0x613830db} I0908 08:32:10.649946 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:d6:3c:dd:ea:6b:5f ID:1,d6:3c:dd:ea:6b:5f Lease:0x6128736d} I0908 08:32:10.649953 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:fa:43:ee:fe:aa:47 ID:1,fa:43:ee:fe:aa:47 Lease:0x61296bd1} I0908 08:32:10.649958 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:6e:23:b4:35:92:8e ID:1,6e:23:b4:35:92:8e Lease:0x611b1538} I0908 08:32:10.649964 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:c6:66:fd:d5:25:dc ID:1,c6:66:fd:d5:25:dc Lease:0x6109beb3} I0908 08:32:10.649973 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:42:16:71:a3:e2:59 ID:1,42:16:71:a3:e2:59 Lease:0x6109bdb1} I0908 08:32:10.649986 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:2e:12:32:da:2e:fe ID:1,2e:12:32:da:2e:fe Lease:0x61086c20} I0908 08:32:10.649994 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:82:e1:93:2c:38:3 ID:1,82:e1:93:2c:38:3 Lease:0x6109ad1d} I0908 08:32:10.650002 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:6e:bc:1a:97:37:49 ID:1,6e:bc:1a:97:37:49 Lease:0x60f8bcd9} I0908 08:32:10.650007 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:82:b:5e:e4:bd:df ID:1,82:b:5e:e4:bd:df Lease:0x60f8ace7} I0908 08:32:10.650017 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:aa:4e:38:b6:c7:1d ID:1,aa:4e:38:b6:c7:1d Lease:0x60f881c3} I0908 08:32:10.650023 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:32:c8:45:59:88:6f ID:1,32:c8:45:59:88:6f Lease:0x609d9bbf} I0908 08:32:10.650029 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:62:9b:7b:7:e5:2 ID:1,62:9b:7b:7:e5:2 Lease:0x608a3191} I0908 08:32:10.650040 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:1e:a7:17:fb:a9:b ID:1,1e:a7:17:fb:a9:b Lease:0x60617c59} I0908 08:32:10.650046 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:d2:56:21:24:d8 ID:1,d6:d2:56:21:24:d8 Lease:0x60591404} I0908 08:32:10.650051 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:6e:ae:ae:3d:62:6a ID:1,6e:ae:ae:3d:62:6a Lease:0x60500e55} I0908 08:32:10.650057 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:8a:10:91:83:42:85 ID:1,8a:10:91:83:42:85 Lease:0x604ebcb5} I0908 08:32:10.650067 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:b2:b4:93:17:d5:b8 ID:1,b2:b4:93:17:d5:b8 Lease:0x604eba8b} I0908 08:32:10.650073 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:62:27:be:29:f9:91 ID:1,62:27:be:29:f9:91 Lease:0x604eba5b} I0908 08:32:10.650082 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:46:6d:f4:8f:a5:33 ID:1,46:6d:f4:8f:a5:33 Lease:0x604eb9f4} I0908 08:32:10.650088 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:fa:e9:22:4e:6d:5e ID:1,fa:e9:22:4e:6d:5e Lease:0x604eb784} I0908 08:32:10.650093 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ce:35:ec:a7:c0:aa ID:1,ce:35:ec:a7:c0:aa Lease:0x604eb755} I0908 08:32:10.650102 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:26:94:46:30:e6:40 ID:1,26:94:46:30:e6:40 Lease:0x604eb702} I0908 08:32:10.650109 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:8e:17:ef:de:42:21 ID:1,8e:17:ef:de:42:21 Lease:0x604eb6d0} I0908 08:32:10.650120 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:e2:f2:63:e7:99:99 ID:1,e2:f2:63:e7:99:99 Lease:0x604eb69f} I0908 08:32:10.650128 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ae:bd:a2:10:3f:c3 ID:1,ae:bd:a2:10:3f:c3 Lease:0x604ead78} I0908 08:32:10.650136 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:7e:71:b3:32:dc:e4 ID:1,7e:71:b3:32:dc:e4 Lease:0x604eac91} I0908 08:32:10.650143 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:92:4b:89:ab:e8:34 ID:1,92:4b:89:ab:e8:34 Lease:0x604ea9be} I0908 08:32:10.650149 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:b2:27:3a:9c:df:e0 ID:1,b2:27:3a:9c:df:e0 Lease:0x604ea98f} I0908 08:32:10.650162 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:8a:1d:c4:b6:af:6a ID:1,8a:1d:c4:b6:af:6a Lease:0x604ff8e9} I0908 08:32:10.650171 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:f6:b0:3a:f0:b6:67 ID:1,f6:b0:3a:f0:b6:67 Lease:0x604c6934} I0908 08:32:10.650177 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:76:16:70:93:1a:a1 ID:1,76:16:70:93:1a:a1 Lease:0x604b176b} I0908 08:32:10.650182 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:8e:93:b6:dd:b2:5 ID:1,8e:93:b6:dd:b2:5 Lease:0x604acead} I0908 08:32:10.651773 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:10 Using fd 5 for I/O notifications I0908 08:32:10.709514 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:10 /Users/noah/.minikube/machines/minikube/boot2docker.iso: fcntl(F_PUNCHHOLE) Operation not permitted: block device will not support TRIM/DISCARD I0908 08:32:10.709677 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:10 vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0 I0908 08:32:10.709686 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:10 vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0 I0908 08:32:10.709696 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:10 vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0 I0908 08:32:11.342569 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 rdmsr to register 0x140 on vcpu 0 I0908 08:32:11.453526 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0 I0908 08:32:11.453573 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0 I0908 08:32:11.453586 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0 I0908 08:32:11.454256 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 rdmsr to register 0x140 on vcpu 1 I0908 08:32:11.533910 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0 I0908 08:32:11.533922 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0 I0908 08:32:11.533977 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0 I0908 08:32:11.534906 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 rdmsr to register 0x140 on vcpu 2 I0908 08:32:11.615036 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 4 bit: 12 unspecified don't care: bit is 0 I0908 08:32:11.615073 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 4 bit: 20 unspecified don't care: bit is 0 I0908 08:32:11.615147 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 vmx_set_ctlreg: cap_field: 3 bit: 13 unspecified don't care: bit is 0 I0908 08:32:11.615947 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:11 rdmsr to register 0x140 on vcpu 3 I0908 08:32:12.652236 20823 main.go:130] libmachine: (minikube) DBG | Attempt 1 I0908 08:32:12.652271 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:12.652475 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:12.655972 20823 main.go:130] libmachine: (minikube) DBG | Searching for c6:37:a0:ac:f6:0 in /var/db/dhcpd_leases ... I0908 08:32:12.656098 20823 main.go:130] libmachine: (minikube) DBG | Found 34 entries in /var/db/dhcpd_leases! I0908 08:32:12.656107 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:ae:82:8f:a1:7c:9d ID:1,ae:82:8f:a1:7c:9d Lease:0x61384b00} I0908 08:32:12.656120 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:c2:ac:f1:f7:42:5a ID:1,c2:ac:f1:f7:42:5a Lease:0x613830db} I0908 08:32:12.656130 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:d6:3c:dd:ea:6b:5f ID:1,d6:3c:dd:ea:6b:5f Lease:0x6128736d} I0908 08:32:12.656145 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:fa:43:ee:fe:aa:47 ID:1,fa:43:ee:fe:aa:47 Lease:0x61296bd1} I0908 08:32:12.656153 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:6e:23:b4:35:92:8e ID:1,6e:23:b4:35:92:8e Lease:0x611b1538} I0908 08:32:12.656164 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:c6:66:fd:d5:25:dc ID:1,c6:66:fd:d5:25:dc Lease:0x6109beb3} I0908 08:32:12.656170 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:42:16:71:a3:e2:59 ID:1,42:16:71:a3:e2:59 Lease:0x6109bdb1} I0908 08:32:12.656176 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:2e:12:32:da:2e:fe ID:1,2e:12:32:da:2e:fe Lease:0x61086c20} I0908 08:32:12.656181 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:82:e1:93:2c:38:3 ID:1,82:e1:93:2c:38:3 Lease:0x6109ad1d} I0908 08:32:12.656197 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:6e:bc:1a:97:37:49 ID:1,6e:bc:1a:97:37:49 Lease:0x60f8bcd9} I0908 08:32:12.656227 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:82:b:5e:e4:bd:df ID:1,82:b:5e:e4:bd:df Lease:0x60f8ace7} I0908 08:32:12.656234 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:aa:4e:38:b6:c7:1d ID:1,aa:4e:38:b6:c7:1d Lease:0x60f881c3} I0908 08:32:12.656241 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:32:c8:45:59:88:6f ID:1,32:c8:45:59:88:6f Lease:0x609d9bbf} I0908 08:32:12.656265 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:62:9b:7b:7:e5:2 ID:1,62:9b:7b:7:e5:2 Lease:0x608a3191} I0908 08:32:12.656270 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:1e:a7:17:fb:a9:b ID:1,1e:a7:17:fb:a9:b Lease:0x60617c59} I0908 08:32:12.656276 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:d2:56:21:24:d8 ID:1,d6:d2:56:21:24:d8 Lease:0x60591404} I0908 08:32:12.656303 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:6e:ae:ae:3d:62:6a ID:1,6e:ae:ae:3d:62:6a Lease:0x60500e55} I0908 08:32:12.656308 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:8a:10:91:83:42:85 ID:1,8a:10:91:83:42:85 Lease:0x604ebcb5} I0908 08:32:12.656340 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:b2:b4:93:17:d5:b8 ID:1,b2:b4:93:17:d5:b8 Lease:0x604eba8b} I0908 08:32:12.656375 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:62:27:be:29:f9:91 ID:1,62:27:be:29:f9:91 Lease:0x604eba5b} I0908 08:32:12.656382 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:46:6d:f4:8f:a5:33 ID:1,46:6d:f4:8f:a5:33 Lease:0x604eb9f4} I0908 08:32:12.656408 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:fa:e9:22:4e:6d:5e ID:1,fa:e9:22:4e:6d:5e Lease:0x604eb784} I0908 08:32:12.656434 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ce:35:ec:a7:c0:aa ID:1,ce:35:ec:a7:c0:aa Lease:0x604eb755} I0908 08:32:12.656439 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:26:94:46:30:e6:40 ID:1,26:94:46:30:e6:40 Lease:0x604eb702} I0908 08:32:12.656443 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:8e:17:ef:de:42:21 ID:1,8e:17:ef:de:42:21 Lease:0x604eb6d0} I0908 08:32:12.656476 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:e2:f2:63:e7:99:99 ID:1,e2:f2:63:e7:99:99 Lease:0x604eb69f} I0908 08:32:12.656508 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ae:bd:a2:10:3f:c3 ID:1,ae:bd:a2:10:3f:c3 Lease:0x604ead78} I0908 08:32:12.656537 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:7e:71:b3:32:dc:e4 ID:1,7e:71:b3:32:dc:e4 Lease:0x604eac91} I0908 08:32:12.656543 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:92:4b:89:ab:e8:34 ID:1,92:4b:89:ab:e8:34 Lease:0x604ea9be} I0908 08:32:12.656551 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:b2:27:3a:9c:df:e0 ID:1,b2:27:3a:9c:df:e0 Lease:0x604ea98f} I0908 08:32:12.656559 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:8a:1d:c4:b6:af:6a ID:1,8a:1d:c4:b6:af:6a Lease:0x604ff8e9} I0908 08:32:12.656585 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:f6:b0:3a:f0:b6:67 ID:1,f6:b0:3a:f0:b6:67 Lease:0x604c6934} I0908 08:32:12.656614 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:76:16:70:93:1a:a1 ID:1,76:16:70:93:1a:a1 Lease:0x604b176b} I0908 08:32:12.656647 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:8e:93:b6:dd:b2:5 ID:1,8e:93:b6:dd:b2:5 Lease:0x604acead} I0908 08:32:14.657861 20823 main.go:130] libmachine: (minikube) DBG | Attempt 2 I0908 08:32:14.657870 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:14.658034 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:14.660388 20823 main.go:130] libmachine: (minikube) DBG | Searching for c6:37:a0:ac:f6:0 in /var/db/dhcpd_leases ... I0908 08:32:14.660505 20823 main.go:130] libmachine: (minikube) DBG | Found 34 entries in /var/db/dhcpd_leases! I0908 08:32:14.660533 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:ae:82:8f:a1:7c:9d ID:1,ae:82:8f:a1:7c:9d Lease:0x61384b00} I0908 08:32:14.660548 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:c2:ac:f1:f7:42:5a ID:1,c2:ac:f1:f7:42:5a Lease:0x613830db} I0908 08:32:14.660556 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:d6:3c:dd:ea:6b:5f ID:1,d6:3c:dd:ea:6b:5f Lease:0x6128736d} I0908 08:32:14.660562 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:fa:43:ee:fe:aa:47 ID:1,fa:43:ee:fe:aa:47 Lease:0x61296bd1} I0908 08:32:14.660567 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:6e:23:b4:35:92:8e ID:1,6e:23:b4:35:92:8e Lease:0x611b1538} I0908 08:32:14.660571 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:c6:66:fd:d5:25:dc ID:1,c6:66:fd:d5:25:dc Lease:0x6109beb3} I0908 08:32:14.660577 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:42:16:71:a3:e2:59 ID:1,42:16:71:a3:e2:59 Lease:0x6109bdb1} I0908 08:32:14.660591 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:2e:12:32:da:2e:fe ID:1,2e:12:32:da:2e:fe Lease:0x61086c20} I0908 08:32:14.660605 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:82:e1:93:2c:38:3 ID:1,82:e1:93:2c:38:3 Lease:0x6109ad1d} I0908 08:32:14.660613 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:6e:bc:1a:97:37:49 ID:1,6e:bc:1a:97:37:49 Lease:0x60f8bcd9} I0908 08:32:14.660626 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:82:b:5e:e4:bd:df ID:1,82:b:5e:e4:bd:df Lease:0x60f8ace7} I0908 08:32:14.660636 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:aa:4e:38:b6:c7:1d ID:1,aa:4e:38:b6:c7:1d Lease:0x60f881c3} I0908 08:32:14.660643 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:32:c8:45:59:88:6f ID:1,32:c8:45:59:88:6f Lease:0x609d9bbf} I0908 08:32:14.660651 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:62:9b:7b:7:e5:2 ID:1,62:9b:7b:7:e5:2 Lease:0x608a3191} I0908 08:32:14.660656 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:1e:a7:17:fb:a9:b ID:1,1e:a7:17:fb:a9:b Lease:0x60617c59} I0908 08:32:14.660661 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:d2:56:21:24:d8 ID:1,d6:d2:56:21:24:d8 Lease:0x60591404} I0908 08:32:14.660671 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:6e:ae:ae:3d:62:6a ID:1,6e:ae:ae:3d:62:6a Lease:0x60500e55} I0908 08:32:14.660679 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:8a:10:91:83:42:85 ID:1,8a:10:91:83:42:85 Lease:0x604ebcb5} I0908 08:32:14.660684 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:b2:b4:93:17:d5:b8 ID:1,b2:b4:93:17:d5:b8 Lease:0x604eba8b} I0908 08:32:14.660691 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:62:27:be:29:f9:91 ID:1,62:27:be:29:f9:91 Lease:0x604eba5b} I0908 08:32:14.660697 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:46:6d:f4:8f:a5:33 ID:1,46:6d:f4:8f:a5:33 Lease:0x604eb9f4} I0908 08:32:14.660704 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:fa:e9:22:4e:6d:5e ID:1,fa:e9:22:4e:6d:5e Lease:0x604eb784} I0908 08:32:14.660709 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ce:35:ec:a7:c0:aa ID:1,ce:35:ec:a7:c0:aa Lease:0x604eb755} I0908 08:32:14.660716 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:26:94:46:30:e6:40 ID:1,26:94:46:30:e6:40 Lease:0x604eb702} I0908 08:32:14.660721 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:8e:17:ef:de:42:21 ID:1,8e:17:ef:de:42:21 Lease:0x604eb6d0} I0908 08:32:14.660736 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:e2:f2:63:e7:99:99 ID:1,e2:f2:63:e7:99:99 Lease:0x604eb69f} I0908 08:32:14.660741 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ae:bd:a2:10:3f:c3 ID:1,ae:bd:a2:10:3f:c3 Lease:0x604ead78} I0908 08:32:14.660747 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:7e:71:b3:32:dc:e4 ID:1,7e:71:b3:32:dc:e4 Lease:0x604eac91} I0908 08:32:14.660754 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:92:4b:89:ab:e8:34 ID:1,92:4b:89:ab:e8:34 Lease:0x604ea9be} I0908 08:32:14.660761 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:b2:27:3a:9c:df:e0 ID:1,b2:27:3a:9c:df:e0 Lease:0x604ea98f} I0908 08:32:14.660767 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:8a:1d:c4:b6:af:6a ID:1,8a:1d:c4:b6:af:6a Lease:0x604ff8e9} I0908 08:32:14.660773 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:f6:b0:3a:f0:b6:67 ID:1,f6:b0:3a:f0:b6:67 Lease:0x604c6934} I0908 08:32:14.660778 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:76:16:70:93:1a:a1 ID:1,76:16:70:93:1a:a1 Lease:0x604b176b} I0908 08:32:14.660785 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:8e:93:b6:dd:b2:5 ID:1,8e:93:b6:dd:b2:5 Lease:0x604acead} I0908 08:32:15.561916 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:15 rdmsr to register 0x64e on vcpu 1 I0908 08:32:15.561929 20823 main.go:130] libmachine: (minikube) DBG | 2021/09/08 08:32:15 rdmsr to register 0x34 on vcpu 1 I0908 08:32:16.664251 20823 main.go:130] libmachine: (minikube) DBG | Attempt 3 I0908 08:32:16.664263 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:16.664443 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:16.666485 20823 main.go:130] libmachine: (minikube) DBG | Searching for c6:37:a0:ac:f6:0 in /var/db/dhcpd_leases ... I0908 08:32:16.666619 20823 main.go:130] libmachine: (minikube) DBG | Found 34 entries in /var/db/dhcpd_leases! I0908 08:32:16.666629 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.35 HWAddress:ae:82:8f:a1:7c:9d ID:1,ae:82:8f:a1:7c:9d Lease:0x61384b00} I0908 08:32:16.666654 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.34 HWAddress:c2:ac:f1:f7:42:5a ID:1,c2:ac:f1:f7:42:5a Lease:0x613830db} I0908 08:32:16.666662 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.33 HWAddress:d6:3c:dd:ea:6b:5f ID:1,d6:3c:dd:ea:6b:5f Lease:0x6128736d} I0908 08:32:16.666710 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.32 HWAddress:fa:43:ee:fe:aa:47 ID:1,fa:43:ee:fe:aa:47 Lease:0x61296bd1} I0908 08:32:16.666737 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.31 HWAddress:6e:23:b4:35:92:8e ID:1,6e:23:b4:35:92:8e Lease:0x611b1538} I0908 08:32:16.666766 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.30 HWAddress:c6:66:fd:d5:25:dc ID:1,c6:66:fd:d5:25:dc Lease:0x6109beb3} I0908 08:32:16.666771 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.29 HWAddress:42:16:71:a3:e2:59 ID:1,42:16:71:a3:e2:59 Lease:0x6109bdb1} I0908 08:32:16.666776 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.28 HWAddress:2e:12:32:da:2e:fe ID:1,2e:12:32:da:2e:fe Lease:0x61086c20} I0908 08:32:16.666781 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.27 HWAddress:82:e1:93:2c:38:3 ID:1,82:e1:93:2c:38:3 Lease:0x6109ad1d} I0908 08:32:16.666797 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.26 HWAddress:6e:bc:1a:97:37:49 ID:1,6e:bc:1a:97:37:49 Lease:0x60f8bcd9} I0908 08:32:16.666807 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.25 HWAddress:82:b:5e:e4:bd:df ID:1,82:b:5e:e4:bd:df Lease:0x60f8ace7} I0908 08:32:16.666813 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.24 HWAddress:aa:4e:38:b6:c7:1d ID:1,aa:4e:38:b6:c7:1d Lease:0x60f881c3} I0908 08:32:16.666817 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.23 HWAddress:32:c8:45:59:88:6f ID:1,32:c8:45:59:88:6f Lease:0x609d9bbf} I0908 08:32:16.666823 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.22 HWAddress:62:9b:7b:7:e5:2 ID:1,62:9b:7b:7:e5:2 Lease:0x608a3191} I0908 08:32:16.666827 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.21 HWAddress:1e:a7:17:fb:a9:b ID:1,1e:a7:17:fb:a9:b Lease:0x60617c59} I0908 08:32:16.666839 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.20 HWAddress:d6:d2:56:21:24:d8 ID:1,d6:d2:56:21:24:d8 Lease:0x60591404} I0908 08:32:16.666850 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.19 HWAddress:6e:ae:ae:3d:62:6a ID:1,6e:ae:ae:3d:62:6a Lease:0x60500e55} I0908 08:32:16.666857 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.18 HWAddress:8a:10:91:83:42:85 ID:1,8a:10:91:83:42:85 Lease:0x604ebcb5} I0908 08:32:16.666878 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.17 HWAddress:b2:b4:93:17:d5:b8 ID:1,b2:b4:93:17:d5:b8 Lease:0x604eba8b} I0908 08:32:16.666885 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.16 HWAddress:62:27:be:29:f9:91 ID:1,62:27:be:29:f9:91 Lease:0x604eba5b} I0908 08:32:16.666891 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.15 HWAddress:46:6d:f4:8f:a5:33 ID:1,46:6d:f4:8f:a5:33 Lease:0x604eb9f4} I0908 08:32:16.666895 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.14 HWAddress:fa:e9:22:4e:6d:5e ID:1,fa:e9:22:4e:6d:5e Lease:0x604eb784} I0908 08:32:16.666923 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.13 HWAddress:ce:35:ec:a7:c0:aa ID:1,ce:35:ec:a7:c0:aa Lease:0x604eb755} I0908 08:32:16.666954 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.12 HWAddress:26:94:46:30:e6:40 ID:1,26:94:46:30:e6:40 Lease:0x604eb702} I0908 08:32:16.666960 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.11 HWAddress:8e:17:ef:de:42:21 ID:1,8e:17:ef:de:42:21 Lease:0x604eb6d0} I0908 08:32:16.666967 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.10 HWAddress:e2:f2:63:e7:99:99 ID:1,e2:f2:63:e7:99:99 Lease:0x604eb69f} I0908 08:32:16.666972 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.9 HWAddress:ae:bd:a2:10:3f:c3 ID:1,ae:bd:a2:10:3f:c3 Lease:0x604ead78} I0908 08:32:16.666979 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.8 HWAddress:7e:71:b3:32:dc:e4 ID:1,7e:71:b3:32:dc:e4 Lease:0x604eac91} I0908 08:32:16.666993 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.7 HWAddress:92:4b:89:ab:e8:34 ID:1,92:4b:89:ab:e8:34 Lease:0x604ea9be} I0908 08:32:16.667000 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.6 HWAddress:b2:27:3a:9c:df:e0 ID:1,b2:27:3a:9c:df:e0 Lease:0x604ea98f} I0908 08:32:16.667025 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.5 HWAddress:8a:1d:c4:b6:af:6a ID:1,8a:1d:c4:b6:af:6a Lease:0x604ff8e9} I0908 08:32:16.667029 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.4 HWAddress:f6:b0:3a:f0:b6:67 ID:1,f6:b0:3a:f0:b6:67 Lease:0x604c6934} I0908 08:32:16.667034 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.3 HWAddress:76:16:70:93:1a:a1 ID:1,76:16:70:93:1a:a1 Lease:0x604b176b} I0908 08:32:16.667040 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.2 HWAddress:8e:93:b6:dd:b2:5 ID:1,8e:93:b6:dd:b2:5 Lease:0x604acead} I0908 08:32:18.671696 20823 main.go:130] libmachine: (minikube) DBG | Attempt 4 I0908 08:32:18.672372 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:18.672626 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:18.674850 20823 main.go:130] libmachine: (minikube) DBG | Searching for c6:37:a0:ac:f6:0 in /var/db/dhcpd_leases ... I0908 08:32:18.674887 20823 main.go:130] libmachine: (minikube) DBG | Found 35 entries in /var/db/dhcpd_leases! I0908 08:32:18.674902 20823 main.go:130] libmachine: (minikube) DBG | dhcp entry: {Name:minikube IPAddress:192.168.64.36 HWAddress:c6:37:a0:ac:f6:0 ID:1,c6:37:a0:ac:f6:0 Lease:0x61391dd2} I0908 08:32:18.674910 20823 main.go:130] libmachine: (minikube) DBG | Found match: c6:37:a0:ac:f6:0 I0908 08:32:18.693327 20823 main.go:130] libmachine: (minikube) DBG | IP: 192.168.64.36 I0908 08:32:18.693392 20823 main.go:130] libmachine: (minikube) Calling .GetConfigRaw I0908 08:32:18.752566 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:18.752917 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:18.753167 20823 main.go:130] libmachine: Waiting for machine to be running, this may take a few minutes... I0908 08:32:18.753201 20823 main.go:130] libmachine: (minikube) Calling .GetState I0908 08:32:18.753400 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:18.753916 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:18.756333 20823 main.go:130] libmachine: Detecting operating system of created instance... I0908 08:32:18.756338 20823 main.go:130] libmachine: Waiting for SSH to be available... I0908 08:32:18.756341 20823 main.go:130] libmachine: Getting to WaitForSSH function... I0908 08:32:18.756345 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:18.756546 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:18.756741 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:18.756899 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:18.757055 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:18.757233 20823 main.go:130] libmachine: Using SSH client type: native I0908 08:32:18.757401 20823 main.go:130] libmachine: &{{{ 0 [] [] []} docker [0x43a0020] 0x43a3100 [] 0s} 192.168.64.36 22 } I0908 08:32:18.757406 20823 main.go:130] libmachine: About to run SSH command: exit 0 I0908 08:32:19.853286 20823 main.go:130] libmachine: SSH cmd err, output: : I0908 08:32:19.853297 20823 main.go:130] libmachine: Detecting the provisioner... I0908 08:32:19.853302 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:19.853547 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:19.853776 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:19.854019 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:19.854114 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:19.854378 20823 main.go:130] libmachine: Using SSH client type: native I0908 08:32:19.854492 20823 main.go:130] libmachine: &{{{ 0 [] [] []} docker [0x43a0020] 0x43a3100 [] 0s} 192.168.64.36 22 } I0908 08:32:19.854496 20823 main.go:130] libmachine: About to run SSH command: cat /etc/os-release I0908 08:32:19.923098 20823 main.go:130] libmachine: SSH cmd err, output: : NAME=Buildroot VERSION=2021.02.4 ID=buildroot VERSION_ID=2021.02.4 PRETTY_NAME="Buildroot 2021.02.4" I0908 08:32:19.924230 20823 main.go:130] libmachine: found compatible host: buildroot I0908 08:32:19.924235 20823 main.go:130] libmachine: Provisioning with buildroot... I0908 08:32:19.924240 20823 main.go:130] libmachine: (minikube) Calling .GetMachineName I0908 08:32:19.924410 20823 buildroot.go:166] provisioning hostname "minikube" I0908 08:32:19.924419 20823 main.go:130] libmachine: (minikube) Calling .GetMachineName I0908 08:32:19.924537 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:19.924659 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:19.924769 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:19.924933 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:19.925052 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:19.925322 20823 main.go:130] libmachine: Using SSH client type: native I0908 08:32:19.925495 20823 main.go:130] libmachine: &{{{ 0 [] [] []} docker [0x43a0020] 0x43a3100 [] 0s} 192.168.64.36 22 } I0908 08:32:19.925501 20823 main.go:130] libmachine: About to run SSH command: sudo hostname minikube && echo "minikube" | sudo tee /etc/hostname I0908 08:32:19.999752 20823 main.go:130] libmachine: SSH cmd err, output: : minikube I0908 08:32:19.999765 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:19.999931 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:20.000049 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.000171 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.000291 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:20.000457 20823 main.go:130] libmachine: Using SSH client type: native I0908 08:32:20.000589 20823 main.go:130] libmachine: &{{{ 0 [] [] []} docker [0x43a0020] 0x43a3100 [] 0s} 192.168.64.36 22 } I0908 08:32:20.000598 20823 main.go:130] libmachine: About to run SSH command: if ! grep -xq '.*\sminikube' /etc/hosts; then if grep -xq '127.0.1.1\s.*' /etc/hosts; then sudo sed -i 's/^127.0.1.1\s.*/127.0.1.1 minikube/g' /etc/hosts; else echo '127.0.1.1 minikube' | sudo tee -a /etc/hosts; fi fi I0908 08:32:20.064561 20823 main.go:130] libmachine: SSH cmd err, output: : I0908 08:32:20.064573 20823 buildroot.go:172] set auth options {CertDir:/Users/noah/.minikube CaCertPath:/Users/noah/.minikube/certs/ca.pem CaPrivateKeyPath:/Users/noah/.minikube/certs/ca-key.pem CaCertRemotePath:/etc/docker/ca.pem ServerCertPath:/Users/noah/.minikube/machines/server.pem ServerKeyPath:/Users/noah/.minikube/machines/server-key.pem ClientKeyPath:/Users/noah/.minikube/certs/key.pem ServerCertRemotePath:/etc/docker/server.pem ServerKeyRemotePath:/etc/docker/server-key.pem ClientCertPath:/Users/noah/.minikube/certs/cert.pem ServerCertSANs:[] StorePath:/Users/noah/.minikube} I0908 08:32:20.064592 20823 buildroot.go:174] setting up certificates I0908 08:32:20.064602 20823 provision.go:83] configureAuth start I0908 08:32:20.064607 20823 main.go:130] libmachine: (minikube) Calling .GetMachineName I0908 08:32:20.064768 20823 main.go:130] libmachine: (minikube) Calling .GetIP I0908 08:32:20.064874 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:20.064986 20823 provision.go:138] copyHostCerts I0908 08:32:20.065104 20823 exec_runner.go:145] found /Users/noah/.minikube/ca.pem, removing ... I0908 08:32:20.065111 20823 exec_runner.go:208] rm: /Users/noah/.minikube/ca.pem I0908 08:32:20.065231 20823 exec_runner.go:152] cp: /Users/noah/.minikube/certs/ca.pem --> /Users/noah/.minikube/ca.pem (1070 bytes) I0908 08:32:20.065816 20823 exec_runner.go:145] found /Users/noah/.minikube/cert.pem, removing ... I0908 08:32:20.065820 20823 exec_runner.go:208] rm: /Users/noah/.minikube/cert.pem I0908 08:32:20.065909 20823 exec_runner.go:152] cp: /Users/noah/.minikube/certs/cert.pem --> /Users/noah/.minikube/cert.pem (1115 bytes) I0908 08:32:20.066267 20823 exec_runner.go:145] found /Users/noah/.minikube/key.pem, removing ... I0908 08:32:20.066271 20823 exec_runner.go:208] rm: /Users/noah/.minikube/key.pem I0908 08:32:20.066398 20823 exec_runner.go:152] cp: /Users/noah/.minikube/certs/key.pem --> /Users/noah/.minikube/key.pem (1675 bytes) I0908 08:32:20.066661 20823 provision.go:112] generating server cert: /Users/noah/.minikube/machines/server.pem ca-key=/Users/noah/.minikube/certs/ca.pem private-key=/Users/noah/.minikube/certs/ca-key.pem org=noah.minikube san=[192.168.64.36 192.168.64.36 localhost 127.0.0.1 minikube minikube] I0908 08:32:20.177958 20823 provision.go:172] copyRemoteCerts I0908 08:32:20.178028 20823 ssh_runner.go:152] Run: sudo mkdir -p /etc/docker /etc/docker /etc/docker I0908 08:32:20.178050 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:20.178357 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:20.178612 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.178783 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:20.178990 20823 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/noah/.minikube/machines/minikube/id_rsa Username:docker} I0908 08:32:20.215642 20823 ssh_runner.go:319] scp /Users/noah/.minikube/machines/server-key.pem --> /etc/docker/server-key.pem (1679 bytes) I0908 08:32:20.232304 20823 ssh_runner.go:319] scp /Users/noah/.minikube/certs/ca.pem --> /etc/docker/ca.pem (1070 bytes) I0908 08:32:20.247157 20823 ssh_runner.go:319] scp /Users/noah/.minikube/machines/server.pem --> /etc/docker/server.pem (1196 bytes) I0908 08:32:20.261251 20823 provision.go:86] duration metric: configureAuth took 196.63885ms I0908 08:32:20.261274 20823 buildroot.go:189] setting minikube options for container-runtime I0908 08:32:20.262129 20823 config.go:177] Loaded profile config "minikube": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.22.1 I0908 08:32:20.262139 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:20.262347 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:20.262509 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:20.262608 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.262775 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.262934 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:20.263149 20823 main.go:130] libmachine: Using SSH client type: native I0908 08:32:20.263244 20823 main.go:130] libmachine: &{{{ 0 [] [] []} docker [0x43a0020] 0x43a3100 [] 0s} 192.168.64.36 22 } I0908 08:32:20.263249 20823 main.go:130] libmachine: About to run SSH command: df --output=fstype / | tail -n 1 I0908 08:32:20.328321 20823 main.go:130] libmachine: SSH cmd err, output: : tmpfs I0908 08:32:20.328329 20823 buildroot.go:70] root file system type: tmpfs I0908 08:32:20.328854 20823 provision.go:309] Updating docker unit: /lib/systemd/system/docker.service ... I0908 08:32:20.328881 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:20.329111 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:20.329365 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.329532 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.329672 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:20.329887 20823 main.go:130] libmachine: Using SSH client type: native I0908 08:32:20.330044 20823 main.go:130] libmachine: &{{{ 0 [] [] []} docker [0x43a0020] 0x43a3100 [] 0s} 192.168.64.36 22 } I0908 08:32:20.330089 20823 main.go:130] libmachine: About to run SSH command: sudo mkdir -p /lib/systemd/system && printf %!s(MISSING) "[Unit] Description=Docker Application Container Engine Documentation=https://docs.docker.com After=network.target minikube-automount.service docker.socket Requires= minikube-automount.service docker.socket StartLimitBurst=3 StartLimitIntervalSec=60 [Service] Type=notify Restart=on-failure # This file is a systemd drop-in unit that inherits from the base dockerd configuration. # The base configuration already specifies an 'ExecStart=...' command. The first directive # here is to clear out that command inherited from the base configuration. Without this, # the command from the base configuration and the command specified here are treated as # a sequence of commands, which is not the desired behavior, nor is it valid -- systemd # will catch this invalid input and refuse to start the service with an error like: # Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services. # NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other # container runtimes. If left unlimited, it may result in OOM issues with MySQL. ExecStart= ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 ExecReload=/bin/kill -s HUP \$MAINPID # Having non-zero Limit*s causes performance problems due to accounting overhead # in the kernel. We recommend using cgroups to do container-local accounting. LimitNOFILE=infinity LimitNPROC=infinity LimitCORE=infinity # Uncomment TasksMax if your systemd version supports it. # Only systemd 226 and above support this version. TasksMax=infinity TimeoutStartSec=0 # set delegate yes so that systemd does not reset the cgroups of docker containers Delegate=yes # kill only the docker process, not all processes in the cgroup KillMode=process [Install] WantedBy=multi-user.target " | sudo tee /lib/systemd/system/docker.service.new I0908 08:32:20.403228 20823 main.go:130] libmachine: SSH cmd err, output: : [Unit] Description=Docker Application Container Engine Documentation=https://docs.docker.com After=network.target minikube-automount.service docker.socket Requires= minikube-automount.service docker.socket StartLimitBurst=3 StartLimitIntervalSec=60 [Service] Type=notify Restart=on-failure # This file is a systemd drop-in unit that inherits from the base dockerd configuration. # The base configuration already specifies an 'ExecStart=...' command. The first directive # here is to clear out that command inherited from the base configuration. Without this, # the command from the base configuration and the command specified here are treated as # a sequence of commands, which is not the desired behavior, nor is it valid -- systemd # will catch this invalid input and refuse to start the service with an error like: # Service has more than one ExecStart= setting, which is only allowed for Type=oneshot services. # NOTE: default-ulimit=nofile is set to an arbitrary number for consistency with other # container runtimes. If left unlimited, it may result in OOM issues with MySQL. ExecStart= ExecStart=/usr/bin/dockerd -H tcp://0.0.0.0:2376 -H unix:///var/run/docker.sock --default-ulimit=nofile=1048576:1048576 --tlsverify --tlscacert /etc/docker/ca.pem --tlscert /etc/docker/server.pem --tlskey /etc/docker/server-key.pem --label provider=hyperkit --insecure-registry 10.96.0.0/12 ExecReload=/bin/kill -s HUP $MAINPID # Having non-zero Limit*s causes performance problems due to accounting overhead # in the kernel. We recommend using cgroups to do container-local accounting. LimitNOFILE=infinity LimitNPROC=infinity LimitCORE=infinity # Uncomment TasksMax if your systemd version supports it. # Only systemd 226 and above support this version. TasksMax=infinity TimeoutStartSec=0 # set delegate yes so that systemd does not reset the cgroups of docker containers Delegate=yes # kill only the docker process, not all processes in the cgroup KillMode=process [Install] WantedBy=multi-user.target I0908 08:32:20.403244 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:20.403423 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:20.403606 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.403749 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.403946 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:20.404097 20823 main.go:130] libmachine: Using SSH client type: native I0908 08:32:20.404237 20823 main.go:130] libmachine: &{{{ 0 [] [] []} docker [0x43a0020] 0x43a3100 [] 0s} 192.168.64.36 22 } I0908 08:32:20.404248 20823 main.go:130] libmachine: About to run SSH command: sudo diff -u /lib/systemd/system/docker.service /lib/systemd/system/docker.service.new || { sudo mv /lib/systemd/system/docker.service.new /lib/systemd/system/docker.service; sudo systemctl -f daemon-reload && sudo systemctl -f enable docker && sudo systemctl -f restart docker; } I0908 08:32:20.951646 20823 main.go:130] libmachine: SSH cmd err, output: : diff: can't stat '/lib/systemd/system/docker.service': No such file or directory Created symlink /etc/systemd/system/multi-user.target.wants/docker.service → /usr/lib/systemd/system/docker.service. I0908 08:32:20.951665 20823 main.go:130] libmachine: Checking connection to Docker... I0908 08:32:20.951670 20823 main.go:130] libmachine: (minikube) Calling .GetURL I0908 08:32:20.951933 20823 main.go:130] libmachine: Docker is up and running! I0908 08:32:20.951938 20823 main.go:130] libmachine: Reticulating splines... I0908 08:32:20.951941 20823 client.go:171] LocalClient.Create took 11.161359687s I0908 08:32:20.951971 20823 start.go:168] duration metric: libmachine.API.Create for "minikube" took 11.161436396s I0908 08:32:20.951979 20823 start.go:267] post-start starting for "minikube" (driver="hyperkit") I0908 08:32:20.951981 20823 start.go:277] creating required directories: [/etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs] I0908 08:32:20.951989 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:20.952230 20823 ssh_runner.go:152] Run: sudo mkdir -p /etc/kubernetes/addons /etc/kubernetes/manifests /var/tmp/minikube /var/lib/minikube /var/lib/minikube/certs /var/lib/minikube/images /var/lib/minikube/binaries /tmp/gvisor /usr/share/ca-certificates /etc/ssl/certs I0908 08:32:20.952249 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:20.952399 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:20.952505 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.952614 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:20.952813 20823 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/noah/.minikube/machines/minikube/id_rsa Username:docker} I0908 08:32:20.993471 20823 ssh_runner.go:152] Run: cat /etc/os-release I0908 08:32:20.996225 20823 info.go:137] Remote host: Buildroot 2021.02.4 I0908 08:32:20.996235 20823 filesync.go:126] Scanning /Users/noah/.minikube/addons for local assets ... I0908 08:32:20.996325 20823 filesync.go:126] Scanning /Users/noah/.minikube/files for local assets ... I0908 08:32:20.996371 20823 start.go:270] post-start completed in 44.388769ms I0908 08:32:20.996393 20823 main.go:130] libmachine: (minikube) Calling .GetConfigRaw I0908 08:32:20.997017 20823 main.go:130] libmachine: (minikube) Calling .GetIP I0908 08:32:20.997156 20823 profile.go:148] Saving config to /Users/noah/.minikube/profiles/minikube/config.json ... I0908 08:32:20.997489 20823 start.go:129] duration metric: createHost completed in 11.265777173s I0908 08:32:20.997493 20823 start.go:80] releasing machines lock for "minikube", held for 11.265901363s I0908 08:32:20.997509 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:20.997608 20823 main.go:130] libmachine: (minikube) Calling .GetIP I0908 08:32:20.997692 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:20.997844 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:20.998153 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:20.998359 20823 ssh_runner.go:152] Run: curl -sS -m 2 https://k8s.gcr.io/ I0908 08:32:20.998432 20823 ssh_runner.go:152] Run: systemctl --version I0908 08:32:20.998443 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:20.998552 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:20.998638 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.998715 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:20.998853 20823 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/noah/.minikube/machines/minikube/id_rsa Username:docker} I0908 08:32:20.998916 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:20.999023 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:20.999159 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:20.999293 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:20.999395 20823 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/noah/.minikube/machines/minikube/id_rsa Username:docker} I0908 08:32:21.039764 20823 preload.go:131] Checking if preload exists for k8s version v1.22.1 and runtime docker I0908 08:32:21.039987 20823 ssh_runner.go:152] Run: docker images --format {{.Repository}}:{{.Tag}} I0908 08:32:21.620261 20823 docker.go:558] Got preloaded images: I0908 08:32:21.620269 20823 docker.go:564] k8s.gcr.io/kube-apiserver:v1.22.1 wasn't preloaded I0908 08:32:21.620343 20823 ssh_runner.go:152] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json I0908 08:32:21.627955 20823 ssh_runner.go:152] Run: which lz4 I0908 08:32:21.630877 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4 I0908 08:32:21.633827 20823 ssh_runner.go:309] existence check for /preloaded.tar.lz4: stat -c "%!s(MISSING) %!y(MISSING)" /preloaded.tar.lz4: Process exited with status 1 stdout: stderr: stat: cannot statx '/preloaded.tar.lz4': No such file or directory I0908 08:32:21.633851 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/preloaded-tarball/preloaded-images-k8s-v12-v1.22.1-docker-overlay2-amd64.tar.lz4 --> /preloaded.tar.lz4 (540060231 bytes) I0908 08:32:23.701510 20823 docker.go:523] Took 2.070980 seconds to copy over tarball I0908 08:32:23.701638 20823 ssh_runner.go:152] Run: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4 I0908 08:32:29.609934 20823 ssh_runner.go:192] Completed: sudo tar -I lz4 -C /var -xf /preloaded.tar.lz4: (5.908202535s) I0908 08:32:29.609973 20823 ssh_runner.go:103] rm: /preloaded.tar.lz4 I0908 08:32:29.646876 20823 ssh_runner.go:152] Run: sudo cat /var/lib/docker/image/overlay2/repositories.json I0908 08:32:29.654880 20823 ssh_runner.go:319] scp memory --> /var/lib/docker/image/overlay2/repositories.json (3149 bytes) I0908 08:32:29.667251 20823 ssh_runner.go:152] Run: sudo systemctl daemon-reload I0908 08:32:29.767658 20823 ssh_runner.go:152] Run: sudo systemctl restart docker I0908 08:32:32.684362 20823 ssh_runner.go:192] Completed: sudo systemctl restart docker: (2.916659751s) I0908 08:32:32.684500 20823 ssh_runner.go:152] Run: sudo systemctl is-active --quiet service containerd I0908 08:32:32.694524 20823 ssh_runner.go:152] Run: sudo systemctl cat docker.service I0908 08:32:32.707026 20823 ssh_runner.go:152] Run: sudo systemctl is-active --quiet service containerd I0908 08:32:32.716595 20823 ssh_runner.go:152] Run: sudo systemctl is-active --quiet service crio I0908 08:32:32.725458 20823 ssh_runner.go:152] Run: sudo systemctl stop -f crio I0908 08:32:32.753699 20823 ssh_runner.go:152] Run: sudo systemctl is-active --quiet service crio I0908 08:32:32.763282 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo mkdir -p /etc && printf %!s(MISSING) "runtime-endpoint: unix:///var/run/dockershim.sock image-endpoint: unix:///var/run/dockershim.sock " | sudo tee /etc/crictl.yaml" I0908 08:32:32.777002 20823 ssh_runner.go:152] Run: sudo systemctl unmask docker.service I0908 08:32:32.873503 20823 ssh_runner.go:152] Run: sudo systemctl enable docker.socket I0908 08:32:32.966229 20823 ssh_runner.go:152] Run: sudo systemctl daemon-reload I0908 08:32:33.067036 20823 ssh_runner.go:152] Run: sudo systemctl start docker I0908 08:32:33.076467 20823 ssh_runner.go:152] Run: docker version --format {{.Server.Version}} I0908 08:32:33.107401 20823 ssh_runner.go:152] Run: docker version --format {{.Server.Version}} I0908 08:32:33.184433 20823 out.go:204] 🐳 Preparing Kubernetes v1.22.1 on Docker 20.10.8 ... I0908 08:32:33.184654 20823 ssh_runner.go:152] Run: grep 192.168.64.1 host.minikube.internal$ /etc/hosts I0908 08:32:33.189056 20823 ssh_runner.go:152] Run: /bin/bash -c "{ grep -v $'\thost.minikube.internal$' "/etc/hosts"; echo "192.168.64.1 host.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts"" I0908 08:32:33.200845 20823 preload.go:131] Checking if preload exists for k8s version v1.22.1 and runtime docker I0908 08:32:33.200923 20823 ssh_runner.go:152] Run: docker images --format {{.Repository}}:{{.Tag}} I0908 08:32:33.229195 20823 docker.go:558] Got preloaded images: -- stdout -- k8s.gcr.io/kube-apiserver:v1.22.1 k8s.gcr.io/kube-scheduler:v1.22.1 k8s.gcr.io/kube-proxy:v1.22.1 k8s.gcr.io/kube-controller-manager:v1.22.1 k8s.gcr.io/etcd:3.5.0-0 k8s.gcr.io/coredns/coredns:v1.8.4 gcr.io/k8s-minikube/storage-provisioner:v5 k8s.gcr.io/pause:3.5 kubernetesui/dashboard:v2.1.0 kubernetesui/metrics-scraper:v1.0.4 -- /stdout -- I0908 08:32:33.229203 20823 docker.go:489] Images already preloaded, skipping extraction I0908 08:32:33.229291 20823 ssh_runner.go:152] Run: docker images --format {{.Repository}}:{{.Tag}} I0908 08:32:33.252449 20823 docker.go:558] Got preloaded images: -- stdout -- k8s.gcr.io/kube-apiserver:v1.22.1 k8s.gcr.io/kube-controller-manager:v1.22.1 k8s.gcr.io/kube-proxy:v1.22.1 k8s.gcr.io/kube-scheduler:v1.22.1 k8s.gcr.io/etcd:3.5.0-0 k8s.gcr.io/coredns/coredns:v1.8.4 gcr.io/k8s-minikube/storage-provisioner:v5 k8s.gcr.io/pause:3.5 kubernetesui/dashboard:v2.1.0 kubernetesui/metrics-scraper:v1.0.4 -- /stdout -- I0908 08:32:33.252466 20823 cache_images.go:78] Images are preloaded, skipping loading I0908 08:32:33.252565 20823 ssh_runner.go:152] Run: docker info --format {{.CgroupDriver}} I0908 08:32:33.285971 20823 cni.go:93] Creating CNI manager for "" I0908 08:32:33.285979 20823 cni.go:167] CNI unnecessary in this configuration, recommending no CNI I0908 08:32:33.285992 20823 kubeadm.go:87] Using pod CIDR: 10.244.0.0/16 I0908 08:32:33.286000 20823 kubeadm.go:153] kubeadm options: {CertDir:/var/lib/minikube/certs ServiceCIDR:10.96.0.0/12 PodSubnet:10.244.0.0/16 AdvertiseAddress:192.168.64.36 APIServerPort:8443 KubernetesVersion:v1.22.1 EtcdDataDir:/var/lib/minikube/etcd EtcdExtraArgs:map[] ClusterName:minikube NodeName:minikube DNSDomain:cluster.local CRISocket:/var/run/dockershim.sock ImageRepository: ComponentOptions:[{Component:apiServer ExtraArgs:map[enable-admission-plugins:NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota] Pairs:map[certSANs:["127.0.0.1", "localhost", "192.168.64.36"]]} {Component:controllerManager ExtraArgs:map[allocate-node-cidrs:true leader-elect:false] Pairs:map[]} {Component:scheduler ExtraArgs:map[leader-elect:false] Pairs:map[]}] FeatureArgs:map[] NoTaintMaster:true NodeIP:192.168.64.36 CgroupDriver:systemd ClientCAFile:/var/lib/minikube/certs/ca.crt StaticPodPath:/etc/kubernetes/manifests ControlPlaneAddress:control-plane.minikube.internal KubeProxyOptions:map[]} I0908 08:32:33.286100 20823 kubeadm.go:157] kubeadm config: apiVersion: kubeadm.k8s.io/v1beta2 kind: InitConfiguration localAPIEndpoint: advertiseAddress: 192.168.64.36 bindPort: 8443 bootstrapTokens: - groups: - system:bootstrappers:kubeadm:default-node-token ttl: 24h0m0s usages: - signing - authentication nodeRegistration: criSocket: /var/run/dockershim.sock name: "minikube" kubeletExtraArgs: node-ip: 192.168.64.36 taints: [] --- apiVersion: kubeadm.k8s.io/v1beta2 kind: ClusterConfiguration apiServer: certSANs: ["127.0.0.1", "localhost", "192.168.64.36"] extraArgs: enable-admission-plugins: "NamespaceLifecycle,LimitRanger,ServiceAccount,DefaultStorageClass,DefaultTolerationSeconds,NodeRestriction,MutatingAdmissionWebhook,ValidatingAdmissionWebhook,ResourceQuota" controllerManager: extraArgs: allocate-node-cidrs: "true" leader-elect: "false" scheduler: extraArgs: leader-elect: "false" certificatesDir: /var/lib/minikube/certs clusterName: mk controlPlaneEndpoint: control-plane.minikube.internal:8443 dns: type: CoreDNS etcd: local: dataDir: /var/lib/minikube/etcd extraArgs: proxy-refresh-interval: "70000" kubernetesVersion: v1.22.1 networking: dnsDomain: cluster.local podSubnet: "10.244.0.0/16" serviceSubnet: 10.96.0.0/12 --- apiVersion: kubelet.config.k8s.io/v1beta1 kind: KubeletConfiguration authentication: x509: clientCAFile: /var/lib/minikube/certs/ca.crt cgroupDriver: systemd clusterDomain: "cluster.local" # disable disk resource management by default imageGCHighThresholdPercent: 100 evictionHard: nodefs.available: "0%!"(MISSING) nodefs.inodesFree: "0%!"(MISSING) imagefs.available: "0%!"(MISSING) failSwapOn: false staticPodPath: /etc/kubernetes/manifests --- apiVersion: kubeproxy.config.k8s.io/v1alpha1 kind: KubeProxyConfiguration clusterCIDR: "10.244.0.0/16" metricsBindAddress: 0.0.0.0:10249 conntrack: maxPerCore: 0 # Skip setting "net.netfilter.nf_conntrack_tcp_timeout_established" tcpEstablishedTimeout: 0s # Skip setting "net.netfilter.nf_conntrack_tcp_timeout_close" tcpCloseWaitTimeout: 0s I0908 08:32:33.286172 20823 kubeadm.go:909] kubelet [Unit] Wants=docker.socket [Service] ExecStart= ExecStart=/var/lib/minikube/binaries/v1.22.1/kubelet --bootstrap-kubeconfig=/etc/kubernetes/bootstrap-kubelet.conf --config=/var/lib/kubelet/config.yaml --container-runtime=docker --hostname-override=minikube --kubeconfig=/etc/kubernetes/kubelet.conf --node-ip=192.168.64.36 [Install] config: {KubernetesVersion:v1.22.1 ClusterName:minikube Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} I0908 08:32:33.286255 20823 ssh_runner.go:152] Run: sudo ls /var/lib/minikube/binaries/v1.22.1 I0908 08:32:33.292697 20823 binaries.go:44] Found k8s binaries, skipping transfer I0908 08:32:33.292758 20823 ssh_runner.go:152] Run: sudo mkdir -p /etc/systemd/system/kubelet.service.d /lib/systemd/system /var/tmp/minikube I0908 08:32:33.299057 20823 ssh_runner.go:319] scp memory --> /etc/systemd/system/kubelet.service.d/10-kubeadm.conf (335 bytes) I0908 08:32:33.312270 20823 ssh_runner.go:319] scp memory --> /lib/systemd/system/kubelet.service (352 bytes) I0908 08:32:33.325030 20823 ssh_runner.go:319] scp memory --> /var/tmp/minikube/kubeadm.yaml.new (2053 bytes) I0908 08:32:33.336806 20823 ssh_runner.go:152] Run: grep 192.168.64.36 control-plane.minikube.internal$ /etc/hosts I0908 08:32:33.339276 20823 ssh_runner.go:152] Run: /bin/bash -c "{ grep -v $'\tcontrol-plane.minikube.internal$' "/etc/hosts"; echo "192.168.64.36 control-plane.minikube.internal"; } > /tmp/h.$$; sudo cp /tmp/h.$$ "/etc/hosts"" I0908 08:32:33.347192 20823 certs.go:52] Setting up /Users/noah/.minikube/profiles/minikube for IP: 192.168.64.36 I0908 08:32:33.347334 20823 certs.go:179] skipping minikubeCA CA generation: /Users/noah/.minikube/ca.key I0908 08:32:33.347410 20823 certs.go:179] skipping proxyClientCA CA generation: /Users/noah/.minikube/proxy-client-ca.key I0908 08:32:33.347456 20823 certs.go:297] generating minikube-user signed cert: /Users/noah/.minikube/profiles/minikube/client.key I0908 08:32:33.347462 20823 crypto.go:69] Generating cert /Users/noah/.minikube/profiles/minikube/client.crt with IP's: [] I0908 08:32:33.515093 20823 crypto.go:157] Writing cert to /Users/noah/.minikube/profiles/minikube/client.crt ... I0908 08:32:33.515105 20823 lock.go:36] WriteFile acquiring /Users/noah/.minikube/profiles/minikube/client.crt: {Name:mk67d0a2e6108ce0a52e2befe9325b04fd3c55ab Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:33.557910 20823 crypto.go:165] Writing key to /Users/noah/.minikube/profiles/minikube/client.key ... I0908 08:32:33.557935 20823 lock.go:36] WriteFile acquiring /Users/noah/.minikube/profiles/minikube/client.key: {Name:mk6538b0e481f56a8cc8d225f0ed117ae243eb84 Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:33.613748 20823 certs.go:297] generating minikube signed cert: /Users/noah/.minikube/profiles/minikube/apiserver.key.6ad7f37f I0908 08:32:33.613766 20823 crypto.go:69] Generating cert /Users/noah/.minikube/profiles/minikube/apiserver.crt.6ad7f37f with IP's: [192.168.64.36 10.96.0.1 127.0.0.1 10.0.0.1] I0908 08:32:33.744431 20823 crypto.go:157] Writing cert to /Users/noah/.minikube/profiles/minikube/apiserver.crt.6ad7f37f ... I0908 08:32:33.744440 20823 lock.go:36] WriteFile acquiring /Users/noah/.minikube/profiles/minikube/apiserver.crt.6ad7f37f: {Name:mk5ad687cf046dcdbdf5c94269c715ca73b15501 Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:33.744760 20823 crypto.go:165] Writing key to /Users/noah/.minikube/profiles/minikube/apiserver.key.6ad7f37f ... I0908 08:32:33.744764 20823 lock.go:36] WriteFile acquiring /Users/noah/.minikube/profiles/minikube/apiserver.key.6ad7f37f: {Name:mkfb72c769043c60e0eb48de30a78e5e9d6f957a Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:33.745014 20823 certs.go:308] copying /Users/noah/.minikube/profiles/minikube/apiserver.crt.6ad7f37f -> /Users/noah/.minikube/profiles/minikube/apiserver.crt I0908 08:32:33.745182 20823 certs.go:312] copying /Users/noah/.minikube/profiles/minikube/apiserver.key.6ad7f37f -> /Users/noah/.minikube/profiles/minikube/apiserver.key I0908 08:32:33.745333 20823 certs.go:297] generating aggregator signed cert: /Users/noah/.minikube/profiles/minikube/proxy-client.key I0908 08:32:33.745336 20823 crypto.go:69] Generating cert /Users/noah/.minikube/profiles/minikube/proxy-client.crt with IP's: [] I0908 08:32:33.874689 20823 crypto.go:157] Writing cert to /Users/noah/.minikube/profiles/minikube/proxy-client.crt ... I0908 08:32:33.874698 20823 lock.go:36] WriteFile acquiring /Users/noah/.minikube/profiles/minikube/proxy-client.crt: {Name:mkcc6cb5de47a8771820fba37c68b62299b2b025 Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:33.874978 20823 crypto.go:165] Writing key to /Users/noah/.minikube/profiles/minikube/proxy-client.key ... I0908 08:32:33.874982 20823 lock.go:36] WriteFile acquiring /Users/noah/.minikube/profiles/minikube/proxy-client.key: {Name:mk167ff83858672ed484a813386d81ca12c0a5d8 Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:33.875418 20823 certs.go:376] found cert: /Users/noah/.minikube/certs/Users/noah/.minikube/certs/ca-key.pem (1675 bytes) I0908 08:32:33.876009 20823 certs.go:376] found cert: /Users/noah/.minikube/certs/Users/noah/.minikube/certs/ca.pem (1070 bytes) I0908 08:32:33.876054 20823 certs.go:376] found cert: /Users/noah/.minikube/certs/Users/noah/.minikube/certs/cert.pem (1115 bytes) I0908 08:32:33.876340 20823 certs.go:376] found cert: /Users/noah/.minikube/certs/Users/noah/.minikube/certs/key.pem (1675 bytes) I0908 08:32:33.877245 20823 ssh_runner.go:319] scp /Users/noah/.minikube/profiles/minikube/apiserver.crt --> /var/lib/minikube/certs/apiserver.crt (1399 bytes) I0908 08:32:33.894456 20823 ssh_runner.go:319] scp /Users/noah/.minikube/profiles/minikube/apiserver.key --> /var/lib/minikube/certs/apiserver.key (1675 bytes) I0908 08:32:33.912150 20823 ssh_runner.go:319] scp /Users/noah/.minikube/profiles/minikube/proxy-client.crt --> /var/lib/minikube/certs/proxy-client.crt (1147 bytes) I0908 08:32:33.930575 20823 ssh_runner.go:319] scp /Users/noah/.minikube/profiles/minikube/proxy-client.key --> /var/lib/minikube/certs/proxy-client.key (1679 bytes) I0908 08:32:33.947787 20823 ssh_runner.go:319] scp /Users/noah/.minikube/ca.crt --> /var/lib/minikube/certs/ca.crt (1111 bytes) I0908 08:32:33.966259 20823 ssh_runner.go:319] scp /Users/noah/.minikube/ca.key --> /var/lib/minikube/certs/ca.key (1675 bytes) I0908 08:32:33.985427 20823 ssh_runner.go:319] scp /Users/noah/.minikube/proxy-client-ca.crt --> /var/lib/minikube/certs/proxy-client-ca.crt (1119 bytes) I0908 08:32:34.004660 20823 ssh_runner.go:319] scp /Users/noah/.minikube/proxy-client-ca.key --> /var/lib/minikube/certs/proxy-client-ca.key (1675 bytes) I0908 08:32:34.023961 20823 ssh_runner.go:319] scp /Users/noah/.minikube/ca.crt --> /usr/share/ca-certificates/minikubeCA.pem (1111 bytes) I0908 08:32:34.041775 20823 ssh_runner.go:319] scp memory --> /var/lib/minikube/kubeconfig (740 bytes) I0908 08:32:34.054366 20823 ssh_runner.go:152] Run: openssl version I0908 08:32:34.058376 20823 ssh_runner.go:152] Run: sudo /bin/bash -c "test -s /usr/share/ca-certificates/minikubeCA.pem && ln -fs /usr/share/ca-certificates/minikubeCA.pem /etc/ssl/certs/minikubeCA.pem" I0908 08:32:34.065441 20823 ssh_runner.go:152] Run: ls -la /usr/share/ca-certificates/minikubeCA.pem I0908 08:32:34.068560 20823 certs.go:419] hashing: -rw-r--r-- 1 root root 1111 Feb 15 2021 /usr/share/ca-certificates/minikubeCA.pem I0908 08:32:34.068676 20823 ssh_runner.go:152] Run: openssl x509 -hash -noout -in /usr/share/ca-certificates/minikubeCA.pem I0908 08:32:34.072201 20823 ssh_runner.go:152] Run: sudo /bin/bash -c "test -L /etc/ssl/certs/b5213941.0 || ln -fs /etc/ssl/certs/minikubeCA.pem /etc/ssl/certs/b5213941.0" I0908 08:32:34.080096 20823 kubeadm.go:390] StartCluster: {Name:minikube KeepContext:false EmbedCerts:false MinikubeISO:https://storage.googleapis.com/minikube/iso/minikube-v1.23.0.iso KicBaseImage:gcr.io/k8s-minikube/kicbase:v0.0.26@sha256:d4aa14fbdc3a28a60632c24af937329ec787b02c89983c6f5498d346860a848c Memory:8192 CPUs:4 DiskSize:81920 VMDriver: Driver:hyperkit HyperkitVpnKitSock: HyperkitVSockPorts:[] DockerEnv:[] ContainerVolumeMounts:[] InsecureRegistry:[] RegistryMirror:[] HostOnlyCIDR:192.168.99.1/24 HypervVirtualSwitch: HypervUseExternalSwitch:false HypervExternalAdapter: KVMNetwork:default KVMQemuURI:qemu:///system KVMGPU:false KVMHidden:false KVMNUMACount:1 DockerOpt:[] DisableDriverMounts:false NFSShare:[] NFSSharesRoot:/nfsshares UUID: NoVTXCheck:false DNSProxy:false HostDNSResolver:true HostOnlyNicType:virtio NatNicType:virtio SSHIPAddress: SSHUser:root SSHKey: SSHPort:22 KubernetesConfig:{KubernetesVersion:v1.22.1 ClusterName:minikube Namespace:default APIServerName:minikubeCA APIServerNames:[] APIServerIPs:[] DNSDomain:cluster.local ContainerRuntime:docker CRISocket: NetworkPlugin: FeatureGates: ServiceCIDR:10.96.0.0/12 ImageRepository: LoadBalancerStartIP: LoadBalancerEndIP: CustomIngressCert: ExtraOptions:[] ShouldLoadCachedImages:true EnableDefaultCNI:false CNI: NodeIP: NodePort:8443 NodeName:} Nodes:[{Name: IP:192.168.64.36 Port:8443 KubernetesVersion:v1.22.1 ControlPlane:true Worker:true}] Addons:map[] CustomAddonImages:map[] CustomAddonRegistries:map[] VerifyComponents:map[apiserver:true system_pods:true] StartHostTimeout:6m0s ScheduledStop: ExposedPorts:[] ListenAddress: Network: MultiNodeRequested:false ExtraDisks:0} I0908 08:32:34.080203 20823 ssh_runner.go:152] Run: docker ps --filter status=paused --filter=name=k8s_.*_(kube-system)_ --format={{.ID}} I0908 08:32:34.106026 20823 ssh_runner.go:152] Run: sudo ls /var/lib/kubelet/kubeadm-flags.env /var/lib/kubelet/config.yaml /var/lib/minikube/etcd I0908 08:32:34.113015 20823 ssh_runner.go:152] Run: sudo cp /var/tmp/minikube/kubeadm.yaml.new /var/tmp/minikube/kubeadm.yaml I0908 08:32:34.119833 20823 ssh_runner.go:152] Run: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf I0908 08:32:34.126878 20823 kubeadm.go:151] config check failed, skipping stale config cleanup: sudo ls -la /etc/kubernetes/admin.conf /etc/kubernetes/kubelet.conf /etc/kubernetes/controller-manager.conf /etc/kubernetes/scheduler.conf: Process exited with status 2 stdout: stderr: ls: cannot access '/etc/kubernetes/admin.conf': No such file or directory ls: cannot access '/etc/kubernetes/kubelet.conf': No such file or directory ls: cannot access '/etc/kubernetes/controller-manager.conf': No such file or directory ls: cannot access '/etc/kubernetes/scheduler.conf': No such file or directory I0908 08:32:34.126900 20823 ssh_runner.go:243] Start: /bin/bash -c "sudo env PATH=/var/lib/minikube/binaries/v1.22.1:$PATH kubeadm init --config /var/tmp/minikube/kubeadm.yaml --ignore-preflight-errors=DirAvailable--etc-kubernetes-manifests,DirAvailable--var-lib-minikube,DirAvailable--var-lib-minikube-etcd,FileAvailable--etc-kubernetes-manifests-kube-scheduler.yaml,FileAvailable--etc-kubernetes-manifests-kube-apiserver.yaml,FileAvailable--etc-kubernetes-manifests-kube-controller-manager.yaml,FileAvailable--etc-kubernetes-manifests-etcd.yaml,Port-10250,Swap,Mem" I0908 08:32:34.526604 20823 out.go:204] ▪ Generating certificates and keys ... I0908 08:32:37.072361 20823 out.go:204] ▪ Booting up control plane ... I0908 08:32:46.605244 20823 out.go:204] ▪ Configuring RBAC rules ... I0908 08:32:47.034645 20823 cni.go:93] Creating CNI manager for "" I0908 08:32:47.034653 20823 cni.go:167] CNI unnecessary in this configuration, recommending no CNI I0908 08:32:47.034674 20823 ssh_runner.go:152] Run: /bin/bash -c "cat /proc/$(pgrep kube-apiserver)/oom_adj" I0908 08:32:47.034767 20823 ssh_runner.go:152] Run: sudo /var/lib/minikube/binaries/v1.22.1/kubectl create clusterrolebinding minikube-rbac --clusterrole=cluster-admin --serviceaccount=kube-system:default --kubeconfig=/var/lib/minikube/kubeconfig I0908 08:32:47.034768 20823 ssh_runner.go:152] Run: sudo /var/lib/minikube/binaries/v1.22.1/kubectl label nodes minikube.k8s.io/version=v1.23.0 minikube.k8s.io/commit=5931455374810b1bbeb222a9713ae2c756daee10 minikube.k8s.io/name=minikube minikube.k8s.io/updated_at=2021_09_08T08_32_47_0700 --all --overwrite --kubeconfig=/var/lib/minikube/kubeconfig I0908 08:32:47.043695 20823 ops.go:34] apiserver oom_adj: -16 I0908 08:32:47.175383 20823 kubeadm.go:985] duration metric: took 140.699771ms to wait for elevateKubeSystemPrivileges. I0908 08:32:47.175404 20823 kubeadm.go:392] StartCluster complete in 13.095177543s I0908 08:32:47.175416 20823 settings.go:142] acquiring lock: {Name:mk2a468bb13097cd7a5caa247e50780d702c8025 Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:47.175545 20823 settings.go:150] Updating kubeconfig: /Users/noah/.kube/config I0908 08:32:47.176746 20823 lock.go:36] WriteFile acquiring /Users/noah/.kube/config: {Name:mkcb8cb36804aad5a35efa61ab8126ef5ac46154 Clock:{} Delay:500ms Timeout:1m0s Cancel:} I0908 08:32:47.698328 20823 kapi.go:244] deployment "coredns" in namespace "kube-system" and context "minikube" rescaled to 1 I0908 08:32:47.698357 20823 start.go:226] Will wait 6m0s for node &{Name: IP:192.168.64.36 Port:8443 KubernetesVersion:v1.22.1 ControlPlane:true Worker:true} I0908 08:32:47.698402 20823 addons.go:404] enableAddons start: toEnable=map[], additional=[] I0908 08:32:47.739339 20823 out.go:177] 🔎 Verifying Kubernetes components... I0908 08:32:47.698696 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.22.1/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml" I0908 08:32:47.739482 20823 addons.go:65] Setting default-storageclass=true in profile "minikube" I0908 08:32:47.739485 20823 addons.go:65] Setting storage-provisioner=true in profile "minikube" I0908 08:32:47.699046 20823 config.go:177] Loaded profile config "minikube": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.22.1 I0908 08:32:47.739557 20823 addons.go:153] Setting addon storage-provisioner=true in "minikube" I0908 08:32:47.739558 20823 addons_storage_classes.go:33] enableOrDisableStorageClasses default-storageclass=true on "minikube" W0908 08:32:47.739564 20823 addons.go:165] addon storage-provisioner should already be in state true I0908 08:32:47.739601 20823 host.go:66] Checking if "minikube" exists ... I0908 08:32:47.739619 20823 ssh_runner.go:152] Run: sudo systemctl is-active --quiet service kubelet I0908 08:32:47.739804 20823 cache.go:108] acquiring lock: {Name:mk8a9877486370d82e3a081a0860a9bf68c21084 Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739787 20823 cache.go:108] acquiring lock: {Name:mkbf77a40c6a71bc03f0f2cc6dc782ca69fe1ba3 Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739791 20823 cache.go:108] acquiring lock: {Name:mk9acbecb1f6f9b0a6de2a807ae2ea10d717df76 Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739837 20823 cache.go:108] acquiring lock: {Name:mk625df2b4da9bca5f1f9606df5381ecb19e099f Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739854 20823 cache.go:108] acquiring lock: {Name:mk6a618cb9d2fd3da473abb01701a1434fc019ef Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739838 20823 cache.go:108] acquiring lock: {Name:mk46c23d8746bc2e6bc6c573ff4ebe4879315606 Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739904 20823 cache.go:108] acquiring lock: {Name:mk069005e752b105e6b2e6c32796ad204189f6c4 Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739960 20823 cache.go:108] acquiring lock: {Name:mk5ad10c13c7709c62e0fb7d4c297d18a319231b Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739962 20823 cache.go:108] acquiring lock: {Name:mkc06aa0aee276ed8159db368fab45855414b56d Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.739969 20823 cache.go:108] acquiring lock: {Name:mk680f29065e9f8ee042bbd3b67773431b82202d Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.740055 20823 cache.go:116] /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.3.0 exists I0908 08:32:47.740029 20823 cache.go:108] acquiring lock: {Name:mkbed71862ac3c35d089202f44c34542935de034 Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.740078 20823 cache.go:97] cache image "jettech/kube-webhook-certgen:v1.3.0" -> "/Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.3.0" took 277.863µs I0908 08:32:47.740095 20823 cache.go:81] save to tar file jettech/kube-webhook-certgen:v1.3.0 -> /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.3.0 succeeded I0908 08:32:47.740136 20823 cache.go:108] acquiring lock: {Name:mk1f615c156c00f3287788ee0ecdf790daaef787 Clock:{} Delay:500ms Timeout:10m0s Cancel:} I0908 08:32:47.740177 20823 cache.go:116] /Users/noah/.minikube/cache/images/local-repl-5 exists I0908 08:32:47.740196 20823 cache.go:97] cache image "local-repl-5:latest" -> "/Users/noah/.minikube/cache/images/local-repl-5" took 360.793µs I0908 08:32:47.740206 20823 cache.go:81] save to tar file local-repl-5 -> /Users/noah/.minikube/cache/images/local-repl-5 succeeded I0908 08:32:47.740206 20823 cache.go:116] /Users/noah/.minikube/cache/images/local-proxy-1 exists I0908 08:32:47.740212 20823 cache.go:116] /Users/noah/.minikube/cache/images/local-repl-1 exists I0908 08:32:47.740230 20823 cache.go:97] cache image "local-proxy-1:latest" -> "/Users/noah/.minikube/cache/images/local-proxy-1" took 394.643µs I0908 08:32:47.740235 20823 cache.go:97] cache image "local-repl-1:latest" -> "/Users/noah/.minikube/cache/images/local-repl-1" took 340.859µs I0908 08:32:47.740243 20823 cache.go:81] save to tar file local-proxy-1 -> /Users/noah/.minikube/cache/images/local-proxy-1 succeeded I0908 08:32:47.740247 20823 cache.go:81] save to tar file local-repl-1 -> /Users/noah/.minikube/cache/images/local-repl-1 succeeded I0908 08:32:47.740252 20823 cache.go:116] /Users/noah/.minikube/cache/images/local-repl-4 exists I0908 08:32:47.740275 20823 cache.go:97] cache image "local-repl-4:latest" -> "/Users/noah/.minikube/cache/images/local-repl-4" took 500.39µs I0908 08:32:47.740286 20823 cache.go:81] save to tar file local-repl-4 -> /Users/noah/.minikube/cache/images/local-repl-4 succeeded I0908 08:32:47.740332 20823 cache.go:116] /Users/noah/.minikube/cache/images/local-proxy-3 exists I0908 08:32:47.740357 20823 cache.go:116] /Users/noah/.minikube/cache/images/local-proxy-2 exists I0908 08:32:47.740356 20823 cache.go:97] cache image "local-proxy-3:latest" -> "/Users/noah/.minikube/cache/images/local-proxy-3" took 576.799µs I0908 08:32:47.740375 20823 cache.go:97] cache image "local-proxy-2:latest" -> "/Users/noah/.minikube/cache/images/local-proxy-2" took 428.21µs I0908 08:32:47.740382 20823 cache.go:81] save to tar file local-proxy-3 -> /Users/noah/.minikube/cache/images/local-proxy-3 succeeded I0908 08:32:47.740386 20823 cache.go:81] save to tar file local-proxy-2 -> /Users/noah/.minikube/cache/images/local-proxy-2 succeeded I0908 08:32:47.740427 20823 cache.go:116] /Users/noah/.minikube/cache/images/local-repl-2 exists I0908 08:32:47.740432 20823 cache.go:116] /Users/noah/.minikube/cache/images/local-repl-3 exists I0908 08:32:47.740437 20823 cache.go:116] /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.2.2 exists I0908 08:32:47.740446 20823 cache.go:97] cache image "local-repl-2:latest" -> "/Users/noah/.minikube/cache/images/local-repl-2" took 505.465µs I0908 08:32:47.740448 20823 cache.go:97] cache image "local-repl-3:latest" -> "/Users/noah/.minikube/cache/images/local-repl-3" took 381.458µs I0908 08:32:47.740454 20823 cache.go:81] save to tar file local-repl-2 -> /Users/noah/.minikube/cache/images/local-repl-2 succeeded I0908 08:32:47.740456 20823 cache.go:81] save to tar file local-repl-3 -> /Users/noah/.minikube/cache/images/local-repl-3 succeeded I0908 08:32:47.740479 20823 cache.go:97] cache image "jettech/kube-webhook-certgen:v1.2.2" -> "/Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.2.2" took 469.585µs I0908 08:32:47.740491 20823 cache.go:81] save to tar file jettech/kube-webhook-certgen:v1.2.2 -> /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.2.2 succeeded I0908 08:32:47.740695 20823 main.go:130] libmachine: Found binary path at /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:47.740734 20823 main.go:130] libmachine: Launching plugin server for driver hyperkit I0908 08:32:47.740757 20823 main.go:130] libmachine: Found binary path at /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:47.740797 20823 main.go:130] libmachine: Launching plugin server for driver hyperkit I0908 08:32:47.740885 20823 cache.go:116] /Users/noah/.minikube/cache/images/cryptexlabs/minikube-ingress-dns_0.3.0 exists I0908 08:32:47.740905 20823 cache.go:97] cache image "cryptexlabs/minikube-ingress-dns:0.3.0" -> "/Users/noah/.minikube/cache/images/cryptexlabs/minikube-ingress-dns_0.3.0" took 1.061793ms I0908 08:32:47.740921 20823 cache.go:81] save to tar file cryptexlabs/minikube-ingress-dns:0.3.0 -> /Users/noah/.minikube/cache/images/cryptexlabs/minikube-ingress-dns_0.3.0 succeeded I0908 08:32:47.740929 20823 cache.go:116] /Users/noah/.minikube/cache/images/us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller_v0.40.2 exists I0908 08:32:47.740952 20823 cache.go:97] cache image "us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2" -> "/Users/noah/.minikube/cache/images/us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller_v0.40.2" took 1.020906ms I0908 08:32:47.740965 20823 cache.go:81] save to tar file us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2 -> /Users/noah/.minikube/cache/images/us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller_v0.40.2 succeeded I0908 08:32:47.740971 20823 cache.go:88] Successfully saved all images to host disk. I0908 08:32:47.744915 20823 config.go:177] Loaded profile config "minikube": Driver=hyperkit, ContainerRuntime=docker, KubernetesVersion=v1.22.1 I0908 08:32:47.747285 20823 main.go:130] libmachine: Found binary path at /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:47.747338 20823 main.go:130] libmachine: Launching plugin server for driver hyperkit I0908 08:32:47.756569 20823 main.go:130] libmachine: Plugin server listening at address 127.0.0.1:60006 I0908 08:32:47.756918 20823 main.go:130] libmachine: Plugin server listening at address 127.0.0.1:60010 I0908 08:32:47.757283 20823 main.go:130] libmachine: () Calling .GetVersion I0908 08:32:47.757472 20823 main.go:130] libmachine: () Calling .GetVersion I0908 08:32:47.757936 20823 main.go:130] libmachine: Using API Version 1 I0908 08:32:47.757948 20823 main.go:130] libmachine: () Calling .SetConfigRaw I0908 08:32:47.758076 20823 main.go:130] libmachine: Using API Version 1 I0908 08:32:47.758087 20823 main.go:130] libmachine: () Calling .SetConfigRaw I0908 08:32:47.758372 20823 main.go:130] libmachine: () Calling .GetMachineName I0908 08:32:47.758457 20823 main.go:130] libmachine: () Calling .GetMachineName I0908 08:32:47.758636 20823 main.go:130] libmachine: (minikube) Calling .GetState I0908 08:32:47.758812 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:47.758863 20823 main.go:130] libmachine: Plugin server listening at address 127.0.0.1:60014 I0908 08:32:47.758948 20823 main.go:130] libmachine: Found binary path at /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:47.758976 20823 main.go:130] libmachine: Launching plugin server for driver hyperkit I0908 08:32:47.759985 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:47.760468 20823 main.go:130] libmachine: () Calling .GetVersion I0908 08:32:47.761747 20823 main.go:130] libmachine: Using API Version 1 I0908 08:32:47.761761 20823 main.go:130] libmachine: () Calling .SetConfigRaw I0908 08:32:47.762090 20823 main.go:130] libmachine: () Calling .GetMachineName I0908 08:32:47.762360 20823 main.go:130] libmachine: (minikube) Calling .GetState I0908 08:32:47.762496 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:47.762808 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:47.766082 20823 main.go:130] libmachine: Found binary path at /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:47.766141 20823 main.go:130] libmachine: Launching plugin server for driver hyperkit I0908 08:32:47.770887 20823 main.go:130] libmachine: Plugin server listening at address 127.0.0.1:60018 I0908 08:32:47.771459 20823 main.go:130] libmachine: () Calling .GetVersion I0908 08:32:47.772022 20823 main.go:130] libmachine: Using API Version 1 I0908 08:32:47.772034 20823 main.go:130] libmachine: () Calling .SetConfigRaw I0908 08:32:47.772527 20823 main.go:130] libmachine: () Calling .GetMachineName I0908 08:32:47.772720 20823 main.go:130] libmachine: (minikube) Calling .GetState I0908 08:32:47.772900 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:47.773271 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:47.774603 20823 addons.go:153] Setting addon default-storageclass=true in "minikube" W0908 08:32:47.774609 20823 addons.go:165] addon default-storageclass should already be in state true I0908 08:32:47.774625 20823 host.go:66] Checking if "minikube" exists ... I0908 08:32:47.774959 20823 main.go:130] libmachine: Found binary path at /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:47.774995 20823 main.go:130] libmachine: Launching plugin server for driver hyperkit I0908 08:32:47.777970 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:47.778380 20823 main.go:130] libmachine: Plugin server listening at address 127.0.0.1:60022 I0908 08:32:47.778966 20823 main.go:130] libmachine: () Calling .GetVersion I0908 08:32:47.797233 20823 out.go:177] ▪ Using image gcr.io/k8s-minikube/storage-provisioner:v5 I0908 08:32:47.785822 20823 main.go:130] libmachine: Plugin server listening at address 127.0.0.1:60026 I0908 08:32:47.797388 20823 addons.go:337] installing /etc/kubernetes/addons/storage-provisioner.yaml I0908 08:32:47.797394 20823 ssh_runner.go:319] scp memory --> /etc/kubernetes/addons/storage-provisioner.yaml (2676 bytes) I0908 08:32:47.797409 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:47.797636 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:47.797824 20823 main.go:130] libmachine: Using API Version 1 I0908 08:32:47.797831 20823 main.go:130] libmachine: () Calling .SetConfigRaw I0908 08:32:47.797885 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:47.797919 20823 main.go:130] libmachine: () Calling .GetVersion I0908 08:32:47.798021 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:47.798123 20823 main.go:130] libmachine: () Calling .GetMachineName I0908 08:32:47.798149 20823 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/noah/.minikube/machines/minikube/id_rsa Username:docker} I0908 08:32:47.798274 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:47.798367 20823 main.go:130] libmachine: Using API Version 1 I0908 08:32:47.798377 20823 main.go:130] libmachine: () Calling .SetConfigRaw I0908 08:32:47.798470 20823 ssh_runner.go:152] Run: docker images --format {{.Repository}}:{{.Tag}} I0908 08:32:47.798479 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:47.798590 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:47.798674 20823 main.go:130] libmachine: () Calling .GetMachineName I0908 08:32:47.798730 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:47.798899 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:47.799016 20823 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/noah/.minikube/machines/minikube/id_rsa Username:docker} I0908 08:32:47.799186 20823 main.go:130] libmachine: Found binary path at /Users/noah/.minikube/bin/docker-machine-driver-hyperkit I0908 08:32:47.799210 20823 main.go:130] libmachine: Launching plugin server for driver hyperkit I0908 08:32:47.809170 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo /var/lib/minikube/binaries/v1.22.1/kubectl --kubeconfig=/var/lib/minikube/kubeconfig -n kube-system get configmap coredns -o yaml | sed '/^ forward . \/etc\/resolv.conf.*/i \ hosts {\n 192.168.64.1 host.minikube.internal\n fallthrough\n }' | sudo /var/lib/minikube/binaries/v1.22.1/kubectl --kubeconfig=/var/lib/minikube/kubeconfig replace -f -" I0908 08:32:47.809225 20823 main.go:130] libmachine: Plugin server listening at address 127.0.0.1:60032 I0908 08:32:47.809718 20823 main.go:130] libmachine: () Calling .GetVersion I0908 08:32:47.810155 20823 main.go:130] libmachine: Using API Version 1 I0908 08:32:47.810162 20823 main.go:130] libmachine: () Calling .SetConfigRaw I0908 08:32:47.810409 20823 main.go:130] libmachine: () Calling .GetMachineName I0908 08:32:47.810509 20823 main.go:130] libmachine: (minikube) Calling .GetState I0908 08:32:47.810626 20823 main.go:130] libmachine: (minikube) DBG | exe=/Users/noah/.minikube/bin/docker-machine-driver-hyperkit uid=0 I0908 08:32:47.810926 20823 main.go:130] libmachine: (minikube) DBG | hyperkit pid from json: 20844 I0908 08:32:47.812925 20823 api_server.go:50] waiting for apiserver process to appear ... I0908 08:32:47.812982 20823 ssh_runner.go:152] Run: sudo pgrep -xnf kube-apiserver.*minikube.* I0908 08:32:47.813536 20823 main.go:130] libmachine: (minikube) Calling .DriverName I0908 08:32:47.813723 20823 addons.go:337] installing /etc/kubernetes/addons/storageclass.yaml I0908 08:32:47.813730 20823 ssh_runner.go:319] scp memory --> /etc/kubernetes/addons/storageclass.yaml (271 bytes) I0908 08:32:47.813737 20823 main.go:130] libmachine: (minikube) Calling .GetSSHHostname I0908 08:32:47.813846 20823 main.go:130] libmachine: (minikube) Calling .GetSSHPort I0908 08:32:47.813957 20823 main.go:130] libmachine: (minikube) Calling .GetSSHKeyPath I0908 08:32:47.814065 20823 main.go:130] libmachine: (minikube) Calling .GetSSHUsername I0908 08:32:47.814479 20823 sshutil.go:53] new ssh client: &{IP:192.168.64.36 Port:22 SSHKeyPath:/Users/noah/.minikube/machines/minikube/id_rsa Username:docker} I0908 08:32:47.856229 20823 ssh_runner.go:152] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.22.1/kubectl apply -f /etc/kubernetes/addons/storage-provisioner.yaml I0908 08:32:47.877329 20823 ssh_runner.go:152] Run: sudo KUBECONFIG=/var/lib/minikube/kubeconfig /var/lib/minikube/binaries/v1.22.1/kubectl apply -f /etc/kubernetes/addons/storageclass.yaml I0908 08:32:48.021141 20823 start.go:729] {"host.minikube.internal": 192.168.64.1} host record injected into CoreDNS I0908 08:32:48.021163 20823 api_server.go:70] duration metric: took 322.789693ms to wait for apiserver process to appear ... I0908 08:32:48.021170 20823 api_server.go:86] waiting for apiserver healthz status ... I0908 08:32:48.021197 20823 api_server.go:239] Checking apiserver healthz at https://192.168.64.36:8443/healthz ... I0908 08:32:48.021267 20823 docker.go:558] Got preloaded images: -- stdout -- k8s.gcr.io/kube-apiserver:v1.22.1 k8s.gcr.io/kube-scheduler:v1.22.1 k8s.gcr.io/kube-proxy:v1.22.1 k8s.gcr.io/kube-controller-manager:v1.22.1 k8s.gcr.io/etcd:3.5.0-0 k8s.gcr.io/coredns/coredns:v1.8.4 gcr.io/k8s-minikube/storage-provisioner:v5 k8s.gcr.io/pause:3.5 kubernetesui/dashboard:v2.1.0 kubernetesui/metrics-scraper:v1.0.4 -- /stdout -- I0908 08:32:48.021272 20823 docker.go:564] local-repl-3 wasn't preloaded I0908 08:32:48.021277 20823 cache_images.go:82] LoadImages start: [local-repl-3 jettech/kube-webhook-certgen:v1.2.2 jettech/kube-webhook-certgen:v1.3.0 local-proxy-1 local-proxy-2 local-proxy-3 local-repl-1 local-repl-2 cryptexlabs/minikube-ingress-dns:0.3.0 local-repl-5 us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2 local-repl-4] I0908 08:32:48.027926 20823 api_server.go:265] https://192.168.64.36:8443/healthz returned 200: ok I0908 08:32:48.029380 20823 api_server.go:139] control plane version: v1.22.1 I0908 08:32:48.029391 20823 api_server.go:129] duration metric: took 8.218249ms to wait for apiserver health ... I0908 08:32:48.029401 20823 system_pods.go:43] waiting for kube-system pods to appear ... I0908 08:32:48.035400 20823 image.go:135] retrieving image: jettech/kube-webhook-certgen:v1.3.0 I0908 08:32:48.035415 20823 image.go:141] checking repository: index.docker.io/jettech/kube-webhook-certgen I0908 08:32:48.035419 20823 image.go:135] retrieving image: jettech/kube-webhook-certgen:v1.2.2 I0908 08:32:48.035426 20823 image.go:141] checking repository: index.docker.io/jettech/kube-webhook-certgen I0908 08:32:48.035534 20823 image.go:135] retrieving image: local-repl-3 I0908 08:32:48.035543 20823 image.go:141] checking repository: index.docker.io/library/local-repl-3 I0908 08:32:48.035933 20823 image.go:135] retrieving image: local-proxy-3 I0908 08:32:48.035952 20823 image.go:141] checking repository: index.docker.io/library/local-proxy-3 I0908 08:32:48.036865 20823 image.go:135] retrieving image: local-repl-5 I0908 08:32:48.036872 20823 image.go:141] checking repository: index.docker.io/library/local-repl-5 I0908 08:32:48.037683 20823 image.go:135] retrieving image: cryptexlabs/minikube-ingress-dns:0.3.0 I0908 08:32:48.037688 20823 image.go:141] checking repository: index.docker.io/cryptexlabs/minikube-ingress-dns I0908 08:32:48.038709 20823 image.go:135] retrieving image: local-repl-1 I0908 08:32:48.038716 20823 image.go:141] checking repository: index.docker.io/library/local-repl-1 I0908 08:32:48.039794 20823 system_pods.go:59] 4 kube-system pods found I0908 08:32:48.039806 20823 system_pods.go:61] "etcd-minikube" [30d01ec6-5c52-4838-bcd7-dac943f185d9] Pending I0908 08:32:48.039809 20823 system_pods.go:61] "kube-apiserver-minikube" [0b16991b-2460-4833-ac71-40fa1ceb773b] Pending I0908 08:32:48.039812 20823 system_pods.go:61] "kube-controller-manager-minikube" [280ab489-a6cc-40f9-b50e-c1cae34460b9] Pending I0908 08:32:48.039814 20823 system_pods.go:61] "kube-scheduler-minikube" [ffe7ca6a-de2b-4b60-82c4-1aaed10abbff] Pending I0908 08:32:48.039816 20823 system_pods.go:74] duration metric: took 10.412249ms to wait for pod list to return data ... I0908 08:32:48.039816 20823 image.go:135] retrieving image: local-proxy-2 I0908 08:32:48.039823 20823 image.go:141] checking repository: index.docker.io/library/local-proxy-2 I0908 08:32:48.039823 20823 kubeadm.go:547] duration metric: took 341.447714ms to wait for : map[apiserver:true system_pods:true] ... I0908 08:32:48.039831 20823 node_conditions.go:102] verifying NodePressure condition ... I0908 08:32:48.040316 20823 image.go:135] retrieving image: local-repl-2 I0908 08:32:48.040323 20823 image.go:141] checking repository: index.docker.io/library/local-repl-2 I0908 08:32:48.041383 20823 image.go:135] retrieving image: local-proxy-1 I0908 08:32:48.041391 20823 image.go:141] checking repository: index.docker.io/library/local-proxy-1 I0908 08:32:48.042139 20823 image.go:135] retrieving image: local-repl-4 I0908 08:32:48.042146 20823 image.go:141] checking repository: index.docker.io/library/local-repl-4 I0908 08:32:48.042182 20823 image.go:135] retrieving image: us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2 I0908 08:32:48.045794 20823 node_conditions.go:122] node storage ephemeral capacity is 72863488Ki I0908 08:32:48.045806 20823 node_conditions.go:123] node cpu capacity is 4 I0908 08:32:48.045811 20823 node_conditions.go:105] duration metric: took 5.978922ms to run NodePressure ... I0908 08:32:48.045817 20823 start.go:231] waiting for startup goroutines ... I0908 08:32:48.046866 20823 image.go:177] daemon lookup for us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2: Error response from daemon: reference does not exist I0908 08:32:48.099646 20823 main.go:130] libmachine: Making call to close driver server I0908 08:32:48.099711 20823 main.go:130] libmachine: (minikube) Calling .Close I0908 08:32:48.099760 20823 main.go:130] libmachine: Making call to close driver server I0908 08:32:48.099786 20823 main.go:130] libmachine: (minikube) Calling .Close I0908 08:32:48.100029 20823 main.go:130] libmachine: Successfully made call to close driver server I0908 08:32:48.100048 20823 main.go:130] libmachine: Making call to close connection to plugin binary I0908 08:32:48.100079 20823 main.go:130] libmachine: Making call to close driver server I0908 08:32:48.100085 20823 main.go:130] libmachine: (minikube) Calling .Close I0908 08:32:48.100134 20823 main.go:130] libmachine: (minikube) DBG | Closing plugin on server side I0908 08:32:48.100179 20823 main.go:130] libmachine: Successfully made call to close driver server I0908 08:32:48.100192 20823 main.go:130] libmachine: Making call to close connection to plugin binary I0908 08:32:48.100205 20823 main.go:130] libmachine: Making call to close driver server I0908 08:32:48.100209 20823 main.go:130] libmachine: (minikube) Calling .Close I0908 08:32:48.100318 20823 main.go:130] libmachine: Successfully made call to close driver server I0908 08:32:48.100326 20823 main.go:130] libmachine: Making call to close connection to plugin binary I0908 08:32:48.100339 20823 main.go:130] libmachine: (minikube) DBG | Closing plugin on server side I0908 08:32:48.100451 20823 main.go:130] libmachine: (minikube) DBG | Closing plugin on server side I0908 08:32:48.100564 20823 main.go:130] libmachine: Successfully made call to close driver server I0908 08:32:48.100574 20823 main.go:130] libmachine: Making call to close connection to plugin binary I0908 08:32:48.100591 20823 main.go:130] libmachine: Making call to close driver server I0908 08:32:48.100608 20823 main.go:130] libmachine: (minikube) Calling .Close I0908 08:32:48.100880 20823 main.go:130] libmachine: Successfully made call to close driver server I0908 08:32:48.100886 20823 main.go:130] libmachine: Making call to close connection to plugin binary I0908 08:32:48.100900 20823 main.go:130] libmachine: (minikube) DBG | Closing plugin on server side I0908 08:32:48.121343 20823 out.go:177] 🌟 Enabled addons: storage-provisioner, default-storageclass I0908 08:32:48.121369 20823 addons.go:406] enableAddons completed in 422.977526ms I0908 08:32:50.273587 20823 image.go:145] canonical name: docker.io/cryptexlabs/minikube-ingress-dns:0.3.0 I0908 08:32:50.277440 20823 image.go:177] daemon lookup for cryptexlabs/minikube-ingress-dns:0.3.0: Error response from daemon: reference does not exist I0908 08:32:50.525783 20823 image.go:145] canonical name: docker.io/jettech/kube-webhook-certgen:v1.3.0 I0908 08:32:50.531989 20823 image.go:177] daemon lookup for jettech/kube-webhook-certgen:v1.3.0: Error response from daemon: reference does not exist W0908 08:32:50.715657 20823 image.go:148] remote: HEAD https://index.docker.io/v2/library/local-proxy-3/manifests/latest: unexpected status code 401 Unauthorized (HEAD responses have no body, use GET for details) I0908 08:32:50.715700 20823 image.go:149] short name: local-proxy-3 W0908 08:32:50.717808 20823 image.go:148] remote: HEAD https://index.docker.io/v2/library/local-repl-5/manifests/latest: unexpected status code 401 Unauthorized (HEAD responses have no body, use GET for details) I0908 08:32:50.717844 20823 image.go:149] short name: local-repl-5 I0908 08:32:50.719529 20823 image.go:177] daemon lookup for local-proxy-3: Error response from daemon: reference does not exist I0908 08:32:50.723145 20823 image.go:177] daemon lookup for local-repl-5: Error response from daemon: reference does not exist I0908 08:32:50.748935 20823 ssh_runner.go:152] Run: docker image inspect --format {{.Id}} us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2 I0908 08:32:50.759487 20823 image.go:145] canonical name: docker.io/jettech/kube-webhook-certgen:v1.2.2 I0908 08:32:50.763070 20823 image.go:177] daemon lookup for jettech/kube-webhook-certgen:v1.2.2: Error response from daemon: reference does not exist I0908 08:32:50.772746 20823 cache_images.go:110] "us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2" needs transfer: "us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2" does not exist at hash "4b26fa2d90ae3bd47a8b9b49cf6a49cf195234ed32e4e0fef4787cd679218ff4" in container runtime I0908 08:32:50.772778 20823 docker.go:239] Removing image: us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2 I0908 08:32:50.772841 20823 ssh_runner.go:152] Run: docker rmi us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller:v0.40.2 I0908 08:32:50.796089 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller_v0.40.2 I0908 08:32:50.796287 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/controller_v0.40.2 I0908 08:32:50.799274 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/controller_v0.40.2: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/controller_v0.40.2: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/controller_v0.40.2': No such file or directory I0908 08:32:50.799304 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller_v0.40.2 --> /var/lib/minikube/images/controller_v0.40.2 (103873024 bytes) W0908 08:32:50.931370 20823 image.go:148] remote: HEAD https://index.docker.io/v2/library/local-repl-1/manifests/latest: unexpected status code 401 Unauthorized (HEAD responses have no body, use GET for details) I0908 08:32:50.931384 20823 image.go:149] short name: local-repl-1 I0908 08:32:50.935273 20823 image.go:177] daemon lookup for local-repl-1: Error response from daemon: reference does not exist W0908 08:32:51.116051 20823 image.go:148] remote: HEAD https://index.docker.io/v2/library/local-repl-3/manifests/latest: unexpected status code 401 Unauthorized (HEAD responses have no body, use GET for details) I0908 08:32:51.116081 20823 image.go:149] short name: local-repl-3 I0908 08:32:51.119947 20823 image.go:177] daemon lookup for local-repl-3: Error response from daemon: reference does not exist W0908 08:32:51.143840 20823 image.go:148] remote: HEAD https://index.docker.io/v2/library/local-repl-2/manifests/latest: unexpected status code 401 Unauthorized (HEAD responses have no body, use GET for details) I0908 08:32:51.143889 20823 image.go:149] short name: local-repl-2 I0908 08:32:51.153241 20823 image.go:177] daemon lookup for local-repl-2: Error response from daemon: reference does not exist I0908 08:32:51.180946 20823 docker.go:206] Loading image: /var/lib/minikube/images/controller_v0.40.2 I0908 08:32:51.180996 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/controller_v0.40.2 | docker load" W0908 08:32:51.310531 20823 image.go:148] remote: HEAD https://index.docker.io/v2/library/local-proxy-2/manifests/latest: unexpected status code 401 Unauthorized (HEAD responses have no body, use GET for details) I0908 08:32:51.310551 20823 image.go:149] short name: local-proxy-2 I0908 08:32:51.315459 20823 image.go:177] daemon lookup for local-proxy-2: Error response from daemon: reference does not exist W0908 08:32:51.351152 20823 image.go:148] remote: HEAD https://index.docker.io/v2/library/local-proxy-1/manifests/latest: unexpected status code 401 Unauthorized (HEAD responses have no body, use GET for details) I0908 08:32:51.351178 20823 image.go:149] short name: local-proxy-1 I0908 08:32:51.355675 20823 image.go:177] daemon lookup for local-proxy-1: Error response from daemon: reference does not exist W0908 08:32:51.529458 20823 image.go:148] remote: HEAD https://index.docker.io/v2/library/local-repl-4/manifests/latest: unexpected status code 401 Unauthorized (HEAD responses have no body, use GET for details) I0908 08:32:51.529485 20823 image.go:149] short name: local-repl-4 I0908 08:32:51.532819 20823 image.go:177] daemon lookup for local-repl-4: Error response from daemon: reference does not exist I0908 08:32:52.307038 20823 ssh_runner.go:152] Run: docker image inspect --format {{.Id}} cryptexlabs/minikube-ingress-dns:0.3.0 W0908 08:32:52.454178 20823 image.go:187] authn lookup for local-repl-5 (trying anon): GET https://index.docker.io/v2/library/local-repl-5/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-5 Type:repository]] W0908 08:32:52.471467 20823 image.go:187] authn lookup for local-proxy-3 (trying anon): GET https://index.docker.io/v2/library/local-proxy-3/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-3 Type:repository]] W0908 08:32:52.477947 20823 image.go:187] authn lookup for local-repl-1 (trying anon): GET https://index.docker.io/v2/library/local-repl-1/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-1 Type:repository]] W0908 08:32:52.635690 20823 image.go:187] authn lookup for local-repl-3 (trying anon): GET https://index.docker.io/v2/library/local-repl-3/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-3 Type:repository]] W0908 08:32:52.672698 20823 image.go:187] authn lookup for local-repl-2 (trying anon): GET https://index.docker.io/v2/library/local-repl-2/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-2 Type:repository]] W0908 08:32:52.741014 20823 image.go:187] authn lookup for local-proxy-2 (trying anon): GET https://index.docker.io/v2/library/local-proxy-2/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-2 Type:repository]] I0908 08:32:52.741690 20823 ssh_runner.go:152] Run: docker image inspect --format {{.Id}} jettech/kube-webhook-certgen:v1.3.0 I0908 08:32:52.754608 20823 ssh_runner.go:152] Run: docker image inspect --format {{.Id}} jettech/kube-webhook-certgen:v1.2.2 W0908 08:32:52.860861 20823 image.go:187] authn lookup for local-proxy-1 (trying anon): GET https://index.docker.io/v2/library/local-proxy-1/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-1 Type:repository]] W0908 08:32:52.971994 20823 image.go:187] authn lookup for local-repl-4 (trying anon): GET https://index.docker.io/v2/library/local-repl-4/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-4 Type:repository]] I0908 08:32:54.380869 20823 image.go:191] remote lookup for local-proxy-3: GET https://index.docker.io/v2/library/local-proxy-3/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-3 Type:repository]] I0908 08:32:54.380891 20823 image.go:94] error retrieve Image local-proxy-3 ref GET https://index.docker.io/v2/library/local-proxy-3/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-3 Type:repository]] I0908 08:32:54.380910 20823 cache_images.go:110] "local-proxy-3" needs transfer: got empty img digest "" for local-proxy-3 I0908 08:32:54.380947 20823 docker.go:239] Removing image: local-proxy-3 I0908 08:32:54.381066 20823 ssh_runner.go:152] Run: docker rmi local-proxy-3 I0908 08:32:54.391552 20823 image.go:191] remote lookup for local-repl-5: GET https://index.docker.io/v2/library/local-repl-5/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-5 Type:repository]] I0908 08:32:54.391580 20823 image.go:94] error retrieve Image local-repl-5 ref GET https://index.docker.io/v2/library/local-repl-5/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-5 Type:repository]] I0908 08:32:54.391601 20823 cache_images.go:110] "local-repl-5" needs transfer: got empty img digest "" for local-repl-5 I0908 08:32:54.391619 20823 docker.go:239] Removing image: local-repl-5 I0908 08:32:54.391691 20823 ssh_runner.go:152] Run: docker rmi local-repl-5 I0908 08:32:54.413507 20823 image.go:191] remote lookup for local-repl-1: GET https://index.docker.io/v2/library/local-repl-1/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-1 Type:repository]] I0908 08:32:54.413531 20823 image.go:94] error retrieve Image local-repl-1 ref GET https://index.docker.io/v2/library/local-repl-1/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-1 Type:repository]] I0908 08:32:54.413550 20823 cache_images.go:110] "local-repl-1" needs transfer: got empty img digest "" for local-repl-1 I0908 08:32:54.413567 20823 docker.go:239] Removing image: local-repl-1 I0908 08:32:54.413644 20823 ssh_runner.go:152] Run: docker rmi local-repl-1 I0908 08:32:54.493170 20823 image.go:191] remote lookup for local-repl-3: GET https://index.docker.io/v2/library/local-repl-3/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-3 Type:repository]] I0908 08:32:54.493194 20823 image.go:94] error retrieve Image local-repl-3 ref GET https://index.docker.io/v2/library/local-repl-3/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-3 Type:repository]] I0908 08:32:54.493218 20823 cache_images.go:110] "local-repl-3" needs transfer: got empty img digest "" for local-repl-3 I0908 08:32:54.493235 20823 docker.go:239] Removing image: local-repl-3 I0908 08:32:54.493325 20823 ssh_runner.go:152] Run: docker rmi local-repl-3 I0908 08:32:54.592222 20823 image.go:191] remote lookup for local-repl-2: GET https://index.docker.io/v2/library/local-repl-2/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-2 Type:repository]] I0908 08:32:54.592248 20823 image.go:94] error retrieve Image local-repl-2 ref GET https://index.docker.io/v2/library/local-repl-2/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-2 Type:repository]] I0908 08:32:54.592270 20823 cache_images.go:110] "local-repl-2" needs transfer: got empty img digest "" for local-repl-2 I0908 08:32:54.592286 20823 docker.go:239] Removing image: local-repl-2 I0908 08:32:54.592354 20823 ssh_runner.go:152] Run: docker rmi local-repl-2 I0908 08:32:54.638207 20823 image.go:191] remote lookup for local-proxy-2: GET https://index.docker.io/v2/library/local-proxy-2/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-2 Type:repository]] I0908 08:32:54.638227 20823 image.go:94] error retrieve Image local-proxy-2 ref GET https://index.docker.io/v2/library/local-proxy-2/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-2 Type:repository]] I0908 08:32:54.638242 20823 cache_images.go:110] "local-proxy-2" needs transfer: got empty img digest "" for local-proxy-2 I0908 08:32:54.638260 20823 docker.go:239] Removing image: local-proxy-2 I0908 08:32:54.638329 20823 ssh_runner.go:152] Run: docker rmi local-proxy-2 I0908 08:32:54.687225 20823 image.go:191] remote lookup for local-proxy-1: GET https://index.docker.io/v2/library/local-proxy-1/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-1 Type:repository]] I0908 08:32:54.687250 20823 image.go:94] error retrieve Image local-proxy-1 ref GET https://index.docker.io/v2/library/local-proxy-1/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-proxy-1 Type:repository]] I0908 08:32:54.687290 20823 cache_images.go:110] "local-proxy-1" needs transfer: got empty img digest "" for local-proxy-1 I0908 08:32:54.687305 20823 docker.go:239] Removing image: local-proxy-1 I0908 08:32:54.687376 20823 ssh_runner.go:152] Run: docker rmi local-proxy-1 I0908 08:32:54.846677 20823 image.go:191] remote lookup for local-repl-4: GET https://index.docker.io/v2/library/local-repl-4/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-4 Type:repository]] I0908 08:32:54.846751 20823 image.go:94] error retrieve Image local-repl-4 ref GET https://index.docker.io/v2/library/local-repl-4/manifests/latest: UNAUTHORIZED: authentication required; [map[Action:pull Class: Name:library/local-repl-4 Type:repository]] I0908 08:32:54.846787 20823 cache_images.go:110] "local-repl-4" needs transfer: got empty img digest "" for local-repl-4 I0908 08:32:54.846803 20823 docker.go:239] Removing image: local-repl-4 I0908 08:32:54.846861 20823 ssh_runner.go:152] Run: docker rmi local-repl-4 I0908 08:32:56.920528 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/controller_v0.40.2 | docker load": (5.739460303s) I0908 08:32:56.920555 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/us.gcr.io/k8s-artifacts-prod/ingress-nginx/controller_v0.40.2 from cache I0908 08:32:56.920574 20823 ssh_runner.go:192] Completed: docker image inspect --format {{.Id}} cryptexlabs/minikube-ingress-dns:0.3.0: (4.613473981s) I0908 08:32:56.920593 20823 cache_images.go:110] "cryptexlabs/minikube-ingress-dns:0.3.0" needs transfer: "cryptexlabs/minikube-ingress-dns:0.3.0" does not exist at hash "98c804e339f5e241d7024967e9849a8d3d039f00f64641c4eaafbf1e2739f05e" in container runtime I0908 08:32:56.920607 20823 docker.go:239] Removing image: cryptexlabs/minikube-ingress-dns:0.3.0 I0908 08:32:56.920647 20823 ssh_runner.go:192] Completed: docker image inspect --format {{.Id}} jettech/kube-webhook-certgen:v1.3.0: (4.178870194s) I0908 08:32:56.920684 20823 cache_images.go:110] "jettech/kube-webhook-certgen:v1.3.0" needs transfer: "jettech/kube-webhook-certgen:v1.3.0" does not exist at hash "4d4f44df9f905cc6fcd61eb1e068a2fb296581980b19a7812e6e897ff7991612" in container runtime I0908 08:32:56.920692 20823 ssh_runner.go:192] Completed: docker image inspect --format {{.Id}} jettech/kube-webhook-certgen:v1.2.2: (4.166028904s) I0908 08:32:56.920723 20823 docker.go:239] Removing image: jettech/kube-webhook-certgen:v1.3.0 I0908 08:32:56.920725 20823 ssh_runner.go:152] Run: docker rmi cryptexlabs/minikube-ingress-dns:0.3.0 I0908 08:32:56.920726 20823 ssh_runner.go:192] Completed: docker rmi local-proxy-3: (2.539629401s) I0908 08:32:56.920730 20823 cache_images.go:110] "jettech/kube-webhook-certgen:v1.2.2" needs transfer: "jettech/kube-webhook-certgen:v1.2.2" does not exist at hash "5693ebf5622ae858533c04eaa60c723efdacf8dcb75af968653015646380883f" in container runtime I0908 08:32:56.920738 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/local-proxy-3 I0908 08:32:56.920740 20823 docker.go:239] Removing image: jettech/kube-webhook-certgen:v1.2.2 I0908 08:32:56.920769 20823 ssh_runner.go:152] Run: docker rmi jettech/kube-webhook-certgen:v1.3.0 I0908 08:32:56.920785 20823 ssh_runner.go:152] Run: docker rmi jettech/kube-webhook-certgen:v1.2.2 I0908 08:32:56.920788 20823 ssh_runner.go:192] Completed: docker rmi local-repl-5: (2.529061841s) I0908 08:32:56.920815 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/local-repl-5 I0908 08:32:56.920892 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-proxy-3 I0908 08:32:56.920896 20823 ssh_runner.go:192] Completed: docker rmi local-repl-1: (2.507216546s) I0908 08:32:56.920956 20823 ssh_runner.go:192] Completed: docker rmi local-repl-3: (2.42759741s) I0908 08:32:56.920964 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/local-repl-1 I0908 08:32:56.920965 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/local-repl-3 I0908 08:32:56.921024 20823 ssh_runner.go:192] Completed: docker rmi local-repl-2: (2.328638035s) I0908 08:32:56.921038 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/local-repl-2 I0908 08:32:56.921056 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-5 I0908 08:32:56.921074 20823 ssh_runner.go:192] Completed: docker rmi local-proxy-2: (2.282714874s) I0908 08:32:56.921087 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/local-proxy-2 I0908 08:32:56.921143 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-3 I0908 08:32:56.921144 20823 ssh_runner.go:192] Completed: docker rmi local-proxy-1: (2.233725297s) I0908 08:32:56.921171 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-1 I0908 08:32:56.921177 20823 ssh_runner.go:192] Completed: docker rmi local-repl-4: (2.074287974s) I0908 08:32:56.921195 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/local-proxy-1 I0908 08:32:56.921199 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/local-repl-4 I0908 08:32:56.921208 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-2 I0908 08:32:56.921231 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-proxy-2 I0908 08:32:56.921368 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-4 I0908 08:32:56.921418 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-proxy-1 I0908 08:32:56.950164 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/local-proxy-3: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-proxy-3: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/local-proxy-3': No such file or directory I0908 08:32:56.950219 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/cryptexlabs/minikube-ingress-dns_0.3.0 I0908 08:32:56.950244 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/local-proxy-3 --> /var/lib/minikube/images/local-proxy-3 (447795712 bytes) I0908 08:32:56.950485 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/minikube-ingress-dns_0.3.0 I0908 08:32:56.967248 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.2.2 I0908 08:32:56.967270 20823 cache_images.go:280] Loading image from: /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.3.0 I0908 08:32:56.967294 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/local-repl-1: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-1: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/local-repl-1': No such file or directory I0908 08:32:56.967335 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/local-repl-1 --> /var/lib/minikube/images/local-repl-1 (421766656 bytes) I0908 08:32:56.967347 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/local-repl-3: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-3: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/local-repl-3': No such file or directory I0908 08:32:56.967374 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/local-repl-3 --> /var/lib/minikube/images/local-repl-3 (1806964224 bytes) I0908 08:32:56.967395 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/local-repl-5: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-5: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/local-repl-5': No such file or directory I0908 08:32:56.967424 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/local-proxy-2: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-proxy-2: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/local-proxy-2': No such file or directory I0908 08:32:56.967437 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/kube-webhook-certgen_v1.3.0 I0908 08:32:56.967444 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/local-repl-5 --> /var/lib/minikube/images/local-repl-5 (1794524672 bytes) I0908 08:32:56.967447 20823 ssh_runner.go:152] Run: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/kube-webhook-certgen_v1.2.2 I0908 08:32:56.967463 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/local-proxy-2 --> /var/lib/minikube/images/local-proxy-2 (448508416 bytes) I0908 08:32:56.967475 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/local-repl-2: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-2: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/local-repl-2': No such file or directory I0908 08:32:56.967499 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/local-proxy-1: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-proxy-1: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/local-proxy-1': No such file or directory I0908 08:32:56.967497 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/local-repl-2 --> /var/lib/minikube/images/local-repl-2 (1806963200 bytes) I0908 08:32:56.967515 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/local-proxy-1 --> /var/lib/minikube/images/local-proxy-1 (448507904 bytes) I0908 08:32:56.967527 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/local-repl-4: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/local-repl-4: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/local-repl-4': No such file or directory I0908 08:32:56.967552 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/local-repl-4 --> /var/lib/minikube/images/local-repl-4 (1804025344 bytes) I0908 08:32:56.973480 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/minikube-ingress-dns_0.3.0: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/minikube-ingress-dns_0.3.0: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/minikube-ingress-dns_0.3.0': No such file or directory I0908 08:32:56.973525 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/cryptexlabs/minikube-ingress-dns_0.3.0 --> /var/lib/minikube/images/minikube-ingress-dns_0.3.0 (57170944 bytes) I0908 08:32:57.017078 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/kube-webhook-certgen_v1.3.0: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/kube-webhook-certgen_v1.3.0: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/kube-webhook-certgen_v1.3.0': No such file or directory I0908 08:32:57.017164 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.3.0 --> /var/lib/minikube/images/kube-webhook-certgen_v1.3.0 (23712256 bytes) I0908 08:32:57.018893 20823 ssh_runner.go:309] existence check for /var/lib/minikube/images/kube-webhook-certgen_v1.2.2: stat -c "%!s(MISSING) %!y(MISSING)" /var/lib/minikube/images/kube-webhook-certgen_v1.2.2: Process exited with status 1 stdout: stderr: stat: cannot statx '/var/lib/minikube/images/kube-webhook-certgen_v1.2.2': No such file or directory I0908 08:32:57.018923 20823 ssh_runner.go:319] scp /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.2.2 --> /var/lib/minikube/images/kube-webhook-certgen_v1.2.2 (23566848 bytes) I0908 08:32:58.077183 20823 docker.go:206] Loading image: /var/lib/minikube/images/kube-webhook-certgen_v1.2.2 I0908 08:32:58.077207 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/kube-webhook-certgen_v1.2.2 | docker load" I0908 08:33:02.246093 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/kube-webhook-certgen_v1.2.2 | docker load": (4.168827153s) I0908 08:33:02.246106 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.2.2 from cache I0908 08:33:02.246127 20823 docker.go:206] Loading image: /var/lib/minikube/images/kube-webhook-certgen_v1.3.0 I0908 08:33:02.246139 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/kube-webhook-certgen_v1.3.0 | docker load" I0908 08:33:04.386054 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/kube-webhook-certgen_v1.3.0 | docker load": (2.139870365s) I0908 08:33:04.386069 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/jettech/kube-webhook-certgen_v1.3.0 from cache I0908 08:33:04.386090 20823 docker.go:206] Loading image: /var/lib/minikube/images/minikube-ingress-dns_0.3.0 I0908 08:33:04.386102 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/minikube-ingress-dns_0.3.0 | docker load" I0908 08:33:15.613614 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/minikube-ingress-dns_0.3.0 | docker load": (11.227379184s) I0908 08:33:15.613624 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/cryptexlabs/minikube-ingress-dns_0.3.0 from cache I0908 08:33:15.613640 20823 docker.go:206] Loading image: /var/lib/minikube/images/local-repl-1 I0908 08:33:15.613658 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-1 | docker load" I0908 08:33:43.426589 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-1 | docker load": (27.812613207s) I0908 08:33:43.426621 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/local-repl-1 from cache I0908 08:33:43.426652 20823 docker.go:206] Loading image: /var/lib/minikube/images/local-proxy-1 I0908 08:33:43.426699 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/local-proxy-1 | docker load" I0908 08:34:04.807309 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/local-proxy-1 | docker load": (21.380366867s) I0908 08:34:04.807336 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/local-proxy-1 from cache I0908 08:34:04.807374 20823 docker.go:206] Loading image: /var/lib/minikube/images/local-proxy-3 I0908 08:34:04.807382 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/local-proxy-3 | docker load" I0908 08:34:09.208235 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/local-proxy-3 | docker load": (4.400796659s) I0908 08:34:09.208264 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/local-proxy-3 from cache I0908 08:34:09.208303 20823 docker.go:206] Loading image: /var/lib/minikube/images/local-proxy-2 I0908 08:34:09.208328 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/local-proxy-2 | docker load" I0908 08:34:10.118142 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/local-proxy-2 from cache I0908 08:34:10.118170 20823 docker.go:206] Loading image: /var/lib/minikube/images/local-repl-4 I0908 08:34:10.118230 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-4 | docker load" I0908 08:35:13.191148 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-4 | docker load": (1m3.072245977s) I0908 08:35:13.191172 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/local-repl-4 from cache I0908 08:35:13.191212 20823 docker.go:206] Loading image: /var/lib/minikube/images/local-repl-3 I0908 08:35:13.191262 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-3 | docker load" I0908 08:36:13.829995 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-3 | docker load": (1m0.638081054s) I0908 08:36:13.830004 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/local-repl-3 from cache I0908 08:36:13.830019 20823 docker.go:206] Loading image: /var/lib/minikube/images/local-repl-2 I0908 08:36:13.830028 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-2 | docker load" I0908 08:36:20.766033 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-2 | docker load": (6.935922786s) I0908 08:36:20.766043 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/local-repl-2 from cache I0908 08:36:20.766069 20823 docker.go:206] Loading image: /var/lib/minikube/images/local-repl-5 I0908 08:36:20.766089 20823 ssh_runner.go:152] Run: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-5 | docker load" I0908 08:36:26.387434 20823 ssh_runner.go:192] Completed: /bin/bash -c "sudo cat /var/lib/minikube/images/local-repl-5 | docker load": (5.6212677s) I0908 08:36:26.387445 20823 cache_images.go:309] Transferred and loaded /Users/noah/.minikube/cache/images/local-repl-5 from cache I0908 08:36:26.387481 20823 cache_images.go:117] Successfully loaded all cached images I0908 08:36:26.387483 20823 cache_images.go:86] LoadImages completed in 3m38.363908338s I0908 08:36:26.387487 20823 cache_images.go:256] succeeded pushing to: minikube I0908 08:36:26.387488 20823 cache_images.go:257] failed pushing to: I0908 08:36:26.387529 20823 main.go:130] libmachine: Making call to close driver server I0908 08:36:26.387535 20823 main.go:130] libmachine: (minikube) Calling .Close I0908 08:36:26.387707 20823 main.go:130] libmachine: Successfully made call to close driver server I0908 08:36:26.387717 20823 main.go:130] libmachine: Making call to close connection to plugin binary I0908 08:36:26.387722 20823 main.go:130] libmachine: Making call to close driver server I0908 08:36:26.387726 20823 main.go:130] libmachine: (minikube) Calling .Close I0908 08:36:26.387886 20823 main.go:130] libmachine: Successfully made call to close driver server I0908 08:36:26.387893 20823 main.go:130] libmachine: Making call to close connection to plugin binary I0908 08:36:26.387907 20823 main.go:130] libmachine: (minikube) DBG | Closing plugin on server side I0908 08:36:26.623450 20823 start.go:462] kubectl: 1.22.1, cluster: 1.22.1 (minor skew: 0) I0908 08:36:26.660464 20823 out.go:177] 🏄 Done! kubectl is now configured to use "minikube" cluster and "default" namespace by default * * ==> Docker <== * -- Journal begins at Tue 2021-09-07 20:32:17 UTC, ends at Tue 2021-09-07 20:47:47 UTC. -- Sep 07 20:32:32 minikube systemd[1]: Started Docker Application Container Engine. Sep 07 20:32:32 minikube dockerd[2245]: time="2021-09-07T20:32:32.332389026Z" level=info msg="API listen on [::]:2376" Sep 07 20:32:32 minikube dockerd[2245]: time="2021-09-07T20:32:32.339557693Z" level=info msg="API listen on /var/run/docker.sock" Sep 07 20:32:39 minikube dockerd[2253]: time="2021-09-07T20:32:39.330645916Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/228bb6c2419021689fb1fe503609400e4306cc5bc6324a283272fe623fb7927d pid=3223 Sep 07 20:32:39 minikube dockerd[2253]: time="2021-09-07T20:32:39.346690382Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/8c6f0b71a346a946eb85dc2637bc58c4b7b5e131a5074fd9ccc0b76e29213fe2 pid=3254 Sep 07 20:32:39 minikube dockerd[2253]: time="2021-09-07T20:32:39.349084743Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/a8d8549ace855cd37c9126bd048880bd4722645754cc174d2a47dc461115a95a pid=3270 Sep 07 20:32:39 minikube dockerd[2253]: time="2021-09-07T20:32:39.350005789Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/243f17f7a7fbb8e5fe313ce76cc98f1b090e6aba20cad530b813f99ddd5f7570 pid=3271 Sep 07 20:32:39 minikube dockerd[2253]: time="2021-09-07T20:32:39.824396524Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/329527d534fe06633490749f62768bd0b6e24395f3cf67b51599ba7a6ed4221d pid=3417 Sep 07 20:32:39 minikube dockerd[2253]: time="2021-09-07T20:32:39.828899360Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/7776fe7de1ebef6f09e0d0809fb470eb0e2ba34bf4b0f1c547391c5bb9b2aabf pid=3439 Sep 07 20:32:39 minikube dockerd[2253]: time="2021-09-07T20:32:39.829787308Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/b9d34792ddd77f5365a31726108145b54e7c0f1c9be7d6e6c3e167cc11777073 pid=3445 Sep 07 20:32:39 minikube dockerd[2253]: time="2021-09-07T20:32:39.841385904Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/275d4050620cfafa771faa83c35c726a4d22c9b4d20d5dee1692919913a9ce2f pid=3473 Sep 07 20:33:02 minikube dockerd[2253]: time="2021-09-07T20:33:02.730288417Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/42917dba1174f5e9b50e4957a30f6d184e363effef4b72baf9c27efe02c77922 pid=4766 Sep 07 20:33:02 minikube dockerd[2253]: time="2021-09-07T20:33:02.758683189Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/d413025e04818372c0df1d624b215c0dd4219875924c3b9afe7421b9585b606c pid=4762 Sep 07 20:33:02 minikube dockerd[2253]: time="2021-09-07T20:33:02.788546542Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/1ee13061cfbf6aaffb6604cd5f372c1d4a93b7d0d5040b6a554c3afa716e72b8 pid=4789 Sep 07 20:33:03 minikube dockerd[2253]: time="2021-09-07T20:33:03.657467701Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/6f568f46ea279a7950dd5594830bc44ea9409f05023187b5a9a3292a33c4807a pid=4920 Sep 07 20:33:04 minikube dockerd[2253]: time="2021-09-07T20:33:04.074449520Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/cb0ef374ef5db2b328c9ab049c269d0200a93cb4e8eec94a4d109bafaf773b94 pid=4966 Sep 07 20:33:04 minikube dockerd[2253]: time="2021-09-07T20:33:04.257636575Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/9361b36c06f72839ab7aac5479b623486687df18a18e70e3d3d5d0be13c9666b pid=5036 Sep 07 20:33:34 minikube dockerd[2253]: time="2021-09-07T20:33:34.483206580Z" level=info msg="shim disconnected" id=cb0ef374ef5db2b328c9ab049c269d0200a93cb4e8eec94a4d109bafaf773b94 Sep 07 20:33:34 minikube dockerd[2253]: time="2021-09-07T20:33:34.483499331Z" level=error msg="copy shim log" error="read /proc/self/fd/63: file already closed" Sep 07 20:33:34 minikube dockerd[2245]: time="2021-09-07T20:33:34.483762689Z" level=info msg="ignoring event" container=cb0ef374ef5db2b328c9ab049c269d0200a93cb4e8eec94a4d109bafaf773b94 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete" Sep 07 20:33:36 minikube dockerd[2253]: time="2021-09-07T20:33:36.481084097Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/adb5a4a0e0c6b50e6d769ccddc460449ceff68f9cfe705ea3616e71fc49d203a pid=5480 Sep 07 20:41:04 minikube dockerd[2245]: time="2021-09-07T20:41:04.270103671Z" level=warning msg="Published ports are discarded when using host network mode" Sep 07 20:41:04 minikube dockerd[2245]: time="2021-09-07T20:41:04.300226105Z" level=warning msg="Published ports are discarded when using host network mode" Sep 07 20:41:04 minikube dockerd[2253]: time="2021-09-07T20:41:04.322532601Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/8a5c9aa642c8d8d460e0c3b7bb6d1fbf50705cd77d94909f68c7b2de9979d5e5 pid=8854 Sep 07 20:41:05 minikube dockerd[2245]: time="2021-09-07T20:41:05.607547961Z" level=warning msg="reference for unknown type: " digest="sha256:e252d2a4c704027342b303cc563e95d2e71d2a0f1404f55d676390e28d5093ab" remote="docker.io/cryptexlabs/minikube-ingress-dns@sha256:e252d2a4c704027342b303cc563e95d2e71d2a0f1404f55d676390e28d5093ab" Sep 07 20:41:07 minikube dockerd[2253]: time="2021-09-07T20:41:07.889814095Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/507422330db1e9b48db8e4455a1b9346571662575af71489d5737ae01aa3809a pid=8974 Sep 07 20:41:07 minikube dockerd[2253]: time="2021-09-07T20:41:07.912690593Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/ae1b342bc3610df11da60a9fee7ed1d623edd56b78346328a7a86eb0505bd8b3 pid=9002 Sep 07 20:41:07 minikube dockerd[2253]: time="2021-09-07T20:41:07.986515114Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/a7c7124a55b208c5353c600f0dafb781b943863fbf29fdba8050b9519d4e9a2b pid=9066 Sep 07 20:41:08 minikube dockerd[2245]: time="2021-09-07T20:41:08.923495375Z" level=warning msg="reference for unknown type: " digest="sha256:f3b6b39a6062328c095337b4cadcefd1612348fdd5190b1dcbcb9b9e90bd8068" remote="k8s.gcr.io/ingress-nginx/kube-webhook-certgen@sha256:f3b6b39a6062328c095337b4cadcefd1612348fdd5190b1dcbcb9b9e90bd8068" Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.129678525Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/cd24545b8788c224f4cbff12fb444a9b12a3f499c03359e17b7802bcf2fe64ad pid=9263 Sep 07 20:41:24 minikube dockerd[2245]: time="2021-09-07T20:41:24.279500233Z" level=info msg="ignoring event" container=cd24545b8788c224f4cbff12fb444a9b12a3f499c03359e17b7802bcf2fe64ad module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete" Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.279690611Z" level=info msg="shim disconnected" id=cd24545b8788c224f4cbff12fb444a9b12a3f499c03359e17b7802bcf2fe64ad Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.279734088Z" level=error msg="copy shim log" error="read /proc/self/fd/87: file already closed" Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.473221711Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/851004f28d8155a241e9b7b197726e9784ff3d1bdcf6df5a91e4843b1a438227 pid=9328 Sep 07 20:41:24 minikube dockerd[2245]: time="2021-09-07T20:41:24.629044230Z" level=info msg="ignoring event" container=851004f28d8155a241e9b7b197726e9784ff3d1bdcf6df5a91e4843b1a438227 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete" Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.629366727Z" level=info msg="shim disconnected" id=851004f28d8155a241e9b7b197726e9784ff3d1bdcf6df5a91e4843b1a438227 Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.629515281Z" level=error msg="copy shim log" error="read /proc/self/fd/87: file already closed" Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.843077811Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/18769bc0991c125238c169ce929f1673e6ee35758a8ea103805bb6971e438693 pid=9392 Sep 07 20:41:24 minikube dockerd[2245]: time="2021-09-07T20:41:24.883162075Z" level=info msg="ignoring event" container=ae1b342bc3610df11da60a9fee7ed1d623edd56b78346328a7a86eb0505bd8b3 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete" Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.883312797Z" level=info msg="shim disconnected" id=ae1b342bc3610df11da60a9fee7ed1d623edd56b78346328a7a86eb0505bd8b3 Sep 07 20:41:24 minikube dockerd[2253]: time="2021-09-07T20:41:24.883373018Z" level=error msg="copy shim log" error="read /proc/self/fd/78: file already closed" Sep 07 20:41:25 minikube dockerd[2245]: time="2021-09-07T20:41:25.002425024Z" level=info msg="ignoring event" container=18769bc0991c125238c169ce929f1673e6ee35758a8ea103805bb6971e438693 module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete" Sep 07 20:41:25 minikube dockerd[2253]: time="2021-09-07T20:41:25.002911720Z" level=info msg="shim disconnected" id=18769bc0991c125238c169ce929f1673e6ee35758a8ea103805bb6971e438693 Sep 07 20:41:25 minikube dockerd[2253]: time="2021-09-07T20:41:25.003005151Z" level=error msg="copy shim log" error="read /proc/self/fd/87: file already closed" Sep 07 20:41:25 minikube dockerd[2245]: time="2021-09-07T20:41:25.877626933Z" level=info msg="ignoring event" container=507422330db1e9b48db8e4455a1b9346571662575af71489d5737ae01aa3809a module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete" Sep 07 20:41:25 minikube dockerd[2253]: time="2021-09-07T20:41:25.877865719Z" level=info msg="shim disconnected" id=507422330db1e9b48db8e4455a1b9346571662575af71489d5737ae01aa3809a Sep 07 20:41:25 minikube dockerd[2253]: time="2021-09-07T20:41:25.877905269Z" level=error msg="copy shim log" error="read /proc/self/fd/75: file already closed" Sep 07 20:41:39 minikube dockerd[2253]: time="2021-09-07T20:41:39.616164250Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/412608d4d1487febe9da405f628f1bf1eb2c7c2f5c0752a698fb6a4818cb2b2c pid=9588 Sep 07 20:41:40 minikube dockerd[2245]: time="2021-09-07T20:41:40.442188412Z" level=warning msg="reference for unknown type: " digest="sha256:44a7a06b71187a4529b0a9edee5cc22bdf71b414470eff696c3869ea8d90a695" remote="k8s.gcr.io/ingress-nginx/controller@sha256:44a7a06b71187a4529b0a9edee5cc22bdf71b414470eff696c3869ea8d90a695" Sep 07 20:42:41 minikube dockerd[2253]: time="2021-09-07T20:42:41.764380058Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/061897b3a8b7ee92dd6f9252e0c74d6093b1be70adfeb4b68bfee89283b2c5b9 pid=10031 Sep 07 20:44:25 minikube dockerd[2253]: time="2021-09-07T20:44:25.693693516Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/9e9425c7c29066574338a1515c434f65122d1b39ca71e322a94beee156831c06 pid=10928 Sep 07 20:44:31 minikube dockerd[2253]: time="2021-09-07T20:44:31.378028172Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/71dffcff93e7781820f36396d918e303ee3576b642d82e7c4d2db2658442d8fa pid=11183 Sep 07 20:46:34 minikube dockerd[2245]: time="2021-09-07T20:46:34.035992811Z" level=info msg="ignoring event" container=a7c7124a55b208c5353c600f0dafb781b943863fbf29fdba8050b9519d4e9a2b module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete" Sep 07 20:46:34 minikube dockerd[2253]: time="2021-09-07T20:46:34.036282858Z" level=info msg="shim disconnected" id=a7c7124a55b208c5353c600f0dafb781b943863fbf29fdba8050b9519d4e9a2b Sep 07 20:46:34 minikube dockerd[2253]: time="2021-09-07T20:46:34.036321378Z" level=error msg="copy shim log" error="read /proc/self/fd/81: file already closed" Sep 07 20:46:34 minikube dockerd[2253]: time="2021-09-07T20:46:34.286716850Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/58d369b8cbb6b61487cc3cba9003c48b3e0fd347d8508fbd2741766f26d7998d pid=11738 Sep 07 20:46:39 minikube dockerd[2253]: time="2021-09-07T20:46:39.023550197Z" level=info msg="shim disconnected" id=58d369b8cbb6b61487cc3cba9003c48b3e0fd347d8508fbd2741766f26d7998d Sep 07 20:46:39 minikube dockerd[2253]: time="2021-09-07T20:46:39.023621301Z" level=error msg="copy shim log" error="read /proc/self/fd/81: file already closed" Sep 07 20:46:39 minikube dockerd[2245]: time="2021-09-07T20:46:39.025238350Z" level=info msg="ignoring event" container=58d369b8cbb6b61487cc3cba9003c48b3e0fd347d8508fbd2741766f26d7998d module=libcontainerd namespace=moby topic=/tasks/delete type="*events.TaskDelete" Sep 07 20:46:50 minikube dockerd[2253]: time="2021-09-07T20:46:50.402871657Z" level=info msg="starting signal loop" namespace=moby path=/run/docker/containerd/daemon/io.containerd.runtime.v2.task/moby/b657a7249591c51867bf1f73a674e5cce5fb71da587566af25d0a7f33311ac57 pid=11869 * * ==> container status <== * CONTAINER IMAGE CREATED STATE NAME ATTEMPT POD ID b657a7249591c 98c804e339f5e 57 seconds ago Running minikube-ingress-dns 2 8a5c9aa642c8d 58d369b8cbb6b 98c804e339f5e About a minute ago Exited minikube-ingress-dns 1 8a5c9aa642c8d 71dffcff93e77 gcr.io/google-samples/hello-app@sha256:60699bc165368192d6c7295b3f837a996a94812d36ef6e7feb2f9c77a558f813 3 minutes ago Running hello-world-app 0 9e9425c7c2906 061897b3a8b7e k8s.gcr.io/ingress-nginx/controller@sha256:44a7a06b71187a4529b0a9edee5cc22bdf71b414470eff696c3869ea8d90a695 5 minutes ago Running controller 0 412608d4d1487 18769bc0991c1 17e55ec30f203 6 minutes ago Exited patch 1 507422330db1e 851004f28d815 k8s.gcr.io/ingress-nginx/kube-webhook-certgen@sha256:f3b6b39a6062328c095337b4cadcefd1612348fdd5190b1dcbcb9b9e90bd8068 6 minutes ago Exited create 0 ae1b342bc3610 adb5a4a0e0c6b 6e38f40d628db 14 minutes ago Running storage-provisioner 1 42917dba1174f 9361b36c06f72 8d147537fb7d1 14 minutes ago Running coredns 0 1ee13061cfbf6 cb0ef374ef5db 6e38f40d628db 14 minutes ago Exited storage-provisioner 0 42917dba1174f 6f568f46ea279 36c4ebbc9d979 14 minutes ago Running kube-proxy 0 d413025e04818 275d4050620cf 0048118155842 15 minutes ago Running etcd 0 a8d8549ace855 b9d34792ddd77 6e002eb89a881 15 minutes ago Running kube-controller-manager 0 243f17f7a7fbb 329527d534fe0 aca5ededae9c8 15 minutes ago Running kube-scheduler 0 8c6f0b71a346a 7776fe7de1ebe f30469a2491a5 15 minutes ago Running kube-apiserver 0 228bb6c241902 * * ==> coredns [9361b36c06f7] <== * .:53 [INFO] plugin/reload: Running configuration MD5 = 08e2b174e0f0a30a2e82df9c995f4a34 CoreDNS-1.8.4 linux/amd64, go1.16.4, 053c4d5 * * ==> describe nodes <== * Name: minikube Roles: control-plane,master Labels: beta.kubernetes.io/arch=amd64 beta.kubernetes.io/os=linux kubernetes.io/arch=amd64 kubernetes.io/hostname=minikube kubernetes.io/os=linux minikube.k8s.io/commit=5931455374810b1bbeb222a9713ae2c756daee10 minikube.k8s.io/name=minikube minikube.k8s.io/updated_at=2021_09_08T08_32_47_0700 minikube.k8s.io/version=v1.23.0 node-role.kubernetes.io/control-plane= node-role.kubernetes.io/master= node.kubernetes.io/exclude-from-external-load-balancers= Annotations: kubeadm.alpha.kubernetes.io/cri-socket: /var/run/dockershim.sock node.alpha.kubernetes.io/ttl: 0 volumes.kubernetes.io/controller-managed-attach-detach: true CreationTimestamp: Tue, 07 Sep 2021 20:32:43 +0000 Taints: Unschedulable: false Lease: HolderIdentity: minikube AcquireTime: RenewTime: Tue, 07 Sep 2021 20:47:47 +0000 Conditions: Type Status LastHeartbeatTime LastTransitionTime Reason Message ---- ------ ----------------- ------------------ ------ ------- MemoryPressure False Tue, 07 Sep 2021 20:44:51 +0000 Tue, 07 Sep 2021 20:32:40 +0000 KubeletHasSufficientMemory kubelet has sufficient memory available DiskPressure False Tue, 07 Sep 2021 20:44:51 +0000 Tue, 07 Sep 2021 20:32:40 +0000 KubeletHasNoDiskPressure kubelet has no disk pressure PIDPressure False Tue, 07 Sep 2021 20:44:51 +0000 Tue, 07 Sep 2021 20:32:40 +0000 KubeletHasSufficientPID kubelet has sufficient PID available Ready True Tue, 07 Sep 2021 20:44:51 +0000 Tue, 07 Sep 2021 20:32:43 +0000 KubeletReady kubelet is posting ready status Addresses: InternalIP: 192.168.64.36 Hostname: minikube Capacity: cpu: 4 ephemeral-storage: 72863488Ki hugepages-2Mi: 0 memory: 8161916Ki pods: 110 Allocatable: cpu: 4 ephemeral-storage: 72863488Ki hugepages-2Mi: 0 memory: 8161916Ki pods: 110 System Info: Machine ID: ebae7540d04147bd912fef00174b7d53 System UUID: ace711ec-0000-0000-a102-acde48001122 Boot ID: aa8d1558-f049-4d53-a9cb-a148bcedbb62 Kernel Version: 4.19.202 OS Image: Buildroot 2021.02.4 Operating System: linux Architecture: amd64 Container Runtime Version: docker://20.10.8 Kubelet Version: v1.22.1 Kube-Proxy Version: v1.22.1 PodCIDR: 10.244.0.0/24 PodCIDRs: 10.244.0.0/24 Non-terminated Pods: (10 in total) Namespace Name CPU Requests CPU Limits Memory Requests Memory Limits Age --------- ---- ------------ ---------- --------------- ------------- --- default hello-world-app-7b9bf45d65-wfccx 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 3m23s ingress-nginx ingress-nginx-controller-69bdbc4d57-cx4ch 100m (2%!)(MISSING) 0 (0%!)(MISSING) 90Mi (1%!)(MISSING) 0 (0%!)(MISSING) 6m41s kube-system coredns-78fcd69978-hssdt 100m (2%!)(MISSING) 0 (0%!)(MISSING) 70Mi (0%!)(MISSING) 170Mi (2%!)(MISSING) 14m kube-system etcd-minikube 100m (2%!)(MISSING) 0 (0%!)(MISSING) 100Mi (1%!)(MISSING) 0 (0%!)(MISSING) 15m kube-system kube-apiserver-minikube 250m (6%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 15m kube-system kube-controller-manager-minikube 200m (5%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 15m kube-system kube-ingress-dns-minikube 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 6m45s kube-system kube-proxy-z47zd 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 14m kube-system kube-scheduler-minikube 100m (2%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 15m kube-system storage-provisioner 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 0 (0%!)(MISSING) 15m Allocated resources: (Total limits may be over 100 percent, i.e., overcommitted.) Resource Requests Limits -------- -------- ------ cpu 850m (21%!)(MISSING) 0 (0%!)(MISSING) memory 260Mi (3%!)(MISSING) 170Mi (2%!)(MISSING) ephemeral-storage 0 (0%!)(MISSING) 0 (0%!)(MISSING) hugepages-2Mi 0 (0%!)(MISSING) 0 (0%!)(MISSING) Events: Type Reason Age From Message ---- ------ ---- ---- ------- Normal NodeHasNoDiskPressure 15m (x4 over 15m) kubelet Node minikube status is now: NodeHasNoDiskPressure Normal NodeHasSufficientPID 15m (x4 over 15m) kubelet Node minikube status is now: NodeHasSufficientPID Normal NodeAllocatableEnforced 15m kubelet Updated Node Allocatable limit across pods Normal NodeHasSufficientMemory 15m (x5 over 15m) kubelet Node minikube status is now: NodeHasSufficientMemory Normal Starting 15m kubelet Starting kubelet. Normal NodeHasSufficientMemory 15m kubelet Node minikube status is now: NodeHasSufficientMemory Normal NodeHasNoDiskPressure 15m kubelet Node minikube status is now: NodeHasNoDiskPressure Normal NodeHasSufficientPID 15m kubelet Node minikube status is now: NodeHasSufficientPID Normal NodeAllocatableEnforced 15m kubelet Updated Node Allocatable limit across pods * * ==> dmesg <== * [Sep 7 20:32] ERROR: earlyprintk= earlyser already used [ +0.000000] You have booted with nomodeset. This means your GPU drivers are DISABLED [ +0.000001] Any video related functionality will be severely degraded, and you may not even be able to suspend the system properly [ +0.000000] Unless you actually understand what nomodeset does, you should reboot without enabling it [ +0.107290] ACPI BIOS Warning (bug): Incorrect checksum in table [DSDT] - 0xBE, should be 0x1B (20180810/tbprint-173) [ +4.280049] ACPI Error: Could not enable RealTimeClock event (20180810/evxfevnt-182) [ +0.000003] ACPI Warning: Could not enable fixed event - RealTimeClock (4) (20180810/evxface-618) [ +0.010995] platform regulatory.0: Direct firmware load for regulatory.db failed with error -2 [ +2.044246] systemd-fstab-generator[1138]: Ignoring "noauto" for root device [ +0.028815] systemd[1]: system-getty.slice: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. [ +0.000001] systemd[1]: (This warning is only shown for the first unit using IP firewalling.) [ +0.618253] SELinux: unrecognized netlink message: protocol=0 nlmsg_type=106 sclass=netlink_route_socket pid=1674 comm=systemd-network [ +0.336965] NFSD: the nfsdcld client tracking upcall will be removed in 3.10. Please transition to using nfsdcltrack. [ +0.889114] vboxguest: loading out-of-tree module taints kernel. [ +0.003238] vboxguest: PCI device not found, probably running on physical hardware. [ +0.916690] systemd-fstab-generator[2045]: Ignoring "noauto" for root device [ +0.121053] systemd-fstab-generator[2056]: Ignoring "noauto" for root device [ +9.087746] systemd-fstab-generator[2236]: Ignoring "noauto" for root device [ +2.853694] kauditd_printk_skb: 68 callbacks suppressed [ +0.252584] systemd-fstab-generator[2403]: Ignoring "noauto" for root device [ +0.096724] systemd-fstab-generator[2414]: Ignoring "noauto" for root device [ +0.101758] systemd-fstab-generator[2425]: Ignoring "noauto" for root device [ +3.973905] systemd-fstab-generator[2674]: Ignoring "noauto" for root device [ +0.576571] kauditd_printk_skb: 137 callbacks suppressed [ +9.140146] systemd-fstab-generator[3957]: Ignoring "noauto" for root device [Sep 7 20:33] kauditd_printk_skb: 8 callbacks suppressed [Sep 7 20:34] NFSD: Unable to end grace period: -110 [Sep 7 20:41] kauditd_printk_skb: 59 callbacks suppressed [ +17.554220] kauditd_printk_skb: 8 callbacks suppressed [Sep 7 20:42] kauditd_printk_skb: 11 callbacks suppressed [ +14.938250] kauditd_printk_skb: 2 callbacks suppressed [Sep 7 20:44] kauditd_printk_skb: 2 callbacks suppressed [ +6.827741] kauditd_printk_skb: 5 callbacks suppressed * * ==> etcd [275d4050620c] <== * {"level":"info","ts":"2021-09-07T20:35:11.991Z","caller":"traceutil/trace.go:171","msg":"trace[424894570] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:543; }","duration":"216.662306ms","start":"2021-09-07T20:35:11.774Z","end":"2021-09-07T20:35:11.991Z","steps":["trace[424894570] 'range keys from in-memory index tree' (duration: 216.536629ms)"],"step_count":1} {"level":"info","ts":"2021-09-07T20:35:15.874Z","caller":"traceutil/trace.go:171","msg":"trace[1512411018] linearizableReadLoop","detail":"{readStateIndex:585; appliedIndex:585; }","duration":"472.711003ms","start":"2021-09-07T20:35:15.402Z","end":"2021-09-07T20:35:15.874Z","steps":["trace[1512411018] 'read index received' (duration: 472.705692ms)","trace[1512411018] 'applied index is now lower than readState.Index' (duration: 4.652µs)"],"step_count":2} {"level":"warn","ts":"2021-09-07T20:35:15.887Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"115.226379ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"range_response_count:0 size:5"} {"level":"warn","ts":"2021-09-07T20:35:15.887Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"485.739313ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/namespaces/default\" ","response":"range_response_count:1 size:343"} {"level":"info","ts":"2021-09-07T20:35:15.887Z","caller":"traceutil/trace.go:171","msg":"trace[1346058263] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:545; }","duration":"115.356277ms","start":"2021-09-07T20:35:15.772Z","end":"2021-09-07T20:35:15.887Z","steps":["trace[1346058263] 'agreement among raft nodes before linearized reading' (duration: 109.36923ms)"],"step_count":1} {"level":"info","ts":"2021-09-07T20:35:15.887Z","caller":"traceutil/trace.go:171","msg":"trace[1257201228] range","detail":"{range_begin:/registry/namespaces/default; range_end:; response_count:1; response_revision:545; }","duration":"485.772885ms","start":"2021-09-07T20:35:15.402Z","end":"2021-09-07T20:35:15.887Z","steps":["trace[1257201228] 'agreement among raft nodes before linearized reading' (duration: 475.033837ms)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:35:15.887Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:35:15.402Z","time spent":"485.82788ms","remote":"127.0.0.1:50386","response type":"/etcdserverpb.KV/Range","request count":0,"request size":30,"response count":1,"response size":366,"request content":"key:\"/registry/namespaces/default\" "} {"level":"info","ts":"2021-09-07T20:35:18.859Z","caller":"traceutil/trace.go:171","msg":"trace[186675712] linearizableReadLoop","detail":"{readStateIndex:589; appliedIndex:589; }","duration":"306.704979ms","start":"2021-09-07T20:35:18.552Z","end":"2021-09-07T20:35:18.859Z","steps":["trace[186675712] 'read index received' (duration: 306.698669ms)","trace[186675712] 'applied index is now lower than readState.Index' (duration: 5.016µs)"],"step_count":2} {"level":"warn","ts":"2021-09-07T20:35:18.922Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"148.401061ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"range_response_count:0 size:5"} {"level":"warn","ts":"2021-09-07T20:35:18.922Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"370.028364ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/storageclasses/\" range_end:\"/registry/storageclasses0\" count_only:true ","response":"range_response_count:0 size:7"} {"level":"info","ts":"2021-09-07T20:35:18.922Z","caller":"traceutil/trace.go:171","msg":"trace[258825908] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:548; }","duration":"148.468471ms","start":"2021-09-07T20:35:18.774Z","end":"2021-09-07T20:35:18.922Z","steps":["trace[258825908] 'agreement among raft nodes before linearized reading' (duration: 85.243444ms)","trace[258825908] 'range keys from in-memory index tree' (duration: 63.147085ms)"],"step_count":2} {"level":"warn","ts":"2021-09-07T20:35:18.922Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"361.038889ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/networkpolicies/\" range_end:\"/registry/networkpolicies0\" count_only:true ","response":"range_response_count:0 size:5"} {"level":"info","ts":"2021-09-07T20:35:18.923Z","caller":"traceutil/trace.go:171","msg":"trace[510607448] range","detail":"{range_begin:/registry/storageclasses/; range_end:/registry/storageclasses0; response_count:0; response_revision:548; }","duration":"370.076298ms","start":"2021-09-07T20:35:18.552Z","end":"2021-09-07T20:35:18.922Z","steps":["trace[510607448] 'agreement among raft nodes before linearized reading' (duration: 306.793712ms)","trace[510607448] 'count revisions from in-memory index tree' (duration: 63.225382ms)"],"step_count":2} {"level":"info","ts":"2021-09-07T20:35:18.923Z","caller":"traceutil/trace.go:171","msg":"trace[816090448] range","detail":"{range_begin:/registry/networkpolicies/; range_end:/registry/networkpolicies0; response_count:0; response_revision:548; }","duration":"361.068639ms","start":"2021-09-07T20:35:18.561Z","end":"2021-09-07T20:35:18.923Z","steps":["trace[816090448] 'agreement among raft nodes before linearized reading' (duration: 297.775376ms)","trace[816090448] 'count revisions from in-memory index tree' (duration: 63.241077ms)"],"step_count":2} {"level":"warn","ts":"2021-09-07T20:35:18.923Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:35:18.561Z","time spent":"361.09478ms","remote":"127.0.0.1:50434","response type":"/etcdserverpb.KV/Range","request count":0,"request size":58,"response count":0,"response size":28,"request content":"key:\"/registry/networkpolicies/\" range_end:\"/registry/networkpolicies0\" count_only:true "} {"level":"warn","ts":"2021-09-07T20:35:18.923Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:35:18.552Z","time spent":"370.115451ms","remote":"127.0.0.1:50484","response type":"/etcdserverpb.KV/Range","request count":0,"request size":56,"response count":1,"response size":30,"request content":"key:\"/registry/storageclasses/\" range_end:\"/registry/storageclasses0\" count_only:true "} {"level":"info","ts":"2021-09-07T20:35:23.630Z","caller":"traceutil/trace.go:171","msg":"trace[383798680] linearizableReadLoop","detail":"{readStateIndex:594; appliedIndex:594; }","duration":"203.019078ms","start":"2021-09-07T20:35:23.427Z","end":"2021-09-07T20:35:23.630Z","steps":["trace[383798680] 'read index received' (duration: 203.013826ms)","trace[383798680] 'applied index is now lower than readState.Index' (duration: 4.56µs)"],"step_count":2} {"level":"warn","ts":"2021-09-07T20:35:23.631Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"204.093249ms","expected-duration":"100ms","prefix":"read-only range ","request":"limit:1 keys_only:true ","response":"range_response_count:0 size:5"} {"level":"info","ts":"2021-09-07T20:35:23.631Z","caller":"traceutil/trace.go:171","msg":"trace[1517050857] range","detail":"{range_begin:; range_end:; response_count:0; response_revision:552; }","duration":"204.176352ms","start":"2021-09-07T20:35:23.427Z","end":"2021-09-07T20:35:23.631Z","steps":["trace[1517050857] 'agreement among raft nodes before linearized reading' (duration: 203.095843ms)"],"step_count":1} {"level":"info","ts":"2021-09-07T20:35:51.574Z","caller":"traceutil/trace.go:171","msg":"trace[1015866603] linearizableReadLoop","detail":"{readStateIndex:619; appliedIndex:619; }","duration":"108.130391ms","start":"2021-09-07T20:35:51.466Z","end":"2021-09-07T20:35:51.574Z","steps":["trace[1015866603] 'read index received' (duration: 108.120105ms)","trace[1015866603] 'applied index is now lower than readState.Index' (duration: 4.022µs)"],"step_count":2} {"level":"warn","ts":"2021-09-07T20:35:51.575Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"108.681407ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"range_response_count:0 size:5"} {"level":"info","ts":"2021-09-07T20:35:51.575Z","caller":"traceutil/trace.go:171","msg":"trace[1719059992] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:571; }","duration":"108.718228ms","start":"2021-09-07T20:35:51.466Z","end":"2021-09-07T20:35:51.575Z","steps":["trace[1719059992] 'agreement among raft nodes before linearized reading' (duration: 108.194709ms)"],"step_count":1} {"level":"info","ts":"2021-09-07T20:36:15.871Z","caller":"traceutil/trace.go:171","msg":"trace[1965871818] linearizableReadLoop","detail":"{readStateIndex:641; appliedIndex:640; }","duration":"312.132549ms","start":"2021-09-07T20:36:15.559Z","end":"2021-09-07T20:36:15.871Z","steps":["trace[1965871818] 'read index received' (duration: 311.296947ms)","trace[1965871818] 'applied index is now lower than readState.Index' (duration: 834.546µs)"],"step_count":2} {"level":"info","ts":"2021-09-07T20:36:15.871Z","caller":"traceutil/trace.go:171","msg":"trace[276086747] transaction","detail":"{read_only:false; response_revision:588; number_of_response:1; }","duration":"463.450967ms","start":"2021-09-07T20:36:15.408Z","end":"2021-09-07T20:36:15.871Z","steps":["trace[276086747] 'process raft request' (duration: 462.61536ms)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:15.872Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"312.309405ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" ","response":"range_response_count:1 size:602"} {"level":"warn","ts":"2021-09-07T20:36:15.872Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:36:15.408Z","time spent":"463.525437ms","remote":"127.0.0.1:50368","response type":"/etcdserverpb.KV/Txn","request count":1,"request size":120,"response count":0,"response size":39,"request content":"compare: success:> failure: >"} {"level":"info","ts":"2021-09-07T20:36:15.872Z","caller":"traceutil/trace.go:171","msg":"trace[550466789] range","detail":"{range_begin:/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath; range_end:; response_count:1; response_revision:588; }","duration":"312.341855ms","start":"2021-09-07T20:36:15.559Z","end":"2021-09-07T20:36:15.872Z","steps":["trace[550466789] 'agreement among raft nodes before linearized reading' (duration: 312.265616ms)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:15.872Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:36:15.559Z","time spent":"312.372529ms","remote":"127.0.0.1:50388","response type":"/etcdserverpb.KV/Range","request count":0,"request size":67,"response count":1,"response size":625,"request content":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" "} {"level":"warn","ts":"2021-09-07T20:36:19.107Z","caller":"wal/wal.go:802","msg":"slow fdatasync","took":"1.215220674s","expected-duration":"1s"} {"level":"info","ts":"2021-09-07T20:36:19.107Z","caller":"traceutil/trace.go:171","msg":"trace[1215500377] linearizableReadLoop","detail":"{readStateIndex:643; appliedIndex:643; }","duration":"333.213452ms","start":"2021-09-07T20:36:18.774Z","end":"2021-09-07T20:36:19.107Z","steps":["trace[1215500377] 'read index received' (duration: 333.208398ms)","trace[1215500377] 'applied index is now lower than readState.Index' (duration: 4.258µs)"],"step_count":2} {"level":"warn","ts":"2021-09-07T20:36:19.109Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"334.808685ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"range_response_count:0 size:5"} {"level":"info","ts":"2021-09-07T20:36:19.109Z","caller":"traceutil/trace.go:171","msg":"trace[1354146955] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:590; }","duration":"334.849334ms","start":"2021-09-07T20:36:18.774Z","end":"2021-09-07T20:36:19.109Z","steps":["trace[1354146955] 'agreement among raft nodes before linearized reading' (duration: 333.33105ms)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:19.109Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:36:18.774Z","time spent":"334.913123ms","remote":"127.0.0.1:50520","response type":"/etcdserverpb.KV/Range","request count":0,"request size":18,"response count":0,"response size":28,"request content":"key:\"/registry/health\" "} {"level":"info","ts":"2021-09-07T20:36:19.109Z","caller":"traceutil/trace.go:171","msg":"trace[1277279286] transaction","detail":"{read_only:false; response_revision:591; number_of_response:1; }","duration":"305.253344ms","start":"2021-09-07T20:36:18.804Z","end":"2021-09-07T20:36:19.109Z","steps":["trace[1277279286] 'process raft request' (duration: 303.48827ms)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:19.109Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:36:18.804Z","time spent":"305.354463ms","remote":"127.0.0.1:50390","response type":"/etcdserverpb.KV/Txn","request count":1,"request size":4923,"response count":0,"response size":39,"request content":"compare: success:> failure: >"} {"level":"warn","ts":"2021-09-07T20:36:23.928Z","caller":"etcdserver/v3_server.go:815","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":6873192054314738874,"retry-timeout":"500ms"} {"level":"warn","ts":"2021-09-07T20:36:24.429Z","caller":"etcdserver/v3_server.go:815","msg":"waiting for ReadIndex response took too long, retrying","sent-request-id":6873192054314738874,"retry-timeout":"500ms"} {"level":"warn","ts":"2021-09-07T20:36:24.661Z","caller":"wal/wal.go:802","msg":"slow fdatasync","took":"1.28403628s","expected-duration":"1s"} {"level":"info","ts":"2021-09-07T20:36:24.661Z","caller":"traceutil/trace.go:171","msg":"trace[679000217] linearizableReadLoop","detail":"{readStateIndex:648; appliedIndex:648; }","duration":"1.233605259s","start":"2021-09-07T20:36:23.428Z","end":"2021-09-07T20:36:24.661Z","steps":["trace[679000217] 'read index received' (duration: 1.233599538s)","trace[679000217] 'applied index is now lower than readState.Index' (duration: 5.121µs)"],"step_count":2} {"level":"warn","ts":"2021-09-07T20:36:24.662Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"887.888865ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/health\" ","response":"range_response_count:0 size:5"} {"level":"info","ts":"2021-09-07T20:36:24.662Z","caller":"traceutil/trace.go:171","msg":"trace[965819449] range","detail":"{range_begin:/registry/health; range_end:; response_count:0; response_revision:594; }","duration":"887.942659ms","start":"2021-09-07T20:36:23.774Z","end":"2021-09-07T20:36:24.662Z","steps":["trace[965819449] 'agreement among raft nodes before linearized reading' (duration: 887.525908ms)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:24.662Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:36:23.774Z","time spent":"887.983443ms","remote":"127.0.0.1:50520","response type":"/etcdserverpb.KV/Range","request count":0,"request size":18,"response count":0,"response size":28,"request content":"key:\"/registry/health\" "} {"level":"warn","ts":"2021-09-07T20:36:24.662Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"109.937233ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/replicasets/\" range_end:\"/registry/replicasets0\" count_only:true ","response":"range_response_count:0 size:7"} {"level":"info","ts":"2021-09-07T20:36:24.663Z","caller":"traceutil/trace.go:171","msg":"trace[1674039776] range","detail":"{range_begin:/registry/replicasets/; range_end:/registry/replicasets0; response_count:0; response_revision:594; }","duration":"110.356473ms","start":"2021-09-07T20:36:24.552Z","end":"2021-09-07T20:36:24.663Z","steps":["trace[1674039776] 'agreement among raft nodes before linearized reading' (duration: 109.335281ms)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:24.662Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"293.08895ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/persistentvolumeclaims/\" range_end:\"/registry/persistentvolumeclaims0\" count_only:true ","response":"range_response_count:0 size:5"} {"level":"info","ts":"2021-09-07T20:36:24.663Z","caller":"traceutil/trace.go:171","msg":"trace[886767263] range","detail":"{range_begin:/registry/persistentvolumeclaims/; range_end:/registry/persistentvolumeclaims0; response_count:0; response_revision:594; }","duration":"293.613691ms","start":"2021-09-07T20:36:24.369Z","end":"2021-09-07T20:36:24.663Z","steps":["trace[886767263] 'agreement among raft nodes before linearized reading' (duration: 292.47052ms)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:24.662Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"1.234360117s","expected-duration":"100ms","prefix":"read-only range ","request":"limit:1 keys_only:true ","response":"range_response_count:0 size:5"} {"level":"warn","ts":"2021-09-07T20:36:24.662Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"1.161547137s","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/priorityclasses/\" range_end:\"/registry/priorityclasses0\" count_only:true ","response":"range_response_count:0 size:7"} {"level":"info","ts":"2021-09-07T20:36:24.663Z","caller":"traceutil/trace.go:171","msg":"trace[1890003046] range","detail":"{range_begin:/registry/priorityclasses/; range_end:/registry/priorityclasses0; response_count:0; response_revision:594; }","duration":"1.162694725s","start":"2021-09-07T20:36:23.501Z","end":"2021-09-07T20:36:24.663Z","steps":["trace[1890003046] 'agreement among raft nodes before linearized reading' (duration: 1.160862978s)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:24.663Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:36:23.501Z","time spent":"1.162726189s","remote":"127.0.0.1:50472","response type":"/etcdserverpb.KV/Range","request count":0,"request size":58,"response count":2,"response size":30,"request content":"key:\"/registry/priorityclasses/\" range_end:\"/registry/priorityclasses0\" count_only:true "} {"level":"warn","ts":"2021-09-07T20:36:24.662Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"741.03865ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" ","response":"range_response_count:1 size:602"} {"level":"info","ts":"2021-09-07T20:36:24.664Z","caller":"traceutil/trace.go:171","msg":"trace[464784293] range","detail":"{range_begin:/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath; range_end:; response_count:1; response_revision:594; }","duration":"742.283551ms","start":"2021-09-07T20:36:23.921Z","end":"2021-09-07T20:36:24.664Z","steps":["trace[464784293] 'agreement among raft nodes before linearized reading' (duration: 740.318663ms)"],"step_count":1} {"level":"info","ts":"2021-09-07T20:36:24.663Z","caller":"traceutil/trace.go:171","msg":"trace[331848861] range","detail":"{range_begin:; range_end:; response_count:0; response_revision:594; }","duration":"1.235579164s","start":"2021-09-07T20:36:23.428Z","end":"2021-09-07T20:36:24.663Z","steps":["trace[331848861] 'agreement among raft nodes before linearized reading' (duration: 1.233681821s)"],"step_count":1} {"level":"warn","ts":"2021-09-07T20:36:24.664Z","caller":"v3rpc/interceptor.go:197","msg":"request stats","start time":"2021-09-07T20:36:23.921Z","time spent":"742.310343ms","remote":"127.0.0.1:50388","response type":"/etcdserverpb.KV/Range","request count":0,"request size":67,"response count":1,"response size":625,"request content":"key:\"/registry/services/endpoints/kube-system/k8s.io-minikube-hostpath\" "} {"level":"warn","ts":"2021-09-07T20:42:41.713Z","caller":"etcdserver/util.go:166","msg":"apply request took too long","took":"208.13497ms","expected-duration":"100ms","prefix":"read-only range ","request":"key:\"/registry/pods/ingress-nginx/\" range_end:\"/registry/pods/ingress-nginx0\" ","response":"range_response_count:3 size:12902"} {"level":"info","ts":"2021-09-07T20:42:41.713Z","caller":"traceutil/trace.go:171","msg":"trace[1888217815] range","detail":"{range_begin:/registry/pods/ingress-nginx/; range_end:/registry/pods/ingress-nginx0; response_count:3; response_revision:967; }","duration":"208.252347ms","start":"2021-09-07T20:42:41.505Z","end":"2021-09-07T20:42:41.713Z","steps":["trace[1888217815] 'range keys from in-memory index tree' (duration: 207.995589ms)"],"step_count":1} {"level":"info","ts":"2021-09-07T20:42:41.742Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":650} {"level":"info","ts":"2021-09-07T20:42:41.760Z","caller":"mvcc/kvstore_compaction.go:57","msg":"finished scheduled compaction","compact-revision":650,"took":"14.677097ms"} {"level":"info","ts":"2021-09-07T20:47:41.749Z","caller":"mvcc/index.go:214","msg":"compact tree index","revision":968} {"level":"info","ts":"2021-09-07T20:47:41.750Z","caller":"mvcc/kvstore_compaction.go:57","msg":"finished scheduled compaction","compact-revision":968,"took":"669.754µs"} * * ==> kernel <== * 20:47:48 up 15 min, 0 users, load average: 0.24, 0.31, 0.40 Linux minikube 4.19.202 #1 SMP Thu Sep 2 18:19:24 UTC 2021 x86_64 GNU/Linux PRETTY_NAME="Buildroot 2021.02.4" * * ==> kube-apiserver [7776fe7de1eb] <== * I0907 20:32:43.387983 1 customresource_discovery_controller.go:209] Starting DiscoveryController I0907 20:32:43.388022 1 apf_controller.go:299] Starting API Priority and Fairness config controller I0907 20:32:43.388044 1 apiservice_controller.go:97] Starting APIServiceRegistrationController I0907 20:32:43.388048 1 cache.go:32] Waiting for caches to sync for APIServiceRegistrationController controller I0907 20:32:43.388153 1 controller.go:85] Starting OpenAPI controller I0907 20:32:43.388179 1 naming_controller.go:291] Starting NamingConditionController I0907 20:32:43.388217 1 establishing_controller.go:76] Starting EstablishingController I0907 20:32:43.388231 1 nonstructuralschema_controller.go:192] Starting NonStructuralSchemaConditionController I0907 20:32:43.388246 1 apiapproval_controller.go:186] Starting KubernetesAPIApprovalPolicyConformantConditionController I0907 20:32:43.388255 1 crd_finalizer.go:266] Starting CRDFinalizer I0907 20:32:43.388329 1 controller.go:83] Starting OpenAPI AggregationController I0907 20:32:43.388343 1 cluster_authentication_trust_controller.go:440] Starting cluster_authentication_trust_controller controller I0907 20:32:43.388351 1 shared_informer.go:240] Waiting for caches to sync for cluster_authentication_trust_controller I0907 20:32:43.388372 1 dynamic_cafile_content.go:155] "Starting controller" name="client-ca-bundle::/var/lib/minikube/certs/ca.crt" I0907 20:32:43.388396 1 dynamic_cafile_content.go:155] "Starting controller" name="request-header::/var/lib/minikube/certs/front-proxy-ca.crt" E0907 20:32:43.391064 1 controller.go:152] Unable to remove old endpoints from kubernetes service: StorageError: key not found, Code: 1, Key: /registry/masterleases/192.168.64.36, ResourceVersion: 0, AdditionalErrorMsg: I0907 20:32:43.464770 1 shared_informer.go:247] Caches are synced for node_authorizer I0907 20:32:43.487742 1 cache.go:39] Caches are synced for autoregister controller I0907 20:32:43.487744 1 cache.go:39] Caches are synced for AvailableConditionController controller I0907 20:32:43.488249 1 shared_informer.go:247] Caches are synced for crd-autoregister I0907 20:32:43.488362 1 cache.go:39] Caches are synced for APIServiceRegistrationController controller I0907 20:32:43.488614 1 apf_controller.go:304] Running API Priority and Fairness config worker I0907 20:32:43.488393 1 shared_informer.go:247] Caches are synced for cluster_authentication_trust_controller I0907 20:32:43.718718 1 controller.go:611] quota admission added evaluator for: namespaces I0907 20:32:44.386983 1 controller.go:132] OpenAPI AggregationController: action for item : Nothing (removed from the queue). I0907 20:32:44.387046 1 controller.go:132] OpenAPI AggregationController: action for item k8s_internal_local_delegation_chain_0000000000: Nothing (removed from the queue). I0907 20:32:44.393604 1 storage_scheduling.go:132] created PriorityClass system-node-critical with value 2000001000 I0907 20:32:44.397426 1 storage_scheduling.go:132] created PriorityClass system-cluster-critical with value 2000000000 I0907 20:32:44.397463 1 storage_scheduling.go:148] all system priority classes are created successfully or already exist. I0907 20:32:44.763925 1 controller.go:611] quota admission added evaluator for: roles.rbac.authorization.k8s.io I0907 20:32:44.791916 1 controller.go:611] quota admission added evaluator for: rolebindings.rbac.authorization.k8s.io W0907 20:32:44.853224 1 lease.go:233] Resetting endpoints for master service "kubernetes" to [192.168.64.36] I0907 20:32:44.854152 1 controller.go:611] quota admission added evaluator for: endpoints I0907 20:32:44.856876 1 controller.go:611] quota admission added evaluator for: endpointslices.discovery.k8s.io I0907 20:32:45.440708 1 controller.go:611] quota admission added evaluator for: serviceaccounts I0907 20:32:46.509859 1 controller.go:611] quota admission added evaluator for: deployments.apps I0907 20:32:46.537395 1 controller.go:611] quota admission added evaluator for: daemonsets.apps I0907 20:32:46.756465 1 controller.go:611] quota admission added evaluator for: leases.coordination.k8s.io I0907 20:32:58.963698 1 trace.go:205] Trace[1279159571]: "Get" url:/api/v1/namespaces/kube-system/serviceaccounts/horizontal-pod-autoscaler,user-agent:kube-controller-manager/v1.22.1 (linux/amd64) kubernetes/632ed30/tokens-controller,audit-id:701f046b-7dbe-4e77-950b-925024ae8187,client:192.168.64.36,accept:application/vnd.kubernetes.protobuf, */*,protocol:HTTP/2.0 (07-Sep-2021 20:32:58.344) (total time: 619ms): Trace[1279159571]: ---"About to write a response" 619ms (20:32:58.963) Trace[1279159571]: [619.293527ms] [619.293527ms] END I0907 20:32:58.963851 1 trace.go:205] Trace[276843952]: "Get" url:/api/v1/namespaces/kube-system/serviceaccounts/horizontal-pod-autoscaler,user-agent:kube-controller-manager/v1.22.1 (linux/amd64) kubernetes/632ed30/kube-controller-manager,audit-id:872fa4fb-82d9-464a-8af3-019a81f1c200,client:192.168.64.36,accept:application/vnd.kubernetes.protobuf, */*,protocol:HTTP/2.0 (07-Sep-2021 20:32:58.390) (total time: 573ms): Trace[276843952]: ---"About to write a response" 573ms (20:32:58.963) Trace[276843952]: [573.13274ms] [573.13274ms] END I0907 20:32:59.066604 1 controller.go:611] quota admission added evaluator for: controllerrevisions.apps I0907 20:32:59.747615 1 controller.go:611] quota admission added evaluator for: replicasets.apps I0907 20:33:01.986721 1 trace.go:205] Trace[1268753703]: "Create" url:/api/v1/namespaces/kube-system/serviceaccounts/coredns/token,user-agent:kubelet/v1.22.1 (linux/amd64) kubernetes/632ed30,audit-id:0f16b4ac-5206-4bfd-81b1-c2b496f630e7,client:192.168.64.36,accept:application/vnd.kubernetes.protobuf,application/json,protocol:HTTP/2.0 (07-Sep-2021 20:33:00.036) (total time: 1950ms): Trace[1268753703]: ---"Object stored in database" 1950ms (20:33:01.986) Trace[1268753703]: [1.950657965s] [1.950657965s] END I0907 20:33:19.060130 1 trace.go:205] Trace[100097011]: "GuaranteedUpdate etcd3" type:*coordination.Lease (07-Sep-2021 20:33:18.199) (total time: 860ms): Trace[100097011]: ---"Transaction committed" 860ms (20:33:19.060) Trace[100097011]: [860.52441ms] [860.52441ms] END I0907 20:33:19.060252 1 trace.go:205] Trace[971704205]: "Update" url:/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/minikube,user-agent:kubelet/v1.22.1 (linux/amd64) kubernetes/632ed30,audit-id:e04349e7-8595-4ed0-8463-4d3295331eb9,client:192.168.64.36,accept:application/vnd.kubernetes.protobuf,application/json,protocol:HTTP/2.0 (07-Sep-2021 20:33:18.199) (total time: 860ms): Trace[971704205]: ---"Object stored in database" 860ms (20:33:19.060) Trace[971704205]: [860.786466ms] [860.786466ms] END I0907 20:36:24.664999 1 trace.go:205] Trace[335053641]: "Get" url:/api/v1/namespaces/kube-system/endpoints/k8s.io-minikube-hostpath,user-agent:storage-provisioner/v0.0.0 (linux/amd64) kubernetes/$Format,audit-id:12ca75b1-1539-4269-8db6-4086b55e7c5e,client:192.168.64.36,accept:application/json, */*,protocol:HTTP/2.0 (07-Sep-2021 20:36:23.921) (total time: 743ms): Trace[335053641]: ---"About to write a response" 743ms (20:36:24.664) Trace[335053641]: [743.698739ms] [743.698739ms] END I0907 20:41:07.422754 1 controller.go:611] quota admission added evaluator for: jobs.batch I0907 20:44:25.360293 1 controller.go:611] quota admission added evaluator for: ingresses.networking.k8s.io * * ==> kube-controller-manager [b9d34792ddd7] <== * I0907 20:32:59.055863 1 shared_informer.go:247] Caches are synced for daemon sets I0907 20:32:59.060599 1 shared_informer.go:247] Caches are synced for taint I0907 20:32:59.060656 1 node_lifecycle_controller.go:1398] Initializing eviction metric for zone: W0907 20:32:59.060695 1 node_lifecycle_controller.go:1013] Missing timestamp for Node minikube. Assuming now as a timestamp. I0907 20:32:59.060714 1 node_lifecycle_controller.go:1214] Controller detected that zone is now in state Normal. I0907 20:32:59.060859 1 taint_manager.go:187] "Starting NoExecuteTaintManager" I0907 20:32:59.060983 1 event.go:291] "Event occurred" object="minikube" kind="Node" apiVersion="v1" type="Normal" reason="RegisteredNode" message="Node minikube event: Registered Node minikube in Controller" I0907 20:32:59.069640 1 shared_informer.go:247] Caches are synced for endpoint I0907 20:32:59.071057 1 shared_informer.go:247] Caches are synced for HPA I0907 20:32:59.078728 1 shared_informer.go:247] Caches are synced for certificate-csrapproving I0907 20:32:59.081577 1 shared_informer.go:247] Caches are synced for cronjob I0907 20:32:59.088144 1 shared_informer.go:247] Caches are synced for TTL I0907 20:32:59.091589 1 shared_informer.go:247] Caches are synced for crt configmap I0907 20:32:59.092787 1 shared_informer.go:247] Caches are synced for attach detach I0907 20:32:59.092790 1 shared_informer.go:247] Caches are synced for job I0907 20:32:59.093462 1 shared_informer.go:247] Caches are synced for ephemeral I0907 20:32:59.098493 1 shared_informer.go:247] Caches are synced for service account I0907 20:32:59.101975 1 shared_informer.go:247] Caches are synced for bootstrap_signer I0907 20:32:59.109638 1 event.go:291] "Event occurred" object="kube-system/kube-proxy" kind="DaemonSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: kube-proxy-z47zd" I0907 20:32:59.109769 1 shared_informer.go:247] Caches are synced for certificate-csrsigning-kubelet-client I0907 20:32:59.109865 1 shared_informer.go:247] Caches are synced for certificate-csrsigning-kubelet-serving I0907 20:32:59.111023 1 shared_informer.go:247] Caches are synced for namespace I0907 20:32:59.193360 1 shared_informer.go:247] Caches are synced for ReplicationController I0907 20:32:59.239710 1 shared_informer.go:247] Caches are synced for stateful set I0907 20:32:59.242926 1 shared_informer.go:247] Caches are synced for ClusterRoleAggregator I0907 20:32:59.251957 1 shared_informer.go:247] Caches are synced for resource quota I0907 20:32:59.267293 1 shared_informer.go:247] Caches are synced for ReplicaSet I0907 20:32:59.274998 1 shared_informer.go:247] Caches are synced for disruption I0907 20:32:59.275028 1 disruption.go:371] Sending events to api server. I0907 20:32:59.289553 1 shared_informer.go:247] Caches are synced for deployment I0907 20:32:59.305772 1 shared_informer.go:247] Caches are synced for resource quota I0907 20:32:59.725149 1 shared_informer.go:247] Caches are synced for garbage collector I0907 20:32:59.749550 1 event.go:291] "Event occurred" object="kube-system/coredns" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set coredns-78fcd69978 to 1" I0907 20:32:59.797161 1 shared_informer.go:247] Caches are synced for garbage collector I0907 20:32:59.797178 1 garbagecollector.go:151] Garbage collector: all resource monitors have synced. Proceeding to collect garbage I0907 20:32:59.854254 1 event.go:291] "Event occurred" object="kube-system/coredns-78fcd69978" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: coredns-78fcd69978-hssdt" I0907 20:41:07.351444 1 event.go:291] "Event occurred" object="ingress-nginx/ingress-nginx-controller" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set ingress-nginx-controller-69bdbc4d57 to 1" I0907 20:41:07.360948 1 event.go:291] "Event occurred" object="ingress-nginx/ingress-nginx-controller-69bdbc4d57" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: ingress-nginx-controller-69bdbc4d57-cx4ch" I0907 20:41:07.428256 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-create I0907 20:41:07.432130 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:41:07.459678 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:41:07.459736 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-create I0907 20:41:07.459784 1 event.go:291] "Event occurred" object="ingress-nginx/ingress-nginx-admission-create" kind="Job" apiVersion="batch/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: ingress-nginx-admission-create--1-wv2j7" I0907 20:41:07.459960 1 event.go:291] "Event occurred" object="ingress-nginx/ingress-nginx-admission-patch" kind="Job" apiVersion="batch/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: ingress-nginx-admission-patch--1-c22gc" I0907 20:41:07.464251 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-create I0907 20:41:07.464772 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:41:07.472689 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-create I0907 20:41:07.482035 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:41:07.486473 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-create I0907 20:41:07.495169 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:41:24.796160 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:41:24.811502 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-create I0907 20:41:24.811768 1 event.go:291] "Event occurred" object="ingress-nginx/ingress-nginx-admission-create" kind="Job" apiVersion="batch/v1" type="Normal" reason="Completed" message="Job completed" I0907 20:41:24.819116 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-create I0907 20:41:25.827808 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:41:25.828243 1 event.go:291] "Event occurred" object="ingress-nginx/ingress-nginx-admission-patch" kind="Job" apiVersion="batch/v1" type="Normal" reason="Completed" message="Job completed" I0907 20:41:25.836343 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:41:26.853636 1 job_controller.go:406] enqueueing job ingress-nginx/ingress-nginx-admission-patch I0907 20:44:25.298107 1 event.go:291] "Event occurred" object="default/hello-world-app" kind="Deployment" apiVersion="apps/v1" type="Normal" reason="ScalingReplicaSet" message="Scaled up replica set hello-world-app-7b9bf45d65 to 1" I0907 20:44:25.307766 1 event.go:291] "Event occurred" object="default/hello-world-app-7b9bf45d65" kind="ReplicaSet" apiVersion="apps/v1" type="Normal" reason="SuccessfulCreate" message="Created pod: hello-world-app-7b9bf45d65-wfccx" * * ==> kube-proxy [6f568f46ea27] <== * I0907 20:33:04.160112 1 node.go:172] Successfully retrieved node IP: 192.168.64.36 I0907 20:33:04.160210 1 server_others.go:140] Detected node IP 192.168.64.36 W0907 20:33:04.160245 1 server_others.go:565] Unknown proxy mode "", assuming iptables proxy W0907 20:33:04.260642 1 server_others.go:197] No iptables support for IPv6: exit status 3 I0907 20:33:04.260700 1 server_others.go:208] kube-proxy running in single-stack IPv4 mode I0907 20:33:04.260713 1 server_others.go:212] Using iptables Proxier. I0907 20:33:04.261801 1 server.go:649] Version: v1.22.1 I0907 20:33:04.263155 1 config.go:315] Starting service config controller I0907 20:33:04.263189 1 shared_informer.go:240] Waiting for caches to sync for service config I0907 20:33:04.263221 1 config.go:224] Starting endpoint slice config controller I0907 20:33:04.263228 1 shared_informer.go:240] Waiting for caches to sync for endpoint slice config E0907 20:33:04.266639 1 event_broadcaster.go:253] Server rejected event '&v1.Event{TypeMeta:v1.TypeMeta{Kind:"", APIVersion:""}, ObjectMeta:v1.ObjectMeta{Name:"minikube.16a2a4689c06821c", GenerateName:"", Namespace:"default", SelfLink:"", UID:"", ResourceVersion:"", Generation:0, CreationTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeletionTimestamp:(*v1.Time)(nil), DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string(nil), Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ClusterName:"", ManagedFields:[]v1.ManagedFieldsEntry(nil)}, EventTime:v1.MicroTime{Time:time.Time{wall:0xc04611000fa97dd0, ext:233520154, loc:(*time.Location)(0x2d81340)}}, Series:(*v1.EventSeries)(nil), ReportingController:"kube-proxy", ReportingInstance:"kube-proxy-minikube", Action:"StartKubeProxy", Reason:"Starting", Regarding:v1.ObjectReference{Kind:"Node", Namespace:"", Name:"minikube", UID:"minikube", APIVersion:"", ResourceVersion:"", FieldPath:""}, Related:(*v1.ObjectReference)(nil), Note:"", Type:"Normal", DeprecatedSource:v1.EventSource{Component:"", Host:""}, DeprecatedFirstTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeprecatedLastTimestamp:v1.Time{Time:time.Time{wall:0x0, ext:0, loc:(*time.Location)(nil)}}, DeprecatedCount:0}': 'Event "minikube.16a2a4689c06821c" is invalid: involvedObject.namespace: Invalid value: "": does not match event.namespace' (will not retry!) I0907 20:33:04.365894 1 shared_informer.go:247] Caches are synced for endpoint slice config I0907 20:33:04.366081 1 shared_informer.go:247] Caches are synced for service config * * ==> kube-scheduler [329527d534fe] <== * I0907 20:32:40.750539 1 serving.go:347] Generated self-signed cert in-memory W0907 20:32:43.417908 1 requestheader_controller.go:193] Unable to get configmap/extension-apiserver-authentication in kube-system. Usually fixed by 'kubectl create rolebinding -n kube-system ROLEBINDING_NAME --role=extension-apiserver-authentication-reader --serviceaccount=YOUR_NS:YOUR_SA' W0907 20:32:43.417940 1 authentication.go:345] Error looking up in-cluster authentication configuration: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot get resource "configmaps" in API group "" in the namespace "kube-system" W0907 20:32:43.417976 1 authentication.go:346] Continuing without authentication configuration. This may treat all requests as anonymous. W0907 20:32:43.417981 1 authentication.go:347] To require authentication configuration lookup to succeed, set --authentication-tolerate-lookup-failure=false I0907 20:32:43.437736 1 secure_serving.go:200] Serving securely on 127.0.0.1:10259 I0907 20:32:43.439011 1 configmap_cafile_content.go:201] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file" I0907 20:32:43.439064 1 shared_informer.go:240] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file I0907 20:32:43.439237 1 tlsconfig.go:240] "Starting DynamicServingCertificateController" E0907 20:32:43.439467 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PodDisruptionBudget: failed to list *v1.PodDisruptionBudget: poddisruptionbudgets.policy is forbidden: User "system:kube-scheduler" cannot list resource "poddisruptionbudgets" in API group "policy" at the cluster scope E0907 20:32:43.442332 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope E0907 20:32:43.442636 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope E0907 20:32:43.443865 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Pod: failed to list *v1.Pod: pods is forbidden: User "system:kube-scheduler" cannot list resource "pods" in API group "" at the cluster scope E0907 20:32:43.444302 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolume: failed to list *v1.PersistentVolume: persistentvolumes is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumes" in API group "" at the cluster scope E0907 20:32:43.444375 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1beta1.CSIStorageCapacity: failed to list *v1beta1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope E0907 20:32:43.444737 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope E0907 20:32:43.445284 1 reflector.go:138] k8s.io/apiserver/pkg/server/dynamiccertificates/configmap_cafile_content.go:205: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system" E0907 20:32:43.446428 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicationController: failed to list *v1.ReplicationController: replicationcontrollers is forbidden: User "system:kube-scheduler" cannot list resource "replicationcontrollers" in API group "" at the cluster scope E0907 20:32:43.446777 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Node: failed to list *v1.Node: nodes is forbidden: User "system:kube-scheduler" cannot list resource "nodes" in API group "" at the cluster scope E0907 20:32:43.446767 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.CSINode: failed to list *v1.CSINode: csinodes.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csinodes" in API group "storage.k8s.io" at the cluster scope E0907 20:32:43.447138 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.ReplicaSet: failed to list *v1.ReplicaSet: replicasets.apps is forbidden: User "system:kube-scheduler" cannot list resource "replicasets" in API group "apps" at the cluster scope E0907 20:32:43.447262 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope E0907 20:32:43.447304 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User "system:kube-scheduler" cannot list resource "services" in API group "" at the cluster scope E0907 20:32:43.447435 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.Namespace: failed to list *v1.Namespace: namespaces is forbidden: User "system:kube-scheduler" cannot list resource "namespaces" in API group "" at the cluster scope E0907 20:32:44.285009 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StatefulSet: failed to list *v1.StatefulSet: statefulsets.apps is forbidden: User "system:kube-scheduler" cannot list resource "statefulsets" in API group "apps" at the cluster scope E0907 20:32:44.449888 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.StorageClass: failed to list *v1.StorageClass: storageclasses.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "storageclasses" in API group "storage.k8s.io" at the cluster scope E0907 20:32:44.562535 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1.PersistentVolumeClaim: failed to list *v1.PersistentVolumeClaim: persistentvolumeclaims is forbidden: User "system:kube-scheduler" cannot list resource "persistentvolumeclaims" in API group "" at the cluster scope E0907 20:32:44.572567 1 reflector.go:138] k8s.io/client-go/informers/factory.go:134: Failed to watch *v1beta1.CSIStorageCapacity: failed to list *v1beta1.CSIStorageCapacity: csistoragecapacities.storage.k8s.io is forbidden: User "system:kube-scheduler" cannot list resource "csistoragecapacities" in API group "storage.k8s.io" at the cluster scope E0907 20:32:44.628940 1 reflector.go:138] k8s.io/apiserver/pkg/server/dynamiccertificates/configmap_cafile_content.go:205: Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "extension-apiserver-authentication" is forbidden: User "system:kube-scheduler" cannot list resource "configmaps" in API group "" in the namespace "kube-system" I0907 20:32:47.639890 1 shared_informer.go:247] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file * * ==> kubelet <== * -- Journal begins at Tue 2021-09-07 20:32:17 UTC, ends at Tue 2021-09-07 20:47:48 UTC. -- Sep 07 20:33:03 minikube kubelet[3964]: I0907 20:33:03.136468 3964 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="d413025e04818372c0df1d624b215c0dd4219875924c3b9afe7421b9585b606c" Sep 07 20:33:03 minikube kubelet[3964]: I0907 20:33:03.935705 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kube-system/coredns-78fcd69978-hssdt through plugin: invalid network status for" Sep 07 20:33:04 minikube kubelet[3964]: I0907 20:33:04.431237 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kube-system/coredns-78fcd69978-hssdt through plugin: invalid network status for" Sep 07 20:33:05 minikube kubelet[3964]: I0907 20:33:05.602035 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for kube-system/coredns-78fcd69978-hssdt through plugin: invalid network status for" Sep 07 20:33:35 minikube kubelet[3964]: I0907 20:33:35.914300 3964 scope.go:110] "RemoveContainer" containerID="cb0ef374ef5db2b328c9ab049c269d0200a93cb4e8eec94a4d109bafaf773b94" Sep 07 20:41:03 minikube kubelet[3964]: I0907 20:41:03.943115 3964 topology_manager.go:200] "Topology Admit Handler" Sep 07 20:41:03 minikube kubelet[3964]: I0907 20:41:03.946091 3964 reconciler.go:224] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8kh6\" (UniqueName: \"kubernetes.io/projected/0c06f492-f9e5-4546-b55b-12646f379f46-kube-api-access-h8kh6\") pod \"kube-ingress-dns-minikube\" (UID: \"0c06f492-f9e5-4546-b55b-12646f379f46\") " Sep 07 20:41:04 minikube kubelet[3964]: I0907 20:41:04.639364 3964 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="8a5c9aa642c8d8d460e0c3b7bb6d1fbf50705cd77d94909f68c7b2de9979d5e5" Sep 07 20:41:07 minikube kubelet[3964]: I0907 20:41:07.371138 3964 topology_manager.go:200] "Topology Admit Handler" Sep 07 20:41:07 minikube kubelet[3964]: I0907 20:41:07.463508 3964 topology_manager.go:200] "Topology Admit Handler" Sep 07 20:41:07 minikube kubelet[3964]: I0907 20:41:07.464593 3964 topology_manager.go:200] "Topology Admit Handler" Sep 07 20:41:07 minikube kubelet[3964]: I0907 20:41:07.470940 3964 reconciler.go:224] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert\") pod \"ingress-nginx-controller-69bdbc4d57-cx4ch\" (UID: \"2bac54d1-12e6-47e6-ae74-ef1699827f10\") " Sep 07 20:41:07 minikube kubelet[3964]: I0907 20:41:07.470986 3964 reconciler.go:224] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqdj\" (UniqueName: \"kubernetes.io/projected/2bac54d1-12e6-47e6-ae74-ef1699827f10-kube-api-access-fjqdj\") pod \"ingress-nginx-controller-69bdbc4d57-cx4ch\" (UID: \"2bac54d1-12e6-47e6-ae74-ef1699827f10\") " Sep 07 20:41:07 minikube kubelet[3964]: I0907 20:41:07.471013 3964 reconciler.go:224] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrqpf\" (UniqueName: \"kubernetes.io/projected/059bd77b-d0e7-483e-b41c-0e53351d974f-kube-api-access-lrqpf\") pod \"ingress-nginx-admission-create--1-wv2j7\" (UID: \"059bd77b-d0e7-483e-b41c-0e53351d974f\") " Sep 07 20:41:07 minikube kubelet[3964]: I0907 20:41:07.471032 3964 reconciler.go:224] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwrnz\" (UniqueName: \"kubernetes.io/projected/172fdcf8-b64e-4a4b-9a46-22295f5ad993-kube-api-access-pwrnz\") pod \"ingress-nginx-admission-patch--1-c22gc\" (UID: \"172fdcf8-b64e-4a4b-9a46-22295f5ad993\") " Sep 07 20:41:07 minikube kubelet[3964]: E0907 20:41:07.572927 3964 secret.go:195] Couldn't get secret ingress-nginx/ingress-nginx-admission: secret "ingress-nginx-admission" not found Sep 07 20:41:07 minikube kubelet[3964]: E0907 20:41:07.573091 3964 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert podName:2bac54d1-12e6-47e6-ae74-ef1699827f10 nodeName:}" failed. No retries permitted until 2021-09-07 20:41:08.073062922 +0000 UTC m=+501.030893293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert") pod "ingress-nginx-controller-69bdbc4d57-cx4ch" (UID: "2bac54d1-12e6-47e6-ae74-ef1699827f10") : secret "ingress-nginx-admission" not found Sep 07 20:41:08 minikube kubelet[3964]: E0907 20:41:08.076225 3964 secret.go:195] Couldn't get secret ingress-nginx/ingress-nginx-admission: secret "ingress-nginx-admission" not found Sep 07 20:41:08 minikube kubelet[3964]: E0907 20:41:08.076298 3964 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert podName:2bac54d1-12e6-47e6-ae74-ef1699827f10 nodeName:}" failed. No retries permitted until 2021-09-07 20:41:09.076283188 +0000 UTC m=+502.034113561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert") pod "ingress-nginx-controller-69bdbc4d57-cx4ch" (UID: "2bac54d1-12e6-47e6-ae74-ef1699827f10") : secret "ingress-nginx-admission" not found Sep 07 20:41:08 minikube kubelet[3964]: I0907 20:41:08.376496 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-admission-patch--1-c22gc through plugin: invalid network status for" Sep 07 20:41:08 minikube kubelet[3964]: I0907 20:41:08.404295 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-admission-create--1-wv2j7 through plugin: invalid network status for" Sep 07 20:41:08 minikube kubelet[3964]: I0907 20:41:08.675485 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-admission-patch--1-c22gc through plugin: invalid network status for" Sep 07 20:41:08 minikube kubelet[3964]: I0907 20:41:08.678801 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-admission-create--1-wv2j7 through plugin: invalid network status for" Sep 07 20:41:09 minikube kubelet[3964]: E0907 20:41:09.085227 3964 secret.go:195] Couldn't get secret ingress-nginx/ingress-nginx-admission: secret "ingress-nginx-admission" not found Sep 07 20:41:09 minikube kubelet[3964]: E0907 20:41:09.085357 3964 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert podName:2bac54d1-12e6-47e6-ae74-ef1699827f10 nodeName:}" failed. No retries permitted until 2021-09-07 20:41:11.085335698 +0000 UTC m=+504.043166092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert") pod "ingress-nginx-controller-69bdbc4d57-cx4ch" (UID: "2bac54d1-12e6-47e6-ae74-ef1699827f10") : secret "ingress-nginx-admission" not found Sep 07 20:41:11 minikube kubelet[3964]: E0907 20:41:11.100683 3964 secret.go:195] Couldn't get secret ingress-nginx/ingress-nginx-admission: secret "ingress-nginx-admission" not found Sep 07 20:41:11 minikube kubelet[3964]: E0907 20:41:11.100787 3964 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert podName:2bac54d1-12e6-47e6-ae74-ef1699827f10 nodeName:}" failed. No retries permitted until 2021-09-07 20:41:15.100767784 +0000 UTC m=+508.058598183 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert") pod "ingress-nginx-controller-69bdbc4d57-cx4ch" (UID: "2bac54d1-12e6-47e6-ae74-ef1699827f10") : secret "ingress-nginx-admission" not found Sep 07 20:41:15 minikube kubelet[3964]: E0907 20:41:15.136856 3964 secret.go:195] Couldn't get secret ingress-nginx/ingress-nginx-admission: secret "ingress-nginx-admission" not found Sep 07 20:41:15 minikube kubelet[3964]: E0907 20:41:15.136967 3964 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert podName:2bac54d1-12e6-47e6-ae74-ef1699827f10 nodeName:}" failed. No retries permitted until 2021-09-07 20:41:23.136949192 +0000 UTC m=+516.094779579 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert") pod "ingress-nginx-controller-69bdbc4d57-cx4ch" (UID: "2bac54d1-12e6-47e6-ae74-ef1699827f10") : secret "ingress-nginx-admission" not found Sep 07 20:41:23 minikube kubelet[3964]: E0907 20:41:23.198782 3964 secret.go:195] Couldn't get secret ingress-nginx/ingress-nginx-admission: secret "ingress-nginx-admission" not found Sep 07 20:41:23 minikube kubelet[3964]: E0907 20:41:23.198880 3964 nestedpendingoperations.go:301] Operation for "{volumeName:kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert podName:2bac54d1-12e6-47e6-ae74-ef1699827f10 nodeName:}" failed. No retries permitted until 2021-09-07 20:41:39.198864458 +0000 UTC m=+532.156694809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/2bac54d1-12e6-47e6-ae74-ef1699827f10-webhook-cert") pod "ingress-nginx-controller-69bdbc4d57-cx4ch" (UID: "2bac54d1-12e6-47e6-ae74-ef1699827f10") : secret "ingress-nginx-admission" not found Sep 07 20:41:24 minikube kubelet[3964]: I0907 20:41:24.785656 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-admission-patch--1-c22gc through plugin: invalid network status for" Sep 07 20:41:24 minikube kubelet[3964]: I0907 20:41:24.788570 3964 scope.go:110] "RemoveContainer" containerID="cd24545b8788c224f4cbff12fb444a9b12a3f499c03359e17b7802bcf2fe64ad" Sep 07 20:41:24 minikube kubelet[3964]: I0907 20:41:24.797414 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-admission-create--1-wv2j7 through plugin: invalid network status for" Sep 07 20:41:24 minikube kubelet[3964]: I0907 20:41:24.805706 3964 scope.go:110] "RemoveContainer" containerID="851004f28d8155a241e9b7b197726e9784ff3d1bdcf6df5a91e4843b1a438227" Sep 07 20:41:25 minikube kubelet[3964]: I0907 20:41:25.813694 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-admission-patch--1-c22gc through plugin: invalid network status for" Sep 07 20:41:25 minikube kubelet[3964]: I0907 20:41:25.818666 3964 scope.go:110] "RemoveContainer" containerID="cd24545b8788c224f4cbff12fb444a9b12a3f499c03359e17b7802bcf2fe64ad" Sep 07 20:41:25 minikube kubelet[3964]: I0907 20:41:25.818878 3964 scope.go:110] "RemoveContainer" containerID="18769bc0991c125238c169ce929f1673e6ee35758a8ea103805bb6971e438693" Sep 07 20:41:25 minikube kubelet[3964]: I0907 20:41:25.831291 3964 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="ae1b342bc3610df11da60a9fee7ed1d623edd56b78346328a7a86eb0505bd8b3" Sep 07 20:41:26 minikube kubelet[3964]: I0907 20:41:26.844715 3964 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="507422330db1e9b48db8e4455a1b9346571662575af71489d5737ae01aa3809a" Sep 07 20:41:26 minikube kubelet[3964]: I0907 20:41:26.935241 3964 reconciler.go:196] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrqpf\" (UniqueName: \"kubernetes.io/projected/059bd77b-d0e7-483e-b41c-0e53351d974f-kube-api-access-lrqpf\") pod \"059bd77b-d0e7-483e-b41c-0e53351d974f\" (UID: \"059bd77b-d0e7-483e-b41c-0e53351d974f\") " Sep 07 20:41:26 minikube kubelet[3964]: I0907 20:41:26.943260 3964 operation_generator.go:866] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/059bd77b-d0e7-483e-b41c-0e53351d974f-kube-api-access-lrqpf" (OuterVolumeSpecName: "kube-api-access-lrqpf") pod "059bd77b-d0e7-483e-b41c-0e53351d974f" (UID: "059bd77b-d0e7-483e-b41c-0e53351d974f"). InnerVolumeSpecName "kube-api-access-lrqpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 07 20:41:27 minikube kubelet[3964]: I0907 20:41:27.036594 3964 reconciler.go:319] "Volume detached for volume \"kube-api-access-lrqpf\" (UniqueName: \"kubernetes.io/projected/059bd77b-d0e7-483e-b41c-0e53351d974f-kube-api-access-lrqpf\") on node \"minikube\" DevicePath \"\"" Sep 07 20:41:27 minikube kubelet[3964]: I0907 20:41:27.944846 3964 reconciler.go:196] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwrnz\" (UniqueName: \"kubernetes.io/projected/172fdcf8-b64e-4a4b-9a46-22295f5ad993-kube-api-access-pwrnz\") pod \"172fdcf8-b64e-4a4b-9a46-22295f5ad993\" (UID: \"172fdcf8-b64e-4a4b-9a46-22295f5ad993\") " Sep 07 20:41:27 minikube kubelet[3964]: I0907 20:41:27.954162 3964 operation_generator.go:866] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/172fdcf8-b64e-4a4b-9a46-22295f5ad993-kube-api-access-pwrnz" (OuterVolumeSpecName: "kube-api-access-pwrnz") pod "172fdcf8-b64e-4a4b-9a46-22295f5ad993" (UID: "172fdcf8-b64e-4a4b-9a46-22295f5ad993"). InnerVolumeSpecName "kube-api-access-pwrnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 07 20:41:28 minikube kubelet[3964]: I0907 20:41:28.046339 3964 reconciler.go:319] "Volume detached for volume \"kube-api-access-pwrnz\" (UniqueName: \"kubernetes.io/projected/172fdcf8-b64e-4a4b-9a46-22295f5ad993-kube-api-access-pwrnz\") on node \"minikube\" DevicePath \"\"" Sep 07 20:41:40 minikube kubelet[3964]: I0907 20:41:40.017310 3964 pod_container_deletor.go:79] "Container not found in pod's containers" containerID="412608d4d1487febe9da405f628f1bf1eb2c7c2f5c0752a698fb6a4818cb2b2c" Sep 07 20:41:40 minikube kubelet[3964]: I0907 20:41:40.021619 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-controller-69bdbc4d57-cx4ch through plugin: invalid network status for" Sep 07 20:41:41 minikube kubelet[3964]: I0907 20:41:41.026610 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-controller-69bdbc4d57-cx4ch through plugin: invalid network status for" Sep 07 20:42:42 minikube kubelet[3964]: I0907 20:42:42.443074 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for ingress-nginx/ingress-nginx-controller-69bdbc4d57-cx4ch through plugin: invalid network status for" Sep 07 20:44:25 minikube kubelet[3964]: I0907 20:44:25.314195 3964 topology_manager.go:200] "Topology Admit Handler" Sep 07 20:44:25 minikube kubelet[3964]: I0907 20:44:25.390811 3964 reconciler.go:224] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4br6\" (UniqueName: \"kubernetes.io/projected/977cb8e8-c3e6-4934-8e11-82860a3fbfbc-kube-api-access-f4br6\") pod \"hello-world-app-7b9bf45d65-wfccx\" (UID: \"977cb8e8-c3e6-4934-8e11-82860a3fbfbc\") " Sep 07 20:44:26 minikube kubelet[3964]: I0907 20:44:26.133008 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for default/hello-world-app-7b9bf45d65-wfccx through plugin: invalid network status for" Sep 07 20:44:26 minikube kubelet[3964]: I0907 20:44:26.193949 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for default/hello-world-app-7b9bf45d65-wfccx through plugin: invalid network status for" Sep 07 20:44:32 minikube kubelet[3964]: I0907 20:44:32.237685 3964 docker_sandbox.go:401] "Failed to read pod IP from plugin/docker" err="Couldn't find network status for default/hello-world-app-7b9bf45d65-wfccx through plugin: invalid network status for" Sep 07 20:46:34 minikube kubelet[3964]: I0907 20:46:34.206626 3964 scope.go:110] "RemoveContainer" containerID="a7c7124a55b208c5353c600f0dafb781b943863fbf29fdba8050b9519d4e9a2b" Sep 07 20:46:39 minikube kubelet[3964]: I0907 20:46:39.269725 3964 scope.go:110] "RemoveContainer" containerID="a7c7124a55b208c5353c600f0dafb781b943863fbf29fdba8050b9519d4e9a2b" Sep 07 20:46:39 minikube kubelet[3964]: I0907 20:46:39.270893 3964 scope.go:110] "RemoveContainer" containerID="58d369b8cbb6b61487cc3cba9003c48b3e0fd347d8508fbd2741766f26d7998d" Sep 07 20:46:39 minikube kubelet[3964]: E0907 20:46:39.271324 3964 pod_workers.go:747] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"minikube-ingress-dns\" with CrashLoopBackOff: \"back-off 10s restarting failed container=minikube-ingress-dns pod=kube-ingress-dns-minikube_kube-system(0c06f492-f9e5-4546-b55b-12646f379f46)\"" pod="kube-system/kube-ingress-dns-minikube" podUID=0c06f492-f9e5-4546-b55b-12646f379f46 Sep 07 20:46:50 minikube kubelet[3964]: I0907 20:46:50.346610 3964 scope.go:110] "RemoveContainer" containerID="58d369b8cbb6b61487cc3cba9003c48b3e0fd347d8508fbd2741766f26d7998d" * * ==> storage-provisioner [adb5a4a0e0c6] <== * I0907 20:33:36.612786 1 storage_provisioner.go:116] Initializing the minikube storage provisioner... I0907 20:33:36.650038 1 storage_provisioner.go:141] Storage provisioner initialized, now starting service! I0907 20:33:36.650990 1 leaderelection.go:243] attempting to acquire leader lease kube-system/k8s.io-minikube-hostpath... I0907 20:33:36.666618 1 leaderelection.go:253] successfully acquired lease kube-system/k8s.io-minikube-hostpath I0907 20:33:36.669239 1 controller.go:835] Starting provisioner controller k8s.io/minikube-hostpath_minikube_a78f9d18-e12a-4783-a237-1134e734e544! I0907 20:33:36.669706 1 event.go:282] Event(v1.ObjectReference{Kind:"Endpoints", Namespace:"kube-system", Name:"k8s.io-minikube-hostpath", UID:"bee20a79-a686-4860-838c-2433dc71680e", APIVersion:"v1", ResourceVersion:"473", FieldPath:""}): type: 'Normal' reason: 'LeaderElection' minikube_a78f9d18-e12a-4783-a237-1134e734e544 became leader I0907 20:33:36.770232 1 controller.go:884] Started provisioner controller k8s.io/minikube-hostpath_minikube_a78f9d18-e12a-4783-a237-1134e734e544! * * ==> storage-provisioner [cb0ef374ef5d] <== * I0907 20:33:04.433973 1 storage_provisioner.go:116] Initializing the minikube storage provisioner... F0907 20:33:34.442850 1 main.go:39] error getting server version: Get "https://10.96.0.1:443/version?timeout=32s": dial tcp 10.96.0.1:443: i/o timeout