Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Envoy Proxy L7 (with CA) OpenSSL error:SSL routines:ssl3_get_record:wrong version number #32567

Closed
2 of 3 tasks
sjoukedv opened this issue May 16, 2024 · 15 comments
Closed
2 of 3 tasks
Labels
info-completed The GH issue has received a reply from the author kind/bug This is a bug in the Cilium logic. kind/community-report This was reported by a user in the Cilium community, eg via Slack. needs/triage This issue requires triaging to establish severity and next steps.

Comments

@sjoukedv
Copy link

sjoukedv commented May 16, 2024

Is there an existing issue for this?

  • I have searched the existing issues

What happened?

I am trying to enforce L7 policies on an external endpoint and adding our own CA (as in this example)

apiVersion: "cilium.io/v2"
kind: CiliumNetworkPolicy
metadata:
  name: "client-ca"
  namespace: cilium-poc
spec:
  endpointSelector:
    matchLabels:
      "k8s:app": client-ca
          
  egress:

  - toFQDNs:
        - matchName: dummyjson.com
    toPorts:
    - ports:
      - port: '443'
        protocol: TCP  
        
      # adding the rules section below redirects to envoy proxy and that's when the TLS handshake cannot be performed
      rules:
        http:
        - {}

The above FQDN has a valid certificate (it's a public API), but I also cannot get this to work with something that has a self-signed certificate and adding:

      originatingTLS:
        secret:
          namespace: "cilium-secrets"
          name: "my-cat"

Removing the rules.http section makes it work as traffic no longer goes through the envoy proxy (?)

Curl indicates the problem is with the openssl library

# curl -v https://dummyjson.com/products/1
....
curl: (35) OpenSSL/3.1.4: error:0A00010B:SSL routines::wrong version number

confirmed with directly using OpenSSL

openssl s_client -connect dummyjson.com:443
CONNECTED(00000003)
139701035984704:error:1408F10B:SSL routines:ssl3_get_record:wrong version number:ssl/record/ssl3_record.c:332:
...

Please let me know what further information I can provide.

Cilium Version

1.16.0-pre.2

Kernel Version

Linux ip-172-x-x-x.<region>.compute.internal 5.10.215-203.850.amzn2.x86_64 #1 SMP Tue Apr 23 20:32:19 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux

Kubernetes Version

Server Version: v1.29.3-eks-adc7111

Regression

No response

Sysdump

Can provide on request in private.

Relevant log output

No response

Anything else?

helm values
cluster:
  name: mycluster
  id: 4

image:
  repository: private-registry/quay/cilium/cilium

kubeProxyReplacement: true 
k8sServiceHost: "my-api-server.eks.amazonaws.com"
k8sServicePort: "443" 

loadBalancer:
  algorithm: maglev 

debug:
  enabled: true

eni:
  enabled: true
  updateEC2AdapterLimitViaAPI: false
  awsReleaseExcessIPs: true
  awsEnablePrefixDelegation: true
  first-interface-index: 0

  # all private-* subnets must have this tag set
  subnetTagsFilter: ["kubernetes.io/role/internal-elb=1"]

  eniTags:
    cilium-managed: 'true'

ipam:
  mode: eni

egressMasqueradeInterfaces: eth0
tunnel: disabled

nodeinit:
  enabled: true

operator:
  image:
    repository: private-registry/quay/cilium/operator
  prometheus:
    enabled: true
    serviceMonitor:
      enabled: true

hubble:
  relay:
    image:
      repository: private-registry/quay/cilium/hubble-relay
    enabled: true
    tls:
      server:
        enabled: false

  redact:
    enabled: true
    http:
      urlQuery: true
      userInfo: true  
    kafka:
      apiKey: true
      

  tls:
    enabled: false
  ui:
    enabled: true
    backend:
      image:
        repository: private-registry/quay/cilium/hubble-ui-backend
    frontend:
      image:
        repository: private-registry/quay/cilium/hubble-ui
      server:
        ipv6:
          enabled: false

envoy:
  enabled: true
  image:
    repository:private-registry/quay/cilium/cilium-envoy

  xffNumTrustedHopsL7PolicyEgress: 2
  xffNumTrustedHopsL7PolicyIngress: 2

  debug:
    admin:
      enabled: true
  
encryption:
  enabled: false
  type: ipsec

authentication:
  mutual:
    spire:
      enabled: true
      install:
        enabled: true

prometheus:
  enabled: true
  serviceMonitor:
    enabled: true

hostFirewall:
  enabled: true

Cilium Users Document

  • Are you a user of Cilium? Please add yourself to the Users doc

Code of Conduct

  • I agree to follow this project's Code of Conduct
@sjoukedv sjoukedv added kind/bug This is a bug in the Cilium logic. kind/community-report This was reported by a user in the Cilium community, eg via Slack. needs/triage This issue requires triaging to establish severity and next steps. labels May 16, 2024
@squeed
Copy link
Contributor

squeed commented May 17, 2024

Usually the wrong_version_number indicates that TLS is not actually being used -- can you check and see if that is the case?

Another thing I see wrong is that there are no port 53 rules to trigger DNS interception, which is also required for ToFQDN rules to work.

Lastly, I recommend checking the cilium-envoy pod logs on the same node to see if there are any error messages.

@squeed squeed added the need-more-info More information is required to further debug or fix the issue. label May 17, 2024
@sjoukedv
Copy link
Author

sjoukedv commented May 21, 2024

Sorry for the delay in reply and thank you for reviewing my issue @squeed

I left out other items in the spec.egress of the CiliumNetworkPolicy including the rule you are referring to:

  - toPorts:
    - ports:
      - port: "53"
        protocol: ANY
      rules:
        dns:
        - matchPattern: "*"

I now test with a policy that only has two (egress) rules, one for DNS and one for de L7 filtering.

You mention that it's a prerequisite for FQDN to work properly. I did notice that when specifying the exact hostname in the DNS rule this does not work:

  - toPorts:
    - ports:
      - port: "53"
        protocol: ANY
      rules:
        dns:
        - matchPattern: "dummyjson.com" # or - matchName:  dummyjson.com
        # only this works 
        # - matchPattern: "*"
cilium-envoy logs on the same node
[2024-05-21 04:33:12.165][8][info][main] [external/envoy/source/server/server.cc:430] initializing epoch 0 (base id=0, hot restart version=11.104)
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:432] statically linked extensions:
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.tls.cert_validator: envoy.tls.cert_validator.default
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.config_mux: envoy.config_mux.delta_grpc_mux_factory, envoy.config_mux.grpc_mux_factory, envoy.config_mux.new_grpc_mux_factory, envoy.config_mux.sotw_grpc_mux_factory
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   network.connection.client: default, envoy_internal
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.route.early_data_policy: envoy.route.early_data_policy.default
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.resolvers: envoy.ip
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.grpc_credentials: envoy.grpc_credentials.default
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.quic.connection_id_generator: envoy.quic.deterministic_connection_id_generator
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.http.original_ip_detection: envoy.http.original_ip_detection.xff
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.upstreams: envoy.filters.connection_pools.tcp.generic
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.request_id: envoy.request_id.uuid
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.filters.network: cilium.network, cilium.network.websocket.client, cilium.network.websocket.server, envoy.ext_authz, envoy.filters.network.connection_limit, envoy.filters.network.ext_authz, envoy.filters.network.http_connection_manager, envoy.filters.network.local_ratelimit, envoy.filters.network.mongo_proxy, envoy.filters.network.mysql_proxy, envoy.filters.network.ratelimit, envoy.filters.network.redis_proxy, envoy.filters.network.sni_cluster, envoy.filters.network.sni_dynamic_forward_proxy, envoy.filters.network.tcp_proxy, envoy.filters.network.thrift_proxy, envoy.filters.network.wasm, envoy.http_connection_manager, envoy.mongo_proxy, envoy.ratelimit, envoy.redis_proxy, envoy.tcp_proxy
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.access_loggers.extension_filters: envoy.access_loggers.extension_filters.cel
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.rate_limit_descriptors: envoy.rate_limit_descriptors.expr
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.transport_sockets.downstream: cilium.tls_wrapper, envoy.transport_sockets.quic, envoy.transport_sockets.raw_buffer, envoy.transport_sockets.tls, raw_buffer, tls
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.thrift_proxy.protocols: auto, binary, binary/non-strict, compact, twitter
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.matching.http.input: envoy.matching.inputs.cel_data_input, envoy.matching.inputs.request_headers, envoy.matching.inputs.request_trailers, envoy.matching.inputs.response_headers, envoy.matching.inputs.response_trailers, query_params
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.clusters: envoy.cluster.eds, envoy.cluster.logical_dns, envoy.cluster.original_dst, envoy.cluster.static, envoy.cluster.strict_dns, envoy.clusters.dynamic_forward_proxy
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.health_checkers: envoy.health_checkers.grpc, envoy.health_checkers.http, envoy.health_checkers.redis, envoy.health_checkers.tcp, envoy.health_checkers.thrift
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.connection_handler: envoy.connection_handler.default
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   quic.http_server_connection: quic.http_server_connection.default
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.filters.listener: cilium.bpf_metadata, envoy.filters.listener.http_inspector, envoy.filters.listener.local_ratelimit, envoy.filters.listener.proxy_protocol, envoy.filters.listener.tls_inspector, envoy.listener.http_inspector, envoy.listener.proxy_protocol, envoy.listener.tls_inspector
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.http.early_header_mutation: envoy.http.early_header_mutation.header_mutation
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.regex_engines: envoy.regex_engines.google_re2
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.stats_sinks: envoy.metrics_service, envoy.open_telemetry_stat_sink, envoy.stat_sinks.metrics_service, envoy.stat_sinks.open_telemetry, envoy.stat_sinks.wasm
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.resource_monitors: envoy.resource_monitors.downstream_connections
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.bootstrap: envoy.bootstrap.internal_listener, envoy.bootstrap.wasm, envoy.extensions.network.socket_interface.default_socket_interface
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.matching.action: rate_limit_quota
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.http.stateful_header_formatters: envoy.http.stateful_header_formatters.preserve_case, preserve_case
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.access_loggers: envoy.access_loggers.file, envoy.access_loggers.http_grpc, envoy.access_loggers.open_telemetry, envoy.access_loggers.stderr, envoy.access_loggers.stdout, envoy.access_loggers.tcp_grpc, envoy.access_loggers.wasm, envoy.file_access_log, envoy.http_grpc_access_log, envoy.open_telemetry_access_log, envoy.stderr_access_log, envoy.stdout_access_log, envoy.tcp_grpc_access_log, envoy.wasm_access_log
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.path.rewrite: envoy.path.rewrite.uri_template.uri_template_rewriter
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.thrift_proxy.filters: envoy.filters.thrift.router
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.transport_sockets.upstream: cilium.tls_wrapper, envoy.transport_sockets.internal_upstream, envoy.transport_sockets.quic, envoy.transport_sockets.raw_buffer, envoy.transport_sockets.tls, raw_buffer, tls
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.upstream_options: envoy.extensions.upstreams.http.v3.HttpProtocolOptions, envoy.extensions.upstreams.tcp.v3.TcpProtocolOptions, envoy.upstreams.http.http_protocol_options, envoy.upstreams.tcp.tcp_protocol_options
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.network.dns_resolver: envoy.network.dns_resolver.cares
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.load_balancing_policies: envoy.load_balancing_policies.cluster_provided, envoy.load_balancing_policies.least_request, envoy.load_balancing_policies.maglev, envoy.load_balancing_policies.random, envoy.load_balancing_policies.ring_hash, envoy.load_balancing_policies.round_robin, envoy.load_balancing_policies.subset
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.filters.http: cilium.l7policy, envoy.ext_authz, envoy.filters.http.dynamic_forward_proxy, envoy.filters.http.ext_authz, envoy.filters.http.grpc_stats, envoy.filters.http.grpc_web, envoy.filters.http.health_check, envoy.filters.http.jwt_authn, envoy.filters.http.local_ratelimit, envoy.filters.http.match_delegate, envoy.filters.http.oauth2, envoy.filters.http.rate_limit_quota, envoy.filters.http.ratelimit, envoy.filters.http.router, envoy.filters.http.set_metadata, envoy.filters.http.wasm, envoy.grpc_web, envoy.health_check, envoy.local_rate_limit, envoy.rate_limit, envoy.router
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.filters.http.upstream: envoy.filters.http.upstream_codec
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.quic.proof_source: envoy.quic.proof_source.filter_chain
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.guarddog_actions: envoy.watchdog.abort_action
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.quic.server.crypto_stream: envoy.quic.crypto_stream.server.quiche
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.http.header_validators: envoy.http.header_validators.envoy_default
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.upstream.local_address_selector: envoy.upstream.local_address_selector.default_local_address_selector
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.health_check.event_sinks: cilium.health_check.event_sink.pipe, envoy.health_check.event_sink.file
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.config_subscription: envoy.config_subscription.ads, envoy.config_subscription.ads_collection, envoy.config_subscription.aggregated_grpc_collection, envoy.config_subscription.delta_grpc, envoy.config_subscription.delta_grpc_collection, envoy.config_subscription.filesystem, envoy.config_subscription.filesystem_collection, envoy.config_subscription.grpc
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   filter_state.object: envoy.network.application_protocols, envoy.network.transport_socket.original_dst_address, envoy.network.upstream_server_name, envoy.network.upstream_subject_alt_names, envoy.tcp_proxy.cluster, envoy.tcp_proxy.disable_tunneling, envoy.upstream.dynamic_host, envoy.upstream.dynamic_port
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.wasm.runtime: envoy.wasm.runtime.null, envoy.wasm.runtime.v8
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.listener_manager_impl: envoy.listener_manager_impl.default, envoy.listener_manager_impl.validation
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.thrift_proxy.transports: auto, framed, header, unframed
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.udp_packet_writer: envoy.udp_packet_writer.default, envoy.udp_packet_writer.gso
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.matching.input_matchers: envoy.matching.matchers.cel_matcher
[2024-05-21 04:33:12.166][8][info][main] [external/envoy/source/server/server.cc:434]   envoy.path.match: envoy.path.match.uri_template.uri_template_matcher
[2024-05-21 04:33:12.174][8][info][main] [external/envoy/source/server/server.cc:488] HTTP header map info:
[2024-05-21 04:33:12.179][8][info][main] [external/envoy/source/server/server.cc:491]   request header map: 592 bytes: :authority,:method,:path,:protocol,:scheme,accept,access-control-request-method,authorization,connection,content-length,content-type,expect,grpc-accept-encoding,grpc-timeout,keep-alive,origin,proxy-connection,proxy-status,referer,sec-websocket-extensions,sec-websocket-key,sec-websocket-protocol,sec-websocket-version,te,transfer-encoding,upgrade,user-agent,via,x-client-trace-id,x-envoy-attempt-count,x-envoy-decorator-operation,x-envoy-downstream-service-cluster,x-envoy-downstream-service-node,x-envoy-expected-rq-timeout-ms,x-envoy-external-address,x-envoy-force-trace,x-envoy-hedge-on-per-try-timeout,x-envoy-internal,x-envoy-ip-tags,x-envoy-is-timeout-retry,x-envoy-max-retries,x-envoy-original-dst-host,x-envoy-original-path,x-envoy-original-url,x-envoy-retriable-header-names,x-envoy-retriable-status-codes,x-envoy-retry-grpc-on,x-envoy-retry-on,x-envoy-upstream-alt-stat-name,x-envoy-upstream-rq-per-try-timeout-ms,x-envoy-upstream-rq-timeout-alt-response,x-envoy-upstream-rq-timeout-ms,x-envoy-upstream-stream-duration-ms,x-forwarded-client-cert,x-forwarded-for,x-forwarded-host,x-forwarded-port,x-forwarded-proto,x-request-id
[2024-05-21 04:33:12.179][8][info][main] [external/envoy/source/server/server.cc:491]   request trailer map: 120 bytes: 
[2024-05-21 04:33:12.179][8][info][main] [external/envoy/source/server/server.cc:491]   response header map: 328 bytes: :status,connection,content-length,content-type,date,grpc-message,grpc-status,keep-alive,location,proxy-connection,proxy-status,sec-websocket-accept,server,transfer-encoding,upgrade,via,x-envoy-attempt-count,x-envoy-decorator-operation,x-envoy-degraded,x-envoy-immediate-health-check-fail,x-envoy-ratelimited,x-envoy-upstream-canary,x-envoy-upstream-healthchecked-cluster,x-envoy-upstream-service-time,x-request-id
[2024-05-21 04:33:12.179][8][info][main] [external/envoy/source/server/server.cc:491]   response trailer map: 144 bytes: grpc-message,grpc-status
[2024-05-21 04:33:12.192][8][info][main] [external/envoy/source/server/server.cc:845] runtime: layers:
  - name: static_layer_0
    static_layer:
      overload:
        global_downstream_max_connections: 50000
[2024-05-21 04:33:12.192][8][info][admin] [external/envoy/source/server/admin/admin.cc:66] admin address: /var/run/cilium/envoy/sockets/admin.sock
[2024-05-21 04:33:12.192][8][info][config] [external/envoy/source/server/configuration_impl.cc:160] loading tracing configuration
[2024-05-21 04:33:12.192][8][info][config] [external/envoy/source/server/configuration_impl.cc:119] loading 0 static secret(s)
[2024-05-21 04:33:12.192][8][info][config] [external/envoy/source/server/configuration_impl.cc:125] loading 6 cluster(s)
[2024-05-21 04:33:12.207][8][info][config] [external/envoy/source/server/configuration_impl.cc:129] loading 3 listener(s)
[2024-05-21 04:33:12.216][8][info][config] [external/envoy/source/server/configuration_impl.cc:146] loading stats configuration
[2024-05-21 04:33:12.217][8][info][runtime] [external/envoy/source/common/runtime/runtime_impl.cc:584] RTDS has finished initialization
[2024-05-21 04:33:12.217][8][info][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:236] cm init: initializing cds
[2024-05-21 04:33:12.226][8][info][main] [external/envoy/source/server/server.cc:951] starting main dispatch loop
[2024-05-21 04:33:27.217][8][warning][config] [external/envoy/source/extensions/config_subscription/grpc/grpc_subscription_impl.cc:130] gRPC config: initial fetch timed out for type.googleapis.com/envoy.config.cluster.v3.Cluster
[2024-05-21 04:33:27.217][8][info][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:240] cm init: all clusters initialized
[2024-05-21 04:33:27.217][8][info][main] [external/envoy/source/server/server.cc:932] all clusters initialized. initializing init manager
[2024-05-21 04:33:34.866][8][info][upstream] [external/envoy/source/common/upstream/cds_api_helper.cc:32] cds: add 0 cluster(s), remove 6 cluster(s)
[2024-05-21 04:33:34.866][8][info][upstream] [external/envoy/source/common/upstream/cds_api_helper.cc:71] cds: added/updated 0 cluster(s), skipped 0 unmodified cluster(s)
[2024-05-21 04:33:38.627][8][info][config] [external/envoy/source/common/listener_manager/listener_manager_impl.cc:926] all dependencies initialized. starting workers
[2024-05-21 04:34:17.746][8][info][upstream] [external/envoy/source/common/listener_manager/lds_api.cc:99] lds: add/update listener 'cilium-http-egress:17881'
[2024-05-21 04:48:38.588][8][info][main] [external/envoy/source/server/drain_manager_impl.cc:208] shutting down parent after drain

I also notice stream reset: reset reason: local reset, response details: http1.codec_error when I turn on the debug mode

cilium-envoy debug logs
[2024-05-21 05:18:49.874][16][debug][router] [external/envoy/source/common/router/router.cc:1514] [Tags: "ConnectionId":"729","StreamId":"13711340128884287025"] upstream headers complete: end_stream=false
[2024-05-21 05:18:49.874][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1803] [Tags: "ConnectionId":"729","StreamId":"13711340128884287025"] closing connection due to connection close header
[2024-05-21 05:18:49.874][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1869] [Tags: "ConnectionId":"729","StreamId":"13711340128884287025"] encoding headers via codec (end_stream=false):
':status', '200'
'content-type', 'text/plain; charset=UTF-8'
'cache-control', 'no-cache, max-age=0'
'x-content-type-options', 'nosniff'
'date', 'Tue, 21 May 2024 05:18:49 GMT'
'server', 'envoy'
'x-envoy-upstream-service-time', '0'
'connection', 'close'

[2024-05-21 05:18:49.874][16][debug][client] [external/envoy/source/common/http/codec_client.cc:128] [Tags: "ConnectionId":"15"] response complete
[2024-05-21 05:18:49.874][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1974] [Tags: "ConnectionId":"729","StreamId":"13711340128884287025"] Codec completed encoding stream.
[2024-05-21 05:18:49.874][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:146] [Tags: "ConnectionId":"729"] closing data_to_write=277 type=0
[2024-05-21 05:18:49.874][16][debug][connection] [external/envoy/source/common/network/connection_impl_base.cc:47] [Tags: "ConnectionId":"729"] setting delayed close timer with timeout 1000 ms
[2024-05-21 05:18:49.874][16][debug][pool] [external/envoy/source/common/http/http1/conn_pool.cc:53] [Tags: "ConnectionId":"15"] response complete
[2024-05-21 05:18:49.874][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"15"] destroying stream: 0 remaining
[2024-05-21 05:18:49.874][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:788] [Tags: "ConnectionId":"729"] write flush complete
[2024-05-21 05:18:49.874][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"729"] closing socket: 1
[2024-05-21 05:18:49.874][16][debug][conn_handler] [external/envoy/source/common/listener_manager/active_stream_listener_base.cc:136] [Tags: "ConnectionId":"729"] adding to cleanup list
[2024-05-21 05:18:52.660][8][debug][main] [external/envoy/source/server/server.cc:239] flushing stats
[2024-05-21 05:18:53.421][15][debug][filter] [external/envoy/source/extensions/filters/listener/tls_inspector/tls_inspector.cc:137] tls:onServerName(), requestedServerName: pps-token.svc.tst.tkp
[2024-05-21 05:18:53.421][15][debug][misc] [cilium/bpf_metadata.cc:297] EGRESS POD IP: 172.x.x.x, destination IP: 172.x.x.x
[2024-05-21 05:18:53.421][15][debug][filter] [cilium/conntrack.cc:178] cilium.bpf_metadata: Using conntrack map global
[2024-05-21 05:18:53.421][15][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 172.x.x.x has ID 16777219
[2024-05-21 05:18:53.421][15][debug][filter] [./cilium/socket_option.h:239] Cilium SocketOption(): source_identity: 285520, ingress: false, port: 443, pod_ip: 172.x.x.x, source_addresses: //, mark: 5b500b04 (magic mark: b00, cluster: 4, ID: 23376), proxy_id: 17881
[2024-05-21 05:18:53.421][15][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 172.x.x.x has ID 16777219
[2024-05-21 05:18:53.421][15][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-21 05:18:53.421][15][debug][filter] [cilium/network_filter.cc:76] cilium.network: onNewConnection
[2024-05-21 05:18:53.422][15][debug][conn_handler] [external/envoy/source/common/listener_manager/active_tcp_listener.cc:160] [Tags: "ConnectionId":"730"] new connection from 172.x.x.x:35460
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:393] [Tags: "ConnectionId":"730"] new stream
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/filter_manager.cc:1065] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] Sending local reply with details http1.codec_error
[2024-05-21 05:18:53.422][15][debug][router] [cilium/uds_client.cc:56] Cilium access log resetting socket due to error: Broken pipe
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1803] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] closing connection due to connection close header
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1869] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] encoding headers via codec (end_stream=false):
':status', '400'
'content-length', '11'
'content-type', 'text/plain'
'date', 'Tue, 21 May 2024 05:18:53 GMT'
'server', 'envoy'
'connection', 'close'

[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1974] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] Codec completed encoding stream.
[2024-05-21 05:18:53.422][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:247] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] doEndStream() resetting stream
[2024-05-21 05:18:53.423][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1938] [Tags: "ConnectionId":"730","StreamId":"11083364593934732890"] stream reset: reset reason: local reset, response details: http1.codec_error
[2024-05-21 05:18:53.423][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:146] [Tags: "ConnectionId":"730"] closing data_to_write=156 type=2
[2024-05-21 05:18:53.423][15][debug][connection] [external/envoy/source/common/network/connection_impl_base.cc:47] [Tags: "ConnectionId":"730"] setting delayed close timer with timeout 1000 ms
[2024-05-21 05:18:53.423][15][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:439] [Tags: "ConnectionId":"730"] dispatch error: http/1.1 protocol error: HPE_INVALID_METHOD
[2024-05-21 05:18:53.423][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:146] [Tags: "ConnectionId":"730"] closing data_to_write=156 type=2
[2024-05-21 05:18:53.423][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:788] [Tags: "ConnectionId":"730"] write flush complete
[2024-05-21 05:18:53.453][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:788] [Tags: "ConnectionId":"730"] write flush complete
[2024-05-21 05:18:54.420][15][debug][connection] [external/envoy/source/common/network/connection_impl_base.cc:69] [Tags: "ConnectionId":"730"] triggered delayed close
[2024-05-21 05:18:54.420][15][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"730"] closing socket: 1
[2024-05-21 05:18:54.420][15][debug][conn_handler] [external/envoy/source/common/listener_manager/active_stream_listener_base.cc:136] [Tags: "ConnectionId":"730"] adding to cleanup list
[2024-05-21 05:18:57.660][8][debug][main] [external/envoy/source/server/server.cc:239] flushing stats

My nodes are the AWS EKS managed AMIs with version 1.29.3-20240514 at the moment.

These are the TCP packets I see with Hubble:

image

Edit;

hubble observe

$ hubble observe --type trace:to-proxy
May 21 10:23:46.777: cilium-poc/client-7655c9f8d6-rktrm:56996 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 21 10:23:46.809: cilium-poc/client-7655c9f8d6-rktrm:56996 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, RST)
May 21 10:23:49.821: cilium-poc/client-7655c9f8d6-rktrm:51846 (ID:285520) -> kube-system/coredns-66f55b6cf5-pbh9c:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:49.839: cilium-poc/client-7655c9f8d6-rktrm:51846 (ID:285520) <- kube-system/coredns-66f55b6cf5-pbh9c:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:49.854: cilium-poc/client-7655c9f8d6-rktrm:57004 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: SYN)
May 21 10:23:49.854: cilium-poc/client-7655c9f8d6-rktrm:57004 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK)
May 21 10:23:49.858: cilium-poc/client-7655c9f8d6-rktrm:57004 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 21 10:23:49.901: cilium-poc/client-7655c9f8d6-rktrm:57004 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, RST)
May 21 10:23:52.909: cilium-poc/client-7655c9f8d6-rktrm:33220 (ID:285520) -> kube-system/coredns-66f55b6cf5-lwv7w:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:52.912: cilium-poc/client-7655c9f8d6-rktrm:33220 (ID:285520) <- kube-system/coredns-66f55b6cf5-lwv7w:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:52.921: cilium-poc/client-7655c9f8d6-rktrm:57010 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: SYN)
May 21 10:23:52.921: cilium-poc/client-7655c9f8d6-rktrm:57010 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK)
May 21 10:23:52.927: cilium-poc/client-7655c9f8d6-rktrm:57010 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 21 10:23:52.954: cilium-poc/client-7655c9f8d6-rktrm:57010 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, RST)
May 21 10:23:55.960: cilium-poc/client-7655c9f8d6-rktrm:56158 (ID:285520) -> kube-system/coredns-66f55b6cf5-pbh9c:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:55.969: cilium-poc/client-7655c9f8d6-rktrm:56158 (ID:285520) <- kube-system/coredns-66f55b6cf5-pbh9c:53 (ID:262381) to-proxy FORWARDED (UDP)
May 21 10:23:55.986: cilium-poc/client-7655c9f8d6-rktrm:42466 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: SYN)
May 21 10:23:55.986: cilium-poc/client-7655c9f8d6-rktrm:42466 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK)
May 21 10:23:55.991: cilium-poc/client-7655c9f8d6-rktrm:42466 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 21 10:23:56.020: cilium-poc/client-7655c9f8d6-rktrm:42466 (ID:285520) -> dummyjson.com:443 (ID:16777321) to-proxy FORWARDED (TCP Flags: ACK, RST)

Nothing with the HTTP protocol

$ hubble observe --protocol http --from-pod cilium-poc/client-7655c9f8d6-rktrm

from the cilium-agent running on the same node:

time="2024-05-21T10:03:34Z" level=debug msg="Received DNS response to proxied lookup" DNSRequestID=2423 dnsName=dummyjson.com. endpointID=2734 identity=285520 ipAddr="172.x.x.x:41150" resp=";; opcode: QUERY, status: NOERROR, id: 8720\n;; flags: qr aa rd ra; QUERY: 1, ANSWER: 0, AUTHORITY: 1, ADDITIONAL: 1\n\n;; OPT PSEUDOSECTION:\n; EDNS: version 0; flags: ; udp: 1232\n\n;; QUESTION SECTION:\n;dummyjson.com.\tIN\t AAAA\n\n;; AUTHORITY SECTION:\ndummyjson.com.\t20\tIN\tSOA\tdns1.registrar-servers.com. hostmaster.registrar-servers.com. 1704487896 43200 3600 604800 3601\n" subsys=fqdn/dnsproxy
time="2024-05-21T10:03:34Z" level=debug msg="Notifying with DNS response to original DNS query" DNSRequestID=2423 dnsName=dummyjson.com. endpointID=2734 identity=285520 ipAddr="172.x.x.x:41150" subsys=fqdn/dnsproxy
time="2024-05-21T10:03:34Z" level=debug msg="Responding to original DNS query" DNSRequestID=2423 dnsName=dummyjson.com. endpointID=2734 identity=285520 ipAddr="172.x.x.x:41150" subsys=fqdn/dnsproxy
time="2024-05-21T10:03:34Z" level=debug msg="dnsproxy: Wrote DNS response (149/149 bytes) from 172.x.x.x:53 to 172.x.x.x:41150" subsys=fqdn/dnsproxy

Running cilium connectivity test works fine except for some hubble errors

@github-actions github-actions bot added info-completed The GH issue has received a reply from the author and removed need-more-info More information is required to further debug or fix the issue. labels May 21, 2024
@lmb
Copy link
Contributor

lmb commented May 21, 2024

Can you try a plaintext curl to port 443 instead? The version error is most likely because you're getting a plain text error message back since TLS is not configured correctly. Something like curl -v http://dummyjson.com:443/products/1.

@lmb lmb added the need-more-info More information is required to further debug or fix the issue. label May 21, 2024
@sjoukedv
Copy link
Author

sjoukedv commented May 22, 2024

traffic going to-stack without http rule, which works fine (so we are able to connect from our pod when bypassing the envoy proxy)
May 22 05:21:28.568: cilium-poc/client-7655c9f8d6-2h7gm (ID:296361) <> kube-system/kube-dns:53 (world) pre-xlate-fwd TRACED (UDP)
May 22 05:21:28.568: cilium-poc/client-7655c9f8d6-2h7gm (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) post-xlate-fwd TRANSLATED (UDP)
May 22 05:21:28.568: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) policy-verdict:L4-Only EGRESS ALLOWED (UDP)
May 22 05:21:28.568: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) to-proxy FORWARDED (UDP)
May 22 05:21:28.569: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.569: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.570: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. AAAA)
May 22 05:21:28.570: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) to-endpoint FORWARDED (UDP)
May 22 05:21:28.570: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.571: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. A)
May 22 05:21:28.571: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.574: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.574: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.575: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. AAAA)
May 22 05:21:28.575: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.576: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. A)
May 22 05:21:28.576: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.578: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. A)
May 22 05:21:28.578: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.579: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. AAAA)
May 22 05:21:28.579: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.579: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.580: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.581: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. A)
May 22 05:21:28.581: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.582: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.583: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. AAAA)
May 22 05:21:28.583: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.583: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.587: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.587: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.587: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. AAAA)
May 22 05:21:28.587: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. A)
May 22 05:21:28.588: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.588: cilium-poc/client-7655c9f8d6-2h7gm:49626 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:21:28.595: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) policy-verdict:L3-L4 EGRESS ALLOWED (TCP Flags: SYN)
May 22 05:21:28.596: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: SYN)
May 22 05:21:28.733: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: ACK)
May 22 05:21:28.752: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: ACK, PSH)
May 22 05:21:29.062: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: ACK, FIN)
May 22 05:21:29.197: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: RST)
May 22 05:21:29.197: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: RST)
May 22 05:21:29.198: cilium-poc/client-7655c9f8d6-2h7gm:51280 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-stack FORWARDED (TCP Flags: RST)
with http rule
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm (ID:296361) <> kube-system/kube-dns:53 (world) pre-xlate-fwd TRACED (UDP)
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) post-xlate-fwd TRANSLATED (UDP)
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) policy-verdict:L4-Only EGRESS ALLOWED (UDP)
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) to-proxy FORWARDED (UDP)
May 22 05:23:47.348: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.349: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. A)
May 22 05:23:47.349: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) to-endpoint FORWARDED (UDP)
May 22 05:23:47.349: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.350: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.351: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. AAAA)
May 22 05:23:47.351: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. A)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.353: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. AAAA)
May 22 05:23:47.354: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.356: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.356: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.356: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. AAAA)
May 22 05:23:47.356: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. A)
May 22 05:23:47.357: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.357: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.358: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.358: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. A)
May 22 05:23:47.358: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.358: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.359: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. AAAA)
May 22 05:23:47.359: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.361: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.361: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.361: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. A)
May 22 05:23:47.362: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) -> kube-system/coredns-6ff88dd996-9xqw5:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. AAAA)
May 22 05:23:47.362: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.362: cilium-poc/client-7655c9f8d6-2h7gm:59795 (ID:296361) <> kube-system/coredns-6ff88dd996-9xqw5 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:23:47.365: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) policy-verdict:L3-L4 EGRESS ALLOWED (TCP Flags: SYN)
May 22 05:23:47.365: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: SYN)
May 22 05:23:47.365: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK)
May 22 05:23:47.365: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (TCP)
May 22 05:23:47.369: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 22 05:23:47.416: cilium-poc/client-7655c9f8d6-2h7gm:48372 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK, RST)
May 22 05:23:47.954: cilium-poc/client-7655c9f8d6-2h7gm:42846 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: RST)
Trying with http://:443 (with http rule enabled so traffic goes trough the proxy)
/ # curl -v --max-time 5 http://dummyjson.com:443/products/1
* Host dummyjson.com:443 was resolved.
* IPv6: (none)
* IPv4: 104.196.232.237, 104.196.232.237
*   Trying 104.196.232.237:443...
* Connected to dummyjson.com (104.196.232.237) port 443
> GET /products/1 HTTP/1.1
> Host: dummyjson.com:443
> User-Agent: curl/8.5.0
> Accept: */*
> 
< HTTP/1.1 503 Service Unavailable
< content-length: 118
< content-type: text/plain
< date: Wed, 22 May 2024 05:24:40 GMT
< server: envoy
< 
* Connection #0 to host dummyjson.com left intact
upstream connect error or disconnect/reset before headers. retried and the latest reset reason: connection termination/
Hubble flows for the http://:443 call
May 22 05:34:37.485: cilium-poc/client-74987657df-4t9jr (ID:296361) <> kube-system/kube-dns:53 (world) pre-xlate-fwd TRACED (UDP)
May 22 05:34:37.485: cilium-poc/client-74987657df-4t9jr (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) post-xlate-fwd TRANSLATED (UDP)
May 22 05:34:37.486: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) policy-verdict:L4-Only EGRESS ALLOWED (UDP)
May 22 05:34:37.486: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) to-proxy FORWARDED (UDP)
May 22 05:34:37.487: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.494: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.505: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. A)
May 22 05:34:37.505: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cilium-poc.svc.cluster.local. AAAA)
May 22 05:34:37.510: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.510: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.510: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) to-endpoint FORWARDED (UDP)
May 22 05:34:37.518: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.518: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.519: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. AAAA)
May 22 05:34:37.519: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.519: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.svc.cluster.local. A)
May 22 05:34:37.521: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.522: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.522: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.522: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. AAAA)
May 22 05:34:37.522: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.523: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.cluster.local. A)
May 22 05:34:37.524: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.525: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. AAAA)
May 22 05:34:37.526: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com.eu-west-1.compute.internal. A)
May 22 05:34:37.526: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.526: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.526: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.527: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.530: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.530: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.530: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. A)
May 22 05:34:37.531: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) -> kube-system/coredns-6ff88dd996-vjtb4:53 (ID:262381) dns-request proxy FORWARDED (DNS Query dummyjson.com. AAAA)
May 22 05:34:37.531: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.531: cilium-poc/client-74987657df-4t9jr:42445 (ID:296361) <> kube-system/coredns-6ff88dd996-vjtb4 (ID:262381) pre-xlate-rev TRACED (UDP)
May 22 05:34:37.541: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) policy-verdict:L3-L4 EGRESS ALLOWED (TCP Flags: SYN)
May 22 05:34:37.541: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: SYN)
May 22 05:34:37.541: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK)
May 22 05:34:37.541: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) <> 172.x.x.x (host) pre-xlate-rev TRACED (TCP)
May 22 05:34:37.542: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK, PSH)
May 22 05:34:37.725: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) http-request FORWARDED (HTTP/1.1 GET http://dummyjson.com:443/products/1)
May 22 05:34:38.007: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) http-request FORWARDED (HTTP/1.1 GET http://dummyjson.com:443/products/1)
May 22 05:34:38.319: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) http-request FORWARDED (HTTP/1.1 GET http://dummyjson.com:443/products/1)
May 22 05:34:38.649: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) http-request FORWARDED (HTTP/1.1 GET http://dummyjson.com:443/products/1)
May 22 05:34:38.788: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK, FIN)
May 22 05:34:38.788: cilium-poc/client-74987657df-4t9jr:34812 (ID:296361) -> dummyjson.com:443 (ID:16777320) to-proxy FORWARDED (TCP Flags: ACK)
Debug logs of the envoy-proxy for the http://:443 call
[2024-05-22 06:12:42.629][14][trace][filter] [cilium/network_filter.cc:99] [Tags: "ConnectionId":"4225"] cilium.network: SNI: dummyjson.com

related to the call

[2024-05-22 05:28:04.102][16][trace][filter] [external/envoy/source/extensions/filters/listener/tls_inspector/tls_inspector.cc:106] tls inspector: new connection accepted
[2024-05-22 05:28:04.103][16][trace][misc] [external/envoy/source/common/network/tcp_listener_impl.cc:116] TcpListener accepted 1 new connections.
[2024-05-22 05:28:04.103][16][trace][filter] [external/envoy/source/common/network/listener_filter_buffer_impl.cc:95] onFileEvent: 1
[2024-05-22 05:28:04.103][16][trace][filter] [external/envoy/source/common/network/listener_filter_buffer_impl.cc:60] recv returned: 90
[2024-05-22 05:28:04.104][16][trace][filter] [external/envoy/source/extensions/filters/listener/tls_inspector/tls_inspector.cc:146] tls inspector: recv: 90
[2024-05-22 05:28:04.105][16][debug][misc] [cilium/bpf_metadata.cc:297] EGRESS POD IP: 172.x.x.x, destination IP: 104.196.232.237
[2024-05-22 05:28:04.105][16][debug][filter] [cilium/conntrack.cc:178] cilium.bpf_metadata: Using conntrack map global
[2024-05-22 05:28:04.105][16][trace][filter] [cilium/conntrack.cc:197] cilium.bpf_metadata: Looking up key: 68c4e8ed, ac1f756f, 1bb, c7ea, 6, 0
[2024-05-22 05:28:04.106][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.107][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.107][16][debug][filter] [./cilium/socket_option.h:239] Cilium SocketOption(): source_identity: 296361, ingress: false, port: 443, pod_ip: 172.x.x.x, source_addresses: //, mark: 85a90b04 (magic mark: b00, cluster: 4, ID: 34217), proxy_id: 0
[2024-05-22 05:28:04.107][16][trace][misc] [external/envoy/source/common/event/scaled_range_timer_manager_impl.cc:60] enableTimer called on 0x6e37f4968c0 for 3600000ms, min is 3600000ms
[2024-05-22 05:28:04.107][16][debug][filter] [cilium/network_filter.cc:76] cilium.network: onNewConnection
[2024-05-22 05:28:04.107][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4015"] raising connection event 2
[2024-05-22 05:28:04.107][16][debug][conn_handler] [external/envoy/source/extensions/listener_managers/listener_manager/active_tcp_listener.cc:159] [Tags: "ConnectionId":"4015"] new connection from 172.x.x.x:51178
[2024-05-22 05:28:04.108][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:04.108][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=1)
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4015"] socket event: 3
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4015"] write ready
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4015"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4015"] read returns: 90
[2024-05-22 05:28:04.108][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:39] [Tags: "ConnectionId":"4015"] read error: Resource temporarily unavailable, code: 0
[2024-05-22 05:28:04.108][16][trace][filter] [cilium/network_filter.cc:196] cilium.network: onData 90 bytes, end_stream: false
[2024-05-22 05:28:04.108][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4015"] parsing 90 bytes
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:590] [Tags: "ConnectionId":"4015"] message begin
[2024-05-22 05:28:04.109][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:391] [Tags: "ConnectionId":"4015"] new stream
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:547] [Tags: "ConnectionId":"4015"] completed header: key=Host value=dummyjson.com:443
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:547] [Tags: "ConnectionId":"4015"] completed header: key=User-Agent value=curl/8.5.0
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:841] [Tags: "ConnectionId":"4015"] onHeadersCompleteImpl
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:547] [Tags: "ConnectionId":"4015"] completed header: key=Accept value=*/*
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:1195] [Tags: "ConnectionId":"4015"] Server: onHeadersComplete size=3
[2024-05-22 05:28:04.109][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:945] [Tags: "ConnectionId":"4015"] message complete
[2024-05-22 05:28:04.109][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1194] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] request headers complete (end_stream=true):
':authority', 'dummyjson.com:443'
':path', '/products/1'
':method', 'GET'
'user-agent', 'curl/8.5.0'
'accept', '*/*'

[2024-05-22 05:28:04.109][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1177] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] request end stream
[2024-05-22 05:28:04.110][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4015"] current connecting state: false
[2024-05-22 05:28:04.111][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=cilium.l7policy status=0
[2024-05-22 05:28:04.111][16][debug][router] [external/envoy/source/common/router/router.cc:520] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] cluster 'egress-cluster' match for URL '/products/1'
[2024-05-22 05:28:04.111][16][debug][upstream] [external/envoy/source/common/upstream/upstream_impl.cc:426] transport socket match, socket default selected for host with address 104.196.232.237:443
[2024-05-22 05:28:04.111][16][debug][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:81] Created host egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:04.111][16][trace][filter] [cilium/network_filter.cc:123] cilium.network: in upstream callback
[2024-05-22 05:28:04.111][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.118][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.118][16][debug][router] [external/envoy/source/common/router/router.cc:740] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] router decoding headers:
':authority', 'dummyjson.com:443'
':path', '/products/1'
':method', 'GET'
':scheme', 'http'
'user-agent', 'curl/8.5.0'
'accept', '*/*'
'x-forwarded-proto', 'http'
'x-envoy-internal', 'true'
'x-request-id', '7515c927-a261-4f73-a9e2-b3d18f3f6490'
'x-envoy-expected-rq-timeout-ms', '3600000'

[2024-05-22 05:28:04.119][16][debug][pool] [external/envoy/source/common/http/conn_pool_base.cc:78] queueing stream due to no available connections (ready=0 busy=0 connecting=0)
[2024-05-22 05:28:04.119][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:291] trying to create new connection
[2024-05-22 05:28:04.119][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:292] ConnPoolImplBase 0x6e37fdcf0e0, ready_clients_.size(): 0, busy_clients_.size(): 0, connecting_clients_.size(): 0, connecting_stream_capacity_: 0, num_active_streams_: 0, pending_streams_.size(): 1 per upstream preconnect ratio: 1
[2024-05-22 05:28:04.119][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:145] creating a new connection (connecting=0)
[2024-05-22 05:28:04.118][7][debug][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:231] addHost() adding egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:04.121][16][trace][filter] [./cilium/socket_option.h:146] Set socket (228) option SO_REUSEPORT
[2024-05-22 05:28:04.121][16][trace][filter] [./cilium/socket_option.h:159] Set socket (228) option SO_MARK to 85a90b04 (magic mark: b00, id: 34217, cluster: 4), src: 
[2024-05-22 05:28:04.121][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4016"] current connecting state: true
[2024-05-22 05:28:04.121][16][debug][client] [external/envoy/source/common/http/codec_client.cc:57] [Tags: "ConnectionId":"4016"] connecting
[2024-05-22 05:28:04.121][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1009] [Tags: "ConnectionId":"4016"] connecting to 104.196.232.237:443
[2024-05-22 05:28:04.121][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1028] [Tags: "ConnectionId":"4016"] connection in progress
[2024-05-22 05:28:04.121][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:131] not creating a new connection, shouldCreateNewConnection returned false.
[2024-05-22 05:28:04.122][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.upstream_codec status=4
[2024-05-22 05:28:04.122][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.router status=1
[2024-05-22 05:28:04.122][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4015"] parsed 90 bytes
[2024-05-22 05:28:04.125][7][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1474] membership update for TLS cluster egress-cluster added 1 removed 0
[2024-05-22 05:28:04.125][14][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1474] membership update for TLS cluster egress-cluster added 1 removed 0
[2024-05-22 05:28:04.126][14][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1480] re-creating local LB for TLS cluster egress-cluster
[2024-05-22 05:28:04.126][7][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1480] re-creating local LB for TLS cluster egress-cluster
[2024-05-22 05:28:04.125][16][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1474] membership update for TLS cluster egress-cluster added 1 removed 0
[2024-05-22 05:28:04.127][16][debug][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:1480] re-creating local LB for TLS cluster egress-cluster
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4016"] socket event: 2
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4016"] write ready
[2024-05-22 05:28:04.258][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:746] [Tags: "ConnectionId":"4016"] connected
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4016"] raising connection event 2
[2024-05-22 05:28:04.258][16][debug][client] [external/envoy/source/common/http/codec_client.cc:88] [Tags: "ConnectionId":"4016"] connected
[2024-05-22 05:28:04.258][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:328] [Tags: "ConnectionId":"4016"] attaching to next stream
[2024-05-22 05:28:04.258][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:182] [Tags: "ConnectionId":"4016"] creating stream
[2024-05-22 05:28:04.258][16][debug][router] [external/envoy/source/common/router/upstream_request.cc:581] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] pool ready
[2024-05-22 05:28:04.258][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.258][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.258][16][trace][config] [cilium/network_policy.cc:597] Cilium L7 PortNetworkPolicyRules(): ALLOWED
[2024-05-22 05:28:04.258][16][trace][config] [cilium/network_policy.cc:606] Cilium L7 PortNetworkPolicyRules(): returning true
[2024-05-22 05:28:04.258][16][debug][filter] [cilium/l7policy.cc:152] cilium.l7policy: egress (296361->16777320) policy lookup for endpoint 172.x.x.x for port 443: ALLOW
[2024-05-22 05:28:04.258][16][trace][router] [external/envoy/source/common/router/upstream_codec_filter.cc:70] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] proxying headers
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4016"] writing 232 bytes, end_stream false
[2024-05-22 05:28:04.258][16][debug][client] [external/envoy/source/common/http/codec_client.cc:141] [Tags: "ConnectionId":"4016"] encode complete
[2024-05-22 05:28:04.258][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:68] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] continuing filter chain: filter=0x6e37f5cc000
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4016"] write ready
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4016"] write returns: 232
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4016"] socket event: 2
[2024-05-22 05:28:04.258][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4016"] write ready
[2024-05-22 05:28:04.395][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4016"] socket event: 3
[2024-05-22 05:28:04.395][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4016"] write ready
[2024-05-22 05:28:04.396][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4016"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:04.396][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4016"] read returns: 0
[2024-05-22 05:28:04.397][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4016"] remote close
[2024-05-22 05:28:04.399][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4016"] closing socket: 0
[2024-05-22 05:28:04.401][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4016"] raising connection event 0
[2024-05-22 05:28:04.402][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4016"] parsing 0 bytes
[2024-05-22 05:28:04.402][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4016"] parsed 0 bytes
[2024-05-22 05:28:04.403][16][debug][client] [external/envoy/source/common/http/codec_client.cc:107] [Tags: "ConnectionId":"4016"] disconnect. resetting 1 pending requests
[2024-05-22 05:28:04.404][16][debug][client] [external/envoy/source/common/http/codec_client.cc:158] [Tags: "ConnectionId":"4016"] request reset
[2024-05-22 05:28:04.405][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:04.406][16][debug][router] [external/envoy/source/common/router/router.cc:1332] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] upstream reset: reset reason: connection termination, transport failure reason: 
[2024-05-22 05:28:04.406][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:04.407][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=3)
[2024-05-22 05:28:04.408][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:484] [Tags: "ConnectionId":"4016"] client disconnected, failure reason: 
[2024-05-22 05:28:04.408][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=4)
[2024-05-22 05:28:04.409][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=5)
[2024-05-22 05:28:04.409][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:454] invoking idle callbacks - is_draining_for_deletion_=false
[2024-05-22 05:28:04.410][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2030] Erasing idle pool for host egress-cluster104.196.232.237:443
[2024-05-22 05:28:04.411][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=6)
[2024-05-22 05:28:04.411][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2037] Pool container empty for host egress-cluster104.196.232.237:443, erasing host entry
[2024-05-22 05:28:04.411][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=6)
[2024-05-22 05:28:04.411][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"4016"] destroying stream: 0 remaining
[2024-05-22 05:28:04.425][16][debug][router] [external/envoy/source/common/router/router.cc:1912] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] performing retry
[2024-05-22 05:28:04.425][16][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:65] Using existing host egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:04.425][16][debug][pool] [external/envoy/source/common/http/conn_pool_base.cc:78] queueing stream due to no available connections (ready=0 busy=0 connecting=0)
[2024-05-22 05:28:04.426][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:291] trying to create new connection
[2024-05-22 05:28:04.426][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:292] ConnPoolImplBase 0x6e37fdcf0e0, ready_clients_.size(): 0, busy_clients_.size(): 0, connecting_clients_.size(): 0, connecting_stream_capacity_: 0, num_active_streams_: 0, pending_streams_.size(): 1 per upstream preconnect ratio: 1
[2024-05-22 05:28:04.426][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:145] creating a new connection (connecting=0)
[2024-05-22 05:28:04.426][16][trace][filter] [./cilium/socket_option.h:146] Set socket (228) option SO_REUSEPORT
[2024-05-22 05:28:04.426][16][trace][filter] [./cilium/socket_option.h:159] Set socket (228) option SO_MARK to 85a90b04 (magic mark: b00, id: 34217, cluster: 4), src: 
[2024-05-22 05:28:04.427][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4017"] current connecting state: true
[2024-05-22 05:28:04.427][16][debug][client] [external/envoy/source/common/http/codec_client.cc:57] [Tags: "ConnectionId":"4017"] connecting
[2024-05-22 05:28:04.427][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1009] [Tags: "ConnectionId":"4017"] connecting to 104.196.232.237:443
[2024-05-22 05:28:04.429][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1028] [Tags: "ConnectionId":"4017"] connection in progress
[2024-05-22 05:28:04.429][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:131] not creating a new connection, shouldCreateNewConnection returned false.
[2024-05-22 05:28:04.429][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.upstream_codec status=4
[2024-05-22 05:28:04.565][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4017"] socket event: 2
[2024-05-22 05:28:04.565][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4017"] write ready
[2024-05-22 05:28:04.565][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:746] [Tags: "ConnectionId":"4017"] connected
[2024-05-22 05:28:04.565][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4017"] raising connection event 2
[2024-05-22 05:28:04.565][16][debug][client] [external/envoy/source/common/http/codec_client.cc:88] [Tags: "ConnectionId":"4017"] connected
[2024-05-22 05:28:04.565][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:328] [Tags: "ConnectionId":"4017"] attaching to next stream
[2024-05-22 05:28:04.565][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:182] [Tags: "ConnectionId":"4017"] creating stream
[2024-05-22 05:28:04.565][16][debug][router] [external/envoy/source/common/router/upstream_request.cc:581] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] pool ready
[2024-05-22 05:28:04.565][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.567][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.567][16][trace][config] [cilium/network_policy.cc:597] Cilium L7 PortNetworkPolicyRules(): ALLOWED
[2024-05-22 05:28:04.567][16][trace][config] [cilium/network_policy.cc:606] Cilium L7 PortNetworkPolicyRules(): returning true
[2024-05-22 05:28:04.567][16][debug][filter] [cilium/l7policy.cc:152] cilium.l7policy: egress (296361->16777320) policy lookup for endpoint 172.x.x.x for port 443: ALLOW
[2024-05-22 05:28:04.568][16][trace][router] [external/envoy/source/common/router/upstream_codec_filter.cc:70] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] proxying headers
[2024-05-22 05:28:04.569][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4017"] writing 232 bytes, end_stream false
[2024-05-22 05:28:04.569][16][debug][client] [external/envoy/source/common/http/codec_client.cc:141] [Tags: "ConnectionId":"4017"] encode complete
[2024-05-22 05:28:04.569][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:68] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] continuing filter chain: filter=0x6e37f5cc000
[2024-05-22 05:28:04.569][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4017"] write ready
[2024-05-22 05:28:04.570][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4017"] write returns: 232
[2024-05-22 05:28:04.570][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4017"] socket event: 2
[2024-05-22 05:28:04.570][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4017"] write ready
[2024-05-22 05:28:04.706][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4017"] socket event: 3
[2024-05-22 05:28:04.707][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4017"] write ready
[2024-05-22 05:28:04.707][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4017"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:04.707][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4017"] read returns: 0
[2024-05-22 05:28:04.708][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4017"] remote close
[2024-05-22 05:28:04.711][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4017"] closing socket: 0
[2024-05-22 05:28:04.711][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4017"] raising connection event 0
[2024-05-22 05:28:04.712][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4017"] parsing 0 bytes
[2024-05-22 05:28:04.713][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4017"] parsed 0 bytes
[2024-05-22 05:28:04.714][16][debug][client] [external/envoy/source/common/http/codec_client.cc:107] [Tags: "ConnectionId":"4017"] disconnect. resetting 1 pending requests
[2024-05-22 05:28:04.715][16][debug][client] [external/envoy/source/common/http/codec_client.cc:158] [Tags: "ConnectionId":"4017"] request reset
[2024-05-22 05:28:04.715][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:04.722][16][debug][router] [external/envoy/source/common/router/router.cc:1332] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] upstream reset: reset reason: connection termination, transport failure reason: 
[2024-05-22 05:28:04.722][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=3)
[2024-05-22 05:28:04.723][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:484] [Tags: "ConnectionId":"4017"] client disconnected, failure reason: 
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=4)
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=5)
[2024-05-22 05:28:04.723][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:454] invoking idle callbacks - is_draining_for_deletion_=false
[2024-05-22 05:28:04.723][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2030] Erasing idle pool for host egress-cluster104.196.232.237:443
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=6)
[2024-05-22 05:28:04.723][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2037] Pool container empty for host egress-cluster104.196.232.237:443, erasing host entry
[2024-05-22 05:28:04.723][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=6)
[2024-05-22 05:28:04.723][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"4017"] destroying stream: 0 remaining
[2024-05-22 05:28:04.743][16][debug][router] [external/envoy/source/common/router/router.cc:1912] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] performing retry
[2024-05-22 05:28:04.743][16][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:65] Using existing host egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:04.743][16][debug][pool] [external/envoy/source/common/http/conn_pool_base.cc:78] queueing stream due to no available connections (ready=0 busy=0 connecting=0)
[2024-05-22 05:28:04.743][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:291] trying to create new connection
[2024-05-22 05:28:04.743][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:292] ConnPoolImplBase 0x6e37fdcf0e0, ready_clients_.size(): 0, busy_clients_.size(): 0, connecting_clients_.size(): 0, connecting_stream_capacity_: 0, num_active_streams_: 0, pending_streams_.size(): 1 per upstream preconnect ratio: 1
[2024-05-22 05:28:04.743][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:145] creating a new connection (connecting=0)
[2024-05-22 05:28:04.743][16][trace][filter] [./cilium/socket_option.h:146] Set socket (228) option SO_REUSEPORT
[2024-05-22 05:28:04.743][16][trace][filter] [./cilium/socket_option.h:159] Set socket (228) option SO_MARK to 85a90b04 (magic mark: b00, id: 34217, cluster: 4), src: 
[2024-05-22 05:28:04.743][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4018"] current connecting state: true
[2024-05-22 05:28:04.743][16][debug][client] [external/envoy/source/common/http/codec_client.cc:57] [Tags: "ConnectionId":"4018"] connecting
[2024-05-22 05:28:04.743][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1009] [Tags: "ConnectionId":"4018"] connecting to 104.196.232.237:443
[2024-05-22 05:28:04.743][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1028] [Tags: "ConnectionId":"4018"] connection in progress
[2024-05-22 05:28:04.744][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:131] not creating a new connection, shouldCreateNewConnection returned false.
[2024-05-22 05:28:04.744][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.upstream_codec status=4
[2024-05-22 05:28:04.881][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4018"] socket event: 2
[2024-05-22 05:28:04.881][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4018"] write ready
[2024-05-22 05:28:04.881][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:746] [Tags: "ConnectionId":"4018"] connected
[2024-05-22 05:28:04.881][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4018"] raising connection event 2
[2024-05-22 05:28:04.881][16][debug][client] [external/envoy/source/common/http/codec_client.cc:88] [Tags: "ConnectionId":"4018"] connected
[2024-05-22 05:28:04.881][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:328] [Tags: "ConnectionId":"4018"] attaching to next stream
[2024-05-22 05:28:04.881][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:182] [Tags: "ConnectionId":"4018"] creating stream
[2024-05-22 05:28:04.881][16][debug][router] [external/envoy/source/common/router/upstream_request.cc:581] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] pool ready
[2024-05-22 05:28:04.881][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:04.882][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:04.882][16][trace][config] [cilium/network_policy.cc:597] Cilium L7 PortNetworkPolicyRules(): ALLOWED
[2024-05-22 05:28:04.883][16][trace][config] [cilium/network_policy.cc:606] Cilium L7 PortNetworkPolicyRules(): returning true
[2024-05-22 05:28:04.883][16][debug][filter] [cilium/l7policy.cc:152] cilium.l7policy: egress (296361->16777320) policy lookup for endpoint 172.x.x.x for port 443: ALLOW
[2024-05-22 05:28:04.885][16][trace][router] [external/envoy/source/common/router/upstream_codec_filter.cc:70] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] proxying headers
[2024-05-22 05:28:04.885][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4018"] writing 232 bytes, end_stream false
[2024-05-22 05:28:04.885][16][debug][client] [external/envoy/source/common/http/codec_client.cc:141] [Tags: "ConnectionId":"4018"] encode complete
[2024-05-22 05:28:04.886][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:68] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] continuing filter chain: filter=0x6e37f5cc000
[2024-05-22 05:28:04.886][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4018"] write ready
[2024-05-22 05:28:04.886][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4018"] write returns: 232
[2024-05-22 05:28:04.886][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4018"] socket event: 2
[2024-05-22 05:28:04.886][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4018"] write ready
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4018"] socket event: 3
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4018"] write ready
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4018"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4018"] read returns: 0
[2024-05-22 05:28:05.023][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4018"] remote close
[2024-05-22 05:28:05.023][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4018"] closing socket: 0
[2024-05-22 05:28:05.023][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4018"] raising connection event 0
[2024-05-22 05:28:05.023][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4018"] parsing 0 bytes
[2024-05-22 05:28:05.023][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4018"] parsed 0 bytes
[2024-05-22 05:28:05.023][16][debug][client] [external/envoy/source/common/http/codec_client.cc:107] [Tags: "ConnectionId":"4018"] disconnect. resetting 1 pending requests
[2024-05-22 05:28:05.024][16][debug][client] [external/envoy/source/common/http/codec_client.cc:158] [Tags: "ConnectionId":"4018"] request reset
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:05.024][16][debug][router] [external/envoy/source/common/router/router.cc:1332] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] upstream reset: reset reason: connection termination, transport failure reason: 
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=3)
[2024-05-22 05:28:05.024][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:484] [Tags: "ConnectionId":"4018"] client disconnected, failure reason: 
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=4)
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=5)
[2024-05-22 05:28:05.024][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:454] invoking idle callbacks - is_draining_for_deletion_=false
[2024-05-22 05:28:05.024][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2030] Erasing idle pool for host egress-cluster104.196.232.237:443
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=6)
[2024-05-22 05:28:05.024][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2037] Pool container empty for host egress-cluster104.196.232.237:443, erasing host entry
[2024-05-22 05:28:05.024][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=6)
[2024-05-22 05:28:05.024][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"4018"] destroying stream: 0 remaining
[2024-05-22 05:28:05.061][16][debug][router] [external/envoy/source/common/router/router.cc:1912] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] performing retry
[2024-05-22 05:28:05.061][16][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:65] Using existing host egress-cluster104.196.232.237:443 104.196.232.237:443.
[2024-05-22 05:28:05.063][16][debug][pool] [external/envoy/source/common/http/conn_pool_base.cc:78] queueing stream due to no available connections (ready=0 busy=0 connecting=0)
[2024-05-22 05:28:05.063][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:291] trying to create new connection
[2024-05-22 05:28:05.063][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:292] ConnPoolImplBase 0x6e37fdcf0e0, ready_clients_.size(): 0, busy_clients_.size(): 0, connecting_clients_.size(): 0, connecting_stream_capacity_: 0, num_active_streams_: 0, pending_streams_.size(): 1 per upstream preconnect ratio: 1
[2024-05-22 05:28:05.063][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:145] creating a new connection (connecting=0)
[2024-05-22 05:28:05.064][16][trace][filter] [./cilium/socket_option.h:146] Set socket (228) option SO_REUSEPORT
[2024-05-22 05:28:05.065][16][trace][filter] [./cilium/socket_option.h:159] Set socket (228) option SO_MARK to 85a90b04 (magic mark: b00, id: 34217, cluster: 4), src: 
[2024-05-22 05:28:05.065][16][debug][connection] [external/envoy/source/common/network/connection_impl.h:98] [Tags: "ConnectionId":"4019"] current connecting state: true
[2024-05-22 05:28:05.065][16][debug][client] [external/envoy/source/common/http/codec_client.cc:57] [Tags: "ConnectionId":"4019"] connecting
[2024-05-22 05:28:05.065][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1009] [Tags: "ConnectionId":"4019"] connecting to 104.196.232.237:443
[2024-05-22 05:28:05.065][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:1028] [Tags: "ConnectionId":"4019"] connection in progress
[2024-05-22 05:28:05.065][16][trace][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:131] not creating a new connection, shouldCreateNewConnection returned false.
[2024-05-22 05:28:05.065][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:572] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] decode headers called: filter=envoy.filters.http.upstream_codec status=4
[2024-05-22 05:28:05.203][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4019"] socket event: 2
[2024-05-22 05:28:05.204][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4019"] write ready
[2024-05-22 05:28:05.205][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:746] [Tags: "ConnectionId":"4019"] connected
[2024-05-22 05:28:05.206][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4019"] raising connection event 2
[2024-05-22 05:28:05.206][16][debug][client] [external/envoy/source/common/http/codec_client.cc:88] [Tags: "ConnectionId":"4019"] connected
[2024-05-22 05:28:05.207][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:328] [Tags: "ConnectionId":"4019"] attaching to next stream
[2024-05-22 05:28:05.208][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:182] [Tags: "ConnectionId":"4019"] creating stream
[2024-05-22 05:28:05.209][16][debug][router] [external/envoy/source/common/router/upstream_request.cc:581] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] pool ready
[2024-05-22 05:28:05.209][16][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 05:28:05.210][16][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777320
[2024-05-22 05:28:05.211][16][trace][config] [cilium/network_policy.cc:597] Cilium L7 PortNetworkPolicyRules(): ALLOWED
[2024-05-22 05:28:05.212][16][trace][config] [cilium/network_policy.cc:606] Cilium L7 PortNetworkPolicyRules(): returning true
[2024-05-22 05:28:05.212][16][debug][filter] [cilium/l7policy.cc:152] cilium.l7policy: egress (296361->16777320) policy lookup for endpoint 172.x.x.x for port 443: ALLOW
[2024-05-22 05:28:05.214][16][trace][router] [external/envoy/source/common/router/upstream_codec_filter.cc:70] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] proxying headers
[2024-05-22 05:28:05.215][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4019"] writing 232 bytes, end_stream false
[2024-05-22 05:28:05.216][16][debug][client] [external/envoy/source/common/http/codec_client.cc:141] [Tags: "ConnectionId":"4019"] encode complete
[2024-05-22 05:28:05.216][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:68] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] continuing filter chain: filter=0x6e37f5cc000
[2024-05-22 05:28:05.216][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4019"] write ready
[2024-05-22 05:28:05.217][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4019"] write returns: 232
[2024-05-22 05:28:05.217][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4019"] socket event: 2
[2024-05-22 05:28:05.217][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4019"] write ready
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4019"] socket event: 3
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4019"] write ready
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4019"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4019"] read returns: 0
[2024-05-22 05:28:05.354][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4019"] remote close
[2024-05-22 05:28:05.354][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4019"] closing socket: 0
[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4019"] raising connection event 0
[2024-05-22 05:28:05.354][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:648] [Tags: "ConnectionId":"4019"] parsing 0 bytes
[2024-05-22 05:28:05.354][16][trace][http] [external/envoy/source/common/http/http1/codec_impl.cc:698] [Tags: "ConnectionId":"4019"] parsed 0 bytes
[2024-05-22 05:28:05.354][16][debug][client] [external/envoy/source/common/http/codec_client.cc:107] [Tags: "ConnectionId":"4019"] disconnect. resetting 1 pending requests
[2024-05-22 05:28:05.354][16][debug][client] [external/envoy/source/common/http/codec_client.cc:158] [Tags: "ConnectionId":"4019"] request reset
[2024-05-22 05:28:05.354][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:05.354][16][debug][router] [external/envoy/source/common/router/router.cc:1332] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] upstream reset: reset reason: connection termination, transport failure reason: 
[2024-05-22 05:28:05.354][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:05.354][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=3)
[2024-05-22 05:28:05.354][16][debug][http] [external/envoy/source/common/http/filter_manager.cc:1035] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] Sending local reply with details upstream_reset_before_response_started{connection_termination}
[2024-05-22 05:28:05.354][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:1208] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] encode headers called: filter=cilium.l7policy status=0
[2024-05-22 05:28:05.354][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1863] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] encoding headers via codec (end_stream=false):
':status', '503'
'content-length', '118'
'content-type', 'text/plain'
'date', 'Wed, 22 May 2024 05:28:04 GMT'
'server', 'envoy'

[2024-05-22 05:28:05.354][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4015"] writing 135 bytes, end_stream false
[2024-05-22 05:28:05.354][16][trace][http] [external/envoy/source/common/http/filter_manager.cc:1393] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] encode data called: filter=cilium.l7policy status=0
[2024-05-22 05:28:05.355][16][trace][http] [external/envoy/source/common/http/conn_manager_impl.cc:1886] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] encoding data via codec (size=118 end_stream=true)
[2024-05-22 05:28:05.355][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:529] [Tags: "ConnectionId":"4015"] writing 118 bytes, end_stream false
[2024-05-22 05:28:05.355][16][debug][http] [external/envoy/source/common/http/conn_manager_impl.cc:1968] [Tags: "ConnectionId":"4015","StreamId":"13691756278743497070"] Codec completed encoding stream.
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=4)
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=5)
[2024-05-22 05:28:05.355][16][trace][misc] [external/envoy/source/common/event/scaled_range_timer_manager_impl.cc:60] enableTimer called on 0x6e37f4968c0 for 3600000ms, min is 3600000ms
[2024-05-22 05:28:05.355][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:484] [Tags: "ConnectionId":"4019"] client disconnected, failure reason: 
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=6)
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=7)
[2024-05-22 05:28:05.355][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:454] invoking idle callbacks - is_draining_for_deletion_=false
[2024-05-22 05:28:05.355][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2030] Erasing idle pool for host egress-cluster104.196.232.237:443
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=8)
[2024-05-22 05:28:05.355][16][trace][upstream] [external/envoy/source/common/upstream/cluster_manager_impl.cc:2037] Pool container empty for host egress-cluster104.196.232.237:443, erasing host entry
[2024-05-22 05:28:05.355][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=8)
[2024-05-22 05:28:05.355][16][debug][pool] [external/envoy/source/common/conn_pool/conn_pool_base.cc:215] [Tags: "ConnectionId":"4019"] destroying stream: 0 remaining
[2024-05-22 05:28:05.355][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4015"] socket event: 2
[2024-05-22 05:28:05.355][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4015"] write ready
[2024-05-22 05:28:05.355][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:70] [Tags: "ConnectionId":"4015"] write returns: 253
[2024-05-22 05:28:05.359][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:614] [Tags: "ConnectionId":"4015"] socket event: 3
[2024-05-22 05:28:05.359][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:737] [Tags: "ConnectionId":"4015"] write ready
[2024-05-22 05:28:05.359][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:654] [Tags: "ConnectionId":"4015"] read ready. dispatch_buffered_data=0
[2024-05-22 05:28:05.359][16][trace][connection] [external/envoy/source/common/network/raw_buffer_socket.cc:25] [Tags: "ConnectionId":"4015"] read returns: 0
[2024-05-22 05:28:05.360][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:714] [Tags: "ConnectionId":"4015"] remote close
[2024-05-22 05:28:05.360][16][debug][connection] [external/envoy/source/common/network/connection_impl.cc:278] [Tags: "ConnectionId":"4015"] closing socket: 0
[2024-05-22 05:28:05.360][16][trace][connection] [external/envoy/source/common/network/connection_impl.cc:469] [Tags: "ConnectionId":"4015"] raising connection event 0
[2024-05-22 05:28:05.360][16][trace][conn_handler] [external/envoy/source/extensions/listener_managers/listener_manager/active_stream_listener_base.cc:125] [Tags: "ConnectionId":"4015"] tcp connection on event 0
[2024-05-22 05:28:05.360][16][debug][conn_handler] [external/envoy/source/extensions/listener_managers/listener_manager/active_stream_listener_base.cc:135] [Tags: "ConnectionId":"4015"] adding to cleanup list
[2024-05-22 05:28:05.360][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=1)
[2024-05-22 05:28:05.360][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:242] item added to deferred deletion list (size=2)
[2024-05-22 05:28:05.360][16][trace][main] [external/envoy/source/common/event/dispatcher_impl.cc:124] clearing deferred deletion list (size=2)
[2024-05-22 05:28:05.759][7][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:250] Cleaning up stale original dst hosts.
[2024-05-22 05:28:05.759][7][trace][upstream] [external/envoy/source/extensions/clusters/original_dst/original_dst_cluster.cc:289] Keeping active address 104.196.232.237:443.
[2024-05-22 05:28:06.626][7][debug][main] [external/envoy/source/server/server.cc:263] flushing stats

The multiple DNS lookups because the ndots is set to 5 by default shouldn't be a problem as it also does this without the http rule. Moreover, we do not have IPv6 enabled and in both cases it tries to do the lookup but returns ' ' as the answer for the AAAA record.

Edit;

Noticeable is the http-request FORWARDED recorded with hubble when using http://<addr>:443 which is not visible using just https.

I agree this is a misconfiguration for TLS somewhere in the envoy proxy, but that's what the cilium agents needs to set using my CiliumNetworkPolicy definition.

the complete CiliumNetworkPolicy
apiVersion: "cilium.io/v2"
kind: CiliumNetworkPolicy
metadata:
  name: "client"
  namespace: cilium-poc
spec:
  endpointSelector:
    matchLabels:
      "k8s:app": client
          
  egress:
  - toPorts:
    - ports:
      - port: "53"
        protocol: ANY
      rules:
        dns:
        - matchPattern: "*"

  - toFQDNs:
      - matchName: "dummyjson.com"
    toPorts:
    - ports:
      - port: '443'
        protocol: TCP   
      rules:
        http:
        - {}

@github-actions github-actions bot removed the need-more-info More information is required to further debug or fix the issue. label May 22, 2024
@sjoukedv
Copy link
Author

sjoukedv commented May 22, 2024

I also see a lot of these

$ k logs cilium-envoy-8cbsd -n kube-system -f | grep -i 'cilium.tls'
[2024-05-22 05:24:27.535][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 05:24:30.602][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 05:24:33.688][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 05:24:35.845][16][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 05:24:36.758][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket

coming from https://github.com/cilium/proxy/blob/main/cilium/tls_wrapper.cc#L66-69

@lmb
Copy link
Contributor

lmb commented May 22, 2024

@lmb lmb added the need-more-info More information is required to further debug or fix the issue. label May 22, 2024
@sjoukedv
Copy link
Author

What does your terminatingTLS block look like? https://docs.cilium.io/en/stable/security/tls-visibility/#apply-dns-and-tls-aware-egress-policy

@lmb it's empty currently. I tried setting it earlier, but perhaps cilium did not clean up this configuration from envoy? I already tried re-adding the policy and restarting every component afterwards. Are there more details I can provide?

I have two use cases which I need to validate on our cluster which are working in the labs:

  1. Egress L7 filtering to an external HTTPS endpoint
  2. Adding our own CA (that's what I tried with originatingTLS)

These are the corresponding envoy logs for this call

/ # curl -v --max-time 30 https://dummyjson.com/products/1
* Host dummyjson.com:443 was resolved.
* IPv6: (none)
* IPv4: 104.196.232.237, 104.196.232.237
*   Trying 104.196.232.237:443...
* Connected to dummyjson.com (104.196.232.237) port 443
* ALPN: curl offers h2,http/1.1
* TLSv1.3 (OUT), TLS handshake, Client hello (1):
*  CAfile: /etc/ssl/certs/ca-certificates.crt
*  CApath: /etc/ssl/certs
* OpenSSL/3.1.4: error:0A00010B:SSL routines::wrong version number
* Closing connection
curl: (35) OpenSSL/3.1.4: error:0A00010B:SSL routines::wrong version number
[2024-05-22 09:41:56.311][14][debug][misc] [cilium/bpf_metadata.cc:297] EGRESS POD IP: 172.x.x.x, destination IP: 104.196.232.237
[2024-05-22 09:41:56.311][14][debug][filter] [cilium/conntrack.cc:178] cilium.bpf_metadata: Using conntrack map global
[2024-05-22 09:41:56.311][14][trace][filter] [cilium/conntrack.cc:197] cilium.bpf_metadata: Looking up key: 68c4e8ed, ac1f711c, 1bb, b23c, 6, 0
[2024-05-22 09:41:56.312][14][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 09:41:56.312][14][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777218
[2024-05-22 09:41:56.313][14][debug][filter] [./cilium/socket_option.h:239] Cilium SocketOption(): source_identity: 291852, ingress: false, port: 443, pod_ip: 172.x.x.x, source_addresses: //, mark: 740c0b04 (magic mark: b00, cluster: 4, ID: 29708), proxy_id: 0
[2024-05-22 09:41:56.313][14][trace][filter] [cilium/ipcache.cc:82] cilium.ipcache: Looking up key: 68c4e8ed, prefixlen: 32
[2024-05-22 09:41:56.314][14][debug][filter] [cilium/ipcache.cc:90] cilium.ipcache: 104.196.232.237 has ID 16777218
[2024-05-22 09:41:56.314][14][debug][misc] [cilium/tls_wrapper.cc:86] cilium.tls_wrapper: Could not get server TLS context for port 443, defaulting to raw socket
[2024-05-22 09:41:56.315][14][debug][filter] [cilium/network_filter.cc:76] cilium.network: onNewConnection
[2024-05-22 09:41:56.315][14][trace][filter] [cilium/network_filter.cc:99] [Tags: "ConnectionId":"63"] cilium.network: SNI: dummyjson.com
[2024-05-22 09:41:56.322][14][trace][filter] [cilium/network_filter.cc:196] cilium.network: onData 517 bytes, end_stream: false
[2024-05-22 09:41:56.323][14][debug][router] [cilium/uds_client.cc:56] Cilium access log resetting socket due to error: Connection reset by peer
[2024-05-22 09:41:56.324][14][trace][http] [external/envoy/source/common/http/filter_manager.cc:1208] [Tags: "ConnectionId":"63","StreamId":"7639742152903267136"] encode headers called: filter=cilium.l7policy status=0
[2024-05-22 09:41:56.326][14][trace][http] [external/envoy/source/common/http/filter_manager.cc:1393] [Tags: "ConnectionId":"63","StreamId":"7639742152903267136"] encode data called: filter=cilium.l7policy status=0

From the http://<addr>:443 call I at least get the server: envoy header back

/ # curl -v --max-time 30 http://dummyjson.com:443/products/1
* Host dummyjson.com:443 was resolved.
* IPv6: (none)
* IPv4: 104.196.232.237, 104.196.232.237
*   Trying 104.196.232.237:443...
* Connected to dummyjson.com (104.196.232.237) port 443
> GET /products/1 HTTP/1.1
> Host: dummyjson.com:443
> User-Agent: curl/8.5.0
> Accept: */*
> 
< HTTP/1.1 503 Service Unavailable
< content-length: 118
< content-type: text/plain
< date: Wed, 22 May 2024 09:43:19 GMT
< server: envoy
< 
* Connection #0 to host dummyjson.com left intact
upstream connect error or disconnect/reset before headers. retried and the latest reset reason: connection termination/

@github-actions github-actions bot removed the need-more-info More information is required to further debug or fix the issue. label May 22, 2024
@lmb
Copy link
Contributor

lmb commented May 22, 2024

The problem is that envoy can't match your incoming request with a server-side TLS certificate + key to use (this is the terminatingTLS part I believe). originatingTLS is for outbound connections from envoy to the RoW and I would suggest that you just use ca-certificates as outlined in the doc. My hunch is that you need both stanzas in the manifest.

That said, this seems to me like a configuration / troubleshooting issue and not a bug with Cilium. The issue tracker is not the right venue for this. Either try the Slack or consider an enterprise vendor: https://cilium.io/enterprise/

@lmb lmb closed this as not planned Won't fix, can't repro, duplicate, stale May 22, 2024
@sjoukedv
Copy link
Author

@lmb thank you very much for your time and assistance with the issue.

The good thing is that I have managed to get the https://artii.herokuapp.com/ example to work. I wanted to bring to your attention that it has a valid certificate from a trusted CA (Amazon) and not a self-signed one. Therefore, adding the CA to the OS bundle and then copying the OS bundle to a secret so we can reference it in the policy would not be necessary in theory (?) this should also work without adding the CA that is created in the example since it's not "used" to validate the outgoing TLS connection.

It would be really great to not having to add the terminatingTLS field since this means you need to generate the certificate internally in the cluster for every hostname you are trying to call (or use wildcard self-signed certificates). The feature we are looking for, which is present in istio, is that it abstracts away the need having to add CA's to the OS bundle. Reason this is not prefereable is that we cannot trust the CAs bundled for the OS as quoted from the "tls visibility example"

Also, individual applications may leverage their own certificate stores rather than use the OS certificate store

Moreover, when both terminatingTLS and originatingTLS are not specified, but the rules.http section is present in the policy, enforcing traffic to go through the envoy proxy, I would expect it to use "TLS Passthrough" which appears not to be the case?

Once again thank you for your time and effort.

@networkop
Copy link
Contributor

@sjoukedv have you tried using serverNames?

@sjoukedv
Copy link
Author

@sjoukedv have you tried using serverNames?

Thank you for the suggestion. I gave this a try, but this I reckon this also sends the traffic through the envoy proxy. For a L4 (tcp on 443) I am able to connect just fine as the packets are "to-stack". After adding the serverNames with the same value as the FQDN the TCP traffic still works. When trying to apply the serverNames option in combination with the rules.http setting I get:

error="Invalid CiliumNetworkPolicy spec: ServerNames are not allowed with L7 rules without TLS termination"

I have managed get this to work only with enabling the TLS termination as well as specifying the CA for the origin:

egress:
  - toFQDNs:
        - matchName: my-service.tld
    toPorts:
    - ports:
      - port: '443'
        protocol: TCP   
         
      terminatingTLS:
        secret:
          namespace: "kube-system"
          name: "my-service-tls" # key and cert that matches the hostname(s) of the FQDN

      originatingTLS:
        secret:
          namespace: "cilium-secrets"
          name: "my-service-ca" # CA not for the termination above, but for connecting to the 'origin'

      serverNames: # optional, also works without
        - "my-service.tld"

      rules:
        http:
        - {}

I guess the ServerName settings is used for protecting against "attacks" when you're exposing multiple services behind a single IP address:

curl -v -H "Host: other-service.tld" https://my-service.tld

But... this is where I notice the above request also succeeds while it is not in the list of allowed egress FQDNs.

@networkop
Copy link
Contributor

I don't think you need originatingTLS or http rules for this to work. This field should just give you the ability to whitelist SNIs

@networkop
Copy link
Contributor

have a look at the example from here: https://isovalent.com/blog/post/zero-trust-security-with-cilium/#h-tls-sni

@sjoukedv
Copy link
Author

@networkop thank you for getting back to me. I have seen the example you shared and that is also what I tried:

egress:
  - toFQDNs:
        - matchName: my-service.tld
    toPorts:
    - ports:
      - port: '443'
        protocol: TCP   
         
      serverNames:
        - "my-service.tld"

which works equally well with or without the serverNames section in terms of reaching the 'origin'. In both cases my 'origin' has the self-signed certificate so I need to add the CA in the chain from where I am doing the call from or need to add the originatingTLS / terminatingTLS section in the policy. As the error message suggests if I want to filter this traffic on layer 7 they are even required.

I would expect when specifying the serverNames I would not be able to access other-service.tld if it's not in the array, but I still can. Are you trying to say that is not the expected behaviour?

@networkop
Copy link
Contributor

oh, if you need to actually do L7 (HTTP) policies, then yes, you'd need originating/terminatingTLS.

serverNames is only for whitelisting SNIs which does send the traffic through envoy but only utilizes TLS inspector, not HTTP inspector.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
info-completed The GH issue has received a reply from the author kind/bug This is a bug in the Cilium logic. kind/community-report This was reported by a user in the Cilium community, eg via Slack. needs/triage This issue requires triaging to establish severity and next steps.
Projects
None yet
Development

No branches or pull requests

4 participants