このページでは、Apache Spark 用サーバーレスのバッチ ワークロードとインタラクティブ セッションで発生する一般的なネットワーク接続の問題を診断して解決するためのガイダンスを提供します。これらの問題により、ワークロードが必要なデータソース、外部サービス、 Google Cloud API にアクセスできなくなる可能性があります。
該当する場合、下り(外向き)ルールで、 Google Cloudの外部にあるパブリック API やデータベースなどの外部サービスへのトラフィックを許可する必要があります。バッチ ワークロードまたはセッションでインターネット アクセスが必要な場合は、Cloud NAT を使用してサブネットの下り(外向き)を提供できます。
バッチまたはセッションのサービス アカウントを使用して、バッチまたはセッションのサブネットでテスト用の Compute Engine VM を起動します。
テスト VM から、次の接続テストを実行します。
nslookup storage.googleapis.com を使用して DNS 解決を検証します。bigquery.googleapis.com や dataproc.googleapis.com などの他の Google API ドメインをルックアップします。Serverless for Apache Spark サブネットで自動的に有効になるプライベート Google アクセス、または Private Service Connect を使用する場合、ドメインはプライベート IP アドレスに解決される必要があります。
curl -v https://storage.googleapis.com を使用して、Google API への HTTPS 接続を確認します。他の Google サービスへの接続も試します。
ワークロードが Google Cloudの外部にあるデータベースまたはサービスに接続する場合は、構成を確認します。
外部サービスのファイアウォールまたはセキュリティ グループで、VPC ネットワークの IP 範囲からのインバウンド接続が許可されていることを確認します。該当する場合は、VPC ピアリング、Cloud VPN、Cloud Interconnect、または Cloud NAT の IP アドレスを使用して内部 IP アドレスを確認します。
[[["わかりやすい","easyToUnderstand","thumb-up"],["問題の解決に役立った","solvedMyProblem","thumb-up"],["その他","otherUp","thumb-up"]],[["わかりにくい","hardToUnderstand","thumb-down"],["情報またはサンプルコードが不正確","incorrectInformationOrSampleCode","thumb-down"],["必要な情報 / サンプルがない","missingTheInformationSamplesINeed","thumb-down"],["翻訳に関する問題","translationIssue","thumb-down"],["その他","otherDown","thumb-down"]],["最終更新日 2025-09-04 UTC。"],[],[],null,["# Troubleshoot batch and session connectivity\n\nThis page provides guidance on diagnosing and resolving common network\nconnectivity issues for Serverless for Apache Spark batch workloads and interactive sessions.\nThese issues can prevent your workloads from accessing required data sources,\nexternal services, or Google Cloud APIs.\n\nCommon symptoms and error messages\n----------------------------------\n\nWhen Serverless for Apache Spark encounters connectivity problems, you might\nencounter errors such as:\n\n- `Unable to connect to service_name.googleapis.com`\n- `Could not reach required Google APIs`\n- `Connection refused`\n- `Host unreachable`\n- `Operation timed out`\n- `Permission denied` (often network-related if blocking API calls)\n\nYou might also encounter errors related to accessing data in Cloud Storage,\nBigQuery, or other databases.\n\nCommon causes and troubleshooting tips\n--------------------------------------\n\nThis section lists common causes of Serverless for Apache Spark\nconnectivity issues, and provides troubleshooting tips to help you resolve them.\n\n### Network configuration\n\nNetwork misconfigurations are a frequent cause of connectivity failures.\nServerless for Apache Spark workloads and sessions run on VMs with internal\nIP addresses, with [Private Google Access (PGA)](/vpc/docs/configure-private-google-access)\nautomatically enabled on the workload or session subnet to access\nto Google APIs and services. For more information, see\n[Serverless for Apache Spark network configuration](/dataproc-serverless/docs/concepts/network).\n\n- Access options:\n\n - Private Service Connect (PSC): You can\n [create private endpoints](/vpc/docs/about-accessing-google-apis-endpoints)\n within your VPC network to access specific Google APIs.\n\n - In the Google Cloud console, go to [**Private Service Connect \\\u003e Endpoints**](https://console.cloud.google.com/net-services/psc/list/consumers). Connect endpoints or confirm that endpoints are connected for all required APIs, such as `storage.googleapis.com` and `dataproc.googleapis.com` and that they connect to the batch workload or session Virtual Private Cloud network.\n - Cloud NAT: If your workload needs to access the public\n internet, you can configure Cloud NAT for\n your batch workload or session subnet:\n\n - In the Google Cloud console, go to the [**Cloud NAT**](https://console.cloud.google.com/net-services/nat/list) page. [Configure a gateway](/nat/docs/gce-example) or confirm that a gateway is configured for the batch workload or session VPC network, region, and subnet. Also make sure firewall rules allow egress to `0.0.0.0/0`. For more information, see [Set up Cloud NAT](/nat/docs/gce-example).\n- Firewall rules:\n\n - Egress firewall rules in your VPC network (or shared VPC network host project, if applicable) must not block outbound traffic to required destinations.\n - If applicable, egress rules must allow traffic to external services, such as public APIs and databases outside of Google Cloud. If your batch workload or session needs internet access, you can use a [Cloud NAT](/nat/docs/overview) to provide subnet egress.\n - Although not a common cause of connectivity issues, overly restrictive ingress rules might inadvertently block necessary return traffic or internal communications.\n- DNS resolution:\n\n - DNS resolution must be configured within the VPC network. Workloads and sessions must be able to resolve hostnames for Google APIs, such as `storage.googleapis.com` or`bigquery.googleapis.com` and external services.\n - Custom DNS servers and Cloud DNS private zones must forward or resolve queries for Google domains.\n - If you are using Private Service Connect for private access to Google APIs, DNS records for Google services must resolve to private IP addresses within your VPC network using the PSC endpoint.\n\nTroubleshooting tips:\n\n- Identify network and subnet configuration:\n\n - From Serverless for Apache Spark batch or session details, review the `networkUri` and `subnetUri`.\n - In the Google Cloud console, review the settings for the VPC network and subnet.\n- Test connectivity from a Proxy VM:\n\n - Launch a test Compute Engine VM in the batch or session subnet using the batch or session service account.\n - From the test VM, perform the following connectivity tests:\n - `nslookup storage.googleapis.com` to verify DNS resolution. Lookup other Google API domains, such as `bigquery.googleapis.com`and `dataproc.googleapis.com`. With Private Google Access, which is automatically enabled on Serverless for Apache Spark subnets, or Private Service Connect, the domains must resolve to private IP addresses.\n - `curl -v https://storage.googleapis.com` to verify HTTPS connectivity to Google APIs. Also try connecting to other Google services.\n - `ping 8.8.8.8` to test internet connectivity if required by your batch or session. Try `curl -v https://example.com` if Cloud NAT is expected.\n - Run Google Cloud [Network Intelligence Center connectivity tests](/network-intelligence-center/docs/connectivity-tests/concepts/overview) to diagnose network paths from your subnet to relevant endpoints, such as Google APIs and external IP addresses.\n- Review Cloud Logging for network errors:\n\n - Review Logging for your Serverless for Apache Spark workload or session. Look for `ERROR` or `WARNING` messages related to network timeouts, connection refusals, or API call failures. Filter by `jsonPayload.component=\"driver\"` or `jsonPayload.component=\"executor\"` for Spark-specific network issues.\n\n### IAM permissions\n\nInsufficient IAM permissions can prevent workloads or sessions from\naccessing resources, resulting in network failures if API calls are denied.\n\nThe service account used by your batch workload or session must have required\nroles:\n\n- **Dataproc Worker role** (`roles/dataproc.worker`).\n- Data access roles, such as `roles/storage.objectViewer` or `roles/bigquery.dataViewer`).\n- Logging: (`roles/logging.logWriter`).\n\nTroubleshooting tips:\n\n- Identify the batch workload or session [service account](/dataproc-serverless/docs/concepts/service-account). If not specified, it defaults to the [Compute Engine default service account](/compute/docs/access/service-accounts#default_service_account).\n- Go to the [**IAM \\& Admin \\\u003e IAM**](https://console.cloud.google.com/iam-admin/iam) page in the Google Cloud console, find the batch workload or session service account, and then verify that it has the necessary roles needed for workload operations. Grant any missing roles.\n\n### External service configuration\n\nIf your workload connects to databases or services outside of Google Cloud,\nverify their configuration:\n\n- Verify the external service firewall or security group allows inbound connections from your VPC network IP ranges: if applicable, check internal IP addresses using VPC Peering, Cloud VPN, or Cloud Interconnect, or Cloud NAT IP addresses.\n- Review database credentials or connectivity strings. Check connection details, usernames, and passwords.\n\nWhat's next\n-----------\n\n- Learn about [Serverless for Apache Spark networking](/dataproc-serverless/docs/concepts/network).\n- Review [Serverless for Apache Spark service accounts](/dataproc-serverless/docs/concepts/service-account).\n- Refer to general network troubleshooting guides:\n - [Troubleshoot Dataproc cluster creation issues](/dataproc/docs/support/troubleshoot-cluster-creation)\n - [Troubleshoot Dataproc Metastore connectivity](/dataproc-metastore/docs/troubleshooting-connectivity)"]]