在 Google Cloud Platform 上部署 Hadoop 时遇到问题

在 Google Cloud Platform 上部署 Hadoop 时遇到问题

我在使用以下命令在 Linux 命令行上在 Google Cloud Platform 上部署 Hadoop 时遇到了问题:./bdutil -e platforms/hdp/ambari_env.sh deploy

我一直收到这个错误:

************* gcloud compute stdout *******************
NAME           ZONE           SIZE_GB  TYPE         STATUS
hadoop-w-0-pd  us-central1-a  1500     pd-standard  READY

******************* gcloud compute stderr *******************
Created [https://www.googleapis.com/compute/v1/projects/hadoop-164401/zones/us-central1-a/disks/hadoop-w-0-pd].
INFO: Display format "table(name,
                    zone.basename(),
                    sizeGb,
                    type.basename(),
                    status)".

New disks are unformatted. You must format and mount a disk before it
can be used. You can find instructions on how to do this at:

https://cloud.google.com/compute/docs/disks/add-persistent-disk#formatting

ERROR: (gcloud.compute.disks.create) Could not fetch resource:
 - Quota 'DISKS_TOTAL_GB' exceeded.  Limit: 2048.0

ERROR: (gcloud.compute.disks.create) Could not fetch resource:
 - Quota 'DISKS_TOTAL_GB' exceeded.  Limit: 2048.0


************ ERROR logs from gcloud compute stderr ************
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:

******************* Exit codes and VM logs *******************
Thu, Apr 13, 2017  9:52:10 PM: Exited 1 : gcloud --project=hadoop-164401 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-1-pd --zone=us-central1-a
Thu, Apr 13, 2017  9:52:10 PM: Exited 1 : gcloud --project=hadoop-164401 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-m-pd --zone=us-central1-a

我需要帮助。

答案1

您的部署超出了项目的永久磁盘配额限制(2048 GB)。要解决此问题,您可以:

  • 编辑部署的配置文件并指定 PD 磁盘资源的较低大小。

或者

  • 发送配额增加请求来增加您项目的 PD 配额限制。

相关内容