这是indexloc提供的服务,不要输入任何密码
Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -359,8 +359,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l {REGION} -p {PROJECT_ID} {BUCKET_URI}"
]
"! gcloud storage buckets create --location={REGION} --project={PROJECT_ID} {BUCKET_URI}" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1098,8 +1097,7 @@
" ! bq rm -r -f $PROJECT_ID:$BQ_DATASET_NAME\n",
"# delete the Cloud Storage bucket\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil -m rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
18 changes: 6 additions & 12 deletions notebooks/community/migration/UJ13 Data Labeling task.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -325,8 +325,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION gs://$BUCKET_NAME"
]
"! gcloud storage buckets create --location $REGION gs://$BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -345,8 +344,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al gs://$BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long gs://$BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -513,8 +511,7 @@
"IMPORT_FILE = \"gs://\" + BUCKET_NAME + \"/labeling.csv\"\n",
"with tf.io.gfile.GFile(IMPORT_FILE, \"w\") as f:\n",
" for lf in LABELING_FILES:\n",
" ! wget {lf} | gsutil cp {lf.split(\"/\")[-1]} gs://{BUCKET_NAME}\n",
" f.write(\"gs://\" + BUCKET_NAME + \"/\" + lf.split(\"/\")[-1] + \"\\n\")"
" ! wget {lf} | gcloud storage cp {lf.split(\"/\")[-1]} gs://{BUCKET_NAME}\n", " f.write(\"gs://\" + BUCKET_NAME + \"/\" + lf.split(\"/\")[-1] + \"\\n\")"
]
},
{
Expand All @@ -525,8 +522,7 @@
},
"outputs": [],
"source": [
"! gsutil cat $IMPORT_FILE"
]
"! gcloud storage cat $IMPORT_FILE" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1007,8 +1003,7 @@
"outputs": [],
"source": [
"# create placeholder file for valid PDF file with instruction for data labeling\n",
"! echo \"this is instruction\" >> instruction.txt | gsutil cp instruction.txt gs://$BUCKET_NAME"
]
"! echo \"this is instruction\" >> instruction.txt | gcloud storage cp instruction.txt gs://$BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1382,8 +1377,7 @@
"\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r gs://$BUCKET_NAME"
]
" ! gcloud storage rm --recursive gs://$BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -472,8 +472,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_URI"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand All @@ -492,8 +491,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_URI"
]
"! gcloud storage ls --all-versions --long $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1353,8 +1351,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_URI/trainer.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_URI/trainer.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1554,8 +1551,7 @@
"delete_bucket = False\n",
"\n",
"if delete_bucket or os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -rf {BUCKET_URI}"
]
" ! gcloud storage rm --recursive --continue-on-error {BUCKET_URI}" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -166,11 +166,9 @@
"if BUCKET_URI is None or BUCKET_URI.strip() == \"\" or BUCKET_URI == \"gs://\":\n",
" BUCKET_URI = f\"gs://{PROJECT_ID}-tmp-{now}-{str(uuid.uuid4())[:4]}\"\n",
" BUCKET_NAME = \"/\".join(BUCKET_URI.split(\"/\")[:3])\n",
" ! gsutil mb -l {REGION} {BUCKET_URI}\n",
"else:\n",
" ! gcloud storage buckets create --location={REGION} {BUCKET_URI}\n", "else:\n",
" assert BUCKET_URI.startswith(\"gs://\"), \"BUCKET_URI must start with `gs://`.\"\n",
" shell_output = ! gsutil ls -Lb {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n",
" bucket_region = shell_output[0].strip().lower()\n",
" shell_output = ! gcloud storage ls --full --buckets {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n", " bucket_region = shell_output[0].strip().lower()\n",
" if bucket_region != REGION:\n",
" raise ValueError(\n",
" \"Bucket region %s is different from notebook region %s\"\n",
Expand All @@ -194,8 +192,8 @@
"\n",
"\n",
"# Provision permissions to the SERVICE_ACCOUNT with the GCS bucket\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.admin $BUCKET_NAME\n",
"\n",
"# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_NAME --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.admin\n", "\n",
"! gcloud config set project $PROJECT_ID\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/storage.admin\"\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/aiplatform.user\"\n",
Expand Down Expand Up @@ -431,8 +429,7 @@
" \"\"\"\n",
" label_map_filename = os.path.basename(label_map_yaml_filepath)\n",
" subprocess.check_output(\n",
" [\"gsutil\", \"cp\", label_map_yaml_filepath, label_map_filename],\n",
" stderr=subprocess.STDOUT,\n",
" [\"gcloud\", \"storage\", \"cp\", label_map_yaml_filepath, label_map_filename],\n", " stderr=subprocess.STDOUT,\n",
" )\n",
" with open(label_map_filename, \"rb\") as input_file:\n",
" label_map = yaml.safe_load(input_file.read())[\"label_map\"]\n",
Expand Down Expand Up @@ -469,8 +466,7 @@
" checkpoint_path = find_checkpoint_in_dir(checkpoint_name)\n",
" checkpoint_path = os.path.relpath(checkpoint_path, checkpoint_name)\n",
"\n",
" ! gsutil cp -r $checkpoint_name $CHECKPOINT_BUCKET/\n",
" checkpoint_uri = os.path.join(CHECKPOINT_BUCKET, checkpoint_name, checkpoint_path)\n",
" ! gcloud storage cp --recursive $checkpoint_name $CHECKPOINT_BUCKET/\n", " checkpoint_uri = os.path.join(CHECKPOINT_BUCKET, checkpoint_name, checkpoint_path)\n",
" print(\"Checkpoint uploaded to\", checkpoint_uri)\n",
" return checkpoint_uri\n",
"\n",
Expand All @@ -481,8 +477,7 @@
" destination = os.path.join(CONFIG_DIR, filename)\n",
" print(\"Copy\", url, \"to\", destination)\n",
" ! wget \"$url\" -O \"$filename\"\n",
" ! gsutil cp \"$filename\" \"$destination\"\n",
" return destination\n",
" ! gcloud storage cp \"$filename\" \"$destination\"\n", " return destination\n",
"\n",
"\n",
"train_job_name = common_util.get_job_name_with_datetime(\n",
Expand Down Expand Up @@ -625,8 +620,7 @@
" current_trial_best_ckpt_evaluation_filepath = os.path.join(\n",
" current_trial_best_ckpt_dir, \"info.json\"\n",
" )\n",
" ! gsutil cp $current_trial_best_ckpt_evaluation_filepath .\n",
" with open(\"info.json\", \"r\") as f:\n",
" ! gcloud storage cp $current_trial_best_ckpt_evaluation_filepath .\n", " with open(\"info.json\", \"r\") as f:\n",
" eval_metric_results = json.load(f)\n",
" current_performance = eval_metric_results[evaluation_metric]\n",
" if current_performance > best_performance:\n",
Expand All @@ -641,8 +635,7 @@
" \"\"\"Finds the best checkpoint path.\"\"\"\n",
" try:\n",
" checkpoint_files = (\n",
" subprocess.check_output([\"gsutil\", \"ls\", checkpoint_dir])\n",
" .decode(\"utf-8\")\n",
" subprocess.check_output([\"gcloud\", \"storage\", \"ls\", checkpoint_dir])\n", " .decode(\"utf-8\")\n",
" .strip()\n",
" )\n",
" for file in checkpoint_files.splitlines():\n",
Expand Down Expand Up @@ -864,8 +857,7 @@
"# The label map file was generated from the section above (`Prepare input data for training`).\n",
"\n",
"dir_name = os.path.basename(predict_destination_prefix)\n",
"! gsutil -m cp -R $predict_destination_prefix /tmp\n",
"\n",
"! gcloud storage cp --recursive $predict_destination_prefix /tmp\n", "\n",
"local_path = os.path.join(\"/tmp\", dir_name)\n",
"file_paths = []\n",
"for root, _, files in os.walk(local_path):\n",
Expand Down Expand Up @@ -919,8 +911,7 @@
"\n",
"delete_bucket = False # @param {type:\"boolean\"}\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -157,11 +157,9 @@
"if BUCKET_URI is None or BUCKET_URI.strip() == \"\" or BUCKET_URI == \"gs://\":\n",
" BUCKET_URI = f\"gs://{PROJECT_ID}-tmp-{now}-{str(uuid.uuid4())[:4]}\"\n",
" BUCKET_NAME = \"/\".join(BUCKET_URI.split(\"/\")[:3])\n",
" ! gsutil mb -l {REGION} {BUCKET_URI}\n",
"else:\n",
" ! gcloud storage buckets create --location {REGION} {BUCKET_URI}\n", "else:\n",
" assert BUCKET_URI.startswith(\"gs://\"), \"BUCKET_URI must start with `gs://`.\"\n",
" shell_output = ! gsutil ls -Lb {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n",
" bucket_region = shell_output[0].strip().lower()\n",
" shell_output = ! gcloud storage ls --full --buckets {BUCKET_NAME} | grep \"Location constraint:\" | sed \"s/Location constraint://\"\n", " bucket_region = shell_output[0].strip().lower()\n",
" if bucket_region != REGION:\n",
" raise ValueError(\n",
" \"Bucket region %s is different from notebook region %s\"\n",
Expand All @@ -185,8 +183,8 @@
"\n",
"\n",
"# Provision permissions to the SERVICE_ACCOUNT with the GCS bucket\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.admin $BUCKET_NAME\n",
"\n",
"! # Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage buckets add-iam-policy-binding and/or gcloud storage buckets remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_NAME --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.admin\n", "\n",
"! gcloud config set project $PROJECT_ID\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/storage.admin\"\n",
"! gcloud projects add-iam-policy-binding --no-user-output-enabled {PROJECT_ID} --member=serviceAccount:{SERVICE_ACCOUNT} --role=\"roles/aiplatform.user\""
Expand Down Expand Up @@ -890,8 +888,7 @@
"\n",
"delete_bucket = False # @param {type:\"boolean\"}\n",
"if delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -373,8 +373,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION -p $PROJECT_ID $BUCKET_URI"
]
"! gcloud storage buckets create --location=$REGION --project=$PROJECT_ID $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -641,14 +640,11 @@
" + \"/evaluation_metrics\"\n",
" )\n",
" if tf.io.gfile.exists(EXECUTE_OUTPUT):\n",
" ! gsutil cat $EXECUTE_OUTPUT\n",
" return EXECUTE_OUTPUT\n",
" ! gcloud storage cat $EXECUTE_OUTPUT\n", " return EXECUTE_OUTPUT\n",
" elif tf.io.gfile.exists(GCP_RESOURCES):\n",
" ! gsutil cat $GCP_RESOURCES\n",
" return GCP_RESOURCES\n",
" ! gcloud storage cat $GCP_RESOURCES\n", " return GCP_RESOURCES\n",
" elif tf.io.gfile.exists(EVAL_METRICS):\n",
" ! gsutil cat $EVAL_METRICS\n",
" return EVAL_METRICS\n",
" ! gcloud storage cat $EVAL_METRICS\n", " return EVAL_METRICS\n",
"\n",
" return None"
]
Expand Down Expand Up @@ -1470,8 +1466,7 @@
"# delete bucket\n",
"delete_bucket = False\n",
"if os.getenv(\"IS_TESTING\") or delete_bucket:\n",
" ! gsutil -m rm -r $BUCKET_URI\n",
"\n",
" ! gcloud storage rm --recursive $BUCKET_URI\n", "\n",
"# Remove local resorces\n",
"delete_local_resources = False\n",
"if delete_local_resources:\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -284,8 +284,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $LOCATION $BUCKET_URI"
]
"! gcloud storage buckets create --location $LOCATION $BUCKET_URI" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -381,8 +380,7 @@
"\n",
"# Copy images using gsutil commands directly\n",
"for src, dest in zip(df.iloc[:, 0], df[\"destination_path\"]):\n",
" ! gsutil -m cp {src} {dest}\n",
"\n",
" ! gcloud storage cp {src} {dest}\n", "\n",
"print(f\"Files copied to {BUCKET_URI}\")"
]
},
Expand Down Expand Up @@ -462,12 +460,10 @@
"else:\n",
" FILE = IMPORT_FILE\n",
"\n",
"count = ! gsutil cat $FILE | wc -l\n",
"print(\"Number of Examples\", int(count[0]))\n",
"count = ! gcloud storage cat $FILE | wc -l\n", "print(\"Number of Examples\", int(count[0]))\n",
"\n",
"print(\"First 10 rows\")\n",
"! gsutil cat $FILE | head"
]
"! gcloud storage cat $FILE | head" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -675,10 +671,8 @@
},
"outputs": [],
"source": [
"! gsutil ls $model_package\n",
"# Download the model artifacts\n",
"! gsutil cp -r $model_package tflite\n",
"\n",
"! gcloud storage ls $model_package\n", "# Download the model artifacts\n",
"! gcloud storage cp --recursive $model_package tflite\n", "\n",
"tflite_path = \"tflite/model.tflite\""
]
},
Expand Down Expand Up @@ -736,8 +730,7 @@
},
"outputs": [],
"source": [
"test_items = ! gsutil cat $IMPORT_FILE | head -n1\n",
"test_item = test_items[0].split(\",\")[0]\n",
"test_items = ! gcloud storage cat $IMPORT_FILE | head -n1\n", "test_item = test_items[0].split(\",\")[0]\n",
"\n",
"with tf.io.gfile.GFile(test_item, \"rb\") as f:\n",
" content = f.read()\n",
Expand Down Expand Up @@ -824,8 +817,7 @@
"dag.delete()\n",
"\n",
"if delete_bucket:\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -363,8 +363,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l {REGION} -p {PROJECT_ID} {BUCKET_URI}"
]
"! gcloud storage buckets create --location={REGION} --project={PROJECT_ID} {BUCKET_URI}" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -437,10 +436,10 @@
},
"outputs": [],
"source": [
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectCreator $BUCKET_URI\n",
"\n",
"! gsutil iam ch serviceAccount:{SERVICE_ACCOUNT}:roles/storage.objectViewer $BUCKET_URI"
]
"! # Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
"! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectCreator\n", "\n",
# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.
! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectViewer ]
Comment on lines +439 to +442
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

critical

The automated migration has produced a malformed .ipynb file. The source array for this cell contains lines that are not valid JSON strings, which will cause issues when parsing the notebook. Specifically, the comment and the final gcloud command are not properly quoted as strings within the JSON array. Additionally, there's a stray ] at the end of the last command.

        "! # Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
        "! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectCreator\n",
        "\n",
        "# Note: Migrating scripts using gsutil iam ch is more complex than get or set. You need to replace the single iam ch command with a series of gcloud storage bucket add-iam-policy-binding and/or gcloud storage bucket remove-iam-policy-binding commands, or replicate the read-modify-write loop.\n",
        "! gcloud storage buckets add-iam-policy-binding $BUCKET_URI --member=serviceAccount:{SERVICE_ACCOUNT} --role=roles/storage.objectViewer"

},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1362,8 +1361,7 @@
"\n",
"# delete the Cloud Storage bucket\n",
"if delete_bucket and os.getenv(\"IS_TESTING\"):\n",
" ! gsutil rm -r $BUCKET_URI"
]
" ! gcloud storage rm --recursive $BUCKET_URI" ]
}
],
"metadata": {
Expand Down
Loading
Loading