这是indexloc提供的服务,不要输入任何密码
Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -419,8 +419,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_NAME"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -439,8 +438,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1119,8 +1117,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_NAME/trainer_cifar10.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_NAME/trainer_cifar10.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1617,8 +1614,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_NAME/trainer_cifar10.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_NAME/trainer_cifar10.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -2062,8 +2058,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -419,8 +419,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_NAME"
]
"! gcloud storage buckets create --location $REGION $BUCKET_NAME" ]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For consistency with other files in this PR and for better script clarity, it's recommended to use --location=$REGION instead of --location $REGION. While gcloud might handle both, using the equals sign is more explicit and less prone to parsing errors in shell scripts.

        "! gcloud storage buckets create --location=$REGION $BUCKET_NAME"

},
{
"cell_type": "markdown",
Expand All @@ -439,8 +438,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1116,8 +1114,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_NAME/trainer_boston.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_NAME/trainer_boston.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1626,8 +1623,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_NAME/trainer_boston.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_NAME/trainer_boston.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -2085,8 +2081,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -419,8 +419,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION $BUCKET_NAME"
]
"! gcloud storage buckets create --location=$REGION $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -439,8 +438,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al $BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long $BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1116,8 +1114,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_NAME/trainer_imdb.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_NAME/trainer_imdb.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1618,8 +1615,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz $BUCKET_NAME/trainer_imdb.tar.gz"
]
"! gcloud storage cp custom.tar.gz $BUCKET_NAME/trainer_imdb.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -2060,8 +2056,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r $BUCKET_NAME"
]
" ! gcloud storage rm --recursive $BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -325,8 +325,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION gs://$BUCKET_NAME"
]
"! gcloud storage buckets create --location=$REGION gs://$BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -345,8 +344,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al gs://$BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long gs://$BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -722,8 +720,7 @@
"! rm -f custom.tar custom.tar.gz\n",
"! tar cvf custom.tar custom\n",
"! gzip custom.tar\n",
"! gsutil cp custom.tar.gz gs://$BUCKET_NAME/census.tar.gz"
]
"! gcloud storage cp custom.tar.gz gs://$BUCKET_NAME/census.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1379,8 +1376,7 @@
" for i in INSTANCES:\n",
" f.write(json.dumps(i) + \"\\n\")\n",
"\n",
"! gsutil cat $gcs_input_uri"
]
"! gcloud storage cat $gcs_input_uri" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1718,8 +1714,7 @@
"source": [
"def get_latest_predictions(gcs_out_dir):\n",
" \"\"\" Get the latest prediction subfolder using the timestamp in the subfolder name\"\"\"\n",
" folders = !gsutil ls $gcs_out_dir\n",
" latest = \"\"\n",
" folders = !gcloud storage ls $gcs_out_dir\n", " latest = \"\"\n",
" for folder in folders:\n",
" subfolder = folder.split(\"/\")[-2]\n",
" if subfolder.startswith(\"prediction-\"):\n",
Expand All @@ -1738,10 +1733,8 @@
" folder = get_latest_predictions(\n",
" response.output_config.gcs_destination.output_uri_prefix\n",
" )\n",
" ! gsutil ls $folder/prediction*\n",
"\n",
" ! gsutil cat -h $folder/prediction*\n",
" break\n",
" ! gcloud storage ls $folder/prediction*\n", "\n",
" ! gcloud storage cat --display-url $folder/prediction*\n", " break\n",
" time.sleep(60)"
]
},
Expand Down Expand Up @@ -2630,8 +2623,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r gs://$BUCKET_NAME"
]
" ! gcloud storage rm --recursive gs://$BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -325,8 +325,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION gs://$BUCKET_NAME"
]
"! gcloud storage buckets create --location=$REGION gs://$BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand All @@ -345,8 +344,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al gs://$BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long gs://$BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -542,8 +540,7 @@
},
"outputs": [],
"source": [
"! gsutil cat $IMPORT_FILE | head -n 10"
]
"! gcloud storage cat $IMPORT_FILE | head -n 10" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1386,8 +1383,7 @@
},
"outputs": [],
"source": [
"test_items = ! gsutil cat $IMPORT_FILE | head -n25\n",
"\n",
"test_items = ! gcloud storage cat $IMPORT_FILE | head -n25\n", "\n",
"cols_1 = test_items[0].split(\",\")\n",
"cols_2 = test_items[-1].split(\",\")\n",
"\n",
Expand Down Expand Up @@ -1466,8 +1462,7 @@
" f.write(json.dumps(data) + \"\\n\")\n",
"\n",
"print(gcs_input_uri)\n",
"!gsutil cat $gcs_input_uri"
]
"!gcloud storage cat $gcs_input_uri" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1783,8 +1778,7 @@
"source": [
"def get_latest_predictions(gcs_out_dir):\n",
" \"\"\" Get the latest prediction subfolder using the timestamp in the subfolder name\"\"\"\n",
" folders = !gsutil ls $gcs_out_dir\n",
" latest = \"\"\n",
" folders = !gcloud storage ls $gcs_out_dir\n", " latest = \"\"\n",
" for folder in folders:\n",
" subfolder = folder.split(\"/\")[-2]\n",
" if subfolder.startswith(\"prediction-\"):\n",
Expand All @@ -1803,10 +1797,8 @@
" folder = get_latest_predictions(\n",
" response.output_config.gcs_destination.output_uri_prefix\n",
" )\n",
" ! gsutil ls $folder/prediction**\n",
"\n",
" ! gsutil cat $folder/prediction**\n",
" break\n",
" ! gcloud storage ls $folder/prediction**\n", "\n",
" ! gcloud storage cat $folder/prediction**\n", " break\n",
" time.sleep(60)"
]
},
Expand Down Expand Up @@ -1882,8 +1874,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r gs://$BUCKET_NAME"
]
" ! gcloud storage rm --recursive gs://$BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -325,8 +325,7 @@
},
"outputs": [],
"source": [
"! gsutil mb -l $REGION gs://$BUCKET_NAME"
]
"! gcloud storage buckets create --location $REGION gs://$BUCKET_NAME" ]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

For consistency with other files in this PR and for better script clarity, it's recommended to use --location=$REGION instead of --location $REGION. While gcloud might handle both, using the equals sign is more explicit and less prone to parsing errors in shell scripts.

        "! gcloud storage buckets create --location=$REGION gs://$BUCKET_NAME"

},
{
"cell_type": "markdown",
Expand All @@ -345,8 +344,7 @@
},
"outputs": [],
"source": [
"! gsutil ls -al gs://$BUCKET_NAME"
]
"! gcloud storage ls --all-versions --long gs://$BUCKET_NAME" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -659,8 +657,7 @@
"! rm -f cifar.tar cifar.tar.gz\n",
"! tar cvf cifar.tar cifar\n",
"! gzip cifar.tar\n",
"! gsutil cp cifar.tar.gz gs://$BUCKET_NAME/trainer_cifar.tar.gz"
]
"! gcloud storage cp cifar.tar.gz gs://$BUCKET_NAME/trainer_cifar.tar.gz" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1318,8 +1315,7 @@
" b64str = base64.b64encode(bytes.numpy()).decode(\"utf-8\")\n",
" f.write(json.dumps({input_name: {\"b64\": b64str}}) + \"\\n\")\n",
"\n",
"! gsutil cat $gcs_input_uri"
]
"! gcloud storage cat $gcs_input_uri" ]
},
{
"cell_type": "markdown",
Expand Down Expand Up @@ -1643,8 +1639,7 @@
"source": [
"def get_latest_predictions(gcs_out_dir):\n",
" \"\"\" Get the latest prediction subfolder using the timestamp in the subfolder name\"\"\"\n",
" folders = !gsutil ls $gcs_out_dir\n",
" latest = \"\"\n",
" folders = !gcloud storage ls $gcs_out_dir\n", " latest = \"\"\n",
" for folder in folders:\n",
" subfolder = folder.split(\"/\")[-2]\n",
" if subfolder.startswith(\"prediction-\"):\n",
Expand All @@ -1663,10 +1658,8 @@
" folder = get_latest_predictions(\n",
" response.output_config.gcs_destination.output_uri_prefix\n",
" )\n",
" ! gsutil ls $folder/prediction*\n",
"\n",
" ! gsutil cat $folder/prediction*\n",
" break\n",
" ! gcloud storage ls $folder/prediction*\n", "\n",
" ! gcloud storage cat $folder/prediction*\n", " break\n",
" time.sleep(60)"
]
},
Expand Down Expand Up @@ -2234,8 +2227,7 @@
" print(e)\n",
"\n",
"if delete_bucket and \"BUCKET_NAME\" in globals():\n",
" ! gsutil rm -r gs://$BUCKET_NAME"
]
" ! gcloud storage rm --recursive gs://$BUCKET_NAME" ]
}
],
"metadata": {
Expand Down
Loading
Loading