Puy Web
Profile Blog
EN TH
Blog Cloud Run Functions: 3 Ways: Challenge Lab With Cloud Storage and Cloud Function CLI
Cloud Run Functions: 3 Ways: Challenge Lab With Cloud Storage and Cloud Function CLI
Technology Mar 23, 2026

Cloud Run Functions: 3 Ways: Challenge Lab With Cloud Storage and Cloud Function CLI

Lab Link: https://www.skills.google/course_templates/696/labs/598833

This lab will work with Google Cloud Storage, Google Cloud Storage (Trigger) Function, and Google Cloud HTTP Function. I try with GCP UI but found that on Cloud Function is already change to Cloud Run Function that make the lab can't pass with the UI. But we can use Cloud Function CLI on cloud shell to pass the lab.

Challenge scenario

You are just starting your junior cloud developer role. So far you have been helping teams create and manage Cloud Run functions that respond to and get triggered by specific events in their Google Cloud projects.

You are expected to have the skills and knowledge for these tasks.

Your challenge

You are asked to help a newly formed development team with some of their initial work on a new project. Specifically, they need to automate running code based on specific activities in their Google Cloud project including HTTP requests and new events in Cloud Storage; you receive the following request to complete the following tasks:

  • Create a bucket to upload new project files.

  • Create, deploy, and test a Cloud Storage function that logs new activities in the Cloud Storage bucket.

  • Create and deploy a function that responds to HTTP requests with minimum instances to limit cold starts.

Some standards you should follow:

  • Ensure that any needed APIs (such as Cloud Run functions) are successfully enabled.

  • Ensure that any needed IAM permissions (such as for the Cloud Storage service account) are assigned.

  • Create all resources in the <filled in at lab start> region, unless otherwise directed.

Let Start

  1. First, open cloud shell terminal.

  2. Next, let prepare the variables.

export REGION=

export HTTP_FUNCTION=

export FUNCTION_NAME=

export BUCKET="gs://$DEVSHELL_PROJECT_ID"

REGION will be display when start the lab. eg., us-west1 need to be check on the lab.

HTTP_FUNCTION will be display on task 3.

FUNCTION_NAME will be display on task 2.

  1. Next, enable the APIs.

gcloud services enable \
  artifactregistry.googleapis.com \
  cloudfunctions.googleapis.com \
  cloudbuild.googleapis.com \
  eventarc.googleapis.com \
  run.googleapis.com \
  logging.googleapis.com \
  pubsub.googleapis.com
  1. Task 1, Create a Cloud Storage Bucket

PROJECT_NUMBER=$(gcloud projects list --filter="project_id:$DEVSHELL_PROJECT_ID" --format='value(project_number)')

SERVICE_ACCOUNT=$(gsutil kms serviceaccount -p $PROJECT_NUMBER)

gcloud projects add-iam-policy-binding $DEVSHELL_PROJECT_ID \
  --member serviceAccount:$SERVICE_ACCOUNT \
  --role roles/pubsub.publisher

gsutil mb -l $REGION gs://$DEVSHELL_PROJECT_ID

Get project number while searches in Google Cloud Environment, then extract the Project Number.

For service account from GCP email address of the Google-managed service account from Cloud Storage with specific project.

Then modify IAM rules for the project and grants role. Last one will be create the bucket (mb - make bucket) with specific region and bucket name.

  1. Task 2. Create, deploy, and test a Cloud Storage function (2nd gen)

mkdir ~/$FUNCTION_NAME && cd $_

touch index.js && touch package.json

cat > index.js <<EOF
const functions = require('@google-cloud/functions-framework');
functions.cloudEvent('$FUNCTION_NAME', (cloudevent) => {
  console.log('A new event in your Cloud Storage bucket has been logged!');
  console.log(cloudevent);
});
EOF

cat > package.json <<EOF
{
  "name": "nodejs-functions-gen2-codelab",
  "version": "0.0.1",
  "main": "index.js",
  "dependencies": {
    "@google-cloud/functions-framework": "^2.0.0"
  }
}
EOF

This will create folder from function name with source code into index.js and package.json. For index.js this code come from the lab but change the function name from setting variable (FUNCTION_NAME). Same as package.json the code from the lab.

  • Creating Cloud Storage Function.

gcloud functions deploy $FUNCTION_NAME \
  --gen2 \
  --runtime nodejs20 \
  --entry-point $FUNCTION_NAME \
  --source . \
  --region $REGION \
  --trigger-bucket $BUCKET \
  --trigger-location $REGION \
  --max-instances 2

This command will create cloud function with provided source code on current folder. Source code runtime is set to nodejs20. Enviroment is gen2. Entrypoint is function name from setting variable (FUNCTION_NAME). And create trigger function with specific bucket and region. The last option is max instances that set to 2 (This is a bit tricky for passing the lab that can't do via the UI).

  1. Task 3. Create and deploy a HTTP function (2nd gen) with minimum instances

cd ..

mkdir ~/HTTP_FUNCTION && cd $_

touch index.js && touch package.json

cat > index.js <<EOF
const functions = require('@google-cloud/functions-framework');
functions.http('$HTTP_FUNCTION', (req, res) => {
  res.status(200).send('subscribe to quikclab');
});
EOF

cat > package.json <<EOF
{
  "name": "nodejs-functions-gen2-codelab",
  "version": "0.0.1",
  "main": "index.js",
  "dependencies": {
    "@google-cloud/functions-framework": "^2.0.0"
  }
}
EOF

This will create folder from http function name with source code into index.js and package.json. For index.js this code come from the lab but change the http function name from setting variable (HTTP_FUNCTION ). Same as package.json the code from the lab.

  • Creating HTTP FUNCTION.

gcloud functions deploy $HTTP_FUNCTION \
  --gen2 \
  --runtime nodejs20 \
  --entry-point $HTTP_FUNCTION \
  --source . \
  --region $REGION \
  --trigger-http \
  --timeout 600s \
  --max-instances 2 \
  --min-instances 1

This command will create cloud function with provided source code on current folder. Source code runtime is set to nodejs20. Enviroment is gen2. Entrypoint is function name from setting variable (HTTP_FUNCTION ). And create http trigger function with specific region. For timeout is set to 600s. The last option is min and max instances that set to 1 for min and 2 for max.

Ref: https://github.com/atulguptag/Google-Cloud-Challenge-Labs-Soluitons/blob/main/Cloud%20Functions%203%20Ways%20Challenge%20Lab.md

Share this article:

Related Articles

Journey Log: Setup & Deploy Web App / API / Let's Encrypt on VPS with Dockers and Auto Deploy from GitLab Container Registry
Technology
Apr 20, 2026

Journey Log: Setup & Deploy Web App / API / Let's Encrypt on VPS with Dockers and Auto Deploy from GitLab Container Registry

This blog is a content for setup & deploy web application and api service with dockerize on VPS, with auto deploy if found new image registry.

Read More
Engineer Data for Predictive Modeling with BigQuery ML: Challenge Lab
Technology
Apr 14, 2026

Engineer Data for Predictive Modeling with BigQuery ML: Challenge Lab

Let learn through the lab.

Read More
Implement Multimodal Vector Search with BigQuery: Challenge Lab
Technology
Apr 13, 2026

Implement Multimodal Vector Search with BigQuery: Challenge Lab

Let learn through the challenge lab.

Read More
Perform Predictive Data Analysis in BigQuery: Challenge Lab
Technology
Apr 12, 2026

Perform Predictive Data Analysis in BigQuery: Challenge Lab

Let learn throught the challenge lab.

Read More