Mental Health Classification Model - BigQuery Data and Vertex AI AutoML

HarshHarsh
3 min read

Tasks

Create, Train, and Deploy a Mental Health (Depression / Suicide Watch) Classification Model as a Service in Vertex AI AutoML with data sourced from the BigQuery dataset.

Prerequisites

  • A browser, such as Chrome or Firefox.

  • A Google Cloud project with billing enabled.

  1. https://www.abirami.dev/code-vipassana-season2

  2. Docs link

  3. Codelab link

Keep these links handy (if you wish to follow along)

Let's Start

Create a Project

  1. On the project selector page, create a Google Cloud project.

  1. Enable the BigQuery and Vertex AI APIs.

  2. Open the Google Cloud Shell.

Creating and Loading the Dataset

  1. Use the bq mk command to create a dataset called "mental_health":
    NOTE: It's mental_health and not mental-health. Since we cannot put - (hyphen) in the names of BigQuery datasets.

     bq mk --location=us-central1 mental_health
    
  2. Clone the repository (for data file) and navigate to the project:

     git clone https://github.com/AbiramiSukumaran/mental-health
    
     cd mental_health
    
  3. Use the bq load command to load your CSV file into a BigQuery table:

     bq load --source_format=CSV --skip_leading_rows=1 mental_health.mental_state \
     ./mental-health.csv \ text:string,label:string
    

    The final result should look like:

  4. Using the BigQuery web UI, run:

     SELECT text FROM mental_health.mental_state limit 3;
    

Using BigQuery Data in Vertex AI

  1. Open the Vertex AI and select Datasets and create a new dataset:

  2. Now, select the third option, i.e., Select a table or view from BigQuery:

  3. Browse the BigQuery table that we created earlier and select it:

  4. Once the dataset is created, you should see the Analyze page. Click the Train New Model and select "Others":

  5. Select Objective as Classification and select AutoML option as Model training method and click continue:

  6. Give your model a name and select the Target Column name as "label" from the dropdown and click Continue.

  7. Put the Budget as 1 (i.e., Maximum Node Hours):

  8. Click Start Training to begin training your new model.

Deploy the Model

  1. View and evaluate the training results.

  2. Deploy and test the model with your API endpoint, and click Continue.

  3. On the next page in Model Settings, scroll down to see the Explainability Options and make sure it is enabled.

  4. Click deploy once you have done the training configuration.

Test the Deployed Model

ENDPOINT_ID="YOUR_ENDPOINT_ID"
PROJECT_ID="YOUR_PROJECT_ID"

curl \
-X POST \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-H "Content-Type: application/json" \
https://us-central1-aiplatform.googleapis.com/v1/projects/${PROJECT_ID}/locations/us-central1/endpoints/${ENDPOINT_ID}:predict \
-d "{'instances': [{ 'text': 'Last week I did the most disgusting thing by cheating on my husband with his best friend, who was the best man at our wedding. We have only been married for a year and a half. We have a 6 month old son. He found out a couple of days after it happened. I feel disgusting and awful and that I devour any happiness he once had. I think about disappearing constantly. Every time I am in the car I have an urge just to drive into a tree. I cannot believe I have done this to such a beautiful person. I feel like I need to remove myself from existence so he can start to heal. I deserve everything bad and terrible. I do not want to because anymore pain. I need to disappear. Cheated on my husband with his best friend'}]}"

You should see the result below with labels and corresponding confidence scores:

Clean Up

To avoid any unnecessary charges, shut down or delete the project you created above.

Key Takeaways

  • How to take data (from day-to-day life) and make a simple ML model using GCP using point and click methodologies.

  • Create a simple Classification model.

  • Deploy the model to make it available as an API for other applications.

10
Subscribe to my newsletter

Read articles from Harsh directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Harsh
Harsh