Installing dependencies for learning PySpark
![Harvey Ducay](https://cdn.hashnode.com/res/hashnode/image/upload/v1727228901972/52b805f8-fec1-4cda-8340-0d7b138b5de8.jpeg?w=500&h=500&fit=crop&crop=entropy&auto=compress,format&format=webp)
1 min read
![](https://cdn.hashnode.com/res/hashnode/image/upload/v1727227207963/83980cf4-ac65-4892-b944-8985be86d223.png)
I’ve had a lot of issues in downloading pyspark locally and with all the support from forums online such as stackoverflow, etc. I still wasn’t able to fix my dependency issues from running PySpark. I said to myself, maybe this is the time I start utilizing cloud computing such as google colab in learning or even testing some production ready deployments. At least I think with using google colab, I’ll be much closer to where I’ll be deploying, in GCP…
0
Subscribe to my newsletter
Read articles from Harvey Ducay directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
![Harvey Ducay](https://cdn.hashnode.com/res/hashnode/image/upload/v1727228901972/52b805f8-fec1-4cda-8340-0d7b138b5de8.jpeg?w=500&h=500&fit=crop&crop=entropy&auto=compress,format&format=webp)