Hybrid Local/Cloud-VM Setups & Solutions for Resource Intensive Python Projects
Unrooted Android Local Termux Linux Userland PRoot Container of Debian GNU/Linux & Oracle Cloud VM of Debian GNU /Linux
Setting up a hybrid environment that allows your unrooted local Debian PRoot on Termux to interact seamlessly with a cloud virtual machine (VM) is a great way to combine convenience with processing power. This approach enables you to develop and test code locally on your Android tablet whilst offloading resource-intensive tasks to a cloud VM when necessary.
There a a number of ways you can achieve this:
1. Establish Secure Remote Access Between Local and Cloud Environments
SSH Connection
Install OpenSSH Client on Termux:
Make sure you have the SSH client installed in your Termux environment:
pkg install openssh
Connect to Your Cloud VM:
From your Termux terminal, you can establish an SSH connection to your cloud VM:
ssh username@your_cloud_vm_ip
Set Up SSH Key-Based Authentication:
For secure and password-less login:
Generate an SSH key pair on your tablet:
ssh-keygen -t rsa -b 4096
Copy your public key to the cloud VM:
ssh-copy-id username@your_cloud_vm_ip
2. Synchronize Code Between Local and Cloud Environments
Option A: Use Git for Version Control
Initialize a Git Repository Locally:
In your project directory on the tablet:
git init
Use a Remote Repository:
Host your repository on platforms like GitHub or GitLab.
Alternatively, set up a bare Git repository on your cloud VM.
Workflow:
Commit and push changes from your local environment.
Pull the latest code on the cloud VM when you need to run tasks there.
Option B: Use rsync
for Direct Synchronization
Install
rsync
:Ensure
rsync
is installed on both local and cloud environments:apt-get install rsync
Sync Your Project:
From your tablet, synchronize your project directory to the cloud VM:
rsync -avz /path/to/local/project/ username@your_cloud_vm_ip:/path/to/remote/project/
Automate with Scripts:
Create a Bash script to automate syncing whenever you make changes.
3. Remote Development Using SSHFS
Mount the Cloud Filesystem Locally:
Install
sshfs
on Termux:pkg install sshfs
Create a Mount Point:
mkdir ~/cloud_project
Mount the Remote Directory:
sshfs username@your_cloud_vm_ip:/path/to/remote/project ~/cloud_project
Advantages:
Edit files locally with your preferred tools.
Changes are reflected directly on the cloud VM.
Unmount When Finished:
fusermount -u ~/cloud_project
4. Use Jupyter Notebooks Hosted on the Cloud VM
Set Up Jupyter Notebook on Cloud VM
Install Jupyter on the Cloud VM:
pip install jupyter
Launch Jupyter Notebook:
Run Jupyter Notebook on the cloud, binding it to all IP addresses:
jupyter notebook --ip=0.0.0.0 --no-browser
Access Jupyter Notebook from Your Tablet
Set Up an SSH Tunnel:
From your Termux terminal:
ssh -N -f -L localhost:8888:localhost:8888 username@your_cloud_vm_ip
Connect via Browser:
Open a web browser on your tablet and navigate to
http://localhost:8888
. You can now interact with the Jupyter Notebook running on the cloud VM.
5. Offload Resource-Intensive Tasks to the Cloud VM
Remote Execution of Scripts
Use SSH to Run Scripts on the Cloud VM:
Execute commands or scripts remotely:
ssh username@your_cloud_vm_ip 'python /path/to/remote/project/script.py'
Automate with Python Libraries:
Fabric:
Install Fabric in your local environment:
pip install fabric
Use Fabric to execute remote tasks:
from fabric import Connection with Connection('username@your_cloud_vm_ip') as c: c.run('python /path/to/remote/project/script.py')
Use Remote Jupyter Notebooks for Data Processing
- Develop notebooks locally and run them on the cloud VM to utilize greater computational resources.
6. Synchronize Environments with Virtual Environments
Using conda
Environments
Export Environment from Local Machine:
conda env export > environment.yml
Recreate Environment on Cloud VM:
conda env create -f environment.yml
Using pip
and virtualenv
Create Requirements File Locally:
pip freeze > requirements.txt
Install Requirements on Cloud VM:
pip install -r requirements.txt
Benefits:
Ensures consistent environments across local and cloud setups.
Avoids dependency conflicts and version mismatches.
7. Clarify the Role of vcpkg
Understanding
vcpkg
:vcpkg
is a package manager for C and C++ libraries, primarily used for managing dependencies in C/C++ projects.
Alternative Solutions:
Since Docker is not available to an unrooted Android device, focus on using Python virtual environments (
virtualenv
,conda
).These tools help isolate your project's Python dependencies without requiring root access or containerization.
8. Utilize Remote Development Tools
VS Code with Remote Development Extensions
Run VS Code Server (
code-server
) on Cloud VM:Install
code-server
on the cloud VM to access VS Code via a web browser.Installation instructions can be found on the code-server repository.
Access from Your Tablet:
Use your tablet's browser to interact with the remote VS Code instance.
This provides a rich development environment connected directly to your cloud resources.
Emacs, Neovim or Vim Over SSH
Use Command-Line Editors:
Access powerful editors like Vim, Neovim or Emacs directly on the cloud VM via SSH.
Customize these editors with plugins to enhance productivity.
9. Implement a Workflow for Project Management
Develop Locally, Execute Remotely
Local Development:
Write and test code on your tablet for immediate feedback.
Use lightweight datasets and test cases to ensure functionality.
Remote Execution:
For tasks that require more resources (e.g., large data processing, model training), run the code on the cloud VM.
This approach balances convenience with performance.
Automation Scripts
Create Bash or Python Scripts:
Automate the process of syncing code, running remote scripts, and retrieving results.
Example Bash script to sync code and execute remotely:
#!/bin/bash rsync -avz /path/to/local/project/ username@your_cloud_vm_ip:/path/to/remote/project/ ssh username@your_cloud_vm_ip 'python /path/to/remote/project/main.py' scp username@your_cloud_vm_ip:/path/to/remote/project/output/* /path/to/local/project/output/
10. Manage Data and Outputs Efficiently
Data Storage Best Practices
Store Large Data on Cloud VM:
- Keep big datasets on the cloud to avoid transferring large files over the network.
Transfer Only Necessary Files:
Download results or processed data as needed.
Compress files before transfer to reduce bandwidth usage.
Use Cloud Storage Services
Leverage Services Like AWS S3 or Google Cloud Storage:
Store and retrieve data directly from your scripts.
Can be accessed from both local and cloud environments if needed.
11. Ensure Security and Cost Efficiency
Security Measures
Use Strong SSH Passwords and Keys:
- Protect access to your cloud VM with strong authentication practices.
Keep Software Updated:
- Regularly update packages to patch security vulnerabilities.
Configure Firewalls:
- Use the cloud provider's firewall settings to restrict access to essential ports only (e.g., SSH, Jupyter Notebook port).
Cost Management
Monitor Cloud Resource Usage:
Be aware of CPU, memory, and storage usage to manage costs.
Shut down or resize the VM when not in use.
Use Spot Instances or Reserved Instances:
- Depending on the cloud provider, these options can reduce costs for compute resources.
Additional Considerations
Overcoming Limitations Without Docker
Use Ansible for Configuration Management:
Automate the setup of your cloud VM environment using Ansible playbooks.
This ensures consistency and saves time when provisioning new VMs.
By implementing these strategies, you can create a hybrid development environment that leverages both your local setup on the Android tablet and the computational power of a cloud VM. This setup allows you to:
Develop and test code locally with convenience.
Offload heavy computations to the cloud.
Maintain consistency across environments.
Optimize costs and resources effectively.
Oracle Cloud VM of Debian GNU/Linux
Oracle Cloud does offer a free tier that includes a VM instance. You can use this to run a Debian VM. Here’s a quick guide to get you started:
Sign up for an Oracle Cloud account: You’ll need to create an account and verify your email.
Navigate to the Free Tier: Once you’re in, look for the free tier options and select a VM instance.
Create a VM: Choose Debian as your OS and configure the VM according to your needs.
Launch the VM: Start the VM and access it via SSH.
Got it! Let's walk through the steps to set up a Debian VM on Oracle Cloud:
Step 1: Log in to Oracle Cloud
Go to the Oracle Cloud Console: Open your web browser and go to the Oracle Cloud Console.
Log in: Use the credentials from your Oracle Cloud account to log in.
Step 2: Create a Virtual Machine
Select Compute: In the Oracle Cloud Console, click on the "Compute" section.
Create Instance: Click on "Instances" and then "Create Instance".
Choose Free Tier: Select the "Always Free" tier to use the free resources.
Select Image: Choose "Oracle Linux" as the image (since Debian isn't directly available, we'll convert it later).
Configure Instance: Set the instance name, shape (e.g., VM.Standard.E2.2), and other configurations as needed.
Add SSH Key: Add your SSH public key to the instance for secure access.
Step 3: Launch the Instance
Review and Create: Review the instance details and click "Create".
Wait for Instance: Wait for the instance to be created and go to the "Instances" page to find your new VM.
Step 4: Connect to the Instance
Find Public IP: Go to the "Details" page of your instance and note the public IP address.
SSH into the Instance: Open your terminal and use the following command to connect:
ssh opc@<public_ip_address>
Replace
<public_ip_address>
with the actual IP address of your instance.
Step 5: Install Debian
Update Packages: Once logged in, update the existing packages:
sudo yum update -y
Install Debian: Download the Debian ISO and install it:
wget https://cdimage.debian.org/debian-cd/current/amd64/iso-cd/debian-11.2.0-amd64-netinst.iso sudo qemu-kvm -cdrom debian-11.2.0-amd64-netinst.iso -boot d
Follow Installation Steps: Follow the on-screen instructions to install Debian.
Step 6: Verify Installation
Check Debian Version: Once installed, check the Debian version:
cat /etc/debian_version
Install Additional Packages: Install any additional packages you need using
apt-get
.
That should get you up and running with a Debian VM on Oracle Cloud! If you run into any issues or need further assistance, feel free to ask. Happy cloud computing! 🌥️💻
Setting Up a Hybrid Development Environment with Ansible, Git, SSHFS, and Jupyter Labs
1. Set Up Oracle Cloud VM
Log in to Oracle Cloud: Access the Oracle Cloud Console and log in with your credentials.
Create a New VM Instance:
Go to the Compute section and create a new instance.
Select the "Always Free" tier.
Choose Debian as your operating system (if Debian is unavailable, choose a compatible Linux distribution like Ubuntu).
Configure the instance name, shape, and other settings.
Add your SSH public key for secure access.
Review and create the instance.
2. Provision the VM with Ansible
Create an Ansible Inventory File:
[oracle_vms] cloud-vm ansible_host=<your_vm_ip> ansible_user=<your_vm_user>
Create an Ansible Playbook:
- hosts: oracle_vms become: yes tasks: - name: Update and upgrade apt packages apt: update_cache: yes upgrade: dist - name: Install required packages apt: name: - python3 - python3-pip - git - build-essential - jupyterlab state: present - name: Install vcpkg git: repo: https://github.com/microsoft/vcpkg.git dest: /home/<your_vm_user>/vcpkg register: vcpkg_cloned - name: Bootstrap vcpkg command: ./bootstrap-vcpkg.sh args: chdir: /home/<your_vm_user>/vcpkg when: vcpkg_cloned.changed
Run the Ansible Playbook:
ansible-playbook -i inventory setup_vm.yml
3. Set Up SSHFS on Termux
Install SSHFS:
pkg install sshfs
Create a Mount Point:
mkdir -p /mnt/remote
Mount the Remote Filesystem:
sshfs user@cloud-vm:/remote/directory /mnt/remote
4. Set Up Git for Version Control
Initialize Git in Your Project:
git init
Add Your Files:
git add .
Commit Changes:
git commit -m "Initial commit"
Create a Repository on GitHub:
Create a new repository on GitHub.
Link your local project to the remote repository:
git remote add origin https://github.com/your-username/your-repo.git git push -u origin master
5. Install Conda or Python Virtual Environment
Install Miniconda:
- Follow the installation instructions for your system.
Create a Conda Environment:
conda create -n myenv python=3.8 conda activate myenv
Install Python Packages:
pip install <your-required-packages>
6. Set Up Jupyter Labs
Start Jupyter Labs Locally:
Ensure your current working directory is the mounted project directory:
cd /mnt/remote/your_project_directory
Start Jupyter Labs:
jupyter lab --no-browser --port=8888
Access Jupyter Labs in Your Browser:
- Open your browser and navigate to
http://localhost:8888
to access Jupyter Labs running on your VM, but seamlessly available through your local mount.
- Open your browser and navigate to
7. Automate Syncing with SSHFS
By mounting the VM locally to your project directory, you seamlessly integrate development and execution environments.
Note: The mount point uses the VM's Debian environment for execution. Your local Python version does not affect the mounted environment. Ensure you manage dependencies appropriately in both environments.
This guide ensures you utilize the full power of your cloud VM while keeping your workflow efficient and straightforward. Happy coding and computing! 🚀
Integrating Conda and Jupyter Labs
Conda and Jupyter Labs actually work quite well together and don't have inherent conflicts. In fact, managing your Jupyter Lab environments with Conda is a common practice due to its ease of handling dependencies. Let's break it down:
Using Conda and Jupyter Labs Together
Create a Conda Environment:
conda create -n myenv python=3.8 conda activate myenv
Install Jupyter Labs in the Conda Environment:
conda install -c conda-forge jupyterlab
Start Jupyter Labs:
While inside your Conda environment, start Jupyter Labs:
jupyter lab --no-browser --port=8888
This approach ensures that Jupyter Labs runs with the specific Python environment and dependencies defined by Conda, keeping your project setup clean and organized.
Jupyter Labs and GitHub
Jupyter Labs doesn’t directly push to GitHub, but you can use Jupyter Lab extensions or the terminal within Jupyter Labs to manage your Git operations:
Use the Terminal:
Open a terminal in Jupyter Labs and use standard Git commands:
git add . git commit -m "Commit message" git push origin master
Jupyter Lab Git Extensions:
Install the
jupyterlab-git
extension for a GUI to manage Git operations:jupyter labextension install @jupyterlab/git pip install jupyterlab-git jupyter serverextension enable --py jupyterlab_git
This adds a Git panel to Jupyter Labs where you can perform Git operations.
Putting It All Together
Mount the VM Locally:
- Using SSHFS, mount the remote filesystem to your local directory.
Activate Conda Environment:
Ensure you’re working within the mounted directory:
cd /mnt/remote/your_project_directory conda activate myenv
Start Jupyter Labs:
Start Jupyter Labs from the mounted directory:
jupyter lab --no-browser --port=8888
Use Git:
- Use the terminal within Jupyter Labs or the Git extension to manage your code with GitHub.
By integrating Conda and Jupyter Labs, and leveraging GitHub for version control, you create a robust and organized development environment.
Steps to Publish Jupyter Notebooks on GitHub
Publishing Jupyter Notebooks with your GitHub project is straightforward and a great way to share your work as well as to, as they say, keep apples-with-apples. Here’s a step-by-step guide to get you set up:
Create or Navigate to Your GitHub Repository:
If you haven't already, create a new repository on GitHub where your project will reside.
Navigate to your repository’s local directory:
cd /mnt/remote/your_project_directory git init
Create a Jupyter Notebook:
Open Jupyter Labs:
jupyter lab --no-browser --port=8888
Create a new Jupyter Notebook or open an existing one. Save it within your project directory.
Save and Commit Your Notebook:
In Jupyter Labs, save your notebook.
Use the terminal or Jupyter Lab Git extension to stage and commit the notebook:
git add your_notebook.ipynb git commit -m "Add initial Jupyter Notebook"
Push to GitHub:
Push your changes to your GitHub repository:
git remote add origin https://github.com/your-username/your-repo.git git push -u origin master
Update Regularly:
Continue working on your notebooks, saving, committing, and pushing changes as you progress:
git add your_notebook.ipynb git commit -m "Update Jupyter Notebook" git push
Benefits of Publishing on GitHub
Version Control: Track changes over time and revert to previous versions if needed.
Collaboration: Others can view, comment, and contribute to your project.
Visibility: Showcase your work and share it with the broader community.
Backup: Safeguard your work against data loss.
By following these steps, you ensure that your Jupyter Notebooks are well-integrated with your GitHub project, making your development process smoother and more organized.
Subscribe to my newsletter
Read articles from Anton directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Anton
Anton
I am new to coding. I am currently learning Bash and Zsh scripting with Neovim in Termux.