Groqing with Groq's plug in play AI api

In the fast paced AI induced tech era..everyone want’s to get their hand messy with AI tools/api…When it comes to devs they have a thing or three for APIs..and so making call with integrated AI’s api to watch things unfold is a must…But here’s the catch..there are few vendors that would rent their api’s for “free”..the rest comes at a fee…
On my expedition to try and implement free AI api’s i came across Groq..it provides a host of free AI models including some of their own ranging from Llamas to Gemini and of course Open AI….
Here’s what you can do to get started using Groq’s API:
Log in to: https://groq.com/
Create an account—> Sign up using Google/GitHubUpon logging in, you can start by clicking on API keys—> Create API key
For Sigma devs, you can figure out the next steps by reading the docs. (It’s fairly simple)
For those unaware and who require a bit of spoon-feeding, follow below
Go to API Reference under documentation and use this part of the code as an HTTP request
curl
https://api.groq.com/openai/v1/chat/completions
-s -H "Content-Type: application/json" -H "Authorization: Bearer $GROQ_API_KEY" -d '{ "model": "llama-3.3-70b-versatile", "messages": [{ "role": "user", "content": "Explain the importance of fast language models" }] }'
If you’re familiar with creating API endpoints on the server/client side, this should be fairly simple…
- For the new baby boomer devs, here’s a layer below…create a basic JS file and try out this script:
import Groq from "groq-sdk";
- For the new baby boomer devs, here’s a layer below…create a basic JS file and try out this script:
const groq = new Groq({ apiKey: process.env.GROQ_API_KEY });
export async function main() { const chatCompletion = await getGroqChatCompletion(); // Print the completion returned by the LLM. console.log(chatCompletion.choices[0]?.message?.content || ""); }
export async function getGroqChatCompletion() { return
groq.chat
.completions.create({ messages: [ { role: "user", content: "Explain the importance of fast language models", }, ], model: "llama-3.3-70b-versatile", }); }—>it can be found on Documentation—>Quickstart (select programming template like Python, JS,etc)
In my next blog post, I’ll share my detailed procedure and will help you understand the working of this API call.
PS: If the script is very confusing or unfamiliar to any budding devs, I’d suggest they try practicing making and creating a few API calls with their preferred language. This directly correlates to understanding Promises or the notorious Async/Await functions. Try making a couple of functions, then API calls like these will feel like a breeze. 🫡
Subscribe to my newsletter
Read articles from Pratikshit Chowdhury directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
