Ai Wow => How #chaicode

Daksh GartanDaksh Gartan
2 min read

Wide overview

Hii Everyone!!

Ai models works on a neural network called transformer . Ai is trained over some data from various sources and that data is put into a transformer and ai model is trained over that. In this blog we’re going to understand the underhood working of the transformer step by step.

let’s understand each step

INPUT QUERY AND ENCODING

Firstly the model take input from you and encode it on the basis of the vocab knowledge feeded inside it

for example here a is assigned 1 same for other . This is vocab knowledge which is fitted inside the model on the basis of this vocab transformer encode the whole text

Making relations

Vector Embedding

Vector embedding first transform the encode data into vectors (array of numbers) and then use place these vectors in a 3d graph and then tries to predict the output based on the relation which the data makes which is embedded inside it.

here all four words were in the vocal knowledge of the model . so when you input capital of india ? it make relation that ok usa is (x=2,y=3,z=4)away from washington and and india is x=2 down from us so definateley capital of india would be same distance away as usa and washington dc.

Sementic meaning

At this step the transformer basically allign the data sementically (which makes more sense).

MAKING CODE READY FOR OUTPUT

Multihead attention

The makes different head who checks the whole sentenceand decide which words are more important and make different arrangements.

Softmax

At this step the model final arranges the words and then chooses the most likely arrangements of words .

Some other jargens

TEMPERATURE (CONTROL CRATIVITY)

Creativity od a model can be controlled by configring its temperatur. the more the temp more creative answer it will give generate.

KNOWLEDGE CUTOFF

It is the last date on which the model was made upto date . the data after that is not updated to the model.

0
Subscribe to my newsletter

Read articles from Daksh Gartan directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Daksh Gartan
Daksh Gartan