Access Meta's Llama 3 on Your Phone for FREE!
Hi AI Enthusiasts,
Welcome to our weekly Magic AI, where we bring you weekly exciting updates from the world of Artificial Intelligence (AI).
This week's Magic AI tool puts the power of Meta's Llama 3 in your hands! And the best, you can also access other top-performing open-source models.
Let's explore this week's AI news together. Stay curious! π
In today's Magic AI:
Apple unveils OpenELM for on-device AI
Sanctuary AI presents Gen 7 of its humanoid robot Phoenix
Microsoft releases a new SLM
Magic AI tool of the week
Hand-picked articles of the week
π Our Books and recommendations for you
Top AI news of the week
π Apple unveils OpenELM for on-device AI
Apple has quietly released OpenELM, an open-source family of small language models (SLMs). OpenELM stands for Open-source Efficient Language Models. It is designed to run locally on devices like an iPhone.
The family consists of eight models (four models are instruction-tuned) with four different parameter sizes (270M, 450M, 1.1B, 3B). According to Apple's research team, "OpenELM outperforms the recent open LLM, OLMo, by 2.36% while requiring 2Γ fewer pre-training tokens."
Apple also open-sourced its library CoreNet. This is the library used to train OpenELM. CoreNet is a library for training deep neural networks.
Our thoughts
We welcome the fact that Apple is disclosing the LLMs. But! Apple models are really bad. Apple's 3B OpenELM has an MMLU value of 24.8, and Microsoft's Phi-3B mini has an MMLU value of 68.8! Phi-3B mini is the lastest model of Microsoft.
So, we think that Apple's models are not useful in the real world, with an MMLU of 24.8!
OpenELM uses a layer-wise scaling strategy which leads to enhanced accuracy and efficiency. Perhaps Apple wanted to open-source this approach and published the LLMs for this reason.
More information
- OpenELM: An Efficient Language Model Family with Open-source Training and Inference Framework - arXiv
π¦Ύ Sanctuary AI presents Gen 7 of its humanoid robot Phoenix
Sanctuary AI has released the seventh generation of its humanoid robot Phoenix. The robot has numerous hardware improvements, and Sanctuary AI also promises the first "human-like intelligence in general-purpose robots".
In addition, the robot can learn new tasks in less than 24 hours. Gen 7 brings longer operating times and significantly lower cost of manufacture. At the beginning of April, the robotics company announced that the global mobility technology company Magna would use the Phoenix robot in production.
Want to learn more about this robot? Then, take a look at the official video.
Our thoughts
We have never heard of this robot before. But we think that such robots can be used well in industrial use cases such as logistics.
Such robots can also be helpful in high-risk environments. Although it may be unavoidable, robots must be used for peaceful and beneficial purposes rather than for military purposes.
More information
Sanctuary AI Unveils the Next Generation of AI Robotics - Sanctuary AI
Sanctuary AI Expands General Purpose Robot Footprint in Automotive Manufacturing Industry - Sanctuary AI
π¬ Microsoft releases a new SLM
As we mentioned above, Microsoft has released the new small language model (SLM) Phi-3-mini 3.8B. According to Microsoft "Phi-3 models are the most capable and cost-effective small language models (SLMs) available, outperforming models of the same size and next size up across a variety of language, reasoning, coding, and math benchmarks".
The new model is available on Microsoft Azure, Hugging Face and Ollama. It comes in two context-length variants with 4K and 128K tokens. In addition, the model is ready to use out-of-the-box because it is instruction-tuned.
In the coming weeks, Microsoft plans to add the models Phi-3-small (7B) and Phi-3-medium (14B) to Phi-3 family. The benchmarks of the Phi-3-mini compete with those of Mixtral and GPT 3.5, even despite its smaller size. Its small size makes it perfect for on-device AI.
βπ½ What is your opinion about on-device AI?
Our thoughts
Microsoft shows what you can get out of small language models. The capabilities of Phi-3-mini are an important step towards efficient on-device AI. In our opinion, on-device AI is an important topic, especially for data protection reasons. We are excited to see what Apple will present for on-device AI.
More information
Introducing Phi-3: Redefining whatβs possible with SLMs - Azure Blog
Phi-3 Technical Report: A Highly Capable Language Model Locally on Your Phone - arXiv
Magic AI tool of the week
πͺ HuggingChat App - Access Meta's Llama 3 on Your Phone
Meta's Llama 3 is the most powerful open-source LLM out there, released just last week! With this week's Magic AI Tool, you can access it on your phone.
It is easy, and you need only the following steps.
Step-by-Step Guide:
Download the HuggingChat app from the Apple App Store. Unfortunately, there is currently no Android app, but you can also access it in the browser!
Start the app. Sign in or sign up with a Hugging Face account or your Apple ID.
Click on the settings icon and select the model "meta-llama/Meta-Llama-3-70B-Instruct"
Now the model is ready! You can start prompting. Have fun with it!
ππ½ You should also check out our other AI Tool recommendations! π
Articles of the week
π You might also be interested in these blog articles:
Thanks for reading, and see you next time.
- Tinz Twins
P.S. Have a nice weekend! ππ
* Disclosure: The links are affiliate links, which means we will receive a commission if you purchase through these links. There are no additional costs for you.
Subscribe to my newsletter
Read articles from Tinz Twins directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
Tinz Twins
Tinz Twins
Hey, we are the Tinz Twins! ππ½ ππ½ We both have a Bachelor's degree in Computer Science and a Master's degree in Data Science. In our blog articles, we deal with topics around Data Science.