This might be the biggest AI hackathon ever:
* >6,300 registrants
* Runs for 2 weeks (Nov. 14-30)
* Open to anyone, anywhere virtually
* $20,000 in cash prizes + $3.5M+ in sponsor credits
Hosted by @Anthropic and @Gradio, along with 10 sponsors, join kickoff in 30 minutes 👇
Sun's out, models out. 😎
@IBM & @NASA dropped Surya, an open-source heliophysics model trained on 14 years of observations from NASA’s Solar Dynamics Observatory, and it's 🔥🔥🔥. https://twitter.com/huggingface/status/1958163027238223985/photo/1
Super happy to announce that we are acquiring @pollenrobotics to bring open-source robots to the world! 🤖
Since @RemiCadene joined us from Tesla, we’ve become the most widely used software platform for open robotics thanks to @LeRobotHF and the Hugging Face Hub. Now, we’re https
We are excited to partner with @AIatMeta to welcome Llama 4 Maverick (402B) & Scout (109B) natively multimodal Language Models on the Hugging Face Hub with Xet 🤗
Both MoE models trained on up-to 40 Trillion tokens, pre-trained on 200 languages and significantly outperforms i
@deepseek_ai Congratulations on the stellar release! 🤩
The model checkpoints and a detailed report - truly Christmas is here!
https://huggingface.co/collections/deepseek-ai/deepseek-v3
We passed 5 million users.
🥳That's 5 million of you who have signed up on the Hub 🚀 thank you for contributing to the ecosystem and making open Machine Learning happen!
We're just getting started 🤗 https://twitter.com/huggingface/status/1825620479895547992/photo/1
Hugging Face 🫶 @GoogleColab
With the latest release of huggingface_hub, you don't need to manually log in anymore. Create a secret once and share it with every notebook you run. 🤗
pip install --upgrade huggingface_hub
Check it out!👇 https://twitter.com/huggingface/status/173
Code Llama: Now on Hugging Chat 💻🦙
Try out the 34B Instruct model for free with super fast inference!
👉 https://huggingface.co/chat/ https://twitter.com/huggingface/status/1696090134309974362/video/1
TRL 🤗 Hugging Face
Excited to announce that we're doubling down on our efforts to democratize RLHF and reinforcement learning with TRL, new addition to the @huggingface family, developed and led by team member @lvwerra 🎉🎉
Train your first RLHF model 👉https://github.com/huggingf
Hugging Face is now part of the PyTorch Foundation as a premier member 🤝
We have been collaborating with the PyTorch team for the past four years and are committed to supporting the project.
We share an objective: to lower the barrier of entry to ML.
https://pytorch.org/blog/h
Llama 2: Now on Hugging Chat 🤗🦙
Try out the 70B Chat model for free with super fast inference, web search, and powered by open-source tools!
👉 https://huggingface.co/chat/ https://twitter.com/huggingface/status/1681758402995798016/video/1
At Hugging Face, we are working to enable you to easily build and serve your own LLMs 🧑💻👨💻👩💻
In this blog, we talk about the amazing world of open-source LLMs, the challenges, and how the Hugging Face ecosystem can help you 🪐
Read about them here 👉https://huggingface.co/blog/o
We are looking into an incident where a malicious user took control over the Hub organizations of Meta/Facebook & Intel via reused employee passwords that were compromised in a data breach on another site. We will keep you updated 🤗
📣 Calling all game dev and AI enthusiasts!🎮
Already 400 people signed up for the first Open Source AI Game Jam, where you'll use AI tools to make a game in a weekend🔥
Sign up here 👉 https://itch.io/jam/open-source-ai-game-jam
What AI tools? Let's focus today on Audio tools 🔊
⬇
🚨Exciting news! Next week, we’ll be launching a brand-new Audio Course! 🤗
Sign up today (https://huggingface.us17.list-manage.com/subscribe?u=7f57e683fa28b51bfc493d048&id=7869673053) and join us for a LIVE course launch event featuring amazing guests like @DynamicWebPaige, Seokh
🤗 Transformers has been built by, with, and for the community.
Reaching 100k ⭐ on GitHub is a testament to ML's reach and the community's will to innovate and contribute.
To celebrate, we highlight 100 incredible projects in transformers' vicinity.
https://github.com/huggingfa
The first RNN in transformers! 🤯
Announcing the integration of RWKV models in transformers with @BlinkDL_AI and RWKV community!
RWKV is an attention free model that combines the best from RNNs and transformers.
Learn more about the model in this blogpost: https://huggingface.co/b
We just released Transformers' boldest feature: Transformers Agents.
This removes the barrier of entry to machine learning
Control 100,000+ HF models by talking to Transformers and Diffusers
Fully multimodal agent: text, images, video, audio, docs...🌎
https://huggingface.co/d
SAM, the groundbreaking segmentation model from @Meta is now in available in 🤗 Transformers!
What does this mean?
1. One line of code to load it, one line to run it
2. Efficient batching support to generate multiple masks
3. pipeline support for easier usage
More details: 🧵 htt
THIS IS BIG! 👀
It's now possible to take any of the >30,000 ML apps from Spaces and run them locally (or on your own infrastructure) with the new "Run with @Docker" feature. 🔥🐳
See an app you like? Run it yourself in just 2 clicks🤯 https://twitter.com/huggingface/status/1641
Today we are excited to announce a new partnership with @awscloud! 🔥
Together, we will accelerate the availability of open-source machine learning 🤝
Read the post 👉 https://huggingface.co/blog/aws-partnership
It's been an exciting year for 🤗Transformers. We tripled the number of weekly active users over 2022, with over 1M users most weeks now and 300k daily pip installs on average🤯 https://twitter.com/huggingface/status/1609162974626779136/photo/1
Scikit-Learn and 🤗 join forces!
With a growing number of tabular classification & regression checkpoints, we believe statistical ML has its place on the HF Hub.
We're excited to partner with sklearn, statistical ML champion, and move forward together.
https://blog.scikit-l
Transformers v4.22 is out, and includes the first VIDEO models! 🎥
💥VideoMAE: masked auto-encoders for video
💥X-CLIP: CLIP for video-language
Other nice goodies:
💥Swin Transformer v2
💥Pegasus-X
💥Donut
💥MobileViT
... and MacOS support (device="mps")! https://twitter.com/huggingf
🖌️ Stable Diffusion meets 🧨Diffusers!
Releasing diffusers==0.2.2 with full support of @StabilityAI's Stable Diffusion & schedulers 🔥
Google colab:
👉 https://colab.research.google.com/github/huggingface/notebooks/blob/main/diffusers/stable_diffusion.ipynb
Code snippet 👇 htt
🧨Diffusion models have been powering impressive ML apps, enabling DALL-E or Imagen
Introducing 🤗 diffusers: a modular toolbox for diffusion techniques, with a focus on:
🚄Inference pipelines
⏰Schedulers
🏭Models
📃Training examples
https://github.com/huggingface/diffusers https:/
Last week, @MetaAI introduced NLLB-200: a massive translation model supporting 200 languages.
Models are now available through the Hugging Face Hub, using 🤗Transformers' main branch.
Models on the Hub: https://huggingface.co/facebook/nllb-200-distilled-600M
Learn about NLLB-20
The Technology Behind BLOOM Training🌸
Discover how @BigscienceW used @MSFTResearch DeepSpeed + @nvidia Megatron-LM technologies to train the World's Largest Open Multilingual Language Model (BLOOM):
https://huggingface.co/blog/bloom-megatron-deepspeed
Machine learning demos are increasingly a vital part of releasing a model. Demos allow anyone, not just ML engineers, to try a model, give feedback on predictions, and build trust
That's why we are thrilled to announce @Gradio 3.0: a grounds-up redesign of the Gradio library 🥳 h
Last week @MetaAI publicly released huge LMs, with up to ☄️30B parameters. Great win for Open-Source🎉
These checkpoints are now in 🤗transformers!
But how to use such big checkpoints?
Introducing Accelerate and
⚡️BIG MODEL INFERENCE⚡️
Load & USE the 30B model in colab (!)
💫 Perceiver IO by @DeepMind is now available in 🤗 Transformers!
A general purpose deep learning model that works on any modality and combinations thereof
📜text
🖼️ images
🎥 video
🔊 audio
☁️ point clouds
...
Read more in our blog post: https://huggingface.co/blog/perceiver https:
Transformers v4.13.0 is out and it is *big*:
Vision:
- 🖼️ SegFormer
- 🖨️ ImageGPT
Audio:
- 🔡 Language model support for ASR
Multimodal:
- ⚖️ Vision-Text dual encoders
NLP:
- 🔣 mLUKE
- 🏅 DeBERTa-v3
Trainer:
- 1⃣6⃣ The Trainer now supports BF16/TF32!
🌠New doc frontend 🌠 https
TODAY'S A BIG DAY
Spaces are now publicly available
Build, host, and share your ML apps on @huggingface in just a few minutes.
There's no limit to what you can build. Be creative, and share what you make with the community.
🙏 @streamlit and @gradio
https://huggingface.co/sp
We're thrilled to partner with https://www.deeplearning.ai/ to create some great new content for their NLP Specialization on Coursera!
With this update, you can access exciting new material and lectures that cover the state of the art in NLP 🧑🏫
https://www.coursera.org/specia
EleutherAI's GPT-J is now in 🤗 Transformers: a 6 billion, autoregressive model with crazy generative capabilities!
It shows impressive results in:
- 🧮Arithmetics
- ⌨️Code writing
- 👀NLU
- 📜Paper writing
- ...
Play with it to see how powerful it is:
https://huggingface.co/Eleuth