π Welcome to the AIGuys Digest newsletter, where we cover cutting-edge AI advancements and all the top AI newsπ.
Don’t forget to check out my new AI book, it covers lots of AI optimizations and practical code:
Ultimate Neural Network Programming with Python
π In this issue:
- π€ Latest advances: This month, it’s all about what’s new in AI and what is just a bunch of old ideas rehashed..
- π Monthly AI News: Discover how these stories are revolutionizing industries and impacting everyday life: NVIDIA’s new voice-modulated AI, the challenges of scaling AI, and AI to identify domestic violence.
- π Special from the editor: This covers interesting talks, conferences and articles we’ve discovered recently.
Letβs embark on this journey of discovery together! ππ€π
Follow me on Twitter and LinkedIn at RealAIGuys And AIGuysEditor.
Everything is evolving at such a rapid pace with new models and strategies arriving every few weeks that it becomes quite difficult to keep up with everything. But if you look closely, you’ll see that little has changed except the scale of the calculation and the data.
One way or another, we are still working with ideas that are ten years old. An example I like to give about not coming up with new ideas is the exorbitant use of XgBoost or other tree models, most financial models still run on these, not models based on deep learning.
We are just evolving AI and not coming up with new ideas
Since the release of LLMs, we have been trying to reduce the memory of our models. Over the years, we have discovered many innovations such as different types of Quantification, Give up, etc.. We even tried completely changing model architectures to solve Transformers scaling issues.
Research like Flash Attention, RetNet, state space modelsand many others show great potential, but somehow Transformer remains the king. Today we look at brand new research papers and see what’s happening in this space. Have we made real improvements or not?
Are small processors the future of scaling?
Recently we have heard a lot of noise about LLM hitting a wall. Is this true? Is a new AI winter upon us? Or is it just a break? In fact, people need to actually know what’s going on when it comes to scaling laws.
It’s not difficult to find AI experts with completely opposing views on the future of AI. This reminds me of Kenneth Stanley’s book, “Why Greatness Can’t Be Planned”, which mainly argues that no one knows what it takes to make a breakthrough in a certain field and the exact same thing happens. produced recently with AI.
Over the past few weeks, we’ve seen many major labs and researchers express disappointment over the diminishing returns of AI, while others promote it even more.
Are we reaching the limits of AI?
Nvidia presents an AI model capable of modifying voices and generating new sounds
Nvidia has unveiled Fugatto, an AI model capable of changing voices and generating new sounds, targeting music, movie and game creators. The company is cautious about public release due to possible misuse.
Press article: Click here
The Challenges of Advancing AI
Industry leaders from companies like OpenAI and Nvidia recognize potential slowdowns in AI progress due to limited computing power and data availability. Strategies to overcome these challenges include the use of multimodal data and synthetic data, as well as improving the reasoning capabilities of AI systems.
- The pace of improvement in AI models appears to be slowing, but some technology leaders say there is no wall.
- This has sparked a debate about how businesses can overcome AI bottlenecks.
Press article: Click here
AI can help police predict if someone is at risk of domestic violence
AI tools are being developed to help police predict the risk of domestic violence and analyze responses to specific questions to predict future incidents with high accuracy. This technology aims to strengthen prevention measures and support for people at risk.
AI ‘Lizzy’ gives the probability of physical violence within three months with 84% accuracy and could soon be made available to UK forces.
Press article: Click here
- Visualizing Transformers and Attention | Speak up for TNG Big Tech Day ’24 Click here
- Geoff Hinton β Will digital intelligence replace biological intelligence? | The remarkable vector 2024 Click here
- AI Lecture Series: βHow Could Machines Achieve Human-Level Intelligence?β Β» by Yann LeCun Click here
- An unreasonably efficient AI with Demis Hassabis! : Click here
π€ Join the conversation: Your thoughts and ideas are valuable to us. Share your views and let’s build a community where knowledge and ideas flow freely. Follow us on Twitter and LinkedIn at RealAIGuys And AIGuysEditor.
Thank you for being part of the AIGuys community. Together, we are not only witnessing the AI ββrevolution; we are part of it. Until next time, keep pushing the boundaries of what’s possible. ππ
Your AIGuys Digest team